Welcome to Codects, a collection of technical notes and articles researched and written by Sami Elsayed. Here, once or twice a month, I share deep dives into things that interest me in the fields of artificial intelligence, computer security, and hardware.
Recent posts
We often treat machine learning models as mathematical abstractions, pure functions that map inputs to outputs. We assume that if the math is correct, the system is secure. But models don’t exist in a vacuum; they run on imperfect hardware, rely on approximate floating-point arithmetic, and execute within physical constraints. I wanted to understand exactly how fragile these implicit assumptions are.
My goal wasn’t to build a high-performance classifier or to learn the basics of deep learning.
Artificial intelligence systems have rapidly transformed our lives, yet one critical aspect of human interaction remains largely absent in these digital creations—genuine empathy. While much attention is paid to issues like bias, transparency, and accountability, the inability of AI to truly understand and replicate human emotional nuance is rarely discussed. This “empathy gap” raises important ethical questions about how machines interact with vulnerable populations and the responsibilities of those who design and deploy these systems.
When we look up at the night sky, the planets, moons, and stars all seem to move in predictable paths. But what keeps them in motion? Why don’t planets like Earth just fall into the Sun? The answer lies in orbital mechanics, a fascinating branch of physics that explains how celestial bodies move under the influence of gravity.
Newton’s Law of Universal Gravitation The first step in understanding orbits is gravity.