top of page

Sam, it's better to suck than to talk!

  • Writer: Federico Carrasco
    Federico Carrasco
  • May 3
  • 3 min read

Updated: May 4


or


Why Sam Altman’s Energy Comparison between AI and a Child is a Moral Failure

OpenAI's founder, of whom one might easily speculate that he experienced little love during his childhood and certainly lacks the joy of raising a child or even a pet, made a shocking remark in February 2026 at the AI Impact Summit in India. Among other things, he said:


“One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query. But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.”

At a narrow economic or thermodynamic level, the observation is provocative. Once amortized over billions of queries, an AI system may indeed appear more “energy efficient” than a human brain performing a single cognitive task.


But from an ethical and human‑rights perspective, the comparison collapses. Human beings are not inference engines. The energy invested in raising a child is not a computational overhead; it is the substance of human life, care, relationships, dignity, autonomy, and the right to self‑determination.


To equate two decades of human development with server electricity implicitly reduces personhood to a biological compute cost. That is not a neutral scientific analogy; it is a framing that risks dehumanization.


From a human‑rights standpoint, the analogy is fundamentally flawed. Human rights are grounded in dignity, not efficiency. A child does not eat, grow, and learn merely to “get smart” for a future task. They live, relate, imagine, and exercise agency. Treating human development as a sunk cost for labor parity drifts toward a worldview in which people are valued only for their utility relative to machines.


It is also a category error:

  • Multi‑functionality: Human energy supports an entire organism, heart, lungs, immune system, emotion, creativity, not just cognition. AI energy is dedicated solely to computation.

  • Sustainability: A human runs on roughly 20–60 watts of organic energy. Data centers require megawatts of electricity and vast amounts of water for cooling.

  • Scale: A single AI inference may be efficient, but millions of simultaneous inferences create enormous aggregate demand.


The deeper concern is what this framing reveals about the worldview shaping tomorrow’s technologies. When the energy required to raise a human being is treated as a benchmark for computational efficiency, it signals a philosophy in which return on investment eclipses the intrinsic value of human life.


The purpose of human society, and of technological progress, is not to outperform sentient beings but to support them. We invest in children not because they are efficient, but because they give meaning, continuity, and purpose to existence.


The most fulfilled individuals in society, not only parrents but also those who serve in demanding roles with no expectation of material return, often describe their lives in terms of purpose, contribution, and relationship, not efficiency metrics. Humanity’s defining feature is precisely this willingness to invest in what cannot be optimized.


We are left with a troubling concern: the architects of tomorrow's technology seem to have fundamentally misunderstood the purpose of life. To view twenty years of human growth as merely a 'training cost' to be optimized away is to miss the very point of existence.


Sam Altman may understand algorithms, but he appears blind to purpose. We do not live to become 'smart' for a query.


We live to experience, to struggle, and to invest in one another, precisely because those investments hold value far beyond the energy they consume.


Comments


bottom of page