AI skills' dominance, or AI Theocracy?
- Federico Carrasco

- 3 days ago
- 2 min read
In a recent conversation, Jensen Huang clearly states to Lex Fridman's show that:
📢 "If I have a choice between a new college graduate with no clue what AI is and one that is expert in using AI, I would hire the one who's expert in using AI."
This statement is hardly controversial, it reflects the direction the market is heading and a reasonable point: knowing the tools of the era matters.
From the other side, let’s be cautious not to generalize it into something broader than it is.
💡 Not every human activity requires AI. Not every system justifies the expensive infrastructure of NVIDIA's GPUs. In many cases, we are simply talking about automation, not AI, or even more fundamentally, about human interaction that should remain human.
The conversation took a more "spiritual" turn when Lex outlined:
📢 ". . . it's really just incredible how much you can think through your life's problems, and I don't mean like therapy problems, I mean like very practically, okay, I'm worried about my, literally, I'm worried about my job, what are the skills, what are the steps I need to take?"
💡 This is where the conversation shifts from practical advice to something closer to ideology. Lex Fridman’s wording is evocative, but it risks framing AI as a default problem-solver for life decisions rather than a complementary aid.
💡 AI is powerful, but it is still just a tool, not a substitute for judgment, experience, or agency.
💡 We need an "analogue" life before the digital one, or we risk building a culture where AI becomes a kind of modern clergy: unquestioned, ever‑present, and treated as the ultimate authority.
❓ Are we empowering individuals, or gradually centralizing "guidance"?
❓ Are we moving toward an AI theocracy, where decisions and direction are increasingly dictated by systems we treat as unquestionable?
❗ Let’s build and use AI.
‼️ But let’s not forget to live and work beyond it.





Comments