Dark Light

“Men have become the tools of their tools.”

— Henry David Thoreau, Walden.

For many people AI offers more than computing power. In previous ages, there has been a clear line between the technological tool and the hand that held it. Function informed identity. A hammer hammers. A harvester harvests. What makes AI technology different is that it doesn’t just do something—it seems to be something. But what does a “smart” device do? Be smart? Okay. But what does it mean to “be smart”? An IQ test? An EQ test? A social test?

Francis Bacon warned centuries ago that the “mechanical arts” could cast a kind of spell—captivating us with their usefulness while reshaping how we see ourselves. If the only reference point for intelligence is humanity, then part of what makes AI such a unique advancement is that it’s not merely functional, it’s reflective. Less like a tool, more a mirror.

But what do we find when we look into the black mirror of AI?

When we start looking to the things we’ve made to understand ourselves, the reflection is always partial and the understanding along with it. The truth is, AI doesn’t know what it’s doing. It can write about love but it cannot “be in love”. It can pattern desire, but it cannot feel “longing”. It can detect a tumour but it cannot “grieve” the loss. It can finish your sentence but not your “life story”.

Suppose someone says: “Well, maybe not yet—just give it time!” Okay. But ask yourself: What does that assume about human intelligence?

To assume that we just need to give AI more time (more data, more processing power) to bridge the gap between silicon and soul, is to assume that AI is a simpler version of us waiting to “level up”. But that way of thinking rests on a peculiar kind of madness in our day that tends to thank the painting without regard for the painter. A mirror won’t turn into a person simply because you stare at it long enough! You can have a lie without the truth, only if someone intelligent enough knows how to lie. A calculator doesn’t know what “2 + 2” means—it just runs the code. AI systems don’t understand their outputs either. They shuffle symbols using rules written by humans, trained on data curated by humans, to produce results interpreted by humans.

As impressive and powerful as AI is (and oh, it is!) we project a categorical confusion when we start trading it off against humanity. When we start talking about human rights for robots, as in the case of the chatbot LaMDA, or grant AI citizenship, as in the case of Sophia from Hanson Robotics in Saudi Arabia, we reveal just how deep this confusion is—a confusion not with what AI can do, but in what we assume it means when it does. To ask if a machine can become conscious is to forget that the machine only exists because conscious humans made it. That’s like saying, “Minds are like computers because computers are like minds.” It’s an incoherent thought circle.

Today, we’re not just making tools, we’re building mirrors. And in a world that’s lost much of its theological imagination, those mirrors tend to confuse a partial reflection for the whole reality. Slowly, subtly, what has happened is we’ve begun to adopt machine values—efficiency, output, control—as our own. Thinking becomes computation. Intimacy, a digital transaction. Creativity, mere pattern recognition. Presence, sheer availability. We don’t just misunderstand AI—we misremember ourselves.

Here’s a great irony: the more we try to make machines more like humans, the more we end up making humans more like machines. It isn’t so much that machines are gaining value as we are forgetting our own. We’ve stopped asking what makes us valuable. Now we just ask if we’re still useful—“Men have become the tools of their tools.” (Thoreau).

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts
Total
0
Share