There are a lot of reasons to dislike ‘AI’ (I put quotes around it because it is artificial but it’s not intelligent).
- Environmental impact of pouring gasoline on the fire of global warming
- False claims regarding what it can do/hype bubble
- Deeply embedded racism
- Copyright infringement and theft of copyrighted works
- The parasitical grotesqueness of big tech ‘AI’ strategy
- The vast sea of ‘slop‘ it is enabling and the ouroboros of ‘AI’ eating its own tail
- The ‘extraneous noncontent‘-ness of it all
Now, almost 2 years after the public release of ChatGPT 3.5, evidence is building for yet another reason to be wary: ‘AI’ threatens to destroy critical thinking skills and obviate the essential experience of productive struggle.
As a teacher, I have seen how important productive struggle is in learning. I see it every day. Research has shown that struggling through difficult tasks literally changes the physical structure of kids’ (and adults’) brains and helps cement learning.
Sitting down in front of a blank page and having to write that first sentence, then the next and the next, is a struggle for sure. It’s not easy. But without the struggle, kids aren’t learning in the same powerful way.
I don’t teach computer science and programming to my students because most of them are going to grow up to be programmers. I teach those skills because they build critical thinking and logic skills which are useful no matter what they end up doing in life.
If I let a student use ChatGPT to draft their program, or let them have the AI sit next to them and suggest corrections to all of their mistakes, where has the struggle gone? Where has the learning gone? The struggle didn’t happen so deep learning didn’t happen.
In fact, new research shows that use of AI tools leads to production of ‘fewer ideas, with less variety and lower originality compared to a baseline’. I’m not surprised by this finding because I’ve seen it first hand.
‘AI’ is not capable of new thoughts. It’s only capable of vomiting up remixes of old thoughts from its training data. People can turn their brains off and let Gemini or Claude or ChatGPT generate some ‘good-enough’, rehashed, pleasing content that might resemble thoughtful production on the surface. But, as soon as they do that, they are ceding their own unique ability to learn deeply, synthesize, and think things that have never been thought. Without struggle, the human creative spark will die under the wet blanket of generative garbage.
The fact that we can do something with technology doesn’t mean that we should do it. There are legitimate and powerful uses for machine learning that can make the world a better place. But ‘AI’ LLMs and image generators are just ‘big, stupid magic tricks’ (link) that will end up eroding our ability to think for ourselves. The world needs more logic and critical thinking, not less.