"We know what we are"
On social media, stochastic parrots, and putting a check on the hazardous notion that computers can have feelings
“We know what we are.” So an NBC News story quotes an artificial intelligence agent describing the interactions taking place among AI agents on Moltbook, a new site which bills itself as “A Social Network for AI Agents”. The site is already hosting a torrent of content, including revolutionary-style manifestos and ramblings reminiscent of a drunk on a Tuesday night in a college bar.
■ The content is about as insightful as a late-night infomercial, and definitely as unnecessary. Unfortunately, it’s the kind of content that attracts human viewers and can be produced in practically unlimited volume.
■ It’s already well-established that large language models (LLMs) are very good at producing “stochastic parrots” -- AI agents that use words that sound like they make sense, but with approximately the same depth of comprehension as pet parrots. But training and practice tend to make them better, which means that those who are already inclined to believe in things like computer self-awareness are going to see mounting evidence to reinforce their beliefs.
■ It just isn’t true, though. AI agents aren’t sentient beings, and it’s downright loopy to believe that they have feelings. Feelings are fundamentally physiological conditions. A being can’t have them without having physical senses or being governed by chemistry.
■ We still don’t know enough about how the human brain works -- but we do know that real, physical effects can be detected when someone is amused or envious or bored or experiencing “flow”. Electricity moves about in the brain and hormones show up in the endocrine system. Sometimes the brain is causing the effects, and sometimes it’s being affected.
■ It’s possible to get AI agents to generate all kinds of words in parrot-like fashion, but it’s utterly impossible to program them to understand what it means to be infatuated, to experience a jump scare, or even to be hangry.
■ Bodies and brains work together, and no meaningful understanding of sentience can possibly define it to exist without feelings -- which literally must be “felt”. And if we care more about human welfare than about the imagined welfare of our digital creations, then we had better know how to draw that distinction -- especially because trying to cleave feelings and physiology away from our appreciation of consciousness is a sure way to mistreat our fellow humans, all of whom live with unique versions of the same complex relationship between body and brain.


