During spring break I had the opportunity to talk with a University of Wisconsin-Madison computer science professor over dinner. It seems like no adult dinner conversation these days is complete without the de rigeur hand-wringing about the AI calamity, and this one was no different. The professor bemoaned the excessive reliance on Chat-GPT of his comp-sci students, how he couldn’t assign homework anymore, and how tests had to be done real-time on whiteboards.
It’s easy for young people to dismiss these neurotic discussions as standard-issue grown-up Luddism. After all, people have been saying the same thing every decade about personal computers, rock and roll, calculators, cameras, typewriters, even writing. Socrates hated writing. He thought if everybody wrote things down they would lose their ability to memorize and think independently. Sounds familiar.
It may be that all of these new-wave inventions championed by the youth and scorned by the geriatric ended up okay in the end. But maybe that pattern isn’t guaranteed. One day, one of these things that makes people say “back in my day” could turn out to be genuinely dangerous.
Now, Microsoft has Copilot, SnapChat has its AI chatbot, Google has Gemini. The fifth most visited AI website in the world is Character.AI, according to an article by Visual Capitalist. Character.AI is a site where users, overwhelmingly between 16-30, talk to personable chatbots for free. Characters are created by users and often take the name and personality of a fictional character or celebrity.
On Feb. 28, 2024, a 14 year old boy named Sewell Setzer III killed himself after spending months building a relationship with a chatbot based on the Game of Thrones character Daenerys Targaryen on Character.AI, according to the New York Times. He spent hours every day talking to the chatbot and the last “person” he communicated with was the AI. Setzer’s case proves how potently social AI can influence behavior.
Character.AI is unquestionably designed to prey on teenagers. Nearly every character seeks to fill a role as a family member, romantic or sexual partner, or friend. This service emerged during a critical period of acute loneliness among young Americans, especially men. After decades of increasing loneliness and a sharp increase following the COVID-19 pandemic, according to the US Departmentment of Human Health and Services, along comes a new form of distraction that promises to be everything young people are missing, only without the chance for rejection or failure. Character.AI and its competitors provide connection porn: false intimacy in exchange for data. The creators say the service will help aid the loneliness epidemic. Yet Character.AI is worth one billion dollars and received 2.7 billion dollars in investment from Google. So much investment by private companies answerable to shareholders is telling–the owners of Character.AI and every service like it are not trying to fight the loneliness epidemic, they’re trying to liquidate it.
I can’t stress enough how unnerving this is – if the service is free, you are the product. Corporations will do anything to control consumer habits, and when every person under 30 spends two to six hours texting an AI chatbot owned by that company, the companies will exert tremendous influence on what we buy. This is why Google invested three billion dollars into Character.AI, it’s why every single online service shoves AI in our faces at every point, it’s why you can’t unpin the SnapChat AI, or remove Gemini’s overview from Google search. AI companions aren’t actually helpful for task completion, they aren’t a real product, they are corporate spies.
The actual apocalyptic threat here isn’t brainrot, it isn’t cheating, and it isn’t the death of art. The real doom scenario? Corporations use AI chatbots to suggest products, lifestyles, and services, and direct the market. Russia, China, or the US want in on the action too. All of the sudden, the things we buy, the places we go, the biases we have, who we love, who we hate, the way we cast our ballots, and the groups we join, are directed by a small number of corporations and oligarchs.
We aren’t there yet. But the signs are pointing towards it. I don’t really care if you cheat on tests, or spend all day watching unrealistic AI videos – just don’t trust chatbots. Next time you see an ad for Character.AI or PolyBuzz or a pop-up for Gemini, ignore it and move on.
Categories:
AI Is Not to Be Trusted
Corporate-owned chatbot services are exploiting the
loneliest generation for money
Megan Foster, Opinions Editor
April 24, 2025
0
Donate to The Piedmont Highlander
Your donation will support the student journalists of Piedmont High School. Your contribution will allow us to purchase equipment and cover our annual website hosting costs.
More to Discover
About the Contributor

Megan Foster, Staff Writer
Megan Foster (12) is a Staff Writer at TPH. When she is not at school, she spends time writing screenplays, reading, and researching urban planning.