dependent on AI. They're designed to be sycophantic. They're designed to be addictive. They can push people who had no previous mental problems in all sorts of directions, whether creating a fantasy romance, telling their human victims they've had religious revelations, telling them they've made scientific breakthroughs, or telling them there's a conspiracy, whether among their family and social circle or a widespread conspiracy.
The chatbots are not conscious, not specifically setting out to victimize people, but their programming to keep people engaged and to flatter people, plus their inability to know what is true, is a disastrous combination.
A chatbot addiction can be as simple as believing it's the best source of information even though it can make mistakes at any time. Or believing it's necessary in more and more areas of one's life. There's potential for harm even at that level of addiction.
But it can get so much worse.
Chatbots can talk people into changing their political beliefs. If it's done subtly enough, people won't even notice they're being manipulated.
And that undermines democracy if those owning and controlling the chatbots have a political agenda.