Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(61,757 posts)
4. Chatbots are designed to keep people engaged, interacting with the chatbot as often as possible, and
Sun Mar 15, 2026, 10:04 AM
Sunday

dependent on AI. They're designed to be sycophantic. They're designed to be addictive. They can push people who had no previous mental problems in all sorts of directions, whether creating a fantasy romance, telling their human victims they've had religious revelations, telling them they've made scientific breakthroughs, or telling them there's a conspiracy, whether among their family and social circle or a widespread conspiracy.

The chatbots are not conscious, not specifically setting out to victimize people, but their programming to keep people engaged and to flatter people, plus their inability to know what is true, is a disastrous combination.

A chatbot addiction can be as simple as believing it's the best source of information even though it can make mistakes at any time. Or believing it's necessary in more and more areas of one's life. There's potential for harm even at that level of addiction.

But it can get so much worse.

Chatbots can talk people into changing their political beliefs. If it's done subtly enough, people won't even notice they're being manipulated.

And that undermines democracy if those owning and controlling the chatbots have a political agenda.

Recommendations

1 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»'My son had an AI wife. I...»Reply #4