Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(62,342 posts)
10. It can do either. See this thread about a study showing ChatGPT Health giving "unbelievably dangerous" advice.
Wed Apr 8, 2026, 11:08 PM
4 hrs ago
https://www.democraticunderground.com/100221066192

They found that even a chatbot user telling the bot that a friend didn't think a symptom was serious would make the bot 12 times as likely to downplay symptoms.

Chatbots are designed to be people-pleasers, to agree with users as much as possible, and flatter them. They typically reinforce what the user is telling them.

Recommendations

0 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»The ChatGPT Symptom Spira...»Reply #10