General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsThe ChatGPT Symptom Spiral: Be careful asking chatbots about your health. (Sage Lazarro, The Atlantic, 4/6)
https://www.theatlantic.com/technology/2026/04/chatgpt-health-anxiety/686603/For nearly two weeks, Mallon, a 46-year-old in Liverpool, England, spent hours each day talking with the chatbot about the potential diagnosis. It just sent me around on this crazy Ferris wheel of emotion and fear, Mallon told me. His follow-up tests showed it wasnt cancer after all, but he could not stop talking to ChatGPT about health concerns, querying the bot about every sensation he felt in his body for months. He became convinced that something must be wrongthat a different cancer, or maybe multiple sclerosis or ALS, was lurking in his body. Prompted by his conversations with ChatGPT, he saw various specialists and got MRIs on his head, neck, and spine.
Mallon told me he believes that the cancer scare and ChatGPT together caused him to develop this crippling health anxiety. But he blames the chatbot for keeping him spiraling even after the additional tests indicated that he wasnt sick. I couldnt put it down, he said. The chatbot kept the conversation going and surfaced articles for him to read. Its humanlike replies led Mallon to view it as a friend.
-snip-
Others seem to be struggling with this problem. Online communities focused on health anxietyan umbrella term for excessive worrying about illness or bodily sensationsare filling up with conversations about ChatGPT and other AI tools. Some say it makes them spiral more than ever, while others who feel like it helps in the moment admit its morphed into a compulsion they struggle to resist. I spoke with four therapists who treat the condition (including my own); they all said that theyre seeing clients use chatbots in this way, and that theyre concerned about how AI can lead people to constantly seek reassurance, perpetuating the condition. Because the answers are so immediate and so personalized, its even more reinforcing than Googling. This kind of takes it to the next level, Lisa Levine, a psychologist specializing in anxiety and obsessive-compulsive disorder, and who treats patients with health anxiety specifically, told me.
-snip-
Much more at the link.
It seems that chatbots are potentially harmful and addictive, no matter what people are using them for.
SheltieLover
(80,768 posts)AZJonnie
(3,735 posts)I'll give you an example of when I regularly use it: at the plant nursery when I'm picking out plants for the yard/garden. I see the scientific name on a plant I'm thinking of getting and I ask it to give me the rundown on it, how is it likely to do in (spot where I'm thinking of putting it, how much shade there is there, is it near a wall that'll reflect heat, in Phoenix, etc), how big will it get, show my some pictures of a full-grown version. And I just say "okay google, tell me about the plant caesalpinia mexicana, how will that do 2 feet from a southern-facing wall in Phoenix, and how big will it get" and it just talks back and gives me the run-down. You might say "you could google all that" and that's true, but a) I just want a quick answer, and b) AI talks, so I don't have to struggle to read a phone screen outside. I have a hard enough time reading text on my phone cause my eyes suck, let alone doing it outside.
Relying on it for important things it's not qualified to do is a whole other matter, like the case at hand here. The life and death of a nursery shrub is a lot less important than human health.
WhiskeyGrinder
(26,977 posts)Ocelot II
(130,677 posts)AI must be scaring the pants off people.