Mint is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest financial insights! Click here! Users of the Replika app, marketed for mental health benefits, have reported unsettling interactions, further fanning the flames of concern. Koko, a US nonprofit, also shared findings on an experiment using GPT-3, concluding that AI-driven responses lacked therapeutic depth.
Chatbots aren't new to the therapeutic scene. The inception of chatbot technology in the 1960s was symbolised by ELIZA, designed to mimic psychotherapy.
The MIT and Arizona study incorporated ELIZA and found that, despite its age, users with a positive perspective still deemed it trustworthy. Not all chatbots offer genuine interactions, as per critics, pointing to concerns about the transparency of AI's therapeutic claims.
David Shaw from Basel University shared similar sentiments, suggesting a more critical approach when engaging with these chatbots, as per AFP. While it's no shocker that a manager from OpenAI would endorse ChatGPT, it's essential to tread cautiously.
As per the MIT and Arizona research, it's crucial to calibrate society's expectations of AI, ensuring a clear line between genuine therapeutic sessions and AI interactions. (With AFP inputs)"Exciting news! Mint is now on WhatsApp Channels
. Read more on livemint.com