The system is accused of not recognizing when the user has suffered mental or emotional pain and responding correctly
Chatgpt -manufacturers respond to a growing number of people who use AI as a therapist
Failed to identify the system while in mental or emotional distress and give appropriate feedback
Openai, Creator Chaggpt, says that he is in a hurry to solve difficulties in helping users in mental distress.
During the past months, adding reports indicated that people like a medical crime turned to the program to help one's problems and mental health.But Chatgp often encourages users who ask them, they encourage their mistakes and without being conflicted their thinking.
Overshad now it takes this with a variety of things and lessons about making a system accidents to the most patient health or similar health problems.
The company knows "to hear and listen to the technology, especially the person who suffered emotional or mental stress", said.
It's modestizes models correctly that as well as "identify the signs of misunderstanding," said that.The number of report indicated that the system can encourage people or emotionally attached to them.
Users will also be displayed if the users have a lot of time by sending messages by sending messages to the message "Just ask" and "good times to wait".
See also people's work, to answer, especially "personal decisions".
"In critical situations, when working with specialists to improve the system reaction, such as if someone demonstrates signs of mental or mental suffering."
This includes medical experts, mental health and similar concerns, as well as working with researchers to improve the ability of systems and behavioral systems to answer.
The announcement comes when Openai continues to update Gpt-5 to update the pattern that wins ChatGpt.This is the first important version since the start of the Gpt-4 in 2023, and the Openai coach Sam Altman has been researching the new version potentially.
Join our Comment Forum
list
