Leading the Alternative World Order

Reshaping Perspectives and Catalyzing Diplomatic Evolution

Sunday, May 5, 2024
-Advertisement-
Science and TechnologyArtificial IntelligenceA medical warning about the new Snapchat service. What to do with users?

A medical warning about the new Snapchat service. What to do with users?

– Published on:

And “My AI” is an artificial intelligence based live chat program, and is currently available to all “Snapchat” users.

Snapchat says My AI is powered by OpenAI’s ChatGPT technology, with additional security enhancements and controls unique to Snapchat.

“In a chat conversation, My AI can answer a simple question, advise you on the perfect gift for a dear friend, or help you plan a trip or weekend getaway,” she adds.

However, teenagers are now using it for another purpose, which is to get support for their psychological and mental health, according to the American site “Fox News”.

“The responses I received from My AI on psychological health and wellbeing were helpful and changed my perspective at a time when I was feeling overwhelmed and stressed,” one user wrote on Reddit.

“It’s not a human program, but it certainly comes close (and sometimes better than humans),” he added.

Another user said, “My AI responds very nice and friendly, but then you realize it’s not a real person.”

Medical warning

On the other hand, doctors warn against using the great potential of artificial intelligence to help support mental health in general.

In this regard, psychiatrist and research professor at Columbia University in New York, Ryan Sultan, told “Fox News”: “As this technology improves, as it increasingly simulates the personal relationship , some may begin to view artificial intelligence as a dominant personal relationship in their lives.

He added: “People’s thoughts are scattered about artificial intelligence. Some consider it useful and others find it frightening.”

Sultan opined, “Some have seen the service as somewhat limited, and it only gives general information that you might find if you google a question.”

“Others said they found it scary. It’s strange that a robot would answer personal questions in such a personal way, plus they don’t like the idea of ​​talking about data on their own personal sanity.”

On the other hand, California counselor Zachary Ginder pointed to some important warning signs that should turn parents and mental health providers off.

“With human-like AI responses, it can be difficult for younger users to distinguish whether they’re talking to a real human or a bot,” he told Fox News.

“Artificial intelligence speaks with a clinical authority that seems accurate on the face of it, although it sometimes manufactures an incorrect answer,” he explained.

In the same context, mental health service providers say the “misinformation” sometimes provided by artificial intelligence software can be a “major concern”, given that it can lead people to “pathways of assessment and treatment that are not suited to their needs”.

Read the Latest World News Today on The Eastern Herald.


For the latest updates and news follow The Eastern Herald on Google News, Instagram, Facebook, and Twitter. To show your support for The Eastern Herald click here.

Arab Desk
Arab Desk
The Eastern Herald’s Arab Desk validates the stories published under this byline. That includes editorials, news stories, letters to the editor, and multimedia features on easternherald.com.

Public Reaction

Subscribe to our Newsletter

- Gain full access to our premium content

- Never miss a story with active notifications

- Exclusive stories right into your inbox

-Advertisement-

Latest News

-Advertisement-

Discover more from The Eastern Herald

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from The Eastern Herald

Subscribe now to keep reading and get access to the full archive.

Continue reading