15.2 C
Qādiān
Wednesday, January 15, 2025

Reshaping Perspectives and Catalyzing Diplomatic Evolution

A medical warning about the new Snapchat service. What to do with users?

And “My AI” is an artificial intelligence based live chat program, and is currently available to all “Snapchat” users.

Snapchat says My AI is powered by OpenAI’s ChatGPT technology, with additional security enhancements and controls unique to Snapchat.

“In a chat conversation, My AI can answer a simple question, advise you on the perfect gift for a dear friend, or help you plan a trip or weekend getaway,” she adds.

However, teenagers are now using it for another purpose, which is to get support for their psychological and mental health, according to the American site “Fox News”.

“The responses I received from My AI on psychological health and wellbeing were helpful and changed my perspective at a time when I was feeling overwhelmed and stressed,” one user wrote on Reddit.

“It’s not a human program, but it certainly comes close (and sometimes better than humans),” he added.

Another user said, “My AI responds very nice and friendly, but then you realize it’s not a real person.”

Medical warning

On the other hand, doctors warn against using the great potential of artificial intelligence to help support mental health in general.

In this regard, psychiatrist and research professor at Columbia University in New York, Ryan Sultan, told “Fox News”: “As this technology improves, as it increasingly simulates the personal relationship , some may begin to view artificial intelligence as a dominant personal relationship in their lives.

He added: “People’s thoughts are scattered about artificial intelligence. Some consider it useful and others find it frightening.”

Sultan opined, “Some have seen the service as somewhat limited, and it only gives general information that you might find if you google a question.”

“Others said they found it scary. It’s strange that a robot would answer personal questions in such a personal way, plus they don’t like the idea of ​​talking about data on their own personal sanity.”

On the other hand, California counselor Zachary Ginder pointed to some important warning signs that should turn parents and mental health providers off.

“With human-like AI responses, it can be difficult for younger users to distinguish whether they’re talking to a real human or a bot,” he told Fox News.

“Artificial intelligence speaks with a clinical authority that seems accurate on the face of it, although it sometimes manufactures an incorrect answer,” he explained.

In the same context, mental health service providers say the “misinformation” sometimes provided by artificial intelligence software can be a “major concern”, given that it can lead people to “pathways of assessment and treatment that are not suited to their needs”.

Read the Latest World News Today on The Eastern Herald.

More

Luigi Mangione Case: An In-Depth Analysis of the UnitedHealthcare CEO Shooting

Introduction: A Shock to the Healthcare WorldOn December 4,...

Mini Crossword: The Ultimate Guide to the New York Times Mini Crossword

NYT Mini Crossword is a supremely shortened variation of...
Follow The Eastern Herald on Google News. Show your support if you like our work.

Author

Arab Desk
Arab Desk
The Eastern Herald’s Arab Desk validates the stories published under this byline. That includes editorials, news stories, letters to the editor, and multimedia features on easternherald.com.

Editor's Picks

Trending Stories

Laurita Fernandez poses totally nude on her back in her Instagram

Laurita Fernandez is one of the most talented dancers...

Luigi Mangione Case: An In-Depth Analysis of the UnitedHealthcare CEO Shooting

Introduction: A Shock to the Healthcare WorldOn December 4,...

Prostitution in Dubai: Understanding the Dark Side of the City

Dubai, a city celebrated for its lavish shopping experiences,...