New York Fashion Week 2025-26, redefining luxury, creativity, and venue storytelling

New York — At New York Fashion Week 2025, Coach unveiled a groundbreaking...

Ralph Lauren unveils Spring 2026 collection at New York Fashion Week with timeless luxury

Ralph Lauren launched New York Fashion Week with an intimate studio show that...

Gun violence and domestic terrorism in the US – Experts call for safety

The Minneapolis Catholic school shooting has thrust gun violence and domestic terrorism back...

Nevada’s two-day shutdown shows how fragile state cyber defenses still are

The Nevada cyberattack, a Nevada ransomware attack detected on Sunday, August 24, forced...

Ending the “Chat GPT” Hallucination Crisis. Experts: don’t wait too long

Microsoft-backed OpenAI, the maker of Chat GPT, has sought to reassure users that it improves the algorithmic problem-solving capabilities of the chatbot to reduce artificial intelligence “hallucinations”.

According to the company, in a statement posted on its official website on Wednesday, mitigating hallucinations is “a critical step towards building compatible AI.”

The meaning of the “artificial intelligence” hallucination

According to Salloum Al-Dahdah, a specialist in communications and information technology:

“Hallucinations” is a term for AI generating “results that are incorrect or not supported by real-world data; i.e. giving unsubstantiated answers.” These “hallucinations” could be that “Chat GPT” gives its users content, news or false information about people or events without relying on facts, and content with “intellectual property rights”, according to his speech.

On the other hand, information technology consultant Islam Ghanem believes that:

Errors caused by ChatGPT’s processing of information are “uncommon and rare”. But the appearance of these errors is called “hallucinations”, and at this moment the application begins to list false information and give “incorrect” outputs that do not correspond to reality or make no sense in the general context, according to his speech. The “Chat GPT” program has attracted him a lot of attention since its release a year ago; For its ability to perfectly simulate and create sounds, to write articles, messages and translate them in seconds, but also to give misleading information, and can distort people, as a result of “hallucinations”.

How do these hallucinations occur?

“Hallucinations” occur because the data on which the system was trained does not cover “accurate answers to certain questions”, and at that point the system manufactures and creates answers that are not correct, and this is one of the major problems these systems are currently facing, according to Anas Najdawi, an expert in information technology and artificial intelligence.

Al-Dahdah agrees with him, who confirms that “artificial intelligence” is the result of human programming, and that it works according to the data and data that humans program it with. Therefore, the lack of these data causes “hallucinations”.

Can “hallucinations” be stopped?

Al-Najdawi believes “the problem of hallucinations can be alleviated and Chat GPT errors reduced”, by programming the system with additional “accurate and unbiased” training data, which he can use to answer questions and make the distinction between truth and fiction. At that point, generative AI programs will learn to provide “more accurate and correct answers” in the future, and in the long run, the performance of these systems will improve. Ghanem also offers a solution, which is to reduce the number of words step by step. Which helps to “understand and analyze the sentence that the user writes for these applications, and clearly understand the question asked”.

It is worth noting that the company “Open AI”, which developed the program “Chat GPT”, has already developed the capabilities of the program to deal with “hallucinations”, and reduced its size by 1-5%, but not completely. On the other hand, Al-Dahdah believes that it is difficult “to improve the capabilities of the Chat GPT” to prevent “hallucinations once and for all”. Therefore, it is important to verify all the information provided by the app, before using it or treating it as “facts”, according to the communications and information technology expert.

More

ChatGPT Zero: What it means and what people get wrong

ChatGPT Zero commonly refers to tools or ideas aimed...

Is ChatGPT down? Real-time status checks and quick fixes

When responses stall or fail, users ask, "Is ChatGPT...
Show your support if you like our work.

Author

Arab Desk
Arab Desk
The Eastern Herald’s Arab Desk validates the stories published under this byline. That includes editorials, news stories, letters to the editor, and multimedia features on easternherald.com.

Comments

Editor's Picks

Trending Stories

Discover more from The Eastern Herald

Subscribe now to keep reading and get access to the full archive.

Continue reading