The artificial intelligence (AI) chatting site “ChatGBT” recently suffered a technical glitch that caused users’ conversations to be leaked.
GBT Chat and its developers have been severely criticized for the security and safety of users’ information and privacy. The technical glitch was revealed following users on social media shared pictures of conversation records that they said were not related to them.
The CEO of OpenAI, the artificial intelligence company that has access to users’ conversations, said the bug had been fixed and the company would conduct a technical analysis.
The company’s policy states that it can use user data, represented in responses and conversations, to continue training the ChatGBT bot model, but following removing personally identifiable information.
serious defect
The academy specialized in artificial intelligence, Nour Naim, said that the error that occurred reflects the end of the “honeymoon” period for this application, which caused widespread confusion with its advanced technical capabilities, and here are its caveats that have begun to be verified, especially since the owner company was built on the basis of creating technologies Open source artificial intelligence.
In an interview with Al-Araby from Istanbul, Naeem considered that the two problems that the application faced are serious, and the first is the exchange of previous conversations with other users.
As for the other problem, it is of a very high risk, as the technical defect caused the disclosure of information regarding individuals and their personal information in e-mail, the data of paid “GB + Chat” subscribers and their credit cards, and the company said that only the last numbers were disclosed in some cards.
Naim suggested that this mistake be repeated by all technology companies, because they depend mainly on the entered data to develop their programs in order to become more efficient and effective.
And she confirmed that specialists in the ethics of artificial intelligence acknowledged that “Open AI” dealt irresponsibly this time with artificial intelligence techniques, and there is a state of uncertainty regarding the privacy of user data.