New Open Letter Urging AI Labs To Pause Training Of Powerful AI Systems Receives More Than 1,100 Signatories

The race for AI is getting more competitive as we speak and the launch of GPT-4 technology has top tech experts and critics at the tips of their toes.

Now, a new open letter has entered the eyes of the public. It has urged all leading AI labs to take a moment to pause and rethink the training of AI systems that have more power than GPT-4. The letter has so far received more than 1100 signatories and we’re expecting the figure to rise dramatically in the next few days too.

A period of six months has been outlined and common names who appeared on the list of signatories included leading computer scientists as well as tech experts like Elon Musk and Steve Wozniak.

This open letter has similarly sent out a warning about the latest AI systems that comprise intelligence that can be compared to the likes of humans. They are outlined as serving as a huge risk to society. And also called out for having a deficiency in terms of planning and management.

Seeing AI labs getting locked out from this never-ending race to create and put out more digital minds that no one can understand fully nor can they control. So experts feel the time has come to step and pause before it really sparks major chaos.

The letter reads how so many AI systems designated as being contemporary are actually competitive at the level of humans and can carry out general tasks. At that time, it’s necessary to ask if we should enable machines to take over information channels using propaganda and untruth. And is it necessary to automate everything including jobs that humans are fulfilling at the moment?

Creating nonhuman minds to do our tasks means outnumbering and outsmarting and even replacing humans. This might be risking civilization as such decisions may not be delegated to tech leaders that are unelected.

This letter mentioned how pauses need to be publicized and entail all the leading factors so they can be further verified. The government is similarly made to notice and introduce moratoriums if delays or pauses don’t arise quickly, the letter went on to elaborate.

But it’s clearly mentioned how labs are not being told to stop general advancements in the world of AI. it’s just being told to take a mere step back from a very dangerous race to some bigger unpredictable models that have leading capabilities.

This similarly suggests how buffer times may be utilized by such AI labs and top experts to combine and produce a set of safety protocols for better AI design and development.

In the end, the letter is seen quoting statements from the makers of ChatGPT but it’s a little striking to see how the list of signatories fails to entail anyone from the powerful research lab other than Elon Musk himself. The latter was a founding member.


Read next: OpenAI’s Latest Generative AI Tool GPT-4 Is Likely To Spread Misinformation - New Study Claims
Previous Post Next Post