More than a thousand scientists, researchers, engineers, developers, and other representatives of scientific and business communities signed an open letter calling for the suspension of all work on advanced neural networks and AI (artificial intelligence) technologies to develop security mechanisms in this area.
An open letter addressed to all labs involved in AI development was published on March 22 on the website of the Future of Life Institute (FOLI), calling for an immediate pause in the creation of “AI systems more powerful than GPT-4.” The letter calls for a minimum six-month pause in order to “jointly develop and implement a set of shared safety protocols” for AI developments.
As of 10:00 a.m. (GMT+3) March 30, 1,377 people have signed the letter, including such prominent business figures as Elon Musk and Steve Wozniak, as well as many scientists from relevant research institutions, managers, and leading employees of private companies in the field of AI and neural network development, independent researchers and other experts.
The letter’s authors state that there are “profound risks to society and humanity” associated with the further uncontrolled development of AI technologies. In their opinion, it’s necessary to introduce common security protocols for AI design and development, and this process should be monitored by “independent outside experts.”
The letter refers to an article by Sam Altman, Founder of OpenAI, the company behind the GPT-4 algorithm. The article discusses the creation of artificial general intelligence (AGI), voicing the company’s plans and vision for it and stating that there are enormous risks associated with the incorrect development of such systems or errors in their management. The FOLI representatives formulated it as the risk of losing “control of our civilization.”
The theme of the letter eventually boils down to the need to delegate decision-making in AI technology development to “elected leaders,” and the technology should be more loyal. According to the FOLI, the field of AI must be put under the control of “new regulatory authorities” in order to avoid “economic and political disruptions (especially to democracy)” that can be caused by AI.
As a result, the FOLI urges all key players in the industry to join the pause, which must be “public and verifiable,” otherwise they call on governments to “step in and institute a moratorium.”
Recall that the development of neural networks and AI technologies is one of the most active media trends of 2023. It’s closely connected with the blockchain industry, as Max Krupyshev, CEO of CoinsPaid, told in an interview with ICE London organizers.