Elon Musk and top AI researchers call for pause on ‘giant AI experiments’


A number of well-known AI researchers — and Elon Musk — have signed an open letter calling on AI labs around the world to pause development of large-scale AI systems, citing fears over the “profound risks to society and humanity” they claim this software poses.
The letter, published by the nonprofit Future of Life Institute, notes that AI labs are currently locked in an “out-of-control race” to develop and deploy machine learning systems “that no one — not even their creators — can understand, predict, or reliably control.”
“We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.”
“Therefore, we call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4,” says the letter. “This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium.”
Signatories include author Yuval Noah Harari, Apple co-founder Steve Wozniak, Skype co-founder Jaan Tallinn, politician Andrew Yang, and a number of well-known AI researchers and CEOs, including Stuart Russell, Yoshua Bengio, Gary Marcus, and Emad Mostaque. The full list of signatories can be seen here, though new names should be treated with caution as there are reports of names being added to the list as a joke (e.g. OpenAI CEO Sam Altman, an individual who is partly responsible for the current race dynamic in AI).
The letter is unlikely to have any effect on the current climate in AI research, which has seen tech companies like Google and Microsoft rush to deploy new products, often sidelining previously-avowed concerns over safety and ethics. But it is a sign of the growing opposition to this “ship it now and fix it later” approach; an opposition that could potentially make its way into the political domain for consideration by actual legislators.
As noted in the letter, even OpenAI itself has expressed the potential need for “independent review” of future AI systems to ensure they meet safety standards. The signatories say that this time has now come.
“AI labs and independent experts should use this pause to jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts,” they write. “These protocols should ensure that systems adhering to them are safe beyond a reasonable doubt.”
You can read the letter in full here.
A number of well-known AI researchers — and Elon Musk — have signed an open letter calling on AI labs around the world to pause development of large-scale AI systems, citing fears over the “profound risks to society and humanity” they claim this software poses. The letter, published by the…
Recent Posts
- Kick off Pokémon Day 2025 with this gorgeous short film
- BitTorrent for LLM? Exo software is a distributed LLM solution that can run even on old smartphones and computers
- The dream of PictoChat on the Nintendo DS lives on in this iMessage app
- Amazon is launching Alexa.com and new app for Alexa Plus
- Alexa Plus explained: 9 things you need to know about Amazon’s new AI-powered assistant
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010