Last year, the U.N. established a 39-member advisory group to tackle challenges related to the global governance of AI. These recommendations are set to be deliberated at a U.N. summit scheduled for September.
The advisory committee emphasized the need for a panel dedicated to providing unbiased and credible scientific insights on AI, as well as addressing the disparities in information between AI laboratories and the broader public.
Since the launch of ChatGPT by Microsoft-backed OpenAI in 2022, the proliferation of AI technologies has surged, leading to heightened concerns regarding the spread of misinformation, fake news, and violations of copyright laws.
Currently, only a limited number of nations have enacted legislation to regulate the deployment of AI technologies. The European Union has taken a proactive stance by implementing a comprehensive AI Act, while the United States has opted for a model based on voluntary compliance. In contrast, China has focused on maintaining social order and state oversight.
The United States was one of approximately 60 nations that endorsed a "blueprint for action" aimed at ensuring the responsible use of AI in military applications on September 10, a document that China chose not to support.
With the advancement of AI predominantly controlled by a few multinational corporations, the U.N. has expressed concerns that this technology could be imposed on individuals without their input regarding its application.
The report also called for initiating a new policy dialogue on AI governance, establishing an AI standards exchange, and creating a global network for AI capacity development to enhance governance capabilities.
Among its other recommendations, the U.N. proposed the creation of a global AI fund to address existing gaps in capacity and collaboration, as well as the establishment of a global AI data framework to promote transparency and accountability.
Lastly, the U.N. report suggested the formation of a small AI office to facilitate and coordinate the execution of these recommendations.