The Biden Administration has just launched the groundbreaking AI Executive Order focused on shaping the future of artificial intelligence (AI). The goal? To set up new rules and safety measures for AI, making it safer and more trustworthy. This move represents a major turning point in how governments worldwide deal with the challenges of AI’s rapid growth and potential risks.
Biden’s AI Executive Order causes “mixed” reactions
Opinions among AI experts are split. Some feel cautiously hopeful about Biden’s plan, seeing it as a step in the right direction. Others are critical, saying the government is relying too much on big tech companies’ goodwill. Biden’s AI Executive Order comes after him and his science advisory put out some kind of a report, a draft of the order.
“He was as impressed and alarmed as anyone. He saw fake AI images of himself, of his dog. He saw how it can make bad poetry. And he’s seen and heard the incredible and terrifying technology of voice cloning, which can take three seconds of your voice and turn it into an entire fake conversation,” deputy White House chief of staff Bruce Reed said, according to AP News.
Breaking down Biden’s AI AI Executive Order
The Executive Order lays out a comprehensive plan across eight key areas:
- Ensuring AI Safety and Security
- Protecting Americans’ Privacy
- Advancing Equity and Civil Rights
- Supporting Consumers, Patients, and Students
- Backing Workers
- Encouraging Innovation and Competition
- Boosting American Leadership Globally
- Responsible Government Use of AI
How artificial intelligence went from fiction to science?
Before we start talking about the details, let’s take a look at the official announcement. You can check the official fact sheet here. Moreover, here is the official press release of the AI Executive Order.
New safety standards for AI
According to Gizmodo, the order asks major AI companies to share their safety test results with the government and create new tools to ensure AI systems are safe and reliable. This also means developing ways to protect against potential AI-related threats, from cyber attacks to AI-generated bioweapons.
Moreover, developers may need to share their safety test results if their tools pose national security risks. The Biden administration uses a law from 1950, the Defense Production Act, to enforce these requirements, emphasizing the seriousness of these risks.
Government-industry collaboration
Government agencies like the National Institute of Standards and Technology (NIST), the Department of Energy, and the Department of Homeland Security will work with private industry to set standards, assess risks, and strengthen defenses against possible AI vulnerabilities.
Focus on AI-generated content and civil rights
The order demands guidelines for labeling AI-generated content and aims to prevent misuse in the justice system to avoid discrimination and abuse.
Privacy and Worker Protection
It emphasizes the importance of handling data collected by AI companies legally and ensuring it’s secure. It also stresses the need to protect workers’ rights amid AI advancements that could impact various job sectors.
Global talent recruitment and data use
The order also highlights the importance of attracting top AI talent globally, making it easier for non-U.S. citizens to contribute to AI projects in the United States. It also directs government agencies to evaluate how they collect and use available data, including from brokers.
AI 101: A beginner’s guide to the basics of artificial intelligence
What Lies Ahead
This Executive Order is a big step forward in regulating AI. It’s a move toward stricter rules and oversight, building on previous voluntary commitments from major AI companies. However, the success of these rules depends on how strictly they are followed and maintained, which could change with different governments.
Despite uncertainties, the administration sees this as a significant leap forward in ensuring AI’s safety, security, and trust. The AI Executive Order marks a decisive move in global AI governance.
Featured image credit: Donghun Shin/Unsplash