Google dropped its long held restrictions on applying artificial intelligence technology to develop or use weapons, and surveillance equipment.
The rules were removed covertly and that was found after the publishing the 2024 report “Responsible AI”, which was firstly reported by The Washington Post.
Google introduced AI principles in 2018, promising not to use AI for weapons or surveillance purposes, as well as other AI applications that could generally cause harm.
The company further vowed not to engage in AI applications that violate international laws and human rights.
In the latest policy, these explicit restrictions have been quietly removed.
Google’s AI leaders, Demis Hassabis and James Manika, clarified that the company is now investing in AI projects that benefit people and society, while also considering the potential risks arising from them.
But Google’s move contrasts with the biggest controversy that occurred in 2018, when the firm was compelled to call off its “Maven” AI project.
The Maven project was in partnership with the U.S. military and was for the development of AI-drone surveillance technology.
The employees of Google protested very strongly, forcing the firm to pull out of the surveillance projects.
But now, Google seems okay to let its AIs use for surveillance purposes again.
Google is not alone in offering AI to the governments defense agencies.
Other companies include OpenAI and Anthropic, which also have deep relationships with the U.S. government and defense agencies.
This shows the deepening nexus of technology companies and national security agencies.
Interestingly, Google has also been silent on other important issues in its policy.
Recently, when U.S. President Donald Trump said he wanted to rename the Gulf of Mexico to the Gulf of America, Google stated without objection that it could accept the change, provided it was included in official records.
With this kind of policy change by Google, it seems major tech companies are changing their stance on the usage of AI and are possibly moving closer towards government and defense agencies.
This could have implications that alter its usage and impact in the future.