Microsoft taking steps to ban hacking groups using its AI products

Published , by TJ Denzer

As artificial intelligence technology has boomed in popularity over the last year or so, so too have bad actors attempting to use it maliciously as Microsoft pointed out in a report this week. The company announced a ban against state-backed hacking groups using its AI tools after discovering that such groups were using products like OpenAI for hacking purposes.

Microsoft released its report and accompanying ban earlier this week, as reported by Reuters. According to the report, Microsoft had found that state-backed hacking groups working on behalf of Russia, Iran, China, and North Korea had made use of OpenAI and other AI products, honing hacking skills by using large language models to trick targets maliciously. As of this report, Microsoft has issued a ban on such use of its products, hoping to corral use of AI by military and government-associated groups such as the Russian military intelligence and Iran's Revolutionary Guard.

OpenAI and Microsoft have worked closely since even before the AI boom with Microsoft making several key investments into the AI company.
Source: Justin Sullivan/Getty Images

Microsoft’s moves to secure its AI products from malicious purposes is just another wrinkle in the dangers posed by unregulated AI, which have been opined on regularly by leading tech experts. Notably, Elon Musk has shared his thoughts several times about the dangers of unregulated AI, and even joined Steve Wozniak and other industry leaders in calling for pauses on AI system training until safeguards could be developed. President Joe Biden followed in October 2023 with an executive order demanding the establishment of AI safety standards.

AI technology is still, in many ways, a wild west, and one wonders if Microsoft has the resources to put a stop to bad actors utilizing its products, especially at a government level. Nonetheless, it will be interesting to see if this ban has an effect on the project. Stay tuned as we continue to follow for the latest updates.