What you need to know
- Microsoft recently open-sourced Counterfit, an automation tool for security testing AI systems.
- The tool can be used to assess the security of machine learning systems and AI.
- A Microsoft survey shows that many organizations do not have the right tools in place to secure AI systems.
Earlier this week, Microsoft open-sourced its automation tool for security testing AI systems called Counterfit. The tool can be used to perform security risk assessments of AI and machine learning systems.
Microsoft explains Counterfit in its blog post on open-sourcing it. The company explains that Counterfit was "born out of our own need to assess Microsoft's AI systems for vulnerabilities with the goal of proactively securing AI services." Initially, the tool used attack scripts written to target specific AI models, but it evolved over time through development.
Microsoft regularly uses Counterfit as part of its AI red team operations. The company uses the tool to automate techniques and then pit them against its AI services.
Matilda Rhode, senior cybersecurity research, Airbus, explains why Counterfit getting open-sourced is important:
Taking security seriously is an important trend at the moment. Microsoft surveyed 28 organizations, including Fortune 500 companies, governments, non-profits, and small and medium-sized businesses to see what processes are already in place for securing AI systems. The survey showed that 25 out of 28 organizations said that they don't have the right tools in place to secure AI systems.
Get the Windows Central Newsletter
All the latest news, reviews, and guides for Windows and Xbox diehards.
Sean Endicott brings nearly a decade of experience covering Microsoft and Windows news to Windows Central. He joined our team in 2017 as an app reviewer and now heads up our day-to-day news coverage. If you have a news tip or an app to review, hit him up at email@example.com.