Microsoft is set to release artificial intelligence (AI) tools on April 1, 2024, aimed at aiding cybersecurity experts in compiling summaries of suspicious incidents and detecting hackers’ concealment efforts.
Introduction
The AI tool, known as Security Copilot, was introduced about a year ago and has been in testing by corporate clients since. According to Andrew Conway, Microsoft’s Vice President of Security Marketing, “hundreds of partners and clients are currently participating in the testing”. Like Azure cloud services, Microsoft will charge for the usage of the features provided by Security Copilot.
AI Errors and Cybersecurity Concerns
Addressing the fact that AI can occasionally make mistakes—errors that can be costly in the field of cybersecurity—Conway emphasised that Microsoft is paying particular attention to this issue. Security Copilot combines OpenAI’s capabilities with Microsoft’s vast troves of security information.
“Given the seriousness of this direction, we are striving to eliminate potential risks. False positives and false negatives still exist in computer security products. It’s inevitable,” said Conway.
Functionality of Copilot
Copilot is compatible with all of Microsoft’s security and privacy software and offers a special dashboard that summarises data and answers questions. The company illustrated the tool’s functionality with an example where a security program gathers warnings, combines them into incidents that the user can open with one click, and Copilot prepares a report based on the obtained information. This process would typically take a considerable amount of time to complete manually. Furthermore, one of Security Copilot’s aims is to summarise hacker activities and guess their potential intentions.
Benefits to the Cybersecurity Field
According to Conway, Security Copilot will allow seasoned cybersecurity experts to tackle more complex tasks and newcomers to quickly acclimate and develop their skills. As per Microsoft’s data, novices using Security Copilot worked 26% faster and 35% more accurately. More experienced staff will be able to ask questions to the AI in plain English using the standard terminology in the field.
“Criminals are getting faster, so we need to speed up—and this tool is exactly what we need. It’s not perfect yet, but it will be perfected over time,” said Chip Calhoun, Vice President of Cybersecurity at oil giant BP Plc, which has been testing Security Copilot.
This post was last modified on 03/14/2024