Just recently, Microsoft announced the Microsoft Security Copilot, a new security offering based on a combination of the GPT-4 language model, the well-known technology provided by OpenAI that also powers ChatGPT and Microsoft Bing, and Microsoft’s own technology, data, knowledge, and security-specific model from their ongoing global threat intelligence.

As the name indicates, the new solution, which is not publicly available yet, is positioned as a “copilot” augmenting the security analysts in SOCs (Security Operations Centers) / CDCs (Cyber Defense Centers) and empowering them.

A use-case specific application of GPT-4

Microsoft Security Copilot resembles ChatGPT to a certain extent, but also vastly differs from it. The users can prompt the solution and receive answers. These answers are not just text, but also contain flowcharts/visuals that detail the anatomy of the incident that is researched. All or selected parts of the analysis can be stored in a pinboard that summarizes an investigation and that can be shared with others.

The solution also comes with predefined promptbooks. Promptbooks are pre-engineered collections of prompts for various use cases such as malware analysis. Thus, the user must not think about the right questions (or prompts) but can rely on preconfigured analytical steps. Microsoft provides promptbooks, but users can add their own ones and share them.

Log analysis, context, and resolutions

Microsoft Security Copilot analyzes log files and uses the data from Microsoft’s global threat intelligence. It helps in identifying affected systems and accounts and reengineering attacks so that the accounts that had been compromised first can be identified. It currently works closely with Microsoft Sentinel and Microsoft Defender, with other integrations to be added. With the multitude of security tools many organizations have deployed, openness for integration to other vendor’s solutions would be something I’d like to see in the future.

All that is done in a way that still requires some level of security skills, but way less than a manual analysis would require. While being somewhat skilled in security, I wouldn’t claim to be capable of doing threat analysis and investigation myself. However, in the demo I received from Microsoft, there was nothing in any of the responses I did not understand. In other words: The Security Copilot would have turned me into a security analyst, by augmenting me.

The tool can even create a PowerPoint file with the condensed results of an investigation, for instance, for reporting to the management. As mentioned, the solution will become integrated with all major Microsoft security services but remain a separate offering.

Augmenting Intelligence

Microsoft Security Copilot, even while being just announced and, as of now, being integrated with only some of the Microsoft security products, is an impressive showcase for the benefit and business value AI can deliver. It can help in closing the skills gap in security and speeding up security analysis. This is done by using a specific model and a vast amount of global data and applying them to the concrete log information of a customer. The customer data is not shared and is not used for machine learning purposes. However, users can rate the quality of responses.

I must admit that I’m impressed by what I have seen so far: A solution delivering tangible benefits and empowering and augmenting human analysts instead of threatening to replace them. This is exactly what we should expect from emerging AI applications, in cybersecurity and in general.

See also