August 20, 2024
|
10
min read
Ivan Fioravanti
Ivan Fioravanti, Co-founder and CTO for CoreView, uses his system engineer and .NET development skills to lead CoreView’s technology team. He’s passionate about AI, automation and all things Microsoft 365.
Human interact with AI artificial intelligence

According to Microsoft, Copilot for Microsoft 365 was adopted by 75 million users in its first 3 months. And, research shows that Copilot could boost user productivity by as much as 70%. Yet, despite repeated reassurances about Copilot’s safety, IT teams are rightfully cautious.  

Aside from its high price tag which can drive up M365 costs by up to 83%, Copilot has unrestricted access to your M365 data and applications organization-wide. So, here’s a detailed guide for getting your Office 365 ecosystem up to speed for the increased security risks of Microsoft Copilot, covering key issues like:

By making data more easily accessible, Copilot spells the end of Microsoft 365 security by obscurity. Here’s how you can be prepared for what comes next.

Is Microsoft 365 Copilot-Safe?

Copilot for Microsoft 365 is a sophisticated AI assistant that integrates with your existing M365 apps like Word, Excel, PowerPoint, Outlook and Teams. It leverages large language models (LLMs) and the emails, chats, and documents in your Microsoft Graph. This enables Copilot to provide intelligent, useful, and context-aware assistance to enhance user productivity and creativity.  

However, this deep integration with your organization’s existing data introduces a new security challenge. Copilot has access to all the data an individual user can access based on their existing M365 permissions. If those permissions are too broad or not well-managed, it could enable Copilot to inadvertently surface sensitive data to unauthorized users in Microsoft 365.  

All of this makes security best practices like permissions management, sensitivity labels, and continuous monitoring much more important under the new data-sharing model. While Copilot may benefit from the same industry-leading security and privacy controls as the rest of Microsoft 365, the way it accesses and shares data presents all-new security challenges that could make your company more vulnerable to information leaks and cyberattacks.

Unpacking Microsoft 365’s Copilot Security Model

Microsoft 365 Copilot introduces powerful AI capabilities, but also new security considerations. Let's dive into the native security model that Microsoft has built around Copilot to protect your data and ensure proper compliance.

  • Data Access Control: Copilot leverages Microsoft 365's existing Role-Based Access Control (RBAC) framework. This means access to Copilot and the data it processes is governed by the same permissions model you're already using, ensuring only authorized users can interact with sensitive information. Permissions can be finely tuned, controlling which data sources Copilot can access and how it can use that data.
  • Data Handling and Privacy: Copilot is designed with data minimization in mind, accessing only the data necessary to perform its functions. All data, both at rest and in transit, is encrypted using industry-standard protocols like TLS/SSL, IPSec, and AES to protect it from unauthorized access.
  • Compliance and Governance: Microsoft 365 and Copilot comply with major industry standards and regulations, including GDPR, HIPAA, ISO/IEC 27001, and more. For organizations with data residency and sovereignty requirements, Microsoft provides options to ensure compliance, such as storing data in specific geographic regions.
  • Monitoring and Auditing: Copilot maintains comprehensive logs of all activities, including data accessed and actions taken. This enables full auditability and traceability. Advanced analytics and machine learning algorithms are employed to detect unusual or suspicious activities associated with Copilot.
  • AI and Model Security: Microsoft ensures the integrity of the AI models powering Copilot, protecting them against tampering and adversarial attacks. The models are continuously updated and improved to address new security challenges and enhance performance
The Copilot Audit Log Report in CoreView
The Copilot Audit LogReport in CoreView

Built-in Security Measures in Microsoft Copilot

What kind of security measures has Microsoft built into Copilot? From robust encryption and access controls to secure development practices and compliance audits, Copilot is built to meet the stringent security requirements of modern enterprises. Up next, let’s talk about the specific features and components that it has designed for its new security model.

  • Data Security: Copilot employs strong encryption protocols to safeguard your data. Data at rest is protected using Advanced Encryption Standard (AES) with 256-bit keys, a highly secure encryption algorithm. For data in transit, Copilot utilizes Transport Layer Security (TLS) and Secure Sockets Layer (SSL) to encrypt communications between users and Microsoft 365 services. Access to Copilot is tightly controlled through strict access control mechanisms. Multi-factor authentication (MFA) is enforced, requiring users to provide additional verification beyond just a password. Moreover, Conditional access policies allow administrators to define granular rules based on factors like user location, device compliance, and risk level.
  • Operational Security: Microsoft follows a rigorous Secure Development Lifecycle (SDL) when building Copilot. This includes extensive security testing, code reviews, and threat modeling to identify and mitigate potential vulnerabilities early in the development process. Copilot's security posture is continuously monitored, and any identified vulnerabilities are promptly patched to protect against emerging threats. Microsoft's dedicated security teams work round the clock to detect and respond to security incidents.
  • Privacy and Compliance: Microsoft is committed to upholding stringent data privacy principles. Copilot is designed to handle user data in compliance with global privacy laws and regulations, such as GDPR and HIPAA. Microsoft provides transparent information about its data handling practices and gives users control over their data. To ensure compliance, Copilot undergoes regular third-party audits against industry standards like ISO/IEC 27001, SOC 1, and SOC 2. These audits validate the effectiveness of Copilot's security controls and its adherence to compliance requirements.
  • AI and Model Security: As an AI-powered assistant, Copilot's AI models are subject to rigorous security measures. Microsoft conducts regular adversarial testing to assess the resilience of these models against potential attacks. This helps identify and fix any vulnerabilities that could be exploited by malicious actors. Copilot's AI capabilities are grounded in Microsoft's Ethical AI Framework, which ensures transparency, fairness, accountability, and responsible use of AI. The framework guides the development and deployment of Copilot to maintain trust and mitigate potential risks.
  • User and Admin Controls: Copilot provides a comprehensive admin dashboard that empowers administrators to configure, monitor, and manage Copilot's operations. Admins can set access controls, define data sharing preferences, monitor usage patterns, and receive security alerts, giving them full visibility and control over Copilot within their organization. To help users understand how to use Copilot securely, Microsoft offers extensive training resources and user education materials. These cover topics like data protection, secure collaboration, and best practices for interacting with AI.

Common Microsoft 365 Copilot Security Concerns

While Microsoft 365 Copilot offers powerful AI capabilities to enhance productivity, it also introduces new security risks that organizations must carefully consider and address. These are often a result of the more accessible and open-ended data sharing model used by Copilot, which can make it much easier for sensitive information to fall into wrong hands.

Here’s a look at the ten most common security threats presented by Microsoft Copilot, based on Gartner’s “Top Ten Gotchas of Copilot for Microsoft 365”:

Risky Configuration Settings Enabled by Default

Out of the box, Copilot may have access to sensitive data without appropriate safeguards in place. Default settings could allow Copilot to interact with external plugins and access web content, introducing new attack surfaces.

Reporting Tools Lack Granularity

Native Microsoft 365 reporting tools often lack the detail needed to effectively govern Copilot usage and mitigate risks. Without granular insights into how Copilot is being used, it's difficult to identify potential areas of concern, such as users accessing sensitive data inappropriately.

Confusing Options for Extending Licenses and Managing Costs

The range of licensing choices for extending Copilot capabilities can be confusing, leading to sub-optimal decisions. Without clear guidance, organizations risk making licensing missteps that drive up costs unnecessarily.

Increased Risks of Oversharing

Copilot can inadvertently expose existing security gaps, making it easier for users to discover and share information they shouldn't have access to. If a user has excessive permissions, Copilot's powerful search capabilities could allow them to surface and share sensitive data.

Common Copilot Security Concerns

Introduction of New Attack Surfaces to Monitor and Protect

Copilot's broad access to data across Microsoft 365 creates additional security risks. If a user's account is compromised, an attacker could leverage Copilot to extract confidential information. The AI models powering Copilot also present potential vulnerabilities that could be exploited.

No Ability to Prioritize Content Sources

Without the ability to prioritize content sources, Copilot may struggle to surface the most pertinent information for a given query. It could pull in data from less relevant or even sensitive sources, potentially leading to data exposure.

Increased Content and App Sprawl

Copilot makes it easier for users to generate new content, which can lead to a proliferation of unmanaged and duplicative data if proper governance isn't in place. Organizations may face increased storage costs and compliance risks.

New Retention and Compliance Challenges

The dynamic and conversational nature of Copilot interactions can make traditional Microsoft 365 compliance approaches difficult to apply. Properly retaining and managing Copilot-generated data for legal, regulatory, and business requirements becomes a new challenge.

Inconsistent Capabilities Across Applications and Languages

Inconsistencies in Copilot's functionality across different Microsoft 365 apps and languages can lead to a fragmented user experience. These inconsistencies can create confusion and frustration among employees, leading to reduced trust and usage of Copilot.

Higher Than Expected Change Management Effort

Underestimating the change management needs for Copilot adoption can lead to slow uptake, user resistance, and sub-optimal return on investment. Effective change management requires a comprehensive approach addressing user awareness, training, support, and ongoing reinforcement.

Real-World Examples of Microsoft 365 Security Issues

At CoreView, we work with IT teams of all sizes to simplify, automate, and secure their Microsoft 365 business ecosystem. Our team has been responsible for managing over 25 million Microsoft 365 licenses to date, with some of our clients including Asmodee, Baker Tilly, The City University of New York, and Jefferson County Library.  

As you continue on your Copilot adoption journey, here are some real-world examples of potential security incidents that you should plan for, based on our experience dealing with various scenarios.

Misconfigured SharePoint Online Permissions Lead to Data Exposure

SharePoint Online allows for granular control over permissions. However, if these permissions are not properly configured, it can lead to sensitive data being exposed to unauthorized users. For example, if a SharePoint site is inadvertently set to allow access to all employees in M365, confidential documents stored there could be viewable by anyone in the company.

Copilot makes it even easier for SharePoint data to fall into the wrong hands if permissions aren’t set properly. Regularly reviewing and auditing SharePoint permissions is crucial to prevent data leaks.

Data Leakage Through Improper Use of Microsoft Teams

Microsoft Teams has become a hub for collaboration, but it also introduces new risks of data leakage if not properly governed, which can be further worsened due to increased discoverability from Copilot.  

Users may inadvertently share sensitive files or have inappropriate conversations in Teams channels. There have also been vulnerabilities discovered that allowed external parties to access Teams chats and meetings. Organizations need to establish Teams usage policies, restrict external sharing, and monitor activity to prevent data exposure.

Compliance Violations Due to Auto-Generated Content

With Copilot's ability to auto-generate emails, documents, and messages, there is a risk that this AI-created content could violate compliance regulations like HIPAA, GDPR, or FINRA.  

For example, Copilot might include personal health information in an email it drafts if not properly prompted, leading to a HIPAA violation. The generated content may also fail to meet requirements around data handling and retention. Companies need governance frameworks to ensure Copilot-produced content adheres to all applicable regulations.

How Copilot Uses Your Microsoft 365 Data

Understanding the way Copilot handles your information within and outside the Microsoft 365 service boundary is the first step to protecting your organizational data. While there are a lot of checks and balances to make sure your data is handled securely, plenty of scenarios still exist where your company data could be compromised. Let’s explore that now.

How the Data Handling Mechanism in Copilot Works

Microsoft Copilot for Microsoft 365 provides AI-powered assistance by combining large language models (LLMs) with your organizational data accessed through Microsoft Graph, such as emails, chats, and documents that you have permission to view. Copilot only surfaces data to individual users who have at least view permissions, respecting the permission models in Microsoft 365 services like SharePoint.

When you enter prompts, the information in your prompts, the retrieved data, and the generated responses remain within the Microsoft 365 service boundary, in line with Microsoft's privacy, security, and compliance commitments. Importantly, prompts, responses, and data accessed through Microsoft Graph are not used to train the foundation LLMs, including those used by Copilot.

When Your Data Leaves the Microsoft 365 Service Boundary

In certain circumstances, your organization's data may leave the Microsoft 365 service boundary when using Copilot. As an IT Director or CISO, it’s important that you understand how and when this happens, so that you can take the appropriate steps to secure it

  • Customer Feedback: When you provide feedback about your Copilot experience, additional diagnostic data related to your specific interaction may be collected with your permission, such as the document you were working on or the email you replied to. This data is used solely to improve Copilot and is not used to train LLMs.
  • Third-party Plugins: If your organization allows third-party plugins for Copilot, your data may be sent to the plugin provider when you use those plugins. Admins can control which plugins are allowed and monitor plugin usage.
  • Prompts and Responses: If you copy Copilot-generated content and share it outside of Microsoft 365 apps, such as pasting it into a third-party app, emailing it externally, or posting it on a public website, that content is no longer protected by Microsoft's security and compliance measures.
  • User Interactions: Diagnostic data about how users interact with Copilot, such as features used and time spent, may be collected to improve the product experience. This data does not include the contents of prompts or responses.
  • Web Content Plugin: If the Bing Web Content plugin is enabled, Copilot may access public web content to help answer queries. The contents of those queries are not linked back to individual users or used to train LLMs.
How Microsoft Copilot Uses Your Data

How to Set Up Microsoft 365 Copilot Security: Step-By-Step Guide

A well-planned and executed Copilot security setup will help you maintain control over your M365 data, ensure compliance with regulations, and minimize the risk of data breaches or unauthorized access. To establish a robust security framework for M365 Copilot, you'll need to follow a series of steps that involve identifying sensitive data, reviewing sharing policies, implementing data classification and protection measures, and enforcing access controls.  

By using Microsoft 365's built-in security and compliance tools, such as Microsoft Purview and Azure AD, you can create a comprehensive and effective security setup tailored to your organization's needs. Where appropriate, you can also use premium third-party solutions like CoreView for Microsoft 365, to further bolster your security framework.

Determine What Constitutes Sensitive Data

The first step is to clearly define what data is considered sensitive in your organization. This typically includes personally identifiable information (PII), protected health information (PHI), financial data, intellectual property, and other confidential business information. Engage with stakeholders from legal, compliance, HR, etc. to create a comprehensive list of sensitive data types.

Locate Sensitive Data Within Microsoft 365

Use Microsoft Purview's built-in sensitive information types (SITs) and data classification capabilities to discover sensitive data across your Microsoft 365 environment, including Exchange Online, SharePoint Online, OneDrive for Business, and Teams. Leverage the content explorer in the Microsoft Purview compliance portal to view occurrences of sensitive data and fine-tune your SITs for improved accuracy.

Review Internal Sharing Policies

Assess your current internal data sharing policies to ensure they align with your organization's security and compliance requirements. Determine which user groups should have access to sensitive data and under what circumstances. Consider implementing least privilege access, where users are only granted the minimum permissions necessary to perform their job functions.

Review External Sharing Policies

Evaluate your external sharing policies to control how sensitive data is shared outside your organization. Determine if external sharing should be allowed, and if so, what restrictions and safeguards should be in place. Use Microsoft 365's external sharing settings to enforce these policies, such as limiting external sharing to specific domains or requiring external users to authenticate.

Establish a Data Classification System

Develop a data classification system that categorizes data based on its sensitivity level (e.g., public, internal, confidential, highly confidential). This classification system will serve as the foundation for applying sensitivity labels and enforcing data protection policies. Train employees on the classification system and their responsibilities for handling sensitive data.

Assess Access Controls

Review your current access controls to ensure they are granular enough to protect sensitive data. Consider implementing role-based access control (RBAC) to grant permissions based on job functions. Use Microsoft 365's built-in tools, such as Azure AD Conditional Access and Microsoft Defender for Cloud Apps, to enforce access controls based on user identity, device health, location, and other risk factors.

Implement Sensitivity Labels

Use Microsoft Purview Information Protection to create and apply sensitivity labels to classify and protect sensitive data. Configure labels to enforce protection settings, such as encryption, watermarking, and access restrictions. Apply labels automatically using auto-labeling policies based on sensitive information types or manually by users. Extend labeling to Microsoft 365 apps, SharePoint, Teams, and Power BI.

Revise Access Controls

Once sensitivity labels are implemented, revisit your access controls to ensure they align with the labels. Use label-based access controls to automatically grant or restrict access based on the sensitivity level of the data. For example, limit access to highly confidential data to a specific group of users and require multi-factor authentication (MFA) for access.

Enforce Sharing and Creation Restrictions

Use sensitivity labels to enforce sharing and creation restrictions on sensitive data. For example, apply a label that prevents printing, copying, or sharing of highly confidential documents. Use data loss prevention (DLP) policies to detect and block the sharing of sensitive data based on labels or information type.

Plan for Data and Access Control Lifecycle

Develop processes for managing the lifecycle of sensitive data and access controls. Regularly review and update your data classification system, sensitivity labels, and access controls to ensure they remain relevant and effective. Implement data retention and disposal policies to securely remove sensitive data when it is no longer needed.

Set Up a Pilot Before Releasing it to the Entire Company

Before rolling out your M365 Copilot security setup to the entire organization, conduct a pilot with a select group of users. This pilot will help you identify any issues or gaps in your configuration and gather user feedback. Use the insights from the pilot to refine your setup before deploying it company-wide. Provide training and support to users during the rollout to ensure successful adoption.

Security Playbook for Microsoft 365 Copilot

For IT and Security leaders looking to embrace Copilot confidently, you’ll need a sharp, proactive strategy that locks down vulnerabilities and ensures strict compliance. Our Security Playbook for Microsoft 365 Copilot offers distilled guidance to fortify your environment against the unique risks Copilot introduces.

Download the playbook for:

  1. Best practices for securing Copilot
  2. Prioritized action items across the full Copilot adoption lifecycle
  3. Recommended tools to address any security gaps

From managing permissions to deploying real-time threat detection, get started today to secure Microsoft 365 and Copilot.  

Microsoft 365 Copilot Frequently Asked Questions

+
Is Microsoft 365 Copilot safe to enable?
Microsoft 365 Copilot comes with substantial security measures, such as role-based access control, encryption of data at rest and in transit, compliance with industry standards like GDPR and HIPAA, and continuous monitoring for vulnerabilities.
However, the deep integration with organizational data does introduce new security challenges. Before enabling Copilot, be sure you understand:
  • Data Access: Find out what data Copilot can access, and adjust permissions in Microsoft Graph according to what it should / should not have access to.
  • Default Settings: Review default access settings to prevent unintentional exposure of sensitive data.
  • Integration Points: Be cautious of integration with third-party plugins as they can introduce new security risks.
+
How does M365 Copilot affect data privacy and security?
In Teams and SharePoint, misconfigured permissions could lead to unwanted data sharing within these platforms. Also, be mindful of the auto-generated content Copilot creates, especially if it involves sensitive or regulated information.
+
What compliance standards does Copilot adhere to?
Microsoft 365 Copilot complies with multiple industry standards, including GDPR, HIPAA, and ISO/IEC 27001. It is subject to regular audits to ensure these compliance measures remain effective.
+
How does Microsoft Copilot handle our data?
+
What permissions does Copilot use to access data?
Copilot uses the existing Microsoft 365 Role-Based Access Control (RBAC) framework, meaning it only accesses data that users are already authorized to view, based on their permissions within Microsoft 365 services.
+
How can I secure my data when using Copilot?
  1. Identify Sensitive Data: Use tools like Microsoft Purview for data discovery and classification.
  2. Configure Access Controls: Set granular permissions using Azure AD (now Entra ID) and RBAC.
  3. Implement Sensitivity Labels: Protect data effectively with proper labeling.
+
Does M365 Copilot collect data?
Yes, Microsoft Copilot collects data related to how users interact with the system. This includes diagnostic data about user interactions and features used. Additional diagnostic data may be collected when users provide feedback.
+
What data does Microsoft Copilot have access to?
Copilot has access to all data that an individual user can access based on their existing Microsoft 365 permissions. It can leverage emails, chats, documents, and other data available through Microsoft Graph.
+
Does Copilot send data to Microsoft?
In general, Copilot operates within the Microsoft 365 boundary but may send limited data back to Microsoft for purposes like customer feedback or when third-party plugins are used.
+
Does Copilot share your data?
Copilot can share data within the Microsoft 365 environment based on user permissions, and this introduces some security challenges. However, unauthorized sharing largely depends on misconfigured permissions rather than Copilot itself being designed to overtly share data externally.
+
Can Microsoft Copilot share data across apps?
Yes, Copilot can share data across Microsoft 365 apps based on user permissions. It integrates with apps like Word, Excel, PowerPoint, Outlook, and Teams, accessing data through connected services like Microsoft Graph. This cross-app sharing capability is a function of the integrated nature of the Microsoft 365 ecosystem.
+
What are some of the security risks associated with M365 Copilot?
Key security risks include misconfigured permissions leading to unauthorized data access, potential oversharing of sensitive information, and increased vulnerability due to the broad access Copilot provides to Microsoft 365 data. There are also new attack surfaces and the need to manage content sprawl effectively.
+
How can organizations secure their Copilot usage?
Organizations should implement robust governance frameworks, establish a data classification system, manage permissions diligently, and use tools like CoreView for comprehensive oversight and sprawl management. Periodic reviews and pilot testing can also improve security.

Automate Microsoft 365 Copilot Governance with CoreView

As organizations prepare to adopt Microsoft 365 Copilot, establishing robust governance is essential to ensure security, manage data sprawl, and optimize usage. CoreView provides a comprehensive solution that addresses these critical areas, helping organizations navigate the complexities of Copilot while maintaining control over their Microsoft 365 environment.  

General Governance

Implementing proper governance in Microsoft 365 is crucial, especially with the introduction of Copilot. CoreView enhances governance by providing predefined security baselines that ensure best practices are applied from the outset. This includes configuring access controls, data governance policies, and security hardening measures tailored to your organization's needs. By automating these configurations, CoreView helps organizations avoid the risks associated with default settings that may expose sensitive data.

Sprawl Management

With the rise of Copilot, managing content and collaboration sprawl becomes increasingly important. CoreView empowers organizations to gain visibility into collaboration spaces, such as Teams chats and SharePoint sites, particularly those that are inactive or no longer in use. By leveraging real-time health checks, CoreView identifies sprawl risks and helps IT teams decommission unnecessary collaboration spaces, reducing clutter and potential security vulnerabilities.  

Usage Management

Visibility into Copilot usage is vital for optimizing licenses and managing costs. CoreView provides detailed analytics that allow organizations to track how often users engage with Copilot. If certain users are not utilizing their licenses, IT teams can reclaim those licenses and reallocate them to users who will benefit from Copilot's capabilities. This not only helps control licensing costs but also mitigates security risks associated with unused accounts, which can be targets for attackers.

Copilot License Optimization Center in CoreView
Copilot License Optimization Center in CoreView

Get a Demo to Learn More

To explore how CoreView can streamline your M365 Copilot governance, request a demo today. Discover the powerful automation tools and insights that CoreView offers to help you successfully manage your Microsoft 365 environment.

Get a personalized demo today

Created by M365 experts, for M365 experts.