Blog Post

Microsoft 365 Copilot
10 MIN READ

Security and governance innovations for Microsoft 365 Copilot from Ignite 2024

alexpozin's avatar
alexpozin
Icon for Microsoft rankMicrosoft
Nov 19, 2024

Discover the latest security and governance capabilities coming to Microsoft 365 Copilot.

Today, we are excited to share the latest security and governance capabilities coming to Microsoft 365 Copilot to help customers protect sensitive data, discover AI risks, and govern Copilot usage. 

How Copilot is addressing oversharing:  

Effective content governance has always been crucial for maintaining the integrity, security, and relevance of organizational content. AI’s power to make content more discoverable than ever before amplifies this need. Enhancing content governance practices in this new context requires implementing both strategies and tools that streamline the content management processes, ensure data is relevant and secure, while improving overall content quality.  

Microsoft offers two powerful tools to address this concern of oversharing: SharePoint Advanced Management for site management and content governance capabilities, and Microsoft Purview for security, compliance, and governance across data and files. Today, we’re excited to share new capabilities in both areas to help customers address oversharing.  

Accelerate Microsoft 365 Copilot adoption with built-in content governance  

Today we’re pleased to announce Microsoft 365 Copilot will now include built-in content governance controls and insights provided by SharePoint Advanced Management (available in early 2025) to accelerate Copilot deployments and automate oversight at scale. Learn more about this update at https://aka.ms/SAMCopilot. 

SharePoint Advanced Management delivers a powerful suite of tools to bolster content governance throughout your Microsoft Copilot deployment journey, including several new capabilities to reduce the risk of oversharing, control content sprawl, and manage site lifecycle. 

 

 

Figure 1: SharePoint Advanced Management landing screen

Prevent site content from being surfaced by Copilot or organization-wide Search  

Ensuring that accountable groups or individuals within the organization can access, describe, protect, and control data quality is fundamental to governance. Restricted Content Discovery (RCD), announced in September, will be rolling out in public preview from now to early December. RCD allows you to configure policies to restrict search and Copilot from reasoning over select data sites, leaving the site access unchanged but preventing the site’s content from being surfaced by Copilot or organization-wide search. It can be controlled granularly for both Team and Communication site types. 

Restrict access to specific sites and content  

Restricted Access Control (RAC) policy insights, available now in SharePoint Advanced Management (SAM), is a powerful tool for enhancing security and governance. RAC policies allow you to restrict access to specific sites and content exclusively to designated user groups. This ensures that only authorized users can access sensitive information, even if individual files or folders have been overshared. Restricted Access Control policy insights enhance RAC by delivering reports of users who have been denied access to one or more sites through RAC policy enforcement. 

 

Quickly discover access by third-party applications 

Understanding how applications interact with your content is as important as understanding how people are accessing content. Enterprise Application Insights, available in December, provides detailed reports that help you discover all the SharePoint sites that are allowed access by third-party applications registered in your tenant. It also offers insights into the application's permissions and request counts, enabling you to take further actions to strengthen site security. 

Identify OneDrive sites that contain potentially overshared content  

Proper file permissions are essential for safeguarding digital assets, preserving confidentiality, and maintaining file integrity. Data access governance, available in December, empowers you with the ability to view reports that identify OneDrive sites that contain potentially overshared or sensitive content. You can use these reports to assess and apply appropriate security and compliance policies. 

 

Figure 2: Data access governance reports in SAM

 

Verify Copilot agents are accessing content appropriately 

Copilot agents are AI assistants designed to automate and execute business processes, working alongside or on behalf of a person, team, or organization. These agents can range from simple, prompt-and-response agents to more advanced, fully autonomous agents. Copilot agent insights and actions, available in preview in December, empower you to enhance governance and productivity within your SharePoint environments. These insights provide detailed reports on how Copilot agents interact with SharePoint content, helping you monitor and manage agent activities effectively. 

By leveraging these insights, you can ensure that your Copilot agents are accessing and utilizing content appropriately, reducing the risk of unauthorized access and improving overall data governance.  

Help identify potentially over-permissioned content  

Scoped permission reports, available in preview in December, provides you with detailed insights into the permissions assigned to individuals and groups across SharePoint sites. These reports help identify potentially over-permissioned content, ensuring that only authorized users have access to sensitive information. 

By leveraging these reports, you can enhance your security posture, streamline access management, and ensure compliance with corporate policies.  

 

Figure 3: Scoped permission report in SAM

 

We’re also excited to share general availability of several SharePoint Advanced Management capabilities announced in September: 

  • Permission state report, which helps admins discover potentially over-permissioned content across the entire tenant 
  • Site ownership policy, which enables admins to automate time-consuming tasks related to content ownership management, such as maintaining the minimum number of site owners and identifying the most appropriate and accountable individuals  
  • Inactive SharePoint sites policy, which helps customers eliminate stale or irrelevant content to minimize oversharing and narrow the scope of what needs governance  
  • Restricted Site Creation, which enables admins to restrict the creation of SharePoint sites to a specific set of users, helping mitigate content sprawl within an organization 
  • Site access review, which allows IT administrators delegate the review process of data access governance reports to the site owners of overshared sites 

To learn more about these features, please read the September announcement post Governing data for GenAI with SharePoint Advanced Management. 

 

New Microsoft Purview capabilities to help manage security, compliance, and governance across data and files  

Microsoft Purview helps you accelerate Microsoft 365 Copilot transformation by managing risks related to data and file security, compliance, and governance. Complementing the built-in content governance provided by SharePoint Advanced Management, Purview addresses concerns with data oversharing, data leakage, and non-compliant usage.  

Discover data that is at risk of oversharing 

Our new oversharing assessment in Data Security Posture Management for AI (DSPM for AI), now in public preview, helps you discover data that is at risk of oversharing. Risks are assessed by scanning data for sensitive information types and identifying locations with potential oversharing based on existing user access patterns. The oversharing assessment provides recommendations on how to mitigate oversharing risks with a few clicks, such as applying a sensitivity label to overshared content, using SharePoint Access Management to add the site to restricted content discovery or start a new site access review. Admins can run the assessment before a Copilot deployment to identify and mitigate risks such as unlabeled files accessed by users. Post-deployment, the assessment will identify risks such as sensitive data referenced in Copilot responses. Read the announcement post to learn more. 

 

Figure 4: Act on potential AI risks from the oversharing assessment in DSPM for AI

Prevent access to sensitive content in documents with label-based Copilot exclusion 

Data oversharing and leakage are major concerns for organizations adopting generative AI technologies like Microsoft 365 Copilot. To address these concerns, we are excited to announce the public preview of Microsoft Purview Data Loss Prevention (DLP) for Microsoft 365 Copilot, aimed at reducing the risk of AI-related oversharing at scale. This new capability prevents Microsoft 365 Business Chat from creating summaries or responses using a document’s contents. It works with Office files and PDFs stored in SharePoint or OneDrive and uses the file's sensitivity label to prevent these actions. This helps ensure that potentially sensitive content within labeled documents is not processed by Copilot and responses are not available to copy and paste into other applications.  

With this approach, admins have an easier option to exclude potentially sensitive content from being used with M365 Copilot. This capability can be configured at various levels, such as file, group, site, and user. For example, DLP policies can prevent M365 Copilot from processing the contents of documents with a Personal information sensitivity label or any sensitivity label you specify. Learn more. 

 

Figure 5: DLP policy setting to prevent Copilot from processing content with a sensitivity label

Identify and respond to potential oversharing risks from user interactions with AI 

Microsoft Purview Insider Risk Management (IRM) correlates various signals to identify potential inadvertent, negligent, or malicious insider risks, such as IP theft, data leakage, and security violations. Today, we are excited to announce the public preview of risky AI usage detections in Purview IRM which focuses on identifying activities that present potential oversharing risks, such as prompts containing access requests to sensitive information and responses generated from sensitive files. With these new detections, organizations can identify and respond to potential data risks from user interactions with M365 Copilot and Copilot Studio, preventing misuse of these technologies. Additionally, an analytics dashboard helps you to identify top risks related to risky AI usage within your organization and recommends policies to mitigate possible risky activities. Learn more. 

 

Figure 6: Insider Risk Management Risky AI usage indicators dashboard

Learn how to prepare your information for GenAI 

We’re pleased to share two new resources to support customers as they address oversharing risks within their organizations.  

A new Microsoft deployment blueprint, Address oversharing in Microsoft 365 Copilot, provides a recommended path to address internal oversharing concerns during a M365 Copilot deployment. The blueprint breaks the deployment journey into three phases: Pilot, Deployment, and Operation after initial deployment. The blueprint also includes the new SAM and Purview capabilities covered in this blog. 

 

Figure 7: Blueprint to address internal oversharing concerns in Microsoft 365 Copilot

 

For demos of the approach outlined in the deployment blueprint, watch the latest episode in the Microsoft Mechanics video series, Oversharing Control at Enterprise Scale | Updates for Microsoft 365 Copilot in Microsoft Purview. This episode shows how you can use DSPM for AI, DLP for M365 Copilot, and SAM to control data visibility, automate site access reviews, and fine-tune permissions to minimize oversharing risks. 

   

Continued innovation for AI security and risk management  

Manage data security and compliance protections for Microsoft 365 Copilot and Microsoft Copilot  

Here are some of the Microsoft Purview capabilities that provide security, compliance and governance for Microsoft 365 Copilot beyond oversharing. 

Visibility into AI usage 

Security teams often find themselves in the dark when it comes to data security and compliance risks associated with AI usage, and without proper visibility, organizations struggle to safeguard their assets effectively. Data Security Posture Management for AI (DSPM for AI), now generally available, provides visibility into how sensitive data flows through Copilot prompts and responses, highlighting any risks related to data oversharing, data leakage and non-compliant use of Microsoft 365 Copilot. Instead of restricting AI use to avoid these outcomes, DSPM for AI suggests protection policies that use existing Microsoft Purview features. Learn more. 

 

Figure 8: DSPM for AI overview page in Microsoft Purview

Enable more secure and compliant GenAI usage  

GenAI introduces new security and safety risks that require new controls to address. For instance, malicious users can perform prompt injection attacks to elicit unauthorized behaviors from GenAI, and users can create content that may violate intellectual property laws. Today we are introducing new GenAI risk detections in Microsoft Purview Communication Compliance, built via the Prompt Shield and Protected materialclassifiers from the Azure AI Content Safety team. Microsoft Purview can now detect risks such as direct and indirect prompt injections and protected material usage in Copilot responses. Protected material usage includes sources such as news articles, lyrics, code content from known GitHub repositories, and software libraries within GenAI responses. Admins with appropriate permissions can receive alerts and investigate the potential incidents to help enable more secure and compliant usage of Copilot. Learn more.  

 

Figure 9: Communication Compliance detects a potential prompt injection attack

 

Better management options to retain and delete Copilot interactions 

Organizations can use Purview Data Lifecycle Management policies to keep or delete Copilot prompts and responses. Previously, these Copilot interactions shared a policy targeting location with Microsoft Teams chats, which meant that Copilot interactions and Teams chats were managed using the same retention and deletion settings. Today we are announcing a new Data Lifecycle Management policy location dedicated to Microsoft Copilot experiences. This new location enables admins to specify retention and deletion settings for only Copilot interactions, separate from Microsoft Teams chats. Learn more 

 

Figure 10: The new Data Lifecycle Management policy location for Microsoft Copilot Experiences. Note: this image does not display all available policy locations.

Enhanced web search controls 

As more customers experience the benefits of allowing Copilot to reference web content, we’ve heard a strong interest in being able to see the queries Copilot generates to fetch that information from the Bing search service.  

Today, we’re excited to announce the general availability of new features that allow greater visibility into Copilot generated web queries in Microsoft 365 Copilot Business Chat for both users and admins. Bing web search query citations for users allows users to see the exact web search queries derived from their prompt in the linked citation section of the Copilot response. This provides users with valuable feedback as to exactly how their prompts are used to generate web queries that are sent over to Bing search, providing the information they need to improve prompts and use Copilot more effectively. Web search query logging enables admins to perform search, audit, and eDiscovery on the exact web search queries Copilot derived from the user's prompt. 

We also recently began rollout of a new policy option, Allow web search in Copilot, that provides the ability to manage web search separately from other optional connected experiences for Microsoft 365. It will also support turning off web search in Microsoft 365 Copilot Business Chat (work) while keeping web search on in Business Chat (web).  

Read the announcement blog to learn how you can use these new controls to achieve increased visibility and control over web search in Copilot.   

Equip IT to lead AI transformation with the Copilot Control System 

CIOs and IT professionals are at the epicenter of AI transformation. To empower every IT team to lead at scale, we are introducing the Copilot Control System – designed for IT to confidently adopt and accelerate the business value of Copilot and agents.  

 

Capabilities of the Copilot Control System include:  

  • Data protection, enabling intelligent grounding on enterprise data while respecting your organization's controls  
  • Management controls that allow IT to govern access and usage of Copilot and agents, including that ability to control which users can use Copilot and agents alongside visibility to agent status and lifecycle 
  • Measurement and value reporting that allows IT and business leaders to track return on investment, adoption patterns, and business outcomes from the use of Copilot and agents.  

 

Read the announcement blog to learn more about the new governance and management capabilities available in the Copilot Control System. 

 

As generative AI offers greater ability to leverage information than ever before, information readiness has never been more important. The features and capabilities shared today provide powerful tools for preparing your information and we will continue to invest in delivering capabilities that enable customers implement effective controls for data protection, security, and privacy. 

Related 

 

 

Updated Nov 19, 2024
Version 1.0
No CommentsBe the first to comment