Commercial pilots have a co-pilot to help out in the cockpit. Similarly, Microsoft 365 Copilot is designed to be your helper when you’re using Microsoft 365 apps.
Embedded in the apps in the Microsoft 365 ecosystem, you can use natural language prompts to tap into AI, combining the power of large language models (LLMs) to create content in Word, visuals in PowerPoint, analyze trends in Excel, aggregate your data across apps and more.
However, with this added functionality, there are also more security risks.
Copilot for Microsoft 365 Security Risks
Copilot respects groups and permissions set within your Microsoft apps. When logging into Microsoft 365, users are authenticated and validated. As such, access is limited to the same information users already have. But, if you do not have robust access controls, Copilot now expands your risks.
Even if you have the right protocols in place, there are other concerns regarding Copilot for Microsoft 365 security risks.
Unintentional data access
While designed to prevent additional data sharing among users and groups, data access controls and Microsoft sensitivity labels should be applied to protect sensitive data. For example, Copilot chat can reference data from different sources within the Microsoft infrastructure. While it checks for usage rights, failing to employ consistent sensitivity labels can expose data to unauthorized users.
According to one study, 53% of organizations leave 1,000 or more files with sensitive data open to all employees — whether they should have access or not.
For example, if someone’s pay information or company financial documents get saved in the wrong location, it might be hidden unless someone knows exactly where to look for it. With Copilot, however, employees may be able to find it just by asking.
Sensitive data exposure
Copilot uses diverse data sources and analyzes trends. Even if the data itself is not sensitive, the analysis it performs may be considered proprietary. One of the biggest values of Copilot for Microsoft 365 is this capability. For example, Copilot can join your Teams meeting and summarize what’s being discussed or summarize and generate replies to your emails. However, because Copilot can generate new data or insight based on what it finds, the new data must also be protected. Yet, users may not think to treat it as such or store it as securely.
Sharing with third parties is easy using Microsoft apps, so it may also be easy to share confidential data without thinking about it.
Sharing confidential data
One recent report showed that 15% of employees regularly post company data into AI tools like ChatGPT and as much as a quarter of the data should be considered confidential, such as internal business data and personally identifiable information (PII).
What’s different with Copilot is that it has access to more than what you share with it. It can instantly search and compile data from your documents, spreadsheets, presentations, email, and notes. For example, employees may consider chat within Teams channels as private conversations, but those with access can tap into information and data within these channels. These can lead to accidental sharing.
Malicious intent from internal and external threat actors
Another Copilot for Microsoft 365 security risk comes from internal and external threat actors. An insider with malicious intent could purposely seek sensitive data using prompt engineering techniques, gaining access to data they shouldn’t have.
External attackers may use compromised credentials to log in to a Microsoft 365 account and enable Copilot, giving them access to far broader sets of data without having to comb through internal networks or understand taxonomy. Using natural language, they may be able to draw out sensitive data that would otherwise be hidden.
What Can Microsoft 365 Copilot Access?
In short, Copilot can access any data within your Microsoft 365 tenant. Using the Microsoft Graph, an API for data sharing across various Microsoft 365 services, This includes SharePoint, OneDrive, Teams Chat, Exchange (emails, contacts and calendars), Word, Excel, PowerPoint, etc.
Copilot can also access any data or documents where users have view permissions even if they do not have editing rights.
Avoiding Copilot for Microsoft 365 security risks
You can help mitigate Copilot for Microsoft 365 security problems by taking a few key security steps.
Employ strong data governance policies
Make sure you have strong data governance policies and apply them consistently. This includes data classifications, access controls, and permissible uses. Mandate encryption of data at rest and in transit and apply special data labeling for sensitive information.
Microsoft 365 permissions can be extremely complex, especially for enterprise companies. One report said the average Microsoft 365 tenant has as many as 40 million unique permissions.
Data governance should also include employee training to ensure team members know what data can be shared and what needs to remain confidential.
Deploy the principle of least privilege (PoLP)
Restrict data access for employees to the minimum level requires for them to perform their jobs. As employees move to new positions within your organization, it is also crucial to evaluate their role and ensure that access is still needed. A zero-trust approach, requiring reauthentication at the app level helps mitigate unauthorized access.
Stay up to date on emerging threats
With any software or SaaS platform, making sure you stay up-to-date on the latest patches and security advisories is crucial. You can sign up to subscribe to Microsoft’s security notifications.
Be careful with web access and third-party plugins
Copilot for Microsoft 365 can reference third-party tools and services using Microsoft Graph connectors or plugs. Data can be returned if the user has access. This means data from external sources can be used if these options are enabled.
A Final Note
While Microsoft secures its platform, remember that it’s up to you to take the proper safeguards. Copilot’s responses and analysis aren’t guaranteed to be 100% accurate or safe. As Copilot itself reminds us when prompted, it’s an AI tool and “the final responsibility lies with the user.”
If you have questions about Copilot for Microsoft 365 in your environment reach out. We’d be happy to work with you on it.