search
Cloud Blog – Copilot Security Concerns, Addressed: Is Copilot Safe to Use?
Microsoft

Copilot Security Concerns, Addressed: Is Copilot Safe to Use?

As per the Harvard Gazette, data privacy was already among the most voiced concerns over artificial intelligence as early as 2020. With the advent and subsequent influx of large language models and generative AI over the past two years, the adoption levels skyrocketed across firms in various sectors, and corporate surveillance made the list, too. Some other pressing issues in this context included the possibilities of intellectual property theft and confidential data leakage.

Since regulatory frameworks surrounding AI and its use remain quite underdeveloped or non-existent in numerous jurisdictions, the IT giants behind the market-leading platforms had to take matters into their own hands and clarify their policies. This article aims to answer Copilot security concerns using Microsoft’s public information.

What’s the Flow?

Microsoft Copilot injects AI assistance directly into the Microsoft 365 suite. Let’s peel back the layers and see how it works step-by-step.

So, Microsoft Copilot can only access your organization’s content if you give it access during a chat. There are three ways this can happen:

  1. You directly type or paste the information into the chat.
  2. You upload a file through the chat box using the paperclip icon (drag and drop is also available in preview).
  3. You use Copilot in Edge with the “Allow access to any webpage or PDF” setting on and have an intranet page open. Copilot may use that page’s content to answer your questions.

First things first: secure access. You sign in with your Entra ID, verifying your identity and license. Communication becomes secure after authentication. When you interact with Copilot, your chat session data goes through a vital transformation. Any information that could identify you or your organization is carefully removed. This anonymized data is then sent securely to the Copilot Orchestrator, the central hub that manages communication within the system. Following secure transmission, the Copilot Orchestrator routes your prompts and desired actions (requests) to Copilot, previously known as Bing Chat. It’s important to note that Copilot operates as a connected experience separate from Microsoft 365, expanding its reach and potential.

Now, the real magic happens. Behind the scenes lies a powerhouse — a private instance of Azure OpenAI. This powerhouse is equipped with cutting-edge models like GPT-4 and DALL-E 3, specifically trained for tasks like generating text, translating languages, and creating images. Importantly, OpenAI itself has no access to these private models or the data they process.

To ensure Copilot’s responses are accurate and relevant, search queries come into play. These queries carefully ground your prompts and responses in the latest data from Bing’s massive search index, ensuring the information you receive is tailored to your needs and reflects the most recent developments. For added trust and transparency, citations are automatically included in Copilot’s responses, clearly revealing the sources used to generate the information and empowering you to verify its accuracy (and explore further, if desired).

With Copilot security concerns in the spotlight, safeguards remain imperative at every stage. All data is encrypted in transit using the robust TLS 1.2+ protocol, guaranteeing secure communication between you and the system. Furthermore, to protect your privacy, chat sessions are temporary. This temporary data is encrypted at rest using the AES-128 standard. Once your chat session concludes, both prompts and responses are discarded, ensuring no trace of your interaction remains unless you explicitly save it.

Copilot’s beauty lies in its accessibility. You can interact with it from various platforms, including a dedicated website (copilot.microsoft.com), directly through the Bing engine on any modern browser (desktop and mobile alike), within Microsoft Edge, and even the Windows environment itself. This flexibility allows Copilot to integrate smoothly into your workflow, no matter where you work (how to use Microsoft Copilot?).

Unsure if you’re ready for AI just yet? Not a problem. Request a Proof-of-Concept to see if Copilot could augment your Microsoft 365 experience. Order now →
CTA Image

Copilot Security Concerns: How Microsoft 365 Protects Your Data

Here’s a breakdown of the steps the company takes to ensure your Microsoft Copilot privacy and security.

Leveraging Existing Permissions

Microsoft Copilot doesn’t operate in a silo. It relies on the permission models already established within your Microsoft 365 tenant, meaning Copilot can only access and utilize data that you, as a user, already have permission to see. This approach ensures your existing data access controls are strictly enforced, preventing unintentional leaks between users, groups, or even different tenants within the Microsoft 365 ecosystem.

Respecting User Identity Boundaries

Microsoft Copilot utilizes a technology called Semantic Index. It plays a crucial role in grounding prompts and responses within relevant data; however, even within this process, user identity stays central. The Semantic Index strictly adheres to user identity-based access boundaries. In simpler terms, it only accesses content that you, the current user, are authorized to see, further reinforcing Microsoft Copilot data privacy and preventing unauthorized access to sensitive information.

Honoring Data Encryption

Many organizations leverage Microsoft Purview Information Protection for data encryption, which can be applied through sensitivity labels or restricted permissions within apps using Information Rights Management (IRM). Here’s the good news: Microsoft Copilot fully respects the usage rights granted to the user when it encounters data encrypted by Microsoft Purview. That being said, even encrypted data remains secure and cannot be accessed beyond the user’s authorized permissions.

Multi-Layered Security Infrastructure

Microsoft understands the importance of data security, and Copilot benefits from a robust infrastructure already in place within Microsoft 365. Here are some key aspects of this multi-layered approach that might ease a number of Copilot security concerns:

  • Customer content within each Microsoft 365 tenant is logically isolated through Microsoft Entra authorization and role-based access control (RBAC), meaning data will not inadvertently mingle between different users or organizations.
  • Microsoft employs rigorous physical security measures, background screening for personnel, and multi-level encryption, safeguarding the confidentiality and integrity of customer content throughout its lifecycle.
  • Microsoft 365 utilizes service-side technologies to encrypt customer content at rest and in transit. This includes technologies like BitLocker, per-file encryption, Transport Layer Security (TLS) we discussed earlier, and Internet Protocol Security (IPsec).

Commitment to Privacy Regulations

The company acknowledges the importance of Microsoft Copilot data privacy concerns and adheres to a strict set of regulations and standards. Such commitment includes compliance with broadly applicable laws like the GDPR and industry standards like ISO/IEC 27018, the world’s leading borderless code for cloud privacy.

Plug-In Access Control

Microsoft Copilot can be accessed through plug-ins within Microsoft 365. An additional security layer exists here as well. Encryption can be applied to exclude programmatic access for these plug-ins. In other words, the plug-ins cannot access the content they are not meant to read, further limiting the potential for unauthorized access.

‘Jailbreak’ Protection

The Prompt Shields API acts as a security shield when interacting with large language models. It watches for a multitude of attacker tactics designed to manipulate the AI’s behavior and potentially compromise security or privacy. These tactics can range from bypassing established rules and feeding misleading instructions to impersonating a different system or encoding unauthorized commands. Deceptive manipulation of data fed to the AI is another one of Copilot security concerns mitigated by Prompt Shields, along with attempts to break into the system itself and steal or disrupt data. The API goes as far as to safeguard users against attackers trying to trick them into giving out their financial details or even spreading malware.

Data Residency

With commercial data protection enabled, chat data is only cached temporarily for a brief period during a session to improve performance. This cached data is then discarded once you close the browser, reset the chat topic, or your session times out.

As of March 1, 2024, Microsoft offers data residency commitments for Copilot for Microsoft 365 customers through its Advanced Data Residency (ADR) and Multi-Geo Capabilities offerings. For users in the European Union, Copilot for Microsoft 365 benefits from the EU Data Boundary (EUDB) service, meaning their data is processed within the EU. For customers outside the EU, however, Copilot queries may be processed in the United States, the European Union, or even other regions.

There are some limitations to compliance guarantees with optional Bing-powered features within Copilot. These features don’t necessarily adhere to Microsoft’s EU Data Boundary (EUDB) or Data Protection Addendum (DPA) agreements.

For added peace of mind, remember that Copilot cannot access your organization’s data stored within your tenant boundaries. Additionally, chat conversations are never used to train the underlying AI models that power Copilot.

Hope we’ve managed to address all (or at least most) of your Copilot security concerns! Cloudfresh is a recognized Microsoft Partner. If you are interested in getting started with Microsoft 365 in general or Microsoft Copilot in particular, you’re more than welcome to reach out using the form below.

Get in touch with Сloudfresh