M365 Copilot & Your Data: 4 Common Misconceptions

Since Microsoft 365 Copilot landed, I’ve had many conversations with businesses who are anywhere from just starting on their journey with it to being advanced users. I have listed here some common misconceptions I hear about data and M365 Copilot. Most of them boil down to one thing, misunderstanding what Copilot actually is and how it works under the bonnet.

In this post I talk about the paid license M365 Copilot that is typically used by businesses, not the free, personal Microsoft Copilot.

So let’s clear the air. Here are four common misconceptions I’ve heard, and the real story behind each one.


Misconception 1: “When I buy Copilot, I get my own private AI model”

Nope, that’s not how it works.

When you license Copilot, you’re not spinning up your own personal GPT instance tucked away in a private server. What you’re actually getting is secure access to a shared large language model hosted in Microsoft’s cloud. Think of it like renting a lane on a motorway, you’re using the same infrastructure as everyone else, but your data stays in your own vehicle.

Microsoft’s architecture is designed to keep your data isolated. Your prompts and context are processed securely, and your content is never used to train the model. So while the model is shared, your data isn’t.


Misconception 2: “My data never leaves my tenant”

This one’s a bit trickier. Your data does reside in your tenant, but when you use Copilot, the relevant bits of it are sent to Microsoft’s cloud-based AI models for processing.

That means your prompt and the context Copilot gathers, emails, files, chats and so on, are securely transmitted to the nearest available AI compute cluster. If that cluster’s busy, your data might be routed to another Microsoft datacentre, possibly in another country. But if you’re in the EU, Microsoft guarantees that your data won’t leave the EU boundary.

So yes, your data leaves your tenant, but it stays within Microsoft’s secure cloud, and within the rules.


Misconception 3: “Copilot can access all ‘Anyone’ share links in my organisation”

Not quite.

An ‘Anyone’ link, the kind that lets anyone with the link view a file, doesn’t automatically make that file searchable. For Copilot to surface content from an ‘Anyone’ link, the user must have redeemed the link, meaning they’ve clicked it and accessed the file.

Until that happens, Copilot treats the file as off-limits. It operates strictly within the context of the user’s permissions. So if you haven’t clicked the link, Copilot won’t see it, even if the link exists somewhere in your tenant.

Also worth noting, ‘Anyone’ links are risky. They’re essentially unauthenticated access tokens. Anyone can forward them, and there’s no audit trail. Use them sparingly.


Misconception 4: “Copilot only sees my data if I attach it to the prompt”

Wrong again.

Copilot doesn’t wait for you to upload a document or paste in a paragraph. It automatically pulls in any content you already have access to, emails, OneDrive files, SharePoint docs, Teams chats, calendar entries, the lot.

This is called grounding. When you ask Copilot a question, it searches your Microsoft 365 environment for relevant context, then sends that along with your prompt to the AI model. If you’ve got access to a file that answers your question, Copilot will find it, no need to attach anything manually.

That’s why data access controls are so important. If a user has access to sensitive content, Copilot can use that content in its responses. It won’t override permissions, but it will amplify whatever the user can already see.


Final Thoughts

M365 Copilot is powerful, but it’s not magic. It works within the boundaries of Microsoft 365’s architecture, permissions and security model. Understanding those boundaries is key to using it safely and effectively.

If you’re rolling out Copilot in your organisation, make sure your users understand what it can and can’t do and make sure you know how to protect your data.


Further Reading

Preparing for Microsoft Copilot

💡 What are some of the things you can do NOW to start preparing for Microsoft Copilot in your organisation?

✅ Understand that there is technical readiness and people readiness. Both need attention to maximise return on investment.

✅ Familiarise yourself with Copilot and its role in the world of generative AI https://www.microsoft.com/en-us/ai.

✅ Engage your Ethics, Legal, and Inclusion champions early on to address the implications of generative AI.

✅ Incorporate the six Microsoft principles in AI into your company’s AI usage policy https://www.microsoft.com/en-us/ai/responsible-ai.

✅ Take your first step into the Copilot world with Bing Chat Enterprise https://www.microsoft.com/en-us/bing/chat/enterprise/?form=MA13FV, an AI-powered chat platform included with common M365 licences

✅ Prioritise data security by reviewing identity and access policies, using data labelling for sensitive documents, and ensuring awareness of file sharing risks. Do your device management policies need a health check?

✅ Identify areas where you may need external assistance and initiate conversations with technology partners.

🚀 Implementing generative AI into a business is a marathon, not a sprint. Set those expectations now. There are quick wins out there to get hands on with the technology while you get Org ready.  Windows Copilot in Windows 11 is one of them Hands on with Windows Copilot.

Hands on with Windows Copilot

In this post, I explain my first impressions of Windows Copilot along with some tips for getting started with it.

Introduction

On 26th September 2023, Microsoft released optional update KB5030310, one of the most ground breaking updates to Windows in recent times. With it comes Windows Copilot, which for millions of users worldwide will serve as an introduction to using an AI powered chat interface to enhance their day to day productivity.

Installing Windows Copilot Preview

Check out my guide for installing the Windows Copilot Preview.

Accessing Windows Copilot

Once installed, the Windows Copilot Preview icon appears in the taskbar.

Clicking the Copilot icon opens the full vertical length chat interface.

The screenshot above was taken while logged in on an Azure AD joined device that had Bing Chat Enterprise enabled. This is what the Protected green shield at the top signifies. 

Changing Device Settings

Frist things first. Settings in Windows are buried all over the place. Copilot helps by you simply telling it what you want to do. Copilot will respond with a question with Yes and No thanks buttons. In the video below, I ask it to set Windows to the light theme.

Compound Task

Trying something more advanced, I asked it to create a text file on the desktop. It offered to open Notepad for me and apologised for not being able to perform the end to end task itself. It also opened the Desktop Background settings window in case I wanted to change the theme.

Getting advice for troubleshooting

Telling Copilot that my workstation was slow it presented me with a couple of options to get started. First were Yes/No questions for opening Task Manager or opening ‘a troubleshooter’. It followed that with some guidance text.

A bit of a fail though was that clicking ‘open a troubleshooter’ failed saying that the Get Help app wasn’t installed!

Summarising a Web Page

Let’s get to the really clever stuff. Windows Copilot integrates with Bing Chat (Bing Chat Enterprise if its enabled in your Azure tenant). This brings some cool features. In the video below I am browsing the United States Constitution, a large document. Copilot summarises this in a concise manner.

  • I found that asking it to summarise the Wikipedia page of The River Thames or the recent amendments to an IEEE standard resulted in it largely regurgitating verbatim what was on the web page.
  • I think this feature will work best when not attempting to summarise highly technical documents (the IEEE standard) or indeed, text that is already very concise like the Wikipedia page.
  • Simply typing summarise this page sometimes did not trigger the Edge integration, instead it described what the Copilot window was for. Typing summarise this web page seemed to always work.

The first time you try to use the Copilot > Edge integration, you will need to give permission to share Edge browser content with Copilot when it prompts you.

Integration with Bing Search

We have had search bars in Windows for many years. Copilot however binds the advanced understanding that AI can bring. Here I asked it to provide an outline for a 15 minute presentation about ant nesting habits and to include reference articles. It provided links to sources for the facts in the presentation. A small section is below:

Images and Snipping Tool

Bing’s AI image creation service, powered by DALL-E, is accessible via Copilot. Just ask it to create any image and off it goes.

When you use the Snipping Tool, Copilot will ask if you want to add the image to the chat. You can ask Copilot about the image.

Choosing Conversation Style

You have probably realised by now that Copilot is quite wordy. It defaults to a conversational style which will help many users who are new to Copilot. When you want to get to the point though, you can change from More Creative (purple) to More Balanced (blue) or More Precise (teal).

Below is an example of how this affects the responses. As you can see, if you’re looking for a warm, cuddly bit of chat, don’t turn to More Precise!

Conclusion

Windows Copilot presents the second evolution of Microsoft’s generative AI offering. The first being Bing Chat and the next being Microsoft Copilot. If you are preparing your organisation for Microsoft Copilot, enabling Windows Copilot is a great way to start training users in a new way to command and converse with their computer.

Windows Copilot is useable and versatile even at this early stage in its life. It will develop, for instance there are more integrations with built-in Windows apps that will are slated for phased rollouts soon.