Microsoft 365 Summit Vancouver
I recently attended the Vancouver Microsoft 365 Summit, and it was fantastic to reconnect with the community and get a peek at what’s coming next. (Spoiler: Surprise! SharePoint is getting more agentic document workflows. That is actually a good thing.)
But one thing really hit home for me: the data readiness challenges that come with rolling out something like Microsoft 365 Copilot.
I’ve been quick to reassure folks that the privacy and licensing options for enterprise-grade AI tools like Microsoft Copilot, Google Gemini, or OpenAI ChatGPT, are solid. They don’t train on your data, and they respect user access rights. For most organizations, that’s enough to check the privacy box. (With one caveat: if your data must stay in Canada, double-check what’s available. YMMV.)
But here’s the kicker: even if you enable Microsoft 365 Copilot and it faithfully respects access controls for indexing SharePoint and OneDrive, and showing model answers only to authorized users, that doesn’t mean your data is actually ‘safe.’
You flip the switch, think: “Perfect. Done!” And go out and celebrate with an expensive holiday-themed coffee called peppermint oat milk foamy something.
Nope. Bad idea.
Because here’s what happens next: someone gives Copilot a harmless question and up pops a sensitive document buried in a subfolder of /temp/archive on the ReallyBoringDepartment’s internally public SharePoint site. Maybe it’s a performance review. Maybe it’s a confidential partnership discussion. Either way, it’s not what you want surfacing in a casual query. Not everyone in your organization has shared files properly. Before you roll out Copilot widely, you need to make sure your data controls are up to standard.
And it’s not just about permissions. IT’S WORSE THAN THAT (Sorry, the fancy holiday coffee is making me dramatic.) What about outdated documents? The pre-final-final version of a contract missing key clauses? The HR policy draft with a critical error in the vacation section? These also shouldn’t be shown.
So what to do?
Start small. Enable Copilot for a few key users. Have them actively look for issues in system settings and in Copilot’s responses. Treat it like a pilot, not a launch.
Train your users. Make sure everyone knows where data they store is going and how it can be found.
Personally, Copilot and business chat have changed how I work. The integration with Teams, SharePoint, and Outlook is a game-changer. I spend way less time hunting for things or trying to remember which channel I sent something on. It’s company-wide search and synthesis that works really well. Sometimes too well.
If you’re thinking about rolling out AI tools, talk to Cypress Falls. We can help you accelerate implementation while making sure it’s safe, effective, and aligned with your data governance needs.
Then it’s time for a well-deserved gingerbread latte.
Photo by Jennie Razumnaya on Unsplash

