🙌 A huge thank you to everyone who joined our live event: Securing Productivity with Microsoft 365 Copilot! 💻✨
It was such a pleasure to connect with so many of you, share ideas, and dive deep into how we can boost productivity while keeping security front and center. 🔐💡
Your questions and engagement made the discussion truly insightful and interactive. This is exactly what makes these sessions so valuable—learning together and exchanging perspectives!
🎥 Missed the session? No worries!
We’ve got you covered. You can catch the full recording here 👇
✨ Grab a coffee ☕, sit back, and enjoy the insights—we hope you find some great takeaways to apply in your organization.
💬 I’d love to hear from you:
👉 How are you planning to use Microsoft 365 Copilot securely in your organization?
👉 What challenges or opportunities do you see when combining AI and security?
Drop your thoughts below! 👇💭 Let’s keep the conversation going and learn from each other.
Microsoft introducing a new capability in Copilot Pages that allows Copilot Chat to answer questions based on the contents of the currently open page. This enhancement supports more efficient workflows by enabling users to get contextual answers directly within the side-by-side Chat experience, without needing to switch views or search manually.
When this will happen: General Availability (Worldwide): We will begin rolling out in early October 2025 and expect to complete by early November 2025.
How this affects your organization:
Who is affected: Users with access to Copilot in Microsoft 365 who use Copilot Pages and Copilot Chat in side-by-side mode.
What will happen:
Users will be able to ask questions in Copilot Chat about the currently open page in Copilot Pages.
Copilot will respond using the content of the open page to provide contextual answers.
The feature will be ON by default for eligible tenants to configure.
Existing admin policies are respected; no policy changes are required.
What you can do to prepare:
Communicate this change to helpdesk and support staff.
Update internal documentation or training materials that reference Copilot Pages or Chat functionality.
Review user guidance to ensure users understand how to use Copilot effectively in side-by-side mode.
Hello Friday 🎉 ! It’s time to unwind with a great story. Who doesn’t love comics? I certainly do! So, let’s dive into my comic story and introduce you my character… My new Tech Comic: Meet Jo Valiant the Guardian of Digital Realms & AI Ethics
🆕 Character 🦸 Name: Jo Valiant ✍ Alias: The Ciphermind 🚨 Role: Guardian of Digital Realms & AI Ethics 🗒️ Catchphrase: “Decode The Chaos.”
In a futuristic digital landscape where data flows like storms and privacy is constantly under threat, emerges Jo The Ciphermind—a guardian clad in encrypted armor and equipped with a mind shielded by advanced privacy protocols. Jo is not just a character but a symbol of resilience and vigilance in the realm of Microsoft 365 Copilot. With glowing blue accents and a brain symbol illuminated on the chest, Jo represents the fusion of human cognition and artificial intelligence. The headphones signify constant connectivity, while the surrounding icons—a padlock, AI chip, shield with a brain, and fingerprint—highlight Jo’s mission to protect data, ensure privacy, and uphold security. This slide introduces Jo as the protagonist of our comic-style journey, setting the stage for an engaging exploration of how Microsoft 365 Copilot integrates these critical elements into its ecosystem.
Balance of Innovation and Integrity Jo’s journey highlights the balance between advancing AI innovation and upholding responsible data stewardship.
Jo’s Mission Jo The Ciphermind’s mission transcends individual battles—it’s about empowering users with transparency, control, and trust in Microsoft 365 Copilot.
Microsoft Purview’s Data Loss Prevention (DLP) now allows you to prevent Microsoft 365 Copilot from processing emails and other content marked with specific sensitivity labels by configuring DLP policies in the Microsoft Purview portal.
By creating a DLP policy with the “Content contains > Sensitivity labels” condition for the Microsoft 365 Copilot policy location, you can restrict Copilot from using this sensitive content in its responses and summarizations, thereby enhancing data protection.
This feature will allow DLP policies to provide detection of sensitivity labels in emails as enterprise grounding data and restrict access of the labeled emails in Microsoft 365 Copilot chat experiences. This feature only works for emails on or after 1/1/2025.
How this will affect your organization:
Organizations with no existing DLP for Microsoft 365 Copilot policies are not impacted. Customers with the required licenses will be able to go to the Microsoft Purview portal to create policies in the Data Loss Prevention solution. Admins can also go to Data Security Posture Management for AI (DSPM for AI) to see recommendations for creating Microsoft 365 Copilot policies.
Admins should create a new DLP policy using the Copilot location to use this feature:
Microsoft Copilot will be able to answer questions based on content shared onscreen during a Teams meeting.
Microsoft: “Copilot will be able to understand slides, documents, spreadsheets, and websites, or anything else shared onscreen. Users will be able to ask simple recall questions, such as “Show me the content that was shared on the screen” or more specific questions like “what was the Sales target number” if it was shared on a previous slide. Users will also be able to combine screen-share with transcript and chat data to ask, “Show me all the slides and the feedback on each slide,” or “Rewrite the paragraph based on the comments from the audience”.”
Microsoft Update at August 2025
After further review, we are not able to continue rolling this out at this time. We apologize for any inconvenience. Now, Copilot in Teams can analyze content shared on-screen during a meeting when recording is enabled. This, along with meeting transcript and meeting chat, enables users to ask Copilot to summarize or find specific information from screen-shared content (e.g., ‘Which products had the highest sales?’), consolidate insights across both the conversation and presentation (e.g., ‘What was the feedback per slide?’), and draft new content based on the entire meeting (e.g., ‘Rewrite the paragraph shared on the screen incorporating the feedback from the chat’). This works for any content shared while sharing your desktop screen (including but not limited to documents, slides, spreadsheets, and websites, irrespective of platform or app). Support for PowerPoint Live and Whiteboard in Teams will be available at a later date.
Microsoft announce some important updates to M365 Copilot Chat that will enhance security and user experience, following:
Integration with SafeLinks:
M365 Copilot Chat will integrate with SafeLinks in Defender for Office 365 to provide time-of-click URL protection for the hyperlinks included in its chat responses.
This change applies to users with Microsoft Defender for Office 365 Plan 1 or Plan 2 service plans. No policy configuration is needed within the SafeLinks policy.
Within Microsoft Defender for Office 365 Security Center, URL protection report will show the relevant summary and trend views for threats detected and actions taken on URL clicks.
Native Time-of-Click URL Reputation Check:
For users without SafeLinks protection (which is available as part of Microsoft Defender for Office 365), M365 Copilot Chat will natively enable time-of-click URL reputation check for the hyperlinks returned in its chat responses.
Hyperlink Redaction Changes:
M365 Copilot Chat will no longer redact hyperlinks in its chat responses if they are found in the grounding data used to generate the responses.
When this will happen:
General Availability (Worldwide): We will begin rolling out in late March 2025 and expect to complete by late May 2025.
Rollout will start on desktop and web and will complete with mobile versions. We plan to extend these updates to Copilot Chat experiences in Office apps in the future.
How this will affect your organization:
These updates are designed to enhance the security of the links included in M365 Copilot Chat response, ensuring that users are protected from malicious URLs.
What you need to do to prepare:
You may consider updating your training and documentation as appropriate to ensure users are aware of the change in behavior with hyperlinks in M365 Copilot Chat.
Microsoft 365 Copilot is a sophisticated processing and orchestration engine that provides AI-powered productivity capabilities by coordinating the following components:
Large language models (LLMs)
Content in Microsoft Graph, such as emails, chats, and documents that you have permission to access.
The Microsoft 365 productivity apps that you use every day, such as Word and PowerPoint.
How does Microsoft 365 Copilot use your proprietary organizational data?
Microsoft 365 Copilot provides value by connecting LLMs to your organizational data. Microsoft 365 Copilot accesses content and context through Microsoft Graph. It can generate responses anchored in your organizational data, such as user documents, emails, calendar, chats, meetings, and contacts. Microsoft 365 Copilot combines this content with the user’s working context, such as the meeting a user is in now, the email exchanges the user had on a topic, or the chat conversations the user had last week. Microsoft 365 Copilot uses this combination of content and context to help provide accurate, relevant, and contextual responses.
Microsoft 365 Copilot only surfaces organizational data to which individual users have at least view permissions. It’s important that you’re using the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content within your organization. This includes permissions you give to users outside your organization through inter-tenant collaboration solutions, such as shared channels in Microsoft Teams.
When you enter prompts using Microsoft 365 Copilot, the information contained within your prompts, the data they retrieve, and the generated responses remain within the Microsoft 365 service boundary, in keeping with our current privacy, security, and compliance commitments. Microsoft 365 Copilot uses Azure OpenAI services for processing, not OpenAI’s publicly available services. Azure OpenAI doesn’t cache customer content and Copilot modified prompts for Microsoft 365 Copilot.
Data stored about user interactions with Microsoft 365 Copilot
When a user interacts with Microsoft 365 Copilot (using apps such as Word, PowerPoint, Excel, OneNote, Loop, or Whiteboard), we store data about these interactions. The stored data includes the user’s prompt and Copilot’s response, including citations to any information used to ground Copilot’s response. We refer to the user’s prompt and Copilot’s response to that prompt as the “content of interactions” and the record of those interactions is the user’s Copilot activity history. For example, this stored data provides users with Copilot activity history in Microsoft 365 Copilot Chat (previously named Business Chat) and meetings in Microsoft Teams. This data is processed and stored in alignment with contractual commitments with your organization’s other content in Microsoft 365. The data is encrypted while it’s stored and isn’t used to train foundation LLMs, including those used by Microsoft 365 Copilot.
To view and manage this stored data, admins can use Content search or Microsoft Purview. Admins can also use Microsoft Purview to set retention policies for the data related to chat interactions with Copilot. For Microsoft Teams chats with Copilot, admins can also use Microsoft Teams Export APIs to view the stored data.
Deleting the history of user interactions with Microsoft 365 Copilot
Microsoft 365 Copilot calls to the LLM are routed to the closest data centers in the region, but also can call into other regions where capacity is available during high utilization periods.
For European Union (EU) users, we have additional safeguards to comply with the EU Data Boundary. EU traffic stays within the EU Data Boundary while worldwide traffic can be sent to the EU and other countries or regions for LLM processing. The EU Data Boundary is a geographically defined boundary within which Microsoft has committed to store and process Customer Data and personal data for our Microsoft enterprise online services, including Azure, Dynamics 365, Power Platform, and Microsoft 365, subject to limited circumstances where Customer Data and personal data will continue to be transferred outside the EU Data Boundary.
How does Microsoft 365 Copilot protect organizational data?
The permissions model within your Microsoft 365 tenant can help ensure that data won’t unintentionally leak between users, groups, and tenants. Microsoft 365 Copilot presents only data that each individual can access using the same underlying controls for data access used in other Microsoft 365 services. Semantic Index honors the user identity-based access boundary so that the grounding process only accesses content that the current user is authorized to access.
Copilot works together with your Microsoft Purviewsensitivity labels and encryption to provide an extra layer of protection. The following diagram provides a visual representation of how Copilot honors your information protection controls using sensitivity labels and encryption.
Copilot will only work with your M365 tenant data and won’t be able to access other companies’ data. Plus, your data doesn’t train the AI for other companies to leverage..
🔝I am excited to announce that I will be speaking at the “Διημερίδα Ψηφιακής Εξέλιξης in Corfu, taking place on February 7-8! @silicon_corfu
📆Title: Get started with Microsoft 365 Copilot in Excel 📝Description: I’m excited to share some insights about the amazing features of Microsoft 365 Copilot in Excel. This innovative tool is designed to help you work more efficiently with your data by providing intelligent suggestions and insights.
With Copilot in Excel, you can do much more with your data. It generates formula column suggestions, shows insights in charts and PivotTables, and highlights interesting data, making it easier for you to uncover valuable information.
In our upcoming presentation, we will explore these features in detail and see how they can enhance our productivity:
📍Formulas: Writing, explaining, and asking questions 📍More formula use cases 📍Working with text 📍Visualize: Charts and Color 📍Ask questions about Excel 📍Demo
🚀 I look forward to seeing you there! Don’t miss the opportunity to participate in this important event and enrich your knowledge of the latest Microsoft technologies. Register now for free and join us for discussions and learning!
Microsoft introducing Microsoft 365 Copilot Chat, a new offering that adds pay-as-you-go agents to our existing free chat experience for Microsoft 365 commercial customers. Copilot Chat enables your entire workforce—from customer service representatives to marketing leads to front-line technicians—to start using Copilot and agents today. It includes:
Free, secure AI chat powered by GPT-4o.
Agents accessible right in the chat.
IT controls, including enterprise data protection and agent management.
Copilot Chat: The power of chat + agents
Copilot is the UI for AI, and it all starts with Copilot Chat. It’s the chat experience you’ll use every day—powered by broad knowledge from the web, built on GPT-4o, and designed to be safe and secure for business use. It represents a foundational shift in how we work, enabling everyone to work smarter, faster, and more collaboratively.
Copilot Chat includes:
Web-grounded chat with GPT-4o. You can use it to do market research, write a strategy document, or prepare for a meeting. File uploads allow you to add any document to the chat and ask Copilot to do things like summarize key points in a Word document, analyze data in an Excel spreadsheet, and suggest improvements to a PowerPoint presentation.1 With Copilot Pages, you can collaborate on content with people and AI in real time—adding content from Copilot, your files, and now from the web as well. And you can quickly create AI-generated images for campaigns, product launches, and social media posts.2
Agents. Using natural language, now anyone can easily create agents to automate repetitive tasks and business processes—directly in Copilot Chat. A customer service representative can ask a customer relationship management (CRM) agent for account details before a customer meeting, while field service agents can access step-by-step instructions and real-time product knowledge stored in SharePoint. Agents are priced on a metered basis, and IT stays in control. IT admins can also build organization-wide agents and manage agent deployment, all powered by Microsoft Copilot Studio.
Copilot Control System. Copilot Chat includes foundational capabilities of the Copilot Control System, including enterprise data protection (EDP) for data privacy and security and the ability to govern access and manage the usage and lifecycle of Copilot and agents, as well as measurement and reporting.
Download the Microsoft 365 Copilot mobile app from here..
This month Microsoft rolled out support for an additional 12 languages in Microsoft 365 Copilot: Bulgarian, Croatian, Estonia, Greek, Indonesian, Latvian, Lithuanian, Romanian, Serbian (Latin), Slovak, Slovenian, and Vietnamese. Microsoft 365 Copilot now supports a total of 42 languages.
Finally, users working in Serbian language will see Teams meeting transcripts in Cyrillic, rather than Latin script. This is an issue Microsoft working to resolve. Microsoft will provide customers with updates on progress towards providing Teams meeting transcripts for Serbian language in Latin script on an as-appropriate basis. Learn more about supported languages for Microsoft Copilot here.
Microsoft are also continuing to expand the list of supported languages, with plans to offer support for even more languages in the coming months, stay tune!