* This blog post is a summary of this video.

How Microsoft 365 Copilot Understands Organizational Data to Enhance Productivity

Author: Microsoft MechanicsTime: 2024-01-23 01:30:01

Table of Contents

Introduction to Large Language Models and Microsoft 365 Copilot

Microsoft recently demonstrated Microsoft 365 Copilot, which transforms how we work by leveraging large language models that interact with organizational data. Copilot works alongside users in Microsoft 365 apps like Word, Outlook, PowerPoint, and Teams to generate relevant content and responses.

In this blog post, we'll break down how large language models gain knowledge from public datasets, interact through natural language prompts, and power intelligent experiences in Microsoft 365.

How Large Language Models Gain Knowledge from Massive Public Datasets

Large language models (LLMs) like those used in Microsoft 365 Copilot are trained on massive amounts of public data, including books, articles, and websites. This allows them to learn language, context, and meaning that can be applied when generating responses. You can interact with LLMs using natural language prompts - statements or questions that provide context. When you ask a question, the LLM generates a response based on its training and understanding of the prompt's context. As you continue conversing, the LLM temporarily gains more context from the full prompt history.

Prompt-Based Interactions with Large Language Models

For example, you can provide an LLM more context by giving it additional details in the prompt. The LLM will then refer to this context when formulating its response. However, it only remembers prompt history during an active conversation. Once a new conversation starts, prior information is wiped clean. To illustrate, in a public chatbot like Microsoft Bing Chat, you can provide details on your outfit in the prompt. The chatbot can then respond accurately to related questions. But in a new chat session, without that context, it no longer knows those details.

Microsoft 365 Copilot Components

The Microsoft 365 Copilot system automatically provides LLMs additional context and enterprise data to generate informed, relevant responses in Microsoft 365 apps. Copilot uses private instances of LLMs hosted in Azure. An orchestration engine retrieves relevant content and data through Microsoft Search and Microsoft Graph, respecting user permissions. This additional context is provided alongside the user's query to the LLM to formulate a response.

Copilot Orchestration Engine Retrieves Relevant Enterprise Information

For example, in Microsoft Teams, a user asked Copilot "Did anything happen yesterday with Fabrikam?" Copilot's orchestration engine searched Graph and retrieved relevant emails, files, and sharing activities related to Fabrikam that the user had access to.

It combined this context into a prompt response citing each information source. This saves the manual effort of searching across services that a user would normally have to do.

Example of Copilot Formulating Response in Microsoft Teams

In this example, Copilot's orchestration engine searched Graph and retrieved an email thread, Project Checklist, March planning presentation, and contract sharing notification related to Fabrikam that the user had access to. By combining this contextual enterprise data into the prompt, Copilot could formulate a concise, relevant response for the user citing each information source.

Copilot Generates Relevant Content Without Retaining Enterprise Data

In apps like Word, Copilot can also save users time by generating content like a draft proposal. It refers to relevant data sources through orchestration and prompt context, without retaining any enterprise information.

For example, Copilot scanned the user's OneNote and other accessible documents to construct a prompt for generating a new proposal draft. The LLM produced original content guided by these additional context clues.

Example of Copilot Drafting Proposal in Microsoft Word

Here, Copilot's orchestration engine scanned the user's OneNote and accessible Word/PowerPoint files to extract relevant information. By combining this data into the prompt context, the LLM could generate an original proposal draft utilizing those additional inputs. Importantly, Copilot's LLM does not retain prompt information containing enterprise data and does not use prompts to further train itself.

Summary of Copilot's Privacy-Preserving Use of Organizational Data

To summarize, Copilot provides intelligent experiences in Microsoft 365 powered by large language models and orchestration:

  • LLMs gain broad knowledge from public data training, not private enterprise data

  • Copilot orchestration retrieves relevant private data based on user permissions to formulate prompts

  • Prompts containing private data context are not retained or used to train Copilot's LLMs

  • This allows Copilot to generate informed content while respecting data privacy

Conclusion

Microsoft 365 Copilot demonstrates how large language models can deliver helpful experiences across apps while safeguarding private data. The orchestration engine targets relevant information while the LLM generates responses without retaining that sensitive context. This privacy-focused approach allows organizations to benefit from AI while maintaining control over their information.

FAQ

Q: How does Microsoft 365 Copilot gain knowledge?
A: Copilot gains knowledge through training on massive public datasets like books, articles and websites. It does not retain or learn from private enterprise data.

Q: How does Copilot access my organization's data?
A: The Copilot orchestration engine retrieves relevant enterprise information based on permissions, relationships and activities in Microsoft 365. It provides this info to the LLM via prompts.

Q: Does Copilot retain my private data?
A: No, Copilot does not retain prompts containing private data nor use them to train the LLM models. Prompts are temporary and forgotten after use.

Q: Can Copilot read all my organization's documents?
A: No, Copilot respects individual user permissions and will only access documents you have access to while using the feature.

Q: Is Copilot safe and private?
A: Yes, Copilot's orchestration and permissions model ensures it only accesses information you have access to. Data is not retained or used for training.

Q: How does Copilot help save time?
A: Copilot automates multistep workflows like searching, retrieving and summarizing data into concise responses and can generate content drafting leveraging your existing documents.

Q: What Microsoft apps support Copilot?
A: Copilot capabilities are available across Microsoft 365 apps like Outlook, Word, PowerPoint and Teams to enhance productivity.

Q: Is Copilot using the public ChatGPT API?
A: No, Microsoft 365 Copilot uses private instances of LLMs via Microsoft's Azure OpenAI service, not the public ChatGPT.

Q: Where can I learn more about Copilot?
A: Visit Microsoft Mechanics on YouTube for more Copilot demo videos and aka.ms/MicrosoftResponsibleAI for details on responsible AI.

Q: When will Copilot be available?
A: Copilot is currently in limited preview. Check with your Microsoft account team for availability.