Overview

Microsoft Copilot with enterprise data protection is a generative AI tool that is available to University students, faculty, and staff for use through:

Copilot can assist with generating content, coding, drafting emails, summarizing information, and more.

Latest news:

  • Protected health information is no longer approved by the University for use in Copilot due to Microsoft revising previous data protection statements.
  • As of October 1, 2024, it is not possible to upload files, including images, pdfs, documents, etc., to Copilot, due to Microsoft's integration of Copilot with Microsoft OneDrive, which is not being leveraged at the University. Users attempting to upload files see the message "An error occurred when uploading your file. Please try again." Repeated attempts deliver the same message.

Enterprise Data Protection

You must sign in with your University internet ID and password to enable Copilot’s enterprise data protection. To confirm that you are signed in, verify that you see a shield symbol in the top-right corner of your screen. This means that prompts and responses are not saved and Microsoft does not have access to the data for the purposes of training the language model. It’s a secure way to use Copilot while protecting your data and the University’s data. 

Important Notes: ​​

  • Pending review, Private-Highly Restricted data may not be used with Microsoft Copilot with enterprise data protection. 
  • Before entering non-public data into other generative AI tools, including other versions of Copilot, ensure they have been properly vetted for your data classification.

Appropriate Use

Review the following resources before using generative AI:

AI tools can generate incomplete, inaccurate, or biased responses, so any output should be closely reviewed and verified by a human.

Highlights

  • Copilot can:
    • Summarize text
    • Provide feedback on the tone of your writing
    • Quickly create stock images
  • Enterprise data protection ensures:
    • Prompts and responses aren't saved
    • Chat data isn't used to train the underlying large language models
    • Previous chats aren’t retained or made available to users or the organization
    • No 3rd-party plugins/actions to prevent data access by external entities

Getting Started