En Barysa S.A., apoyamos a que la seguridad y la privacidad de datos sean claras mientras se investiga la IA. Con nuestro soporte, puede estar seguro de que Microsoft protege la información con estrictos estándares. Conversemos sobre cómo iniciar este viaje.
How does Microsoft 365 Copilot utilize organizational data?
Microsoft 365 Copilot connects large language models (LLMs) to your organizational data by accessing content through Microsoft Graph. It generates responses based on user documents, emails, calendar events, chats, and meetings that the user has permission to access. This combination of content and context helps provide accurate and relevant responses. Importantly, prompts and responses are not used to train the foundation LLMs.
What measures are in place to protect organizational data?
Microsoft 365 Copilot employs a permissions model to ensure that only authorized users can access specific data. It uses multiple layers of protection, including encryption for data at rest and in transit, and adheres to privacy regulations like GDPR. Additionally, it implements logical isolation of customer content and honors usage rights granted to users, ensuring that sensitive information remains secure.
What data is stored from user interactions with Copilot?
When users interact with Microsoft 365 Copilot, data such as prompts and responses are stored as part of the user's Copilot activity history. This data is encrypted and processed in line with organizational commitments. Admins can manage this stored data using tools like Microsoft Purview, and users have the option to delete their activity history through the My Account portal.