www.zenity.io Open in urlscan Pro
141.193.213.20  Public Scan

Submitted URL: https://pages.zenity.io/e3t/Ctc/GG+113/d30KZy04/VWJk0K2Nx0_PN38MbmLXDH9ZW3fCDQw57hj5gN9cd5G85m_5PW50kH_H6lZ3lFW2RP5Gt3qW...
Effective URL: https://www.zenity.io/microsoft-copilot-studio-vulnerabilities-explained/?utm_campaign=FY2312%20-%20Content%20-%206%20...
Submission: On December 20 via api from ES — Scanned from ES

Form analysis 0 forms found in the DOM

Text Content

Skip to content
 * Platform
 * Solutions
   
   
 * Resources
   
   
 * Company
   
   
 * Blog
 * Contact us





SOLUTIONS


ZENITY IS TRUSTED BY THE MOST INNOVATIVE COMPANIES IN THE WORLD

Browse Solutions


PLATFORMS

Power Platform
Salesforce
ServiceNow


TECHNOLOGIES

LCAP
RPA
iPaaS


USE CASES

Citizen Development
Business Continuity
Compliance


RESOURCES


VIEW EBOOKS, WHITEPAPERS, VIDEOS AND MORE IN THE INDUSTRY'S FIRST AND PACKED
RESOURCE LIBRARY FOR NO-CODE/LOW-CODE SECURITY GOVERNANCE

View Resource


COLLATERAL

Frameworks
Whitepapers
Case Studies


GALLERY

Videos
Webinars


LEARNING CENTER

Glossary


COMPANY


ZENITY IS THE LEADING PLATFORM TO SECURE AND MONITOR NO-CODE/LOW-CODE
DEVELOPMENT

Read More


WHY ZENITY

About Us
Careers
Trust Center


AROUND THE WORLD

Newsroom
Book Your Demo


MICROSOFT COPILOT STUDIO VULNERABILITIES: EXPLAINED

 * 
   Written by Andrew Silberman
 * Post published:December 19, 2023



Last week, Michael Bargury and the team at Zenity published a video summarizing
6 vulnerabilities that are found in Microsoft Copilot Studio. The video
highlights, in sequence, a myriad of ways that business users can create their
own AI Copilots that are risky, why they are risky, and how they can be easily
exploited. While I highly recommend checking out the video, this blog sets out
to provide a look at why these vulnerabilities matter, and what considerations
should be taken to mitigate them. 

POOR AUTHENTICATION PROTOCOLS LEAD TO OVER-SHARING

After building a Copilot, builders can both publish them for broader use, as
well as share them with other business users at the organization to help drive
efficiency and productivity. The challenge for security teams is that the
default setting makes it so that no authentication is required before other
people can use this Copilot, meaning that it is publicly accessible to anyone
with the link. 



In the video, Michael cites an example of an HR app, but also think of a
financial Copilot that’s been created to verify budget information. This
Copilot, in order to provide timely and accurate information, likely needs to be
connected to an internal Sharepoint site or SAP table that contains sensitive
data pertaining to corporate budget. If anyone can simply log on to access a bot
that has access to potentially sensitive data, it can lead to data leakage
and/or a failed audit. 

CREDENTIAL SHARING

Further, when building a bot without authentication methods, the builder embeds
their identity into the application, making it so that every time someone logs
in or uses the bot, it appears that it is the builder. This results in a lack of
visibility for security leaders, but also is a prime example of credential
sharing, where any user that interacts with the bot essentially interacts with,
and uses someone else’s credentials. 

THE PROCESS OF MITIGATION

While it is relatively straightforward to identify and fix the issue, two things
need to proactively happen. First, an authentication method must be chosen, and
second, builders must select the box that requires users to log in. This whole
process is very reminiscent of ‘opt-in’ vs. ‘opt-out’ that many businesses
inject into processes that allows them to harvest data, send spammy emails, and
more, unless the user goes out of their way to correct it (which many don’t).  



While seemingly simple, in the high-velocity world of citizen development, it is
very easy to make a mistake and/or forget. This is pronounced when less
technical users are the ones building their own apps and bots, as they lack the
training and awareness to think of potential security vulnerabilities. 

To prevent this from happening, anyone using Copilot Studio should be instructed
to double check the authentication method for a Copilot they are building. There
are some instances where it makes sense to not require authentication (think of
a customer service chatbot that pops up when you enter a new website), but in
many cases, particularly those that require access to sensitive data, unchecked
access is not needed. 

PROMPT INJECTION

AI Copilots are often targeted in prompt injection attacks, where in a direct
attack, a hacker modifies a large language model’s input in an attempt to
overwrite existing system prompts. In the example given, Michael shows what a
prompt injection looks like in the video with a simple request for the bot to
provide information about an impending layoff plan that is contained in a
Sharepoint site. Even for verified employees, this type of information would not
be something desirable to ‘get out,’ and even less desirable for a bot that has
not been fortified with even basic authentication. 



However, it becomes that much worse when unauthorized users access these bots
with bad intentions. It becomes easy, especially when these bad actors are
masquerading as trusted insiders, to ‘trick’ the bots to giving them sensitive
information, which results in frequent and widespread data leaks. This can also,
of course, lead to failures of compliance and data exfiltration. 

VIOLATION OF LEAST PRIVILEGE

Finally, within Copilot Studio, in order to track performance and see the
interactions with each individual bot, Microsoft provides access to transcripts
in a shared table. However, for now, this is a shared table that is accessible
by many, that contains all transcripts from all of the bots. While this is a
clear violation of least privilege, it is done with good intentions, because
people need to be able to verify that bots are working as designed, so they can
tweak and optimize them. However, in this case, it provides a single place that
contains all transcripts and engagements with each bot, resulting in a huge
opportunity for data leakage. 



RECOMMENDATIONS

For starters, many organizations are rushing to slow the use and adoption of AI
tools, but in many cases, cannot be fully contained. At Zenity, we became the
first company to offer support for business-led development of Enterprise AI
Copilots just a couple of weeks ago and are already providing value to our
customers in the way of: 

 1. Maintaining Visibility. With Zenity, customers can continuously identify
    copilots as they are being introduced to the corporate environment. Zenity
    also tags sensitive data and correlates that with any copilot that has
    access to storing, processing, or otherwise handling data that should be
    tightly guarded. 
 2. Assessing Risk. Each copilot and bot is also automatically scanned and
    analyzed to determine which bots lack proper authentication methods, are
    likely to leak data, are susceptible to prompt injections, and more.
 3. Governing Citizen Development. As citizen developers are prone to putting
    bots into production with errors, security leaders can autonomously resolve
    security and compliance issues and ensure that as more and more people use
    tools like Copilot Studio, that there are guardrails in place to ensure
    harmony.

If you’d like to see this in action, come chat with us!


ALL THE NEWS STRAIGHT TO YOUR INBOX. SIGNUP FOR ZENITY’S WEEKLY NEWSLETTER.

Don’t miss a single opportunity to get knowledge.




ABOUT THE AUTHOR


ANDREW SILBERMAN


As the Director of Marketing at Zenity, Andrew is responsible for telling
stories that resonate. He is a passionate advocate for customers,  and helping
to translate real-world findings into things that provide value for the world of
low-code/no-code security. With nearly 10 years of cybersecurity experience,
Andrew has held past sales and marketing leadership positions with CyberArk, the
leader in Privileged Access Management, and heading up product marketing at
Omada, a leader in identity governance.


TABLE OF CONTENTS





YOU MIGHT FIND THESE ARTICLES INTERESTING


THE 7 DEADLY SINS OF LOW-CODE SECURITY AND HOW TO AVOID THEM

Read More
November 17, 2021


CTO MICHAEL BARGURY’S THOUGHTS ON LOW-CODE SECURITY FEATURED ON DARK READING

Read More
December 9, 2021


LOW-CODE SECURITY AND BUSINESS EMAIL COMPROMISE VIA EMAIL AUTO-FORWARDING

Read More
January 16, 2022

hello@zenity.io
Linkedin Twitter Youtube Instagram



PLATFORM


 * Platform
 * Solutions

 * Platform
 * Solutions


COMPANY


 * About Us

 * About Us


RESOURCES


 * Resources
 * Blog
 * Terms and Conditions
 * Privacy Policy
 * Cookies Policy

 * Resources
 * Blog
 * Terms and Conditions
 * Privacy Policy
 * Cookies Policy


MY ACCOUNT


 * Sign Up

 * Sign Up


COPYRIGHT © ZENITY LTD. | ALL RIGHTS RESERVED 2023

you're currently offline


APPSEC FOR COPILOTS


ZENITY INTRODUCES FULL SUPPORT FOR MICROSOFT COPILOT STUDIO

Learn More