www.reversinglabs.com Open in urlscan Pro
2606:2c40::c73c:67e1  Public Scan

Submitted URL: https://marketing.reversinglabs.com/e3t/Ctc/LV+113/cprv104/VWgfPw15H4dzW59hyS64GGLbXVkMXxJ5bfMxSN6rgBr83lYMRW95jsWP6lZ3kRW5lpbYm9fck...
Effective URL: https://www.reversinglabs.com/blog/is-copilot-leaking-your-development-secrets-4-key-considerations?utm_medium=email&_hsmi=296...
Submission: On March 04 via api from US — Scanned from DE

Form analysis 3 forms found in the DOM

/hs-search-results

<form action="/hs-search-results" class="modal__form" data-hs-cf-bound="true">
  <input type="text" class="hs-search-field__input hs-search-field__input--modal" name="term" autocomplete="off" aria-label="Search" placeholder="Search reversinglabs.com">
  <input type="hidden" name="type" value="SITE_PAGE">
  <input type="hidden" name="type" value="LANDING_PAGE">
  <input type="hidden" name="type" value="BLOG_POST">
  <input type="hidden" name="type" value="LISTING_PAGE">
  <button aria-label="Search" class="modal__search-button"><span class="ico search"></span></button>
</form>

/hs-search-results

<form action="/hs-search-results" class="hs-search-field__bar-form" data-hs-cf-bound="true">
  <input type="text" class="hs-search-field__input hs-search-field__input--results" name="term" autocomplete="off" aria-label="Search" placeholder="Search">
  <input type="hidden" name="type" value="SITE_PAGE">
  <input type="hidden" name="type" value="LANDING_PAGE">
  <input type="hidden" name="type" value="BLOG_POST">
  <input type="hidden" name="type" value="LISTING_PAGE">
  <input type="hidden" name="limit" value="100">
  <button aria-label="Search" class="hs-search-field__button cta cta--default cta--red"><span class="ico search ico--24"></span></button>
</form>

POST https://forms.hsforms.com/submissions/v3/public/submit/formsnext/multipart/3375217/24abef2a-a2f4-4889-8899-dd4026584fa9

<form id="hsForm_24abef2a-a2f4-4889-8899-dd4026584fa9_1999" method="POST" accept-charset="UTF-8" enctype="multipart/form-data" novalidate=""
  action="https://forms.hsforms.com/submissions/v3/public/submit/formsnext/multipart/3375217/24abef2a-a2f4-4889-8899-dd4026584fa9"
  class="hs-form-private hsForm_24abef2a-a2f4-4889-8899-dd4026584fa9 hs-form-24abef2a-a2f4-4889-8899-dd4026584fa9 hs-form-24abef2a-a2f4-4889-8899-dd4026584fa9_9d076010-82de-4051-8af6-a133d3c72159 hs-form stacked hs-custom-form"
  target="target_iframe_24abef2a-a2f4-4889-8899-dd4026584fa9_1999" data-instance-id="9d076010-82de-4051-8af6-a133d3c72159" data-form-id="24abef2a-a2f4-4889-8899-dd4026584fa9" data-portal-id="3375217"
  data-test-id="hsForm_24abef2a-a2f4-4889-8899-dd4026584fa9_1999" data-hs-cf-bound="true">
  <div class="hs_email hs-email hs-fieldtype-text field hs-form-field"><label id="label-email-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your " for="email-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span></span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input id="email-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" name="email" required="" placeholder="Email*" type="email" class="hs-input" inputmode="email" autocomplete="email" value=""></div>
  </div>
  <div class="hs_leadsource hs-leadsource hs-fieldtype-select field hs-form-field smart-field" style="display: none;"><label id="label-leadsource-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your Lead source"
      for="leadsource-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span>Lead source</span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input name="leadsource" class="hs-input" type="hidden" value="Website"></div>
  </div>
  <div class="hs_utm_campaign hs-utm_campaign hs-fieldtype-text field hs-form-field" style="display: none;"><label id="label-utm_campaign-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your utm_campaign"
      for="utm_campaign-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span>utm_campaign</span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input name="utm_campaign" class="hs-input" type="hidden" value=""></div>
  </div>
  <div class="hs_utm_content hs-utm_content hs-fieldtype-text field hs-form-field" style="display: none;"><label id="label-utm_content-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your utm_content"
      for="utm_content-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span>utm_content</span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input name="utm_content" class="hs-input" type="hidden" value="296706000"></div>
  </div>
  <div class="hs_utm_medium hs-utm_medium hs-fieldtype-text field hs-form-field" style="display: none;"><label id="label-utm_medium-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your utm_medium"
      for="utm_medium-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span>utm_medium</span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input name="utm_medium" class="hs-input" type="hidden" value="email"></div>
  </div>
  <div class="hs_utm_source hs-utm_source hs-fieldtype-text field hs-form-field" style="display: none;"><label id="label-utm_source-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your utm_source"
      for="utm_source-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span>utm_source</span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input name="utm_source" class="hs-input" type="hidden" value="hs_email"></div>
  </div>
  <div class="hs_utm_term hs-utm_term hs-fieldtype-text field hs-form-field" style="display: none;"><label id="label-utm_term-24abef2a-a2f4-4889-8899-dd4026584fa9_1999" class="" placeholder="Enter your utm_term"
      for="utm_term-24abef2a-a2f4-4889-8899-dd4026584fa9_1999"><span>utm_term</span></label>
    <legend class="hs-field-desc" style="display: none;"></legend>
    <div class="input"><input name="utm_term" class="hs-input" type="hidden" value=""></div>
  </div>
  <div class="hs_recaptcha hs-recaptcha field hs-form-field">
    <div class="input">
      <div class="grecaptcha-badge" data-style="inline"
        style="width: 256px; height: 60px; position: fixed; visibility: hidden; display: block; transition: right 0.3s ease 0s; bottom: 14px; right: -186px; box-shadow: gray 0px 0px 5px; border-radius: 2px; overflow: hidden;">
        <div class="grecaptcha-logo"><iframe title="reCAPTCHA" width="256" height="60" role="presentation" name="a-92w3w6xnvbcg" frameborder="0" scrolling="no"
            sandbox="allow-forms allow-popups allow-same-origin allow-scripts allow-top-navigation allow-modals allow-popups-to-escape-sandbox allow-storage-access-by-user-activation"
            src="https://www.google.com/recaptcha/enterprise/anchor?ar=1&amp;k=6Ld_ad8ZAAAAAAqr0ePo1dUfAi0m4KPkCMQYwPPm&amp;co=aHR0cHM6Ly93d3cucmV2ZXJzaW5nbGFicy5jb206NDQz&amp;hl=en&amp;v=vj7hFxe2iNgbe-u95xTozOXW&amp;size=invisible&amp;badge=inline&amp;cb=8eo61n6suc1j"></iframe>
        </div>
        <div class="grecaptcha-error"></div><textarea id="g-recaptcha-response" name="g-recaptcha-response" class="g-recaptcha-response"
          style="width: 250px; height: 40px; border: 1px solid rgb(193, 193, 193); margin: 10px 25px; padding: 0px; resize: none; display: none;"></textarea>
      </div><iframe style="display: none;"></iframe>
    </div><input type="hidden" name="g-recaptcha-response" id="hs-recaptcha-response" value="">
  </div>
  <div class="hs_submit hs-submit">
    <div class="hs-field-desc" style="display: none;"></div>
    <div class="actions"><input type="submit" class="hs-button primary large" value="Submit"></div>
  </div><input name="hs_context" type="hidden"
    value="{&quot;embedAtTimestamp&quot;:&quot;1709570694504&quot;,&quot;formDefinitionUpdatedAt&quot;:&quot;1702042874752&quot;,&quot;lang&quot;:&quot;en&quot;,&quot;clonedFromForm&quot;:&quot;ce383929-448d-4186-b8e8-0851a2285f27&quot;,&quot;userAgent&quot;:&quot;Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.6261.94 Safari/537.36&quot;,&quot;pageTitle&quot;:&quot;Are AI development tools exposing your organization? 4 key considerations&quot;,&quot;pageUrl&quot;:&quot;https://www.reversinglabs.com/blog/is-copilot-leaking-your-development-secrets-4-key-considerations?utm_medium=email&amp;_hsmi=296706000&amp;_hsenc=p2ANqtz-9rQnLcv0kR2CxgMpVQdXKltrBnX_4ekVhgAyxDuXh2rWdmIW_vZ5Ni69aFDvStRbByH3HgzP23YQ1AClKPhcVnzmZTTg&amp;utm_content=296706000&amp;utm_source=hs_email&quot;,&quot;pageId&quot;:&quot;158425241207&quot;,&quot;urlParams&quot;:{&quot;utm_medium&quot;:&quot;email&quot;,&quot;_hsmi&quot;:&quot;296706000&quot;,&quot;_hsenc&quot;:&quot;p2ANqtz-9rQnLcv0kR2CxgMpVQdXKltrBnX_4ekVhgAyxDuXh2rWdmIW_vZ5Ni69aFDvStRbByH3HgzP23YQ1AClKPhcVnzmZTTg&quot;,&quot;utm_content&quot;:&quot;296706000&quot;,&quot;utm_source&quot;:&quot;hs_email&quot;},&quot;isHubSpotCmsGeneratedPage&quot;:true,&quot;canonicalUrl&quot;:&quot;https://www.reversinglabs.com/blog/is-copilot-leaking-your-development-secrets-4-key-considerations&quot;,&quot;contentType&quot;:&quot;blog-post&quot;,&quot;hutk&quot;:&quot;d9d05c559becc22ffb549dde5457c12d&quot;,&quot;__hsfp&quot;:1689330705,&quot;__hssc&quot;:&quot;60854195.1.1709570698464&quot;,&quot;__hstc&quot;:&quot;60854195.d9d05c559becc22ffb549dde5457c12d.1709570698464.1709570698464.1709570698464.1&quot;,&quot;formTarget&quot;:&quot;#hs_form_target_form_418158477&quot;,&quot;formInstanceId&quot;:&quot;1999&quot;,&quot;rawInlineMessage&quot;:&quot;<p>Thanks for subscribing to our blog series.&amp;nbsp; You will receive an email every Monday with our latest blog posts.</p>&quot;,&quot;hsFormKey&quot;:&quot;872fc58aaa0f629911a6abfce69ad77f&quot;,&quot;pageName&quot;:&quot;Are AI development tools exposing your organization? 4 key considerations&quot;,&quot;rumScriptExecuteTime&quot;:1204.099998474121,&quot;rumTotalRequestTime&quot;:1580.5,&quot;rumTotalRenderTime&quot;:1668.599998474121,&quot;rumServiceResponseTime&quot;:376.4000015258789,&quot;rumFormRenderTime&quot;:88.20000457763672,&quot;connectionType&quot;:&quot;4g&quot;,&quot;firstContentfulPaint&quot;:0,&quot;largestContentfulPaint&quot;:0,&quot;locale&quot;:&quot;en&quot;,&quot;timestamp&quot;:1709570698482,&quot;originalEmbedContext&quot;:{&quot;portalId&quot;:&quot;3375217&quot;,&quot;formId&quot;:&quot;24abef2a-a2f4-4889-8899-dd4026584fa9&quot;,&quot;region&quot;:&quot;na1&quot;,&quot;target&quot;:&quot;#hs_form_target_form_418158477&quot;,&quot;isBuilder&quot;:false,&quot;isTestPage&quot;:false,&quot;isPreview&quot;:false,&quot;formInstanceId&quot;:&quot;1999&quot;,&quot;formsBaseUrl&quot;:&quot;/_hcms/forms&quot;,&quot;css&quot;:&quot;&quot;,&quot;inlineMessage&quot;:&quot;<p>Thanks for subscribing to our blog series.&amp;nbsp; You will receive an email every Monday with our latest blog posts.</p>&quot;,&quot;isMobileResponsive&quot;:true,&quot;rawInlineMessage&quot;:&quot;<p>Thanks for subscribing to our blog series.&amp;nbsp; You will receive an email every Monday with our latest blog posts.</p>&quot;,&quot;hsFormKey&quot;:&quot;872fc58aaa0f629911a6abfce69ad77f&quot;,&quot;pageName&quot;:&quot;Are AI development tools exposing your organization? 4 key considerations&quot;,&quot;pageId&quot;:&quot;158425241207&quot;,&quot;contentType&quot;:&quot;blog-post&quot;,&quot;formData&quot;:{&quot;cssClass&quot;:&quot;hs-form stacked hs-custom-form&quot;},&quot;isCMSModuleEmbed&quot;:true},&quot;correlationId&quot;:&quot;9d076010-82de-4051-8af6-a133d3c72159&quot;,&quot;renderedFieldsIds&quot;:[&quot;email&quot;,&quot;leadsource&quot;,&quot;utm_campaign&quot;,&quot;utm_content&quot;,&quot;utm_medium&quot;,&quot;utm_source&quot;,&quot;utm_term&quot;],&quot;captchaStatus&quot;:&quot;LOADED&quot;,&quot;emailResubscribeStatus&quot;:&quot;NOT_APPLICABLE&quot;,&quot;isInsideCrossOriginFrame&quot;:false,&quot;source&quot;:&quot;forms-embed-1.4774&quot;,&quot;sourceName&quot;:&quot;forms-embed&quot;,&quot;sourceVersion&quot;:&quot;1.4774&quot;,&quot;sourceVersionMajor&quot;:&quot;1&quot;,&quot;sourceVersionMinor&quot;:&quot;4774&quot;,&quot;allPageIds&quot;:{&quot;embedContextPageId&quot;:&quot;158425241207&quot;,&quot;analyticsPageId&quot;:&quot;158425241207&quot;,&quot;contentPageId&quot;:158425241207,&quot;contentAnalyticsPageId&quot;:&quot;158425241207&quot;},&quot;_debug_embedLogLines&quot;:[{&quot;clientTimestamp&quot;:1709570694590,&quot;level&quot;:&quot;INFO&quot;,&quot;message&quot;:&quot;Retrieved customer callbacks used on embed context: [\&quot;getExtraMetaDataBeforeSubmit\&quot;]&quot;},{&quot;clientTimestamp&quot;:1709570694591,&quot;level&quot;:&quot;INFO&quot;,&quot;message&quot;:&quot;Retrieved pageContext values which may be overriden by the embed context: {\&quot;pageTitle\&quot;:\&quot;Are AI development tools exposing your organization? 4 key considerations\&quot;,\&quot;pageUrl\&quot;:\&quot;https://www.reversinglabs.com/blog/is-copilot-leaking-your-development-secrets-4-key-considerations?utm_medium=email&amp;_hsmi=296706000&amp;_hsenc=p2ANqtz-9rQnLcv0kR2CxgMpVQdXKltrBnX_4ekVhgAyxDuXh2rWdmIW_vZ5Ni69aFDvStRbByH3HgzP23YQ1AClKPhcVnzmZTTg&amp;utm_content=296706000&amp;utm_source=hs_email\&quot;,\&quot;userAgent\&quot;:\&quot;Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.6261.94 Safari/537.36\&quot;,\&quot;urlParams\&quot;:{\&quot;utm_medium\&quot;:\&quot;email\&quot;,\&quot;_hsmi\&quot;:\&quot;296706000\&quot;,\&quot;_hsenc\&quot;:\&quot;p2ANqtz-9rQnLcv0kR2CxgMpVQdXKltrBnX_4ekVhgAyxDuXh2rWdmIW_vZ5Ni69aFDvStRbByH3HgzP23YQ1AClKPhcVnzmZTTg\&quot;,\&quot;utm_content\&quot;:\&quot;296706000\&quot;,\&quot;utm_source\&quot;:\&quot;hs_email\&quot;},\&quot;pageId\&quot;:\&quot;158425241207\&quot;,\&quot;contentAnalyticsPageId\&quot;:\&quot;158425241207\&quot;,\&quot;contentPageId\&quot;:158425241207,\&quot;isHubSpotCmsGeneratedPage\&quot;:true}&quot;},{&quot;clientTimestamp&quot;:1709570694593,&quot;level&quot;:&quot;INFO&quot;,&quot;message&quot;:&quot;Retrieved countryCode property from normalized embed definition response: \&quot;DE\&quot;&quot;},{&quot;clientTimestamp&quot;:1709570698477,&quot;level&quot;:&quot;INFO&quot;,&quot;message&quot;:&quot;Retrieved analytics values from API response which may be overriden by the embed context: {\&quot;hutk\&quot;:\&quot;d9d05c559becc22ffb549dde5457c12d\&quot;,\&quot;canonicalUrl\&quot;:\&quot;https://www.reversinglabs.com/blog/is-copilot-leaking-your-development-secrets-4-key-considerations\&quot;,\&quot;contentType\&quot;:\&quot;blog-post\&quot;,\&quot;pageId\&quot;:\&quot;158425241207\&quot;}&quot;}]}"><iframe
    name="target_iframe_24abef2a-a2f4-4889-8899-dd4026584fa9_1999" style="display: none;"></iframe>
</form>

Text Content

slide 2 of 3
Gartner® Report: Mitigate Enterprise Software Supply Chain Risks
Get the Insights
New! The Buyer’s Guide to Software Supply Chain Security
Download Now!
WEBINAR: The New Era of Software Supply Chain Security
REGISTER NOW
Gartner® Report: Mitigate Enterprise Software Supply Chain Risks
Get the Insights
New! The Buyer’s Guide to Software Supply Chain Security
Download Now!

Solutions

SOFTWARE SUPPLY CHAIN SECURITY

Spectra Assure for Procurement Spectra Assure for Build & Release

AUTOMATE SOC SUPPORT

Triage Incident Response SIEM/SOAR Protect Cloud File Shares

OPTIMIZE THREAT HUNTING

Ransomware Feed Malware Lab Threat Hunting Sandbox Email EDR Threat Intelligence
Platform Threat Intelligence Platform for Microsoft Sentinel
Product & Technology

PRODUCT & TECHNOLOGY

Titanium Platform Spectra Assure for Software Supply Chain Security
ReversingLabs Threat Intelligence ReversingLabs Elastic Threat Infrastructure
ReversingLabs Threat Analysis & Hunting Free: Open Source Yara Rules
Integrations
Why RL?
Partners

PARTNERS

Become A Partner Value Added Partners Marketplaces

ALLIANCES

ReversingLabs And Synopsys
Resources

RESOURCES

Blog Content Library Webinars Software Deconstruction Demo Series
ReversingGlass: Concepts Explained ConversingLabs Podcast From the Labs: YARA
Rules DEMO Videos Learning with ReversingLabs
Company

COMPANY

About Us Leadership Careers Series B Investment Company News

EVENTS

Events

PRESS

Press Releases In the News
Demo
Contact Us Support Login Blog Developer Portal Search



REVERSINGLABS BLOG

Dev & DevSecOps
| February 28, 2024


ARE AI DEVELOPMENT TOOLS EXPOSING YOUR ORGANIZATION? 4 KEY CONSIDERATIONS


WHEN USING AI TOOLS INCLUDING GITHUB COPILOT, YOUR SECURITY TEAM MUST BE AWARE
OF — AND PROTECT AGAINST — CERTAIN RISKS. HERE ARE FOUR KEY QUESTIONS TO ASK.

Blog Author

Jaikumar Vijayan, Freelance technology journalist. Read More...

 * 
 * 
 * 
 * 



Microsoft's soon-to-be-released GitHub Copilot Enterprise option will give
organizations an enterprise-grade subscription plan for its AI-powered
code-completion tool, which helps developers write code faster.


The option will give administrators a single point of visibility and management
over Copilot use in the enterprise and will include security and privacy
features to protect enterprise code from potential threats and compromise.

The enterprise plan gives organizations that have a low appetite for risk one of
the best options yet to harness the productivity benefits of an AI pair
programmer such as Copilot while mitigating some of the perceived risks
associated with the technology.

When using AI tools such as Copilot, organizations need to be cognizant of a
number of security and legal risks — and protect against them. This includes
AI-powered code auto-completion, code-generation, and code-optimization tools.
Here are four important questions to ask. 

[ Learn more in our report: The Buyer’s Guide to Software Supply Chain Security
| See Webinar: Why you need to upgrade your AppSec tools for the new era ]


1. DO COMPONENTS OF YOUR CODE BELONG TO SOMEONE ELSE?

One of the biggest concerns associated with automatic code-generation and
code-completion tools is the potential for copyright violations and licensing
complications. Copilot and other similar AI-based tools often use public and
private databases for training data. The black-box nature of these technologies
means organizations have no visibility into whether the code snippets and
partial code that these tools suggest might include use of copyrighted material
and intellectual property.

Microsoft has acknowledged the concerns that some of its customers have
expressed about the potential IP and copyright issues associated with the use of
Copilot: "Microsoft is bullish on the benefits of AI, but, as with any powerful
technology, we’re clear-eyed about the challenges and risks associated with it,
including protecting creative works."

The company's versions of Copilot for enterprises explicitly blocks outputs that
match public code as one measure to reduce copyright-related risks. To further
ease some of these concerns, the company announced last September that it would
offer legal protections from third-party copyright infringement claims to
Copilot customers. Under Microsoft's Copilot Copyright Commitment provision,
Microsoft will defend customers — and even pay for any adverse judgments or
settlements — resulting from any lawsuit that a third-party might file against
them over copyright infringement claims.

However, to be eligible for the protection, organizations need to ensure that
they implement specific guardrails and content filters that Microsoft has built
into Copilot when using the technology. The goal in integrating the filters and
other mitigation technologies is to reduce the likelihood of Copilot returning
infringing content, Microsoft said:

> "These build on and complement our work to protect digital safety, security,
> and privacy, based on a broad range of guardrails such as classifiers,
> meta-prompts, content filtering, and operational monitoring and abuse
> detection."

FOSSA, the developers of the eponymously named software platform for managing
open-source license compliance, also recommends that organizations scan their
AI-generated code for potential copyrighted, licensed code and tag all
AI-generated code. In addition, FOSSA recommends that organizations enable
GitHub Copilot's optional duplicate code-detection feature to further reduce
risk associated with license compliance issues.

John Bambenek, president at Bambenek Consulting, said organizations should be
very concerned because “the AI made it” will likely not be a defense against a
copyright claim.

"Organizations are making money with this code, which means there is a strict
level of scrutiny on the organization against using copyrighted code even if
they had no idea, it was copyrighted in the first place."
—John Bambenek


2. IS YOUR AI TOOL INTRODUCING VULNERABILITIES?

Copilot can inadvertently suggest code snippets that contain security weaknesses
that developers then introduce into their software projects without review. Past
research has shown that GitHub Copilot and other AI-based code-completion and
code-generation assistants such as AWS CodeWhisperer that are trained on open
source code and libraries with known vulnerabilities in them can often reproduce
output that contains these same vulnerabilities.

Eric Schwake, director of cybersecurity strategy at Salt Security, said that
while AI tools including Copilot can help to dramatically decrease the time
needed for software development, they can also introduce security concerns that
need to be accounted for. He said that it's imperative that organizations that
use tools such as Copilot do due diligence when utilizing code or code
suggestions.

"Because of the black-box nature of these tools, organizations need strategies
in place to ensure the AI is providing code that is both secure and compliant
with their internal regulations."
—Eric Schwake

One new study from Snyk found that if an organization uses Copilot in a project
with existing vulnerabilities, the AI-based tool will amplify those
vulnerabilities through its suggestions. At the same time, when Copilot is used
in a project without existing security issues, the code it generates is also
mostly vulnerability-free.

An older, 2021 analysis of the security of GitHub Copilot's code contributions
by researchers at New York University and the University of Calgary found that
40% of 1,689 programs that Copilot helped produce for the study contained
vulnerabilities. The researchers found that while Copilot could significantly
increase the productivity of software developers, it also heightened security
risks. 

"Put simply, when Copilot suggests code, it may inadvertently replicate existing
security vulnerabilities and bad practices present in the neighbor files," Snyk
said. "This can lead to insecure coding practices and open the door to a range
of security vulnerabilities."

Mitigating these risks means having mechanisms in place to scan the output from
AI code-prompting and code-generation tools for vulnerabilities and to have
code-review policies that require approval of all auto-generated code before
deployment.

> While Copilot has evolved considerably since the 2021 study, researchers
> believe organizations still need to pair the tool with controls for
> comprehensive vetting. Matt Rose, field CISO at ReversingLabs, said that while
> traditional AppSec tools (AST) can test for vulnerabilities and tools such as
> software composition analysis (SCA) can help validate open-source licensing,
> what organizations need is a tool that can provide visibility into the entire
> software packages companies develop or use.
> 
> Rose wrote recently that complex binary analysis is the right tool for dealing
> with today's increasingly complex software.
> 
> "An evolution of application security (AppSec) is under way, and a key to it
> is complex binary analysis, which is like a final exam for your software
> package before release. Complex binary analysis allows your team to review the
> software in final form so that you can trust all of the software your
> organization produces and consumes."
> —Matt Rose


3. ARE YOUR SECRETS ENDING UP IN THE TRAINING DATA?

AI writing assistants, code completion tools and chatbots have a tendency to
store large chunks of training data and spew out the data verbatim with the
appropriate prompts. A study that researchers at Google's DeepMind AI research
lab conducted in collaboration with peers at Cornell University, UC Berkeley,
and three other universities showed how an adversary could extract training data
from ChatGPT simply by prompting it to incessantly repeat specific words such as
"poem," "make," "send," and "company."

This can become a significant issue with AI-based coding assistants because the
training data can sometimes contain copyrighted code and hard-coded secrets such
as access tokens, OAuth IDs, and API keys. Though such data is supposed to
remain private, it can often end up in codebases on public repositories such as
GitHub, which tools such as Copilot then use as training data. In 2022,
developers inadvertently committed codebases to GitHub that in total contained
over 3 million hard-coded secrets.

Just as with ChatGPT, research has shown that attackers can extract these
secrets in AI-based code assistants using the appropriate prompts. To illustrate
the extent of the problem, researchers at the Chinese University of Hong Kong
and Sun Yat-sen University in China ran a tool they developed called Hard-coded
Credential Revealer against Copilot and CodeWhisperer and extracted 2,702
hard-coded credentials from Copilot and 129 secrets from CodeWhisperer.

Salt Security's Schwake said awareness was key.

> "As with most things, when it comes to security, it’s important that there are
> awareness programs in place for DevOps that explain the risks associated with
> Copilot and how it could potentially interact with sensitive data. Ensuring
> secure coding practices across the organization is especially important to
> limit the risk of data loss."
> —Eric Schwake

FOSSA also recommends that organizations ensure they have opted out of having
Copilot using any prompts or code snippets of theirs for training purposes.

Philip George, executive technical assistant at Merlin Cyber, said sound secrets
hygiene should be exercised when curating training data for Copilot. The goal
should be to ensure no hardcoded secrets existing within the training content or
in the code repositories, he said.

> "Consider establishing a cryptographic bill of materials to track the
> proliferation of secrets across a given codebase and incorporate centralized
> credential management and Just-in-time (JIT) access for CI/CD development."
> —Philip George

There are mechanisms, usually around pattern matching, to prevent secrets from
getting into public repositories adds Bambenek. "But [often] these mechanisms
are ineffectively deployed, if they are deployed at all."


4. CAN SOMEONE MANIPULATE YOUR CODE COMPLETION TOOL? 

One risk that organizations need to consider and protect against is the
potential for adversaries to try and "poison" AI systems or to deliberately
confuse them into malfunctioning.

The National Institute of Standards and Technology (NIST) earlier this year
highlighted four attack classes that threat actors can use to trigger these
outcomes: evasion attacks or attempts to alter an input to affect how the system
responds to it; poisoning attacks involving the introduction of corrupt data
into the training set; privacy attacks for extracting training data for
malicious purposes; and abuse attacks involving the use of incorrect and false
information into a source web page or document. 

NIST noted in its blog post:

“Most of these attacks are fairly easy to mount and require minimum knowledge of
the AI system and limited adversarial capabilities. Poisoning attacks, for
example, can be mounted by controlling a few dozen training samples, which would
be a very small percentage of the entire training set.”

Such attacks can happen in the context of AI-powered code assistants such as
GitHub Copilot and CodeWhisperer, said Merlin Cyber's George. For starters,
organizations should adopt regular static analysis scans of their codebase in
conjunction with strict access control requirements for data repositories to
mitigate this risk, he said.

DevOps teams that plan on using Copilot or similar tools should also consider
adopting access control mechanisms to enforce the compartmentalization of
datasets used for training large language models in the environment, George
said. He recommended that organizations consider models such as the Bell
LaPadula model — often used by the military and government — to protect
sensitive information when using AI-based assistants in the development
environment.

> "The focus is to ensure both the confidentiality and integrity of data sources
> to maintain trust for AI pair/code-generation tools."
> —Philip George


AI GIVETH, AND AI TAKETH AWAY

AI tools such as Copilot can help to dramatically increase the time needed for
app development. But they can also introduce security concerns that need to be
accounted for, Schwake said.

It’s important for organizations to set in place guardrails around the use of
such technologies, the kind of data they have access to, and how, or whether,
such data can be shared. 

> "There is still uncertainty on whether AI-generated code can be copyrighted,
> so this needs to be considered if your organization utilizes code that Copilot
> has built."
> —Eric Schwake

Get up to speed on key trends and understand the landscape with The State of
Software Supply Chain Security 2024. Plus: Learn about ReversingLabs Spectra
Assure for software supply chain security.


KEEP LEARNING

 * Update your understanding: Buyer's Guide for Software Supply Chain Security
 * Join the Webinar: Why you need to upgrade your AppSec for the new era
 * See Webinar: State of Software Supply Chain Security Webinar
 * See the Webinar: State of Software Supply Chain Security 2024
 * See Gartner's guidance on managing software supply chain risk

 * Tags:
 * Dev & DevSecOps


MORE BLOG POSTS

Software Bill of Materials (SBOM) February 29, 2024


ALL SBOMS ARE NOT CREATED EQUAL: HOW TO MAKE THEM ACTIONABLE

Here are key challenges with SBOMs, how tools affect their usefulness — and what
it will take to make them effective at software supply chain security.
Read More
Dev & DevSecOps February 28, 2024


ARE AI DEVELOPMENT TOOLS EXPOSING YOUR ORGANIZATION? 4 KEY CONSIDERATIONS

When using AI tools including GitHub Copilot, your security team must be aware
of — and protect against — risks. Here are four key questions to ask.
Read More
Threat Modeling February 27, 2024


LESSONS IN THREAT MODELING: HOW ATTACK TREES CAN DELIVER APPSEC BY DESIGN

Here's what development and application security teams need to know about using
attack trees in tandem with threat modeling to lock down their software.
Read More



TOPICS

 * All Blog Posts
 * AppSec & Supply Chain Security
 * Dev & DevSecOps
 * Threat Research
 * Security Operations
 * Products & Technology
 * Company & Events


FOLLOW US

 * X
 * YouTube
 * LinkedIn


SUBSCRIBE

Get the best of RL Blog delivered to your in-box weekly to stay up to date on
key trends, analysis and best practices across threat intelligence and software
supply chain security.

Lead source

utm_campaign

utm_content

utm_medium

utm_source

utm_term





SPECIAL REPORTS

 * The State of Software Supply Chain Security 2024
   January 16, 2024


LATEST BLOG POSTS

The State of Software Supply Chain Security 2024

Conversations About Threat Hunting and Software Supply Chain Security

Reproducible Builds: Graduate Your Software Supply Chain Security

Glassboard conversations with ReversingLabs Field CISO Matt Rose

Software Package Deconstruction: Video Conferencing Software

Analyzing Risks To Your Software Supply Chain


SOFTWARE SUPPLY CHAIN SECURITY HOTLINE

If you need immediate assistance with a software supply chain security issue,
you can contact us here.

sscs incident response
 * Blog
 * Webinars
 * Demo Videos

 * Events
 * In the News
 * Glossary

 * About Us
 * Careers
 * Contact Us

Privacy Policy | Cookies
All rights reserved ReversingLabs © 2024
 * 
 * 
 * 
 * 
 * 
 * 


✖
This website uses cookies to ensure the best website experience. By continuing
to use this website you are giving your consent to cookies being used. Detailed
information about our use of cookies is here. cookie script