Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Attackers used AI prompts to silently exfiltrate code from GitHub repositories

The exploit relied on GitHub’s “invisible comments” feature to hide malicious instructions from human reviewers.

byKerem Gülen
October 15, 2025
in Cybersecurity, News

A critical vulnerability in GitHub Copilot Chat, dubbed “CamoLeak,” allowed attackers to silently steal source code and secrets from private repositories using a sophisticated prompt injection technique. The flaw, which carried a CVSS score of 9.6, has since been patched by GitHub.

How the CamoLeak attack worked

The attack method, discovered by security researcher Omer Mayraz, began by hiding malicious instructions within a pull request description using GitHub’s “invisible comments” feature. While this content is not visible to users in the standard interface, Copilot Chat ingests all repository and pull request context, including this hidden metadata, when generating responses.

The vulnerability was triggered when a legitimate developer with access to private repositories would ask Copilot Chat a question about the compromised pull request. Copilot, which operates with the permissions of the querying user, would then execute the hidden malicious prompt. This allowed the attacker to command the AI assistant to search for sensitive information, such as API keys or source code, within the victim’s accessible private repositories.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

To exfiltrate the stolen data, the attack leveraged GitHub’s own “Camo” image proxy service. Normally, GitHub’s Content Security Policy (CSP) prevents content from directly leaking data to external domains. The Camo proxy is designed to safely route external image requests, rewriting URLs to a camo.githubusercontent.com address with a cryptographic signature.

The CamoLeak attack bypassed these protections by first having the attacker create a dictionary of pre-signed Camo URLs. Each valid URL pointed to a benign, invisible 1×1 pixel image hosted on the attacker’s server, with each unique URL representing a single character of data (e.g., ‘A’, ‘B’, ‘1’, ‘;’).

The injected prompt then instructed Copilot to construct its response by referencing these pre-signed image URLs in a specific sequence that encoded the stolen repository content. When the victim’s browser rendered Copilot’s output, it made a series of requests through the trusted Camo proxy to fetch each invisible pixel. The sequence of these requests, as received by the attacker’s server, effectively reconstructed the stolen data character by character, all without displaying any malicious content to the user or triggering standard network security alerts.


Featured image credit

Tags: camoleakGithub

Related Posts

Netflix to stream video podcasts in 2026

Netflix to stream video podcasts in 2026

November 6, 2025
Google Maps integrates Gemini for hands-free navigation

Google Maps integrates Gemini for hands-free navigation

November 6, 2025
Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

November 6, 2025
Sony launches world’s first ethical bias benchmark for AI images

Sony launches world’s first ethical bias benchmark for AI images

November 6, 2025
Nintendo expands its store app beyond Japan to global markets

Nintendo expands its store app beyond Japan to global markets

November 6, 2025
Blue Origin New Glenn’s second launch set for November 9

Blue Origin New Glenn’s second launch set for November 9

November 6, 2025

LATEST NEWS

Netflix to stream video podcasts in 2026

Google Maps integrates Gemini for hands-free navigation

Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

Sony launches world’s first ethical bias benchmark for AI images

Nintendo expands its store app beyond Japan to global markets

Blue Origin New Glenn’s second launch set for November 9

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.