Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • Let’s Talk About ChatGPT and Cheating in the Classroom
    • Google’s Will Smith double is better at eating AI spaghetti … but it’s crunchy?
    • NBA Playoffs 2025: How to Watch Pacers vs. Knicks Game 2 Tonight
    • Master VR: Self-Hacks to Reduce Sickness
    • Prototyping Gradient Descent in Machine Learning
    • Affordable small living with Dragon Tiny Homes’ Fairfax tiny house
    • Freedom of the Press Foundation Threatens Legal Action if Paramount Settles With Trump Over ’60 Minutes’ Interview
    • Researchers cause GitLab AI developer assistant to turn safe code malicious
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Saturday, May 24
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»News»Researchers cause GitLab AI developer assistant to turn safe code malicious
    News

    Researchers cause GitLab AI developer assistant to turn safe code malicious

    Editor Times FeaturedBy Editor Times FeaturedMay 24, 2025No Comments2 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link

    Entrepreneurs promote AI-assisted developer instruments as workhorses which might be important for in the present day’s software program engineer. Developer platform GitLab, as an example, claims its Duo chatbot can “immediately generate a to-do checklist” that eliminates the burden of “wading by means of weeks of commits.” What these corporations don’t say is that these instruments are, by temperament if not default, simply tricked by malicious actors into performing hostile actions in opposition to their customers.

    Researchers from safety agency Legit on Thursday demonstrated an assault that induced Duo into inserting malicious code right into a script it had been instructed to jot down. The assault may additionally leak non-public code and confidential concern knowledge, reminiscent of zero-day vulnerability particulars. All that’s required is for the consumer to instruct the chatbot to work together with a merge request or related content material from an outdoor supply.

    AI assistants’ double-edged blade

    The mechanism for triggering the assaults is, in fact, immediate injections. Among the many most typical types of chatbot exploits, immediate injections are embedded into content material a chatbot is requested to work with, reminiscent of an e mail to be answered, a calendar to seek the advice of, or a webpage to summarize. Giant language model-based assistants are so desperate to comply with directions that they’ll take orders from nearly anyplace, together with sources that may be managed by malicious actors.

    The assaults concentrating on Duo got here from varied sources which might be generally utilized by builders. Examples embody merge requests, commits, bug descriptions and feedback, and supply code. The researchers demonstrated how directions embedded inside these sources can lead Duo astray.

    “This vulnerability highlights the double-edged nature of AI assistants like GitLab Duo: when deeply built-in into improvement workflows, they inherit not simply context—however danger,” Legit researcher Omer Mayraz wrote. “By embedding hidden directions in seemingly innocent undertaking content material, we had been capable of manipulate Duo’s conduct, exfiltrate non-public supply code, and exhibit how AI responses may be leveraged for unintended and dangerous outcomes.”



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    Google’s Will Smith double is better at eating AI spaghetti … but it’s crunchy?

    May 24, 2025

    Feds charge 16 Russians allegedly tied to botnets used in cyberattacks and spying

    May 23, 2025

    Oracle will buy ~400,000 Nvidia GB200 chips and lease them to OpenAI at its 1.2 gigawatts Texas data center, billed as the first US Stargate project (Financial Times)

    May 23, 2025

    Intuit stock jumps 9%+ after reporting Q3 revenue up 15% YoY to $7.8B with FY 2025 guidance of $18.72B to $18.76B, up from $18.16B to $18.35B (Ashley Capoot/CNBC)

    May 23, 2025

    New Claude 4 AI model refactored code for 7 hours straight

    May 23, 2025

    Destructive malware available in NPM repo went unnoticed for 2 years

    May 23, 2025
    Leave A Reply Cancel Reply

    Editors Picks

    Let’s Talk About ChatGPT and Cheating in the Classroom

    May 24, 2025

    Google’s Will Smith double is better at eating AI spaghetti … but it’s crunchy?

    May 24, 2025

    NBA Playoffs 2025: How to Watch Pacers vs. Knicks Game 2 Tonight

    May 24, 2025

    Master VR: Self-Hacks to Reduce Sickness

    May 24, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    The DOJ Still Wants Google to Sell Off Chrome

    March 8, 2025

    The iPhone 17 might use Apple’s own Wi-Fi chips

    November 2, 2024

    Jury orders NSO to pay $167 million for hacking WhatsApp users

    May 19, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.