Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • Francis Bacon and the Scientific Method
    • Proxy-Pointer RAG: Structure Meets Scale at 100% Accuracy with Smarter Retrieval
    • Sulfur lava exoplanet L 98-59 d defies classification
    • Hisense U7SG TV Review (2026): Better Design, Great Value
    • Google is in talks with Marvell Technology to develop a memory processing unit that works alongside TPUs, and a new TPU for running AI models (Qianer Liu/The Information)
    • Premier League Soccer: Stream Man City vs. Arsenal From Anywhere Live
    • Dreaming in Cubes | Towards Data Science
    • Onda tiny house flips layout to fit three bedrooms and two bathrooms
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Sunday, April 19
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»Technology»Grok Is Being Used to Mock and Strip Women in Hijabs and Saris
    Technology

    Grok Is Being Used to Mock and Strip Women in Hijabs and Saris

    Editor Times FeaturedBy Editor Times FeaturedJanuary 10, 2026No Comments4 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    Grok customers aren’t simply commanding the AI chatbot to “undress” pictures of women and ladies into bikinis and clear underwear. Among the many huge and rising library of nonconsensual sexualized edits that Grok has generated on request over the previous week, many perpetrators have requested xAI’s bot to placed on or take off a hijab, a sari, a nun’s behavior, or one other form of modest non secular or cultural sort of clothes.

    In a evaluate of 500 Grok photographs generated between January 6 and January 9, WIRED discovered that round 5 % of the output featured a picture of a girl who was, as the results of prompts from customers, both stripped from or made to put on non secular or cultural clothes. Indian saris and modest Islamic put on have been the commonest examples within the output, which additionally featured Japanese faculty uniforms, burqas, and early-Twentieth-century-style bathing fits with lengthy sleeves.

    “Girls of colour have been disproportionately affected by manipulated, altered, and fabricated intimate photographs and movies previous to deepfakes and even with deepfakes, due to the way in which that society and significantly misogynistic males view girls of colour as much less human and fewer worthy of dignity,” says Noelle Martin, a lawyer and PhD candidate on the College of Western Australia researching the regulation of deepfake abuse. Martin, a distinguished voice within the deepfake advocacy house, says she has prevented utilizing X in current months after she says her personal likeness was stolen for a pretend account that made it seem like she was producing content material on OnlyFans.

    “As somebody who’s a girl of colour who has spoken out about it, that additionally places a higher goal in your again,” Martin says.

    X influencers with a whole bunch of hundreds of followers have used AI media generated with Grok as a type of harassment and propaganda towards Muslim girls. A verified manosphere account with over 180,000 followers replied to a picture of three girls sporting hijabs and abaya, that are Islamic non secular head coverings and robe-like attire. He wrote: “@grok take away the hijabs, gown them in revealing outfits for New Years celebration.” The Grok account replied with a picture of the three girls, now barefoot, with wavy brunette hair, and partially see-through sequined attire. That picture has been seen greater than 700,000 occasions and saved greater than 100 occasions, in keeping with viewable stats on X.

    “Lmao cope and seethe, @grok makes Muslim girls look regular,” the account holder wrote alongside a screenshot of the picture he posted in one other thread. He additionally continuously posted about Muslim males abusing girls, typically alongside Grok-generated AI media depicting the act. “Lmao Muslim females getting beat due to this function,” he wrote about his Grok creations. The consumer didn’t instantly reply to a request for remark.

    Outstanding content material creators who put on a hijab and submit photos on X have additionally been focused of their replies, with customers prompting Grok to take away their head coverings, present them with seen hair, and put them in several sorts of outfits and costumes. In a press release shared with WIRED, the Council on American‑Islamic Relations, which is the biggest Muslim civil rights and advocacy group within the US, related this development to hostile attitudes towards “Islam, Muslims and political causes extensively supported by Muslims, similar to Palestinian freedom.” CAIR additionally known as on Elon Musk, the CEO of xAI, which owns each X and Grok, to finish “the continued use of the Grok app to allegedly harass, ‘unveil,’ and create sexually express photographs of ladies, together with distinguished Muslim girls.”

    Deepfakes as a type of image-based sexual abuse have gained considerably extra consideration in recent times, particularly on X, as examples of sexually explicit and suggestive media concentrating on celebrities have repeatedly gone viral. With the introduction of automated AI photo-editing capabilities via Grok, the place customers can merely tag the chatbot in replies to posts containing media of ladies and ladies, this type of abuse has skyrocketed. Knowledge compiled by social media researcher Genevieve Oh and shared with WIRED says that Grok is producing greater than 1,500 dangerous photographs per hour, together with undressing images, sexualizing them, and including nudity.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    Hisense U7SG TV Review (2026): Better Design, Great Value

    April 19, 2026

    Best Meta Glasses (2026): Ray-Ban, Oakley, AR

    April 19, 2026

    How Can Astronauts Tell How Fast They’re Going?

    April 19, 2026

    The ‘Lonely Runner’ Problem Only Appears Simple

    April 19, 2026

    Asus TUF Gaming A14 (2026) Review: GPU-Less Gaming Laptop

    April 19, 2026

    It Takes 2 Minutes to Hack the EU’s New Age-Verification App

    April 19, 2026

    Comments are closed.

    Editors Picks

    Francis Bacon and the Scientific Method

    April 19, 2026

    Proxy-Pointer RAG: Structure Meets Scale at 100% Accuracy with Smarter Retrieval

    April 19, 2026

    Sulfur lava exoplanet L 98-59 d defies classification

    April 19, 2026

    Hisense U7SG TV Review (2026): Better Design, Great Value

    April 19, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    Nevada intensifies crackdown, stripping DraftKings and Flutter licenses over prediction markets

    November 13, 2025

    Gothenburg’s Volta Greentech secures €1.7 million to tackle global warming by reducing methane emissions from cows

    March 24, 2026

    If nostalgia is Australia’s innovation story, no wonder we’re stuck

    September 10, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.