Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • GPU Performance Comparison Shows Surprising Variability
    • How to Study the Monotonicity and Stability of Variables in a Scoring Model using Python
    • Vision-only manipulation is hitting a wall
    • Brain-inspired AI chip could save 70% energy
    • Liquid Instruments jags more taxpayer funding in $70 million Series C
    • MAGA Is Confused About ‘Animal Farm’
    • Meta says it might be forced to withdraw its apps from New Mexico if a judge orders it to adopt the state’s proposed safety features (Thomas Barrabi/New York Post)
    • Samsung Chip Profits Soar Amid the Tech World’s RAM Shortages
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Thursday, April 30
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»Robotics»Vision-only manipulation is hitting a wall
    Robotics

    Vision-only manipulation is hitting a wall

    Editor Times FeaturedBy Editor Times FeaturedApril 30, 2026No Comments4 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    In 2016, I stated one thing that went in opposition to the place robotics was heading on the time: vision alone doesn’t work for grasping.

    Not “it wants enchancment.” Not “the tech isn’t there but.” It doesn’t match the issue.

    Greedy is bodily. Contact, power, friction. Imaginative and prescient can information the strategy. It will possibly’t really feel what occurs subsequent.

    Again then, we noticed it within the lab. Tactile vibration knowledge predicted grasp failure with 83% accuracy and detected slip at 92%. Early outcomes, however clear sufficient. The indicators that matter don’t present up in photos.

    Ten years later, the remainder of the sphere is operating into the identical restrict.

    Imaginative and prescient will get you shut

    Imaginative and prescient nonetheless issues. It handles detection, positioning and planning. It will get the robotic to the fitting place, lined up the fitting method.

    It does that effectively, however manipulation doesn’t cease when the gripper reaches the item.

    That’s the place issues break.

    What occurs at contact isn’t seen

    Earlier than contact, the robotic is working off photos.

    After contact, it’s coping with forces.

    A nasty grasp doesn’t begin as a visible change. It reveals up as a shift in power. Slip begins within the fingertips earlier than something strikes sufficient to see. An excessive amount of strain reveals up within the wrist earlier than the item deforms.

    By the point a digicam picks up an issue, it’s already occurring.

    Imaginative and prescient sees outcomes. Contact sensing measures interplay because it occurs.

    And the helpful knowledge lives proper there, for the time being of contact.

    The proof is already there

    This isn’t a principle anymore.

    Tactile-driven insurance policies beat vision-only ones on duties that contain power. Benchmarks like ManiSkill-ViTac present higher efficiency once you mix imaginative and prescient with tactile enter, particularly in insertion and meeting. Fashions like π0, OpenVLA, and Octo rely on synchronized inputs from a number of sensors. Take away power or tactile knowledge, and efficiency drops.

    Nobody is changing imaginative and prescient. They’re including what’s lacking.

    The strongest methods in the present day mix imaginative and prescient, proprioception, power, and contact right into a single mannequin.

    That’s what strikes efficiency.

    Imaginative and prescient has already given most of what it will probably

    Imaginative and prescient nonetheless carries a variety of the system. But it surely doesn’t resolve the arduous half.

    Bodily AI improves with extra knowledge, however not all knowledge issues the identical. Pressure and tactile indicators have an outsized influence on how effectively a system handles actual contact.

    Most datasets nonetheless lean closely on imaginative and prescient and joint knowledge.

    So that you see the identical sample time and again. Robots attain the fitting place. Then wrestle with insertion, meeting, and something that depends upon compliance.

    The lacking data is bodily.

    Tactile knowledge hasn’t scaled but

    Amassing good contact knowledge hasn’t been straightforward. You want instrumented finish effectors, dependable power and tactile sensors, tight synchronization, and constant codecs.

    That’s a {hardware} drawback as a lot as a modelling one.

    Till not too long ago, the infrastructure wasn’t there.

    Now it’s.

    The bottleneck is how briskly groups can deploy it and begin amassing knowledge.

     

    Closing the loop

    What began as a declare in 2016 is now exhibiting up in all places.

    Robots that solely see will hold hitting the identical limits. Robots that may really feel will begin to shut the hole.

    Imaginative and prescient stays. It’s not going anyplace.

    But it surely gained’t carry manipulation by itself. The shift comes from including the indicators that matter on the level of contact.

    At Robotiq, our tactile sensors are constructed to seize these indicators straight on the gripper, so robots see and really feel what they’re doing.

    Contact us to speak with an expert





    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    How Medra built the largest autonomous lab in the United States

    April 28, 2026

    Why pharmaceutical manufacturers are standardizing robotic palletizing

    April 23, 2026

    Why Physical AI isn’t scaling yet, and what’s holding it back

    April 21, 2026

    AI can decide. But can it act? The missing layer in Physical AI

    April 16, 2026

    How TIDI Products increased palletizing productivity by 30% with automation

    April 14, 2026

    Robots-Blog | fruitcore robotics stellt HORST600 G2 und HORST800 G2 vor – neue Robotergeneration für mehr Leistung und Wirtschaftlichkeit

    April 10, 2026
    Leave A Reply Cancel Reply

    Editors Picks

    GPU Performance Comparison Shows Surprising Variability

    April 30, 2026

    How to Study the Monotonicity and Stability of Variables in a Scoring Model using Python

    April 30, 2026

    Vision-only manipulation is hitting a wall

    April 30, 2026

    Brain-inspired AI chip could save 70% energy

    April 30, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    Target Is Latest Chain to Warn of Price Hikes Amid Tariff Uncertainty

    May 21, 2025

    Amsterdam’s Sympower raises additional €42 million funding to drive Europe’s energy transition

    September 16, 2025

    WSOP Circuit returns to Choctaw with $1M guaranteed main event

    January 7, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.