Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • OneOdio Focus A1 Pro review
    • The 11 Best Fans to Buy Before It Gets Hot Again (2026)
    • A look at Dylan Patel’s SemiAnalysis, an AI newsletter and research firm that expects $100M+ in 2026 revenue from subscriptions and AI supply chain research (Abram Brown/The Information)
    • ‘Euphoria’ Season 3 Release Schedule: When Does Episode 2 Come Out?
    • Francis Bacon and the Scientific Method
    • Proxy-Pointer RAG: Structure Meets Scale at 100% Accuracy with Smarter Retrieval
    • Sulfur lava exoplanet L 98-59 d defies classification
    • Hisense U7SG TV Review (2026): Better Design, Great Value
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Sunday, April 19
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»AI Technology News»Mustafa Suleyman: AI development won’t hit a wall anytime soon—here’s why
    AI Technology News

    Mustafa Suleyman: AI development won’t hit a wall anytime soon—here’s why

    Editor Times FeaturedBy Editor Times FeaturedApril 8, 2026Updated:April 8, 2026No Comments5 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    We developed for a linear world. Should you stroll for an hour, you cowl a sure distance. Stroll for 2 hours and also you cowl double that distance. This instinct served us nicely on the savannah. However it catastrophically fails when confronting AI and the core exponential tendencies at its coronary heart.

    From the time I started work on AI in 2010 to now, the quantity of coaching knowledge that goes into frontier AI fashions has grown by a staggering 1 trillion instances—from roughly 10¹⁴ flops (floating-point operations‚ the core unit of computation) for early techniques to over 10²⁶ flops for in the present day’s largest fashions. That is an explosion. All the pieces else in AI follows from this reality.

    The skeptics hold predicting partitions. And so they hold being unsuitable within the face of this epic generational compute ramp. Typically, they level out that Moore’s Regulation is slowing. Additionally they point out an absence of knowledge, or they cite limitations on power.

    However if you have a look at the mixed forces driving this revolution, the exponential pattern appears fairly predictable. To grasp why, it’s price trying on the complicated and fast-moving actuality beneath the headlines.

    Consider AI coaching as a room full of individuals working calculators. For years, including computational energy meant including extra individuals with calculators to that room. A lot of the time these employees sat idle, drumming their fingers on desks, ready for the numbers to return by means of for his or her subsequent calculation. Each pause was wasted potential. Right this moment’s revolution goes past extra and higher calculators (though it delivers these); it’s really about guaranteeing that each one these calculators by no means cease, and that they work collectively as one.

    Three advances at the moment are converging to allow this. First, the fundamental calculators received sooner. Nvidia’s chips have delivered an eightfold improve in uncooked efficiency in simply six years, from 312 teraflops in 2020 to 2,500 teraflops today. Our personal Maia 200 chip, launched this January, delivers 30% higher efficiency per greenback than another {hardware} in our fleet. Second, the numbers arrive sooner because of a expertise referred to as HBM, or excessive bandwidth reminiscence, which stacks chips vertically like tiny skyscrapers; the most recent era, HBM3, triples the bandwidth of its predecessor, feeding knowledge to processors quick sufficient to maintain them busy on a regular basis. Third, the room of individuals with calculators turned an workplace after which a complete campus or metropolis. Applied sciences like NVLink and InfiniBand join lots of of hundreds of GPUs into warehouse-size supercomputers that perform as single cognitive entities. A number of years in the past this was unimaginable.

    These good points all come collectively to ship dramatically extra compute. The place coaching a language mannequin took 167 minutes on eight GPUs in 2020, it now takes beneath 4 minutes on equal fashionable {hardware}. To place this in perspective: Moore’s Regulation would predict solely a few 5x enchancment over this era. We noticed 50x. We’ve gone from two GPUs coaching AlexNet, the picture recognition mannequin that kicked off the fashionable increase in deep studying in 2012, to over 100,000 GPUs in in the present day’s largest clusters, each individually much more highly effective than its predecessors.

    Then there’s the revolution in software program. Analysis from Epoch AI means that the compute required to succeed in a hard and fast efficiency degree halves roughly each eight months, a lot sooner than the standard 18-to-24-month doubling of Moore’s Regulation. The prices of serving some current fashions have collapsed by an element of as much as 900 on an annualized foundation. AI is changing into radically cheaper to deploy.

    The numbers for the close to future are simply as staggering. Think about that main labs are rising capability at practically 4x yearly. Since 2020, the compute used to coach frontier fashions has grown 5x every year. World AI-relevant compute is forecast to hit 100 million H100-equivalents by 2027, a tenfold improve in three years. Put all this collectively and we’re taking a look at one thing like one other 1,000x in efficient compute by the top of 2028. It’s believable that by 2030 we’ll carry a further 200 gigawatts of compute on-line yearly—akin to the height power use of the UK, France, Germany, and Italy put collectively.

    What does all this get us? I imagine it can drive the transition from chatbots to just about human-level brokers—semiautonomous techniques able to writing code for days, finishing up weeks- and months-long initiatives, making calls, negotiating contracts, managing logistics. Neglect fundamental assistants that reply questions. Suppose groups of AI employees that deliberate, collaborate, and execute. Proper now we’re solely within the foothills of this transition, and the implications stretch far past tech. Each business constructed on cognitive work can be reworked.

    The apparent constraint right here is power. A single refrigerator-size AI rack consumes 120 kilowatts, equal to 100 properties. However this starvation collides with one other exponential: Solar costs have fallen by an element of practically 100 over 50 years; battery prices have dropped 97% over three many years. There’s a pathway to wash scaling coming into view.

    The capital is deployed. The engineering is delivering. The $100 billion clusters, the 10-gigawatt energy attracts, the warehouse-scale supercomputers … these are now not science fiction. Floor is being damaged for these initiatives now throughout the US and the world. In consequence, we’re heading towards true cognitive abundance. At Microsoft AI, that is the world our superintelligence lab is planning for and constructing.

    Skeptics accustomed to a linear world will proceed predicting diminishing returns. They’ll proceed being shocked. The compute explosion is the technological story of our time, full cease. And it’s nonetheless solely simply starting.

    Mustafa Suleyman is CEO of Microsoft AI.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    How robots learn: A brief, contemporary history

    April 17, 2026

    Vibe Coding Best Practices: 5 Claude Code Habits

    April 16, 2026

    Why having “humans in the loop” in an AI war is an illusion

    April 16, 2026

    Making AI operational in constrained public sector environments

    April 16, 2026

    Treating enterprise AI as an operating layer

    April 16, 2026

    Building trust in the AI era with privacy-led UX

    April 15, 2026
    Leave A Reply Cancel Reply

    Editors Picks

    OneOdio Focus A1 Pro review

    April 19, 2026

    The 11 Best Fans to Buy Before It Gets Hot Again (2026)

    April 19, 2026

    A look at Dylan Patel’s SemiAnalysis, an AI newsletter and research firm that expects $100M+ in 2026 revenue from subscriptions and AI supply chain research (Abram Brown/The Information)

    April 19, 2026

    ‘Euphoria’ Season 3 Release Schedule: When Does Episode 2 Come Out?

    April 19, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    Today’s NYT Connections: Sports Edition Hints, Answers for March 21 #179

    March 21, 2025

    Artemis II’s Breathtaking View of the Far Side of the Moon

    April 8, 2026

    New York lawsuit accuses Polymarket of running illegal sports gambling platform

    February 7, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.