Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • The Asus Zenbook S 16 Is $500 Off and Has Never Been This Cheap
    • OpenAI sidesteps Nvidia with unusually fast coding model on plate-sized chips
    • Waymo Begins Fully Autonomous Operations With 6th-Generation Tech
    • Tiny NanoLEDs Promise New Display Possibilities
    • How to Leverage Explainable AI for Better Business Decisions
    • Awaken Morningstar sleek, dual-shell fiberglass camping trailer
    • Stockholm-based Loovi secures €1 million to scale preventive health and longevity platform
    • Waymo Asks the DC Public to Pressure Their City Officials
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Friday, February 13
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»AI Technology News»Why AI leaders can’t afford fragmented AI tools
    AI Technology News

    Why AI leaders can’t afford fragmented AI tools

    Editor Times FeaturedBy Editor Times FeaturedMarch 21, 2025No Comments7 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    TL;DR:

    Fragmented AI instruments are draining  budgets, slowing adoption, and irritating groups. To manage prices and speed up ROI, AI leaders want interoperable options that cut back software sprawl and streamline workflows.

    AI funding is underneath a microscope in 2025. Leaders aren’t simply requested to show AI’s worth — they’re being requested why, after vital investments, their groups nonetheless battle to ship outcomes.

    1-in-4 groups report problem implementing AI instruments, and practically 30% cite integration and workflow inefficiencies as their high frustration, in keeping with our Unmet AI Needs report.

    The perpetrator? A disconnected AI ecosystem. When groups spend extra time wrestling with disconnected instruments than delivering outcomes, AI leaders threat ballooning prices, stalled ROI, and excessive expertise turnover. 

    AI practitioners spend extra time sustaining instruments than fixing enterprise issues. The most important blockers? Guide pipelines, software fragmentation, and connectivity roadblocks.

    Think about if cooking a single dish required utilizing a special range each single time. Now envision working a restaurant underneath these situations. Scaling could be not possible. 

    Equally, AI practitioners are slowed down by the time-consuming, brittle pipelines, leaving much less time to advance and ship AI options.

    AI integration should accommodate various working kinds, whether or not code-first in notebooks, GUI-driven, or a hybrid strategy. It should additionally bridge gaps between groups, corresponding to knowledge science and DevOps, the place every group depends on totally different toolsets. When these workflows stay siloed, collaboration slows, and deployment bottlenecks emerge.

    Scalable AI additionally calls for deployment flexibility corresponding to JAR recordsdata, scoring code, APIs or embedded functions. With out an infrastructure that streamlines these workflows, AI leaders threat stalled innovation, rising inefficiencies, and unrealized AI potential. 

    How integration gaps drain AI budgets and assets 

    Interoperability hurdles don’t simply decelerate groups – they create vital value implications.

    The highest workflow restrictions AI practitioners face:

    • Guide pipelines. Tedious setup and upkeep pull AI, engineering, DevOps, and IT groups away from innovation and new AI deployments.
    • Instrument and infrastructure fragmentation. Disconnected environments create bottlenecks and inference latency, forcing groups into infinite troubleshooting as a substitute of scaling AI.
    • Orchestration complexities.  Guide provisioning of compute assets — configuring servers, DevOps settings, and adjusting as utilization scales — shouldn’t be solely time-consuming however practically not possible to optimize manually. This results in efficiency limitations, wasted effort, and underutilized compute, finally stopping AI from scaling successfully.
    • Tough updates. Fragile pipelines and gear silos make integrating new applied sciences gradual, complicated, and unreliable. 

    The long-term value? Heavy infrastructure administration overhead that eats into ROI. 

    Extra funds goes towards the overhead prices of guide patchwork options as a substitute of delivering outcomes.

    Over time, these course of breakdowns lock organizations into outdated infrastructure, frustrate AI groups, and stall enterprise impression.

    Code-first builders want customization, however know-how misalignment makes it more durable to work effectively.

    • 42% of builders say customization improves AI workflows.
    • Only one-in-3 say their AI instruments are simple to make use of.

    This disconnect forces groups to decide on between flexibility and value, resulting in misalignments that gradual AI growth and complicate workflows. However these inefficiencies don’t cease with builders. AI integration points have a much wider impression on the enterprise.

    The true value of integration bottlenecks

    Disjointed AI instruments and methods don’t simply impression budgets; they create ripple results that impression crew stability and operations. 

    • The human value. With a mean tenure of simply 11 months, knowledge scientists typically depart earlier than organizations can absolutely profit from their experience. Irritating workflows and disconnected instruments contribute to excessive turnover.
    • Misplaced collaboration alternatives. Solely 26% of AI practitioners really feel assured counting on their very own experience, making cross-functional collaboration important for knowledge-sharing and retention.

    Siloed infrastructure slows AI adoption. Leaders typically flip to hyperscalers for value financial savings, however these options don’t at all times combine simply with instruments, including backend friction for AI groups. 

    Generative AI and agentic are including extra complexity

    With 90% of respondents anticipating generative AI and predictive AI to converge, AI groups should stability person wants with technical feasibility.

    As King’s Hawaiian CDAO Ray Fager explains:
    “Utilizing generative AI in tandem with predictive AI has actually helped us construct belief. Enterprise customers ‘get’ generative AI since they’ll simply work together with it. After they have a GenAI app that helps them work together with predictive AI, it’s a lot simpler to construct a shared understanding.”

    With an growing demand for generative and agentic AI, practitioners face mounting compute, scalability, and operational challenges. Many organizations are layering new generative AI instruments on high of their present know-how stack and not using a clear integration and orchestration technique. 

    The addition of generative and agentic AI, with out the inspiration to effectively allocate these complicated workloads throughout all out there compute assets, will increase operational pressure and makes AI even more durable to scale.

    4 steps to simplify AI infrastructure and reduce prices  

    Streamlining AI operations doesn’t need to be overwhelming. Listed below are actionable steps AI leaders can take to optimize operations and empower their groups:

    Step 1: Assess software flexibility and adaptableness

    Agentic AI requires modular, interoperable instruments that assist frictionless upgrades and integrations. As necessities evolve, AI workflows ought to stay versatile, not constrained by vendor lock-in or inflexible instruments and architectures.

    Two vital inquiries to ask are:

    • Can AI groups simply join, handle, and interchange instruments corresponding to LLMs, vector databases, or orchestration and safety layers with out downtime or main reengineering?
    • Do our AI instruments scale throughout varied environments (on-prem, cloud, hybrid), or are they locked into particular distributors and inflexible infrastructure?

    Step 2: Leverage a hybrid interface

    53% of practitioners want a hybrid AI interface that blends the flexibleness of coding with the accessibility of GUI-based instruments. As one knowledge science lead defined, “GUI is vital for explainability, particularly for constructing belief between technical and non-technical stakeholders.” 

    Step 3: Streamline workflows with AI platforms

    Consolidating instruments into a unified platform reduces guide pipeline stitching, eliminates blockers, and improves scalability. A platform strategy additionally optimizes AI workflow orchestration by leveraging the most effective out there compute assets, minimizing infrastructure overhead whereas making certain low-latency, high-performance AI options.

    Step 4: Foster cross-functional collaboration

    When IT, knowledge science, and enterprise groups align early, they’ll establish workflow boundaries earlier than they turn into implementation roadblocks. Utilizing unified instruments and shared methods reduces redundancy, automates processes, and accelerates AI adoption. 

    Set the stage for future AI innovation

    The Unmet AI Wants survey makes one factor clear: AI leaders should prioritize adaptable, interoperable instruments — or threat falling behind. 

    Inflexible, siloed methods not solely slows innovation and delays ROI, it additionally prevents organizations from responding to fast-moving developments in AI and enterprise know-how. 

    With 77% of organizations already experimenting with generative and predictive AI, unresolved integration challenges will solely turn into extra pricey over time. 

    Leaders who deal with software sprawl and infrastructure inefficiencies now will decrease operational prices, optimize assets, and see stronger long-term AI returns

    Get the total DataRobot Unmet AI Needs report to find out how high AI groups are overcoming implementation hurdles and optimizing their AI investments.

    In regards to the creator

    Might Masoud

    Product Advertising Supervisor, DataRobot

    Might Masoud is a knowledge scientist, AI advocate, and thought chief educated in classical Statistics and fashionable Machine Studying. At DataRobot she designs market technique for the DataRobot AI Governance product, serving to world organizations derive measurable return on AI investments whereas sustaining enterprise governance and ethics.

    Might developed her technical basis by levels in Statistics and Economics, adopted by a Grasp of Enterprise Analytics from the Schulich College of Enterprise. This cocktail of technical and enterprise experience has formed Might as an AI practitioner and a thought chief. Might delivers Moral AI and Democratizing AI keynotes and workshops for enterprise and educational communities.


    Kateryna Bozhenko
    Kateryna Bozhenko

    Product Supervisor, AI Manufacturing, DataRobot

    Kateryna Bozhenko is a Product Supervisor for AI Manufacturing at DataRobot, with a broad expertise in constructing AI options. With levels in Worldwide Enterprise and Healthcare Administration, she is passionated in serving to customers to make AI fashions work successfully to maximise ROI and expertise true magic of innovation.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    AI is already making online crimes easier. It could get much worse.

    February 12, 2026

    What’s next for Chinese open-source AI

    February 12, 2026

    Is a secure AI assistant possible?

    February 11, 2026

    Real Fight Is Business Model

    February 11, 2026

    The Foundation of Trusted Enterprise AI

    February 11, 2026

    A “QuitGPT” campaign is urging people to cancel their ChatGPT subscription

    February 10, 2026

    Comments are closed.

    Editors Picks

    The Asus Zenbook S 16 Is $500 Off and Has Never Been This Cheap

    February 13, 2026

    OpenAI sidesteps Nvidia with unusually fast coding model on plate-sized chips

    February 13, 2026

    Waymo Begins Fully Autonomous Operations With 6th-Generation Tech

    February 13, 2026

    Tiny NanoLEDs Promise New Display Possibilities

    February 12, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    Humanoid Robots: The Scaling Challenge

    September 13, 2025

    AI for Particle Physics: Searching for Anomalies

    February 3, 2026

    Gear News of the Week: Nothing’s Latest Earbuds, Amazon’s Hardware Event, and a New Free VPN

    September 20, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.