Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • AI Machine-Vision Earns Man Overboard Certification
    • Battery recycling startup Renewable Metals charges up on $12 million Series A
    • The Influencers Normalizing Not Having Sex
    • Sources say NSA is using Mythos Preview, and a source says it is also being used widely within the DoD, despite Anthropic’s designation as a supply chain risk (Axios)
    • Today’s NYT Wordle Hints, Answer and Help for April 20 #1766
    • Scandi-style tiny house combines smart storage and simple layout
    • Our Favorite Apple Watch Has Never Been Less Expensive
    • Vercel says it detected unauthorized access to its internal systems after a hacker using the ShinyHunters handle claimed a breach on BreachForums (Lawrence Abrams/BleepingComputer)
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Monday, April 20
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»Artificial Intelligence»Build LLM Agents Faster with Datapizza AI
    Artificial Intelligence

    Build LLM Agents Faster with Datapizza AI

    Editor Times FeaturedBy Editor Times FeaturedOctober 30, 2025No Comments8 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    Organizations are more and more investing in AI as these new instruments are adopted in on a regular basis operations increasingly more. This steady wave of innovation is fueling the demand for extra environment friendly and dependable frameworks. Following this pattern, Datapizza (the startup behind Italy’s tech group) simply launched an open-source framework for GenAI with Python, referred to as Datapizza AI.

    When creating LLM-powered Brokers, that you must choose an AI stack:

    • Language Mannequin – the mind of the Agent. The primary wide selection is open-source (i.e. Llama, DeepSeek, Phi) vs paid (i.e. ChatGPT, Claude, Gemini). Then, primarily based on the usecase, one wants to think about the LLM information: generic (is aware of slightly little bit of all the things like Wikipedia) vs topic-specific (i.e. fine-tuned for coding or finance).
    • LLM Engine – it’s what runs the language mannequin, responding to prompts, inferring which means, and creating textual content. Principally, it generates intelligence. Essentially the most used are OpenAI (ChatGPT), Anthropic (Claude), Google (Gemini), and Ollama (runs open-source fashions regionally).
    • AI Framework – it’s the orchestration layer to construct and handle workflows. To place it in one other approach, the framework should construction the intelligence created by LLMs. For the time being, the panorama is dominated by LangChain, LLamaIndex, and CrewAI. The brand new library Datapizza AI falls on this class and desires to be a substitute for the opposite important frameworks.

    On this article, I’m going to point out how you can use the brand new Datapizza framework for constructing LLM-powered AI Brokers. I’ll current some helpful Python code that may be simply utilized in different comparable instances (simply copy, paste, run) and stroll by way of each line of code with feedback in an effort to replicate this instance.

    Setup

    I’ll use Ollama because the LLM engine, as a result of I prefer to host fashions regionally on my pc. That’s the usual follow for all firms with delicate knowledge. Protecting all the things native provides full management over knowledge privateness, mannequin conduct, and price.

    To start with, that you must download Ollama from the website. Then, choose a mannequin and run the command indicated on the web page to tug the LLM. I’m going with Alibaba’s Qwen, because it’s each good and lite (ollama run qwen3).

    Datapizza AI helps all the principle LLM engines. We will full the setup by working the next instructions:

    pip set up datapizza-ai
    pip set up datapizza-ai-clients-openai-like

    As indicated within the official documentation, we will rapidly check our AI stack by calling the mannequin with a easy immediate and asking a query. The article OpenAILikeClient() is the way you connect with the Ollama API, which is normally hosted on the identical localhost URL.

    from datapizza.shoppers.openai_like import OpenAILikeClient
    
    llm = "qwen3"
    
    immediate = '''
    You might be an clever assistant, present the very best reply to person's request. 
    ''' 
    
    ollama = OpenAILikeClient(api_key="", mannequin=llm, system_prompt=immediate, base_url="http://localhost:11434/v1")
    
    q = '''
    what time is it?
    '''
    
    llm_res = ollama.invoke(q)
    print(llm_res.textual content)

    Chatbot

    One other solution to check the potential of the LLM is to construct a easy Chatbot and do some dialog. To take action, at each interplay, we have to retailer the chat historical past and feed it again to the mannequin, specifying what was stated by whom. The Datapizza framework already has a built-in reminiscence system.

    from datapizza.reminiscence import Reminiscence
    from datapizza.kind import TextBlock, ROLE
    
    reminiscence = Reminiscence()
    reminiscence.add_turn(TextBlock(content material=immediate), function=ROLE.SYSTEM)
    
    whereas True:
        ## Consumer
        q = enter('🙂 >')
        if q == "give up":
            break
        
        ## LLM
        llm_res = ollama.invoke(q, reminiscence=reminiscence)
        res = llm_res.textual content
        print("🍕 >", f"x1b[1;30m{res}x1b[0m")
    
        ## Update Memory
        memory.add_turn(TextBlock(content=q), role=ROLE.USER)
        memory.add_turn(TextBlock(content=res), role=ROLE.ASSISTANT)

    If you want to retrieve the chat history, you can just access the memory. Usually, AI frameworks use three roles in the interaction with an LLM: “system” (core instructions), “user” (what was said by the human), “assistant” (what the chatbot replied).

    memory.to_dict()

    Obviously, the LLM alone is very limited and it can’t do much besides chatting. Therefore, we need to give it the possibility to take action, or in other words, to activate Tools.

    Tools

    Tools are the main difference between a simple LLM and an AI Agent. When the user requests something that goes beyond the LLM knowledge base (i.e. “what time is it now?“), the Agent should understand that it doesn’t know the answer, activate a Tool to get additional information (i.e. checking the clock), elaborate the result through the LLM, and generate an answer.

    The Datapizza framework allows you to create Tools from scratch very easily. You just need to import the decorator @tool and any function can become actionable for the Agent.

    from datapizza.tools import tool
    
    @tool
    def get_time() -> str:
        '''Get the current time.'''
        from datetime import datetime
        return datetime.now().strftime("%H:%M")
    
    get_time()

    Then, assign the designated Tool to the Agent, and you’ll have an AI that combines language understanding + autonomy decision making + tool use.

    from datapizza.agents import Agent
    import os
    
    os.environ["DATAPIZZA_AGENT_LOG_LEVEL"] = "DEBUG"  #max logging
    
    agent = Agent(identify="single-agent", consumer=ollama, system_prompt=immediate, 
                  instruments=[get_time], max_steps=2)
    
    q = '''
    what time is it?
    '''
    
    agent_res = agent.run(q)

    An LLM-powered AI Agent is an clever system constructed round a language mannequin that doesn’t simply reply, however it causes, decides, and acts. In addition to dialog (which suggests chatting with a general-purpose information base), the most typical actions that Brokers can do are RAG (chatting along with your paperwork), Querying (chatting with a database), Internet Search (chatting with the entire Web).

    As an example, let’s attempt an online looking out Software. In Python, the simplest solution to do it’s with the well-known personal browser DuckDuckGo. You possibly can instantly use the original library or the Datapizza framework wrapper (pip set up datapizza-ai-tools-duckduckgo).

    from datapizza.instruments.duckduckgo import DuckDuckGoSearchTool
    
    DuckDuckGoSearchTool().search(question="powell")

    Let’s create an Agent that may search the online for us. If you wish to make it extra interactive, you may construction the AI like I did for the Chatbot.

    os.environ["DATAPIZZA_AGENT_LOG_LEVEL"] = "ERROR" #flip off logging
    
    immediate = '''
    You're a journalist. You need to make assumptions, use your device to analysis, make a guess, and formulate a remaining reply.
    The ultimate reply should comprise information, dates, evidences to help your guess.
    '''
    
    reminiscence = Reminiscence()
    
    agent = Agent(identify="single-agent", consumer=ollama, system_prompt=immediate, 
                  instruments=[DuckDuckGoSearchTool()], 
                  reminiscence=reminiscence, max_steps=2)
    
    whereas True:
        ## Consumer
        q = enter('🙂 >')
        if q == "give up":
            break
        
        ## Agent
        agent_res = agent.run(q)
        res = agent_res.textual content
        print("🍕 >", f"x1b[1;30m{res}x1b[0m")
    
        ## Update Memory
        memory.add_turn(TextBlock(content=q), role=ROLE.USER)
        memory.add_turn(TextBlock(content=res), role=ROLE.ASSISTANT)

    Multi-Agent System

    The real strength of Agents is the ability to collaborate with each other, just like humans do. These teams are called Multi-Agent Systems (MAS), a group of AI Agents that work together in a shared environment to solve complex problems that are too difficult for a single one to handle alone.

    This time, let’s create a more advanced Tool: code execution. Please note that LLMs know how to code by being exposed to a large corpus of both code and natural language text, where they learn patterns, syntax, and semantics of programming languages. But since they can not complete any real action, the code they create is just text. In short, LLMs can generate Python code but can’t execute it, Agents can.

    import io
    import contextlib
    
    @tool
    def code_exec(code:str) -> str:
        '''Execute python code. Use always the function print() to get the output'''
        output = io.StringIO()
        with contextlib.redirect_stdout(output):
            try:
                exec(code)
            except Exception as e:
                print(f"Error: {e}")
        return output.getvalue()
    
    code_exec("from datetime import datetime; print(datetime.now().strftime('%H:%M'))")

    There are two types of MAS: the sequential process ensures tasks are executed one after the other, following a linear progression. On the other hand, the hierarchical structure simulates traditional organizational hierarchies for efficient task delegation and execution. Personally, I tend to prefer the latter as there is more parallelism and flexibility.

    With the Datapizza framework, you can link two or more Agents with the function can_call(). In this way, one Agent can pass the current task to another Agent.

    prompt_senior = '''
    You are a senior Python coder. You check the code generated by the Junior, 
    and use your tool to execute the code only if it's correct and safe.
    '''
    agent_senior = Agent(name="agent-senior", client=ollama, system_prompt=prompt_senior, 
                         tools=[code_exec])
    
    prompt_junior = '''
    You're a junior Python coder. You possibly can generate code however you may't execute it. 
    You obtain a request from the Supervisor, and your remaining output should be Python code to go on.
    If you do not know some particular instructions, you should utilize your device and search the online for "how you can ... with python?".
    '''
    agent_junior = Agent(identify="agent-junior", consumer=ollama, system_prompt=prompt_junior, 
                         instruments=[DuckDuckGoSearchTool()])
    agent_junior.can_call([agent_senior])
    
    prompt_manager = '''
     nothing, you are only a supervisor. After you get a request from the person, 
    first you ask the Junior to generate the code, then you definately ask the Senior to execute it.
    '''
    agent_manager = Agent(identify="agent-manager", consumer=ollama, system_prompt=prompt_manager, 
                          instruments=[])
    agent_manager.can_call([agent_junior, agent_senior])
    
    q = '''
    Plot the Titanic dataframe. Yow will discover the information right here: 
    https://uncooked.githubusercontent.com/mdipietro09/DataScience_ArtificialIntelligence_Utils/grasp/machine_learning/data_titanic.csv
    '''
    
    agent_res = agent_manager.run(q)
    #print(agent_res.textual content)

    Conclusion

    This text has been a tutorial to introduce Datapizza AI, a model new framework to construct LLM-powered Chatbots and AI Brokers. The library could be very versatile and user-friendly, and might cowl totally different GenAI usecases. I used it with Ollama, however it may be linked with all of the well-known engines, like OpenAI.

    Full code for this text: GitHub

    I hope you loved it! Be at liberty to contact me for questions and suggestions or simply to share your fascinating initiatives.

    👉 Let’s Connect 👈

    (All photographs are by the writer except in any other case famous)



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    KV Cache Is Eating Your VRAM. Here’s How Google Fixed It With TurboQuant.

    April 19, 2026

    Proxy-Pointer RAG: Structure Meets Scale at 100% Accuracy with Smarter Retrieval

    April 19, 2026

    Dreaming in Cubes | Towards Data Science

    April 19, 2026

    AI Agents Need Their Own Desk, and Git Worktrees Give Them One

    April 18, 2026

    Your RAG System Retrieves the Right Data — But Still Produces Wrong Answers. Here’s Why (and How to Fix It).

    April 18, 2026

    Europe Warns of a Next-Gen Cyber Threat

    April 18, 2026

    Comments are closed.

    Editors Picks

    AI Machine-Vision Earns Man Overboard Certification

    April 20, 2026

    Battery recycling startup Renewable Metals charges up on $12 million Series A

    April 20, 2026

    The Influencers Normalizing Not Having Sex

    April 20, 2026

    Sources say NSA is using Mythos Preview, and a source says it is also being used widely within the DoD, despite Anthropic’s designation as a supply chain risk (Axios)

    April 19, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    web2wave raises €330k to help app developers bypass app store fees and regain control

    March 26, 2025

    UK’s Turing AI Institute bosses respond to staff anger

    August 15, 2025

    Why Long-Term Roleplay Chatbots Feel More Human

    August 26, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.