Build Intelligent Agents
with Async Python.

Async-first Python framework for building, extending, and orchestrating AI Agents and Chatbots. Vendor-agnostic, production-ready, and built for speed.

pip install ai-parrot

Monorepo Packages

Install only what you need. Heavy tools and loaders are split into their own packages.

ai-parrot

Core framework: agents, LLM clients, memory, orchestration, A2A, MCP.

pip install ai-parrot

ai-parrot-tools

Tool and toolkit implementations: Jira, AWS, Slack, Google, finance, and more.

pip install ai-parrot-tools

ai-parrot-loaders

Document loaders for RAG pipelines: PDF, YouTube, audio, video, EPUB, web.

pip install ai-parrot-loaders

Why AI-Parrot?

Unified Agent API

Simple Chatbot interface to create agents with built-in memory, vector store support, and conversation history management.

Powerful Tooling

Use the @tool decorator, class-based AbstractToolkit, or instantly convert OpenAPI specs into tools with OpenAPIToolkit.

Orchestration

Manage multi-agent workflows with AgentCrew. Support for Sequential, Parallel, Flow, and Loop execution modes.

A2A & MCP

Native Agent-to-Agent (A2A) protocol and first-class support for Model Context Protocol (MCP).

Multi-Provider

Switch seamlessly between OpenAI, Anthropic, Google Gemini, Groq, X.AI, HuggingFace, vLLM, and OpenRouter without changing your code.

Scheduling

Give your agents agency with the @schedule decorator to run background tasks like daily briefings or monitoring.

Application Server

Full aiohttp-based server that exposes agents as REST APIs, WebSocket endpoints, with built-in bot management.

Platform Integrations

Expose agents natively to Telegram, MS Teams, Slack, and WhatsApp with minimal config.

CLI Tooling

Interactive parrot setup wizard, MCP server launcher, autonomous agent deployer, and Docker-based security tool installer.

Installation

LLM Providers

Install only the providers you need:

terminal
# Google Gemini
pip install "ai-parrot[google]"

# OpenAI / GPT
pip install "ai-parrot[openai]"

# Anthropic / Claude
pip install "ai-parrot[anthropic]"

# Groq
pip install "ai-parrot[groq]"

# X.AI / Grok
pip install "ai-parrot[xai]"

# All LLM providers at once
pip install "ai-parrot[llms]"

Additional providers supported out of the box (no extra install): HuggingFace, vLLM, OpenRouter, Ollama.

Tools & Loaders

terminal
# Tools with specific extras
pip install "ai-parrot-tools[jira]"
pip install "ai-parrot-tools[aws]"
pip install "ai-parrot-tools[slack]"
pip install "ai-parrot-tools[finance]"
pip install "ai-parrot-tools[all]"

# Document loaders
pip install "ai-parrot-loaders[youtube]"
pip install "ai-parrot-loaders[pdf]"
pip install "ai-parrot-loaders[audio]"
pip install "ai-parrot-loaders[all]"

Embeddings & Vector Stores

terminal
# Sentence transformers, FAISS, ChromaDB, etc.
pip install "ai-parrot[embeddings]"

CLI Quick Setup

terminal
# Interactive setup wizard
parrot setup

# Initialize config directory
parrot conf init

# Start an MCP server
parrot mcp --config server.yaml

# Deploy an autonomous agent as a systemd service
parrot autonomous create --agent my_agent.py
parrot autonomous install --agent my_agent.py --name my-agent

Quick Start

Simple Weather Bot

weather_bot.py
import asyncio
from parrot.bots import Chatbot
from parrot.tools import tool

# 1. Define a tool with type hints
@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is Sunny, 25C"

async def main():
    # 2. Create the Agent
    bot = Chatbot(
        name="WeatherBot",
        llm="openai:gpt-4o",
        tools=[get_weather],
        system_prompt="You are a helpful weather assistant."
    )

    # 3. Configure & Chat
    await bot.configure()
    response = await bot.ask("What's the weather like in Madrid?")
    print(response)

if __name__ == "__main__":
    asyncio.run(main())

Using LLM Clients Directly

image_gen.py
import asyncio
from parrot.clients.google.client import GoogleGenAIClient
from parrot.models.outputs import ImageGenerationPrompt
from parrot.models.google import GoogleModel

async def main():
    prompt = ImageGenerationPrompt(
        prompt="A realistic passport-style photo with white background",
        styles=["photorealistic", "high resolution"],
        model=GoogleModel.IMAGEN_3.value,
        aspect_ratio="16:9",
    )

    client = GoogleGenAIClient()
    async with client:
        response = await client.image_generation(prompt_data=prompt)
        for img_path in response.images:
            print(f"Image saved to: {img_path}")

if __name__ == "__main__":
    asyncio.run(main())

Core Concepts

Tools

Functional Tools (@tool)

The simplest way to create a tool. The docstring and type hints are automatically used to generate the schema for the LLM.

tools.py
from parrot.tools import tool

@tool
def calculate_vat(amount: float, rate: float = 0.20) -> float:
    """Calculate VAT for a given amount."""
    return amount * rate

Class-Based Toolkits (AbstractToolkit)

Group related tools into a reusable class. All public async methods become tools.

toolkit.py
from parrot.tools import AbstractToolkit

class MathToolkit(AbstractToolkit):
    async def add(self, a: int, b: int) -> int:
        """Add two numbers."""
        return a + b

    async def multiply(self, a: int, b: int) -> int:
        """Multiply two numbers."""
        return a * b

OpenAPI Toolkit

Dynamically generate tools from any OpenAPI/Swagger specification.

openapi_tools.py
from parrot.tools import OpenAPIToolkit

petstore = OpenAPIToolkit(
    spec="https://petstore.swagger.io/v2/swagger.json",
    service="petstore"
)

# Now your agent can call petstore_get_pet_by_id, etc.
bot = Chatbot(name="PetBot", tools=petstore.get_tools())

Orchestration (AgentCrew)

Orchestrate multiple agents with Sequential, Parallel, Flow (DAG), and Loop modes.

crew.py
from parrot.bots.orchestration import AgentCrew

crew = AgentCrew(
    name="ResearchTeam",
    agents=[researcher_agent, writer_agent]
)

# Define a Flow - Writer waits for Researcher to finish
crew.task_flow(researcher_agent, writer_agent)

await crew.run_flow("Research the latest advancements in Quantum Computing")

Scheduling

scheduled_agent.py
from parrot.scheduler import schedule, ScheduleType

class DailyBot(Chatbot):
    @schedule(schedule_type=ScheduleType.DAILY, hour=9, minute=0)
    async def morning_briefing(self):
        news = await self.ask("Summarize today's top tech news")
        await self.send_notification(news)

Connectivity & Exposure

Agent-to-Agent (A2A)

Agents can discover and talk to each other using the A2A protocol.

# Expose an Agent
from parrot.a2a import A2AServer
a2a = A2AServer(my_agent)
a2a.setup(app, url="https://my-agent.com")

# Consume an Agent
from parrot.a2a import A2AClient
async with A2AClient("https://remote-agent.com") as client:
    response = await client.send_message("Hello!")

Model Context Protocol (MCP)

First-class MCP support. Consume external MCP servers or expose your agent as one.

# Consume MCP Servers
mcp_servers = [
    MCPServerConfig(
        name="filesystem",
        command="npx",
        args=["-y", "@modelcontextprotocol/server-filesystem", "/home"]
    )
]
await bot.setup_mcp_servers(mcp_servers)

Running as a Server

AI-Parrot is also a full aiohttp-based application server powered by Navigator.

app.py
from parrot.manager import BotManager
from parrot.conf import STATIC_DIR
from parrot.handlers import AppHandler
from agents.my_agent import MyAgent

class Main(AppHandler):
    app_name: str = "Parrot"
    enable_static: bool = True
    staticdir: str = STATIC_DIR

    def configure(self) -> None:
        self.bot_manager = BotManager()
        self.bot_manager.register(MyAgent())
        self.bot_manager.setup(self.app)

Built-in Endpoints

EndpointMethodDescription
/api/v1/agents/chat/{agent_id}POSTChat with an agent (JSON, HTML, or Markdown)
/api/v1/agents/chat/{agent_id}PATCHConfigure tools/MCP servers for a session
/api/v1/bot_managementGETList registered bots
/api/v1/bot_management/{bot}GET/POST/PATCH/DELETECRUD operations on bots
/api/v1/agent_toolsGETList available tools
/ws/userinfoWebSocketReal-time user notifications

Production Deployment

terminal
# Development (single process, auto-reload)
python run.py

# Production (Gunicorn with async workers)
pip install "ai-parrot[deploy]"

gunicorn run:app \
    --worker-class aiohttp.worker.GunicornUVLoopWebWorker \
    --workers 4 \
    --bind 0.0.0.0:5000 \
    --timeout 360

Architecture

Modular design enabling agents to act as both consumers and providers.

graph TD User["User / Client"] --> API["AgentTalk Handlers"] API --> Bot["Chatbot / BaseBot"] subgraph Core["Agent Core"] Bot --> Memory["Memory / Vector Store"] Bot --> LLM["LLM Client"] Bot --> TM["Tool Manager"] end subgraph Tools["Tools & Capabilities"] TM --> LocalTools["Local Tools (@tool)"] TM --> Toolkits["Toolkits (OpenAPI/Custom)"] TM --> MCPServer["External MCP Servers"] end subgraph Connect["Connectivity"] Bot -.-> A2A["A2A Protocol"] Bot -.-> MCP["MCP Protocol"] Bot -.-> Integrations["Telegram / Teams / Slack / WhatsApp"] end subgraph Orch["Orchestration"] Crew["AgentCrew"] --> Bot Crew --> OtherBots["Other Agents"] end

Supported LLM Providers

ProviderExtraIdentifierExample
OpenAIopenaiopenaiopenai:gpt-4o
Anthropicanthropicanthropic, claudeanthropic:claude-sonnet-4-20250514
Google Geminigooglegooglegoogle:gemini-3.1-flash-lite-preview
Groqgroqgroqgroq:llama-3.3-70b-versatile
X.AI / Grokxaigrokgrok:grok-3
HuggingFaceincludedhfhf:meta-llama/Llama-3-8B
vLLMincludedvllmvllm:model-name
OpenRouterincludedopenrouteropenrouter:anthropic/claude-sonnet-4
Ollamaincludedvia OpenAI-compatible endpoint

Contributing

Development Setup

terminal
git clone https://github.com/phenobarbital/ai-parrot.git
cd ai-parrot

# Create venv (Python 3.11)
make venv
source .venv/bin/activate

# Full dev install
make develop

# Run tests
make test

Project Layout

ai-parrot/
├── packages/
│   ├── ai-parrot/           # Core framework
│   │   └── src/parrot/
│   ├── ai-parrot-tools/     # Tool implementations
│   │   └── src/parrot_tools/
│   └── ai-parrot-loaders/   # Document loaders
│       └── src/parrot_loaders/
├── tests/
├── examples/
├── Makefile                  # Build, install, test, release shortcuts
└── pyproject.toml            # Workspace root

Makefile Targets

TargetDescription
make developAll packages + all extras + dev tools
make develop-fastAll packages, base deps only (no torch/tensorflow)
make installAll packages, base deps only
make install-coreCore with LLM clients + vector stores
make install-toolsCore + tools with common extras
make install-allEverything with ALL extras
make formatFormat code with black
make lintLint with pylint + black --check
make testRun pytest + mypy
make buildBuild all packages (sdist + wheel)
make bump-patchBump patch version across all packages

Guidelines