Async-first Python framework for building, extending, and orchestrating AI Agents and Chatbots. Vendor-agnostic, production-ready, and built for speed.
pip install ai-parrot
Install only what you need. Heavy tools and loaders are split into their own packages.
Core framework: agents, LLM clients, memory, orchestration, A2A, MCP.
pip install ai-parrot
Tool and toolkit implementations: Jira, AWS, Slack, Google, finance, and more.
pip install ai-parrot-tools
Document loaders for RAG pipelines: PDF, YouTube, audio, video, EPUB, web.
pip install ai-parrot-loaders
Simple Chatbot interface to create agents with built-in memory, vector store
support, and conversation history management.
Use the @tool decorator, class-based AbstractToolkit, or instantly
convert OpenAPI specs into tools with OpenAPIToolkit.
Manage multi-agent workflows with AgentCrew. Support for Sequential, Parallel, Flow,
and Loop execution modes.
Native Agent-to-Agent (A2A) protocol and first-class support for Model Context Protocol (MCP).
Switch seamlessly between OpenAI, Anthropic, Google Gemini, Groq, X.AI, HuggingFace, vLLM, and OpenRouter without changing your code.
Give your agents agency with the @schedule decorator to run background tasks like
daily briefings or monitoring.
Full aiohttp-based server that exposes agents as REST APIs, WebSocket endpoints, with built-in bot management.
Expose agents natively to Telegram, MS Teams, Slack, and WhatsApp with minimal config.
Interactive parrot setup wizard, MCP server launcher, autonomous agent deployer, and
Docker-based security tool installer.
Install only the providers you need:
# Google Gemini
pip install "ai-parrot[google]"
# OpenAI / GPT
pip install "ai-parrot[openai]"
# Anthropic / Claude
pip install "ai-parrot[anthropic]"
# Groq
pip install "ai-parrot[groq]"
# X.AI / Grok
pip install "ai-parrot[xai]"
# All LLM providers at once
pip install "ai-parrot[llms]"
Additional providers supported out of the box (no extra install): HuggingFace, vLLM, OpenRouter, Ollama.
# Tools with specific extras
pip install "ai-parrot-tools[jira]"
pip install "ai-parrot-tools[aws]"
pip install "ai-parrot-tools[slack]"
pip install "ai-parrot-tools[finance]"
pip install "ai-parrot-tools[all]"
# Document loaders
pip install "ai-parrot-loaders[youtube]"
pip install "ai-parrot-loaders[pdf]"
pip install "ai-parrot-loaders[audio]"
pip install "ai-parrot-loaders[all]"
# Sentence transformers, FAISS, ChromaDB, etc.
pip install "ai-parrot[embeddings]"
# Interactive setup wizard
parrot setup
# Initialize config directory
parrot conf init
# Start an MCP server
parrot mcp --config server.yaml
# Deploy an autonomous agent as a systemd service
parrot autonomous create --agent my_agent.py
parrot autonomous install --agent my_agent.py --name my-agent
import asyncio
from parrot.bots import Chatbot
from parrot.tools import tool
# 1. Define a tool with type hints
@tool
def get_weather(location: str) -> str:
"""Get the current weather for a location."""
return f"The weather in {location} is Sunny, 25C"
async def main():
# 2. Create the Agent
bot = Chatbot(
name="WeatherBot",
llm="openai:gpt-4o",
tools=[get_weather],
system_prompt="You are a helpful weather assistant."
)
# 3. Configure & Chat
await bot.configure()
response = await bot.ask("What's the weather like in Madrid?")
print(response)
if __name__ == "__main__":
asyncio.run(main())
import asyncio
from parrot.clients.google.client import GoogleGenAIClient
from parrot.models.outputs import ImageGenerationPrompt
from parrot.models.google import GoogleModel
async def main():
prompt = ImageGenerationPrompt(
prompt="A realistic passport-style photo with white background",
styles=["photorealistic", "high resolution"],
model=GoogleModel.IMAGEN_3.value,
aspect_ratio="16:9",
)
client = GoogleGenAIClient()
async with client:
response = await client.image_generation(prompt_data=prompt)
for img_path in response.images:
print(f"Image saved to: {img_path}")
if __name__ == "__main__":
asyncio.run(main())
@tool)The simplest way to create a tool. The docstring and type hints are automatically used to generate the schema for the LLM.
from parrot.tools import tool
@tool
def calculate_vat(amount: float, rate: float = 0.20) -> float:
"""Calculate VAT for a given amount."""
return amount * rate
AbstractToolkit)Group related tools into a reusable class. All public async methods become tools.
from parrot.tools import AbstractToolkit
class MathToolkit(AbstractToolkit):
async def add(self, a: int, b: int) -> int:
"""Add two numbers."""
return a + b
async def multiply(self, a: int, b: int) -> int:
"""Multiply two numbers."""
return a * b
Dynamically generate tools from any OpenAPI/Swagger specification.
from parrot.tools import OpenAPIToolkit
petstore = OpenAPIToolkit(
spec="https://petstore.swagger.io/v2/swagger.json",
service="petstore"
)
# Now your agent can call petstore_get_pet_by_id, etc.
bot = Chatbot(name="PetBot", tools=petstore.get_tools())
AgentCrew)Orchestrate multiple agents with Sequential, Parallel, Flow (DAG), and Loop modes.
from parrot.bots.orchestration import AgentCrew
crew = AgentCrew(
name="ResearchTeam",
agents=[researcher_agent, writer_agent]
)
# Define a Flow - Writer waits for Researcher to finish
crew.task_flow(researcher_agent, writer_agent)
await crew.run_flow("Research the latest advancements in Quantum Computing")
from parrot.scheduler import schedule, ScheduleType
class DailyBot(Chatbot):
@schedule(schedule_type=ScheduleType.DAILY, hour=9, minute=0)
async def morning_briefing(self):
news = await self.ask("Summarize today's top tech news")
await self.send_notification(news)
Agents can discover and talk to each other using the A2A protocol.
# Expose an Agent
from parrot.a2a import A2AServer
a2a = A2AServer(my_agent)
a2a.setup(app, url="https://my-agent.com")
# Consume an Agent
from parrot.a2a import A2AClient
async with A2AClient("https://remote-agent.com") as client:
response = await client.send_message("Hello!")
First-class MCP support. Consume external MCP servers or expose your agent as one.
# Consume MCP Servers
mcp_servers = [
MCPServerConfig(
name="filesystem",
command="npx",
args=["-y", "@modelcontextprotocol/server-filesystem", "/home"]
)
]
await bot.setup_mcp_servers(mcp_servers)
AI-Parrot is also a full aiohttp-based application server powered by Navigator.
from parrot.manager import BotManager
from parrot.conf import STATIC_DIR
from parrot.handlers import AppHandler
from agents.my_agent import MyAgent
class Main(AppHandler):
app_name: str = "Parrot"
enable_static: bool = True
staticdir: str = STATIC_DIR
def configure(self) -> None:
self.bot_manager = BotManager()
self.bot_manager.register(MyAgent())
self.bot_manager.setup(self.app)
| Endpoint | Method | Description |
|---|---|---|
/api/v1/agents/chat/{agent_id} | POST | Chat with an agent (JSON, HTML, or Markdown) |
/api/v1/agents/chat/{agent_id} | PATCH | Configure tools/MCP servers for a session |
/api/v1/bot_management | GET | List registered bots |
/api/v1/bot_management/{bot} | GET/POST/PATCH/DELETE | CRUD operations on bots |
/api/v1/agent_tools | GET | List available tools |
/ws/userinfo | WebSocket | Real-time user notifications |
# Development (single process, auto-reload)
python run.py
# Production (Gunicorn with async workers)
pip install "ai-parrot[deploy]"
gunicorn run:app \
--worker-class aiohttp.worker.GunicornUVLoopWebWorker \
--workers 4 \
--bind 0.0.0.0:5000 \
--timeout 360
Modular design enabling agents to act as both consumers and providers.
| Provider | Extra | Identifier | Example |
|---|---|---|---|
| OpenAI | openai | openai | openai:gpt-4o |
| Anthropic | anthropic | anthropic, claude | anthropic:claude-sonnet-4-20250514 |
| Google Gemini | google | google | google:gemini-3.1-flash-lite-preview |
| Groq | groq | groq | groq:llama-3.3-70b-versatile |
| X.AI / Grok | xai | grok | grok:grok-3 |
| HuggingFace | included | hf | hf:meta-llama/Llama-3-8B |
| vLLM | included | vllm | vllm:model-name |
| OpenRouter | included | openrouter | openrouter:anthropic/claude-sonnet-4 |
| Ollama | included | via OpenAI-compatible endpoint | |
git clone https://github.com/phenobarbital/ai-parrot.git
cd ai-parrot
# Create venv (Python 3.11)
make venv
source .venv/bin/activate
# Full dev install
make develop
# Run tests
make test
ai-parrot/
├── packages/
│ ├── ai-parrot/ # Core framework
│ │ └── src/parrot/
│ ├── ai-parrot-tools/ # Tool implementations
│ │ └── src/parrot_tools/
│ └── ai-parrot-loaders/ # Document loaders
│ └── src/parrot_loaders/
├── tests/
├── examples/
├── Makefile # Build, install, test, release shortcuts
└── pyproject.toml # Workspace root
| Target | Description |
|---|---|
make develop | All packages + all extras + dev tools |
make develop-fast | All packages, base deps only (no torch/tensorflow) |
make install | All packages, base deps only |
make install-core | Core with LLM clients + vector stores |
make install-tools | Core + tools with common extras |
make install-all | Everything with ALL extras |
make format | Format code with black |
make lint | Lint with pylint + black --check |
make test | Run pytest + mypy |
make build | Build all packages (sdist + wheel) |
make bump-patch | Bump patch version across all packages |
pytest after any logic change