Browse and search the AI agent directory
2861 agents found
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
MCP server for PLANKA kanban boards
An MCP server for testing MCP servers you are developing with AI assistants
A Linear MCP server for interacting with Linear's API
First HTTP-native MCP server for Obsidian - Works with Claude Code CLI, Codex, Gemini without stdio bugs
Model Context Protocol (MCP) server for pure.md, the markdown delivery network for LLMs
Advanced Model Context Protocol (MCP) server for Google NotebookLM.
MCP server for Swift build, test, and package operations with structured, token-efficient output
MCP Server para la API refacil-commerce
🚀 Easy-to-use MCP server for SAP HANA database integration with AI agents like Claude Desktop. Connect to HANA databases with natural language queries.
WhizoAI MCP Server for Claude Desktop - Enterprise web scraping through Model Context Protocol (MCP) integration
MCP server for IndexForge — gives AI agents the ability to submit URLs to Google and Bing for indexing, scan sitemaps, check index status, and detect 404s.
Simplified MCP server for Latitude.so prompt management - 8 focused tools for push, pull, run, and manage prompts
MCP server for the OpSpawn x402 Bazaar — screenshot capture, AI analysis, PDF/HTML generation, code security scanning, and dependency auditing via x402 micropayments
MCP server providing on-demand skill loading for AI coding assistants
Model Context Protocol (MCP) server designed for LLMs to interact with Obsidian vaults. Provides secure, token-aware tools for seamless knowledge base management through a standardized interface.
MCP server daemon and aggregator CLI – aggregate all your Model Context Protocol servers behind a single background daemon with lazy start, auto-reconnect, and idle shutdown
All your AI Agents like Claude Code, Codex CLI in a single TUI to keep things organized.
MCP Server for validating MTSD (i-AUD Design) documents
MCP server for running long-running AI tasks using @just-every/task