What You Get
Deploy at Scale
Deploy and run AI agents on Kubernetes with production-ready configurations, auto-scaling, and high availability.
Full Observability
Capture traces, metrics, and logs for complete visibility into agent behavior using OpenTelemetry instrumentation.
Governance
Enforce policies, manage access controls, and ensure compliance across all agents with built-in governance tools.
Auto-Instrumentation
Zero-code instrumentation for popular AI frameworks including LangChain, LlamaIndex, and more.
Lifecycle Management
Manage agent versions, configurations, and deployments from a unified control plane with rollback capabilities.
External Agent Support
Monitor and govern externally deployed agents alongside internal ones with unified observability.
Built on Open Standards
Powered by OpenChoreo
Built on OpenChoreo, an open-source Kubernetes-native platform for deploying and managing cloud-native applications. This ensures production-ready deployments, auto-scaling, and high availability for your AI agents.
OpenTelemetry Compatible
Fully compatible with OpenTelemetry standards for instrumentation, enabling seamless integration to capture traces, across popular AI frameworks including LangChain, LlamaIndex, and more.
Get Started in Minutes
Try the platform with our quick-start dev container. Everything you need is pre-configured.
docker run --rm -it --name amp-quick-start \
-v /var/run/docker.sock:/var/run/docker.sock \
--network=host \
ghcr.io/wso2/amp-quick-start:v0.6.0
# Inside container
./install.sh