Overview
Features
MCP Compatibility
Skyvern supports the Model Context Protocol, enabling any MCP-compliant LLM to drive browser automation tasks.
Wide LLM Provider Support
Supports OpenAI, Anthropic, Azure OpenAI, AWS Bedrock, Gemini, Ollama, OpenRouter, and OpenAI-compatible endpoints via MCP.
Flexible LLM Configuration
Control LLM selection and token limits through environment variables (LLM_KEY, SECONDARY_LLM_KEY, LLM_CONFIG_MAX_TOKENS).
Structured Output with Data Schemas
Use data_extraction_schema to enforce consistent outputs from MCP-driven runs.
Task & Workflow Automation
Drive Skyvern Tasks and Workflows with MCP-driven LLMs to automate browser interactions (form filling, data extraction, downloads, etc.).
Documentation & Resources
Access MCP documentation and related docs to configure and use MCP with Skyvern.
Who Is This For?
- LLM developers:Integrate MCP-compliant models with Skyvern to drive browser automation workflows.
- Automation engineers:Leverage MCP-enabled Skyvern to automate browser tasks across multiple websites.
- AI researchers:Experiment with MCP-based browser automation and compare model performance on tasks.




