Overview
Features
Grounding-based search with Vertex AI datastore
Ground Gemini's responses by grounding them to data stored in Vertex AI Datastore to improve search results quality.
Multi-datastore integration
Supports integrating one or more Vertex AI data stores into the MCP server.
Docker deployment ready
Provides a Dockerfile for running the MCP server in Docker.
Config-driven operation
Configured via a YAML file (config.yml.template) detailing server, model, and data_stores settings.
Two transport options
Supports Server-Sent Events (SSE) and standard input/output (stdio) via the --transport flag.
CLI-based search testing
Allows testing Vertex AI Search with the search command without running the MCP server.
Config template and appendix
Appendix A describes the config file structure and required fields for setup.
Who Is This For?
- AI engineers:Build MCP server to ground Gemini responses with private Vertex AI data stores.
- Data scientists:Integrate private Vertex AI data stores to ground model outputs for better search results.




