Google Vertex AI Search

Google Vertex AI Search

Provides Google Vertex AI Search results by grounding a Gemini model with your own private data

31
Stars
12
Forks
0
Releases

Overview

This MCP server is designed to search documents using Vertex AI by grounding Gemini's responses in the user's private data stored in Vertex AI Datastore. Grounding helps improve the quality of search results by anchoring Gemini's outputs to your data. The server can connect to one or multiple Vertex AI data stores, enabling flexible grounding across datasets. The solution uses Gemini together with Vertex AI grounding to perform searches over private data. For more details on grounding, a link to Vertex AI Grounding Documentation is provided. The server can be run in Docker (Dockerfile is provided) or installed from the repository as a Python package, with a config file required (config.yml.template). Configuration is YAML-based, with sections for server, model, and data_stores (including fields like project_id, location, datastore_id, tool_name, and description). The MCP server supports two transports (Server-Sent Events and stdio) controlled via the --transport flag. It also offers a test command to query Vertex AI Search without running the MCP server (uv run mcp-vertexai-search search). An Appendix details the config file structure and required fields.

Details

Owner
ubie-oss
Language
Python
License
Apache License 2.0
Updated
2025-12-07

Features

Grounding-based search with Vertex AI datastore

Ground Gemini's responses by grounding them to data stored in Vertex AI Datastore to improve search results quality.

Multi-datastore integration

Supports integrating one or more Vertex AI data stores into the MCP server.

Docker deployment ready

Provides a Dockerfile for running the MCP server in Docker.

Config-driven operation

Configured via a YAML file (config.yml.template) detailing server, model, and data_stores settings.

Two transport options

Supports Server-Sent Events (SSE) and standard input/output (stdio) via the --transport flag.

CLI-based search testing

Allows testing Vertex AI Search with the search command without running the MCP server.

Config template and appendix

Appendix A describes the config file structure and required fields for setup.

Audience

AI engineersBuild MCP server to ground Gemini responses with private Vertex AI data stores.
Data scientistsIntegrate private Vertex AI data stores to ground model outputs for better search results.

Tags

Vertex AIGeminiGroundingDatastoreMCPDocument SearchConfigDockerSSEstdio