Overview
Features
Reliable verse retrieval for LLMs
Current MCP server implementation enables repeated and reliable fetching of Bible verses for use with large language models.
Claude Desktop integration
Can be configured to run as an MCP server with Claude Desktop via mcp-server.stdio.js.
Dockerized completions API wrapper
Docker container wraps the MCP server to expose an OpenAI-completions-like API via mcpo.
Swagger API documentation
Swagger UI is available at /docs for API exploration and testing.
Get-verse endpoint with multiple references and language
Endpoint accepts an array of verse references and a language parameter for flexible lookups.
Open WebUI and Ollama workflow guidance
Documentation and guidance for using Open WebUI and local LLMs (e.g., Ollama with LLama 3.1 8B) for local operation.
Web front end access
Public web front end is available at ai-bible.com for exploring the project.
Who Is This For?
- Researchers:Use the MCP server to retrieve Bible verses reliably for evaluation and reproducible experiments with LLMs.
- Educators:Provide students with consistent verse lookups for AI-assisted biblical interpretation and study.
- Developers / LLM integrators:Connect the MCP server to Claude Desktop, OpenAI-like completions APIs, or local LLM stacks (via Open WebUI and Ollama).




