ai-Bible

ai-Bible

Search the bible reliably and repeatably ai-Bible Labs

6
Stars
2
Forks
0
Releases

Overview

ai-Bible's MCP server provides a current implementation for repeatedly and reliably retrieving Bible verses when using large language models. It is part of a project that explores AI in interpreting biblical text and includes a container for compatibility with the OpenAI completions API, enabling research- and education-focused workflows to look up data in consistent formats. The server is designed to support Claude Desktop via the mcp-server.stdio.js wrapper found in the build folder, allowing Claude to connect as an MCP server. A Docker container wraps the MCP server with mcpo to expose an API compatible with OpenAI's completions API. The repository exposes a Swagger UI at /docs for API exploration and includes a get-verse endpoint that accepts an array of references and a language parameter (e.g., English). The project emphasizes reproducibility and reasonable results for educational and research purposes, with front-end access via ai-bible.com and guidance for local workflows using Open WebUI and Ollama with models like LLama 3.1 8B.

Details

Owner
AdbC99
Language
JavaScript
License
GNU General Public License v3.0
Updated
2025-12-07

Features

Reliable verse retrieval for LLMs

Current MCP server implementation enables repeated and reliable fetching of Bible verses for use with large language models.

Claude Desktop integration

Can be configured to run as an MCP server with Claude Desktop via mcp-server.stdio.js.

Dockerized completions API wrapper

Docker container wraps the MCP server to expose an OpenAI-completions-like API via mcpo.

Swagger API documentation

Swagger UI is available at /docs for API exploration and testing.

Get-verse endpoint with multiple references and language

Endpoint accepts an array of verse references and a language parameter for flexible lookups.

Open WebUI and Ollama workflow guidance

Documentation and guidance for using Open WebUI and local LLMs (e.g., Ollama with LLama 3.1 8B) for local operation.

Web front end access

Public web front end is available at ai-bible.com for exploring the project.

Audience

ResearchersUse the MCP server to retrieve Bible verses reliably for evaluation and reproducible experiments with LLMs.
EducatorsProvide students with consistent verse lookups for AI-assisted biblical interpretation and study.
Developers / LLM integratorsConnect the MCP server to Claude Desktop, OpenAI-like completions APIs, or local LLM stacks (via Open WebUI and Ollama).

Tags

mcp-serverbiblellmcompletions-apidockeropenaiclaudereproducibleresearcheducationget-verseai-bible