MCP-VectorSQL is a powerful vector SQL generation tool that converts natural language questions into high-quality SQL queries, specifically designed for vector databases. It enables users to interact with vector databases using natural language, simplifying complex vector search operations.
The architecture consists of three main components:
- Text2VectorSql: Handles natural language input and generates unified SQL output
- LLM: Processes natural language questions and generates vector queries
- VecDB (MyScale): Performs vector similarity searches and stores vector data
The workflow includes:
- Step 1: LLM lists database tables and schemas from the vector database
- Step 2: Text2VectorSql gets vector queries based on natural language questions
- Step 3: VecDB executes vector queries and returns results
- Accepts direct natural language questions from users
- Converts natural language into structured vector queries
- Supports complex questions with multiple conditions
- Performs efficient similarity searches on vector databases
- Supports various similarity metrics (cosine similarity, Euclidean distance, etc.)
- Optimized for large-scale vector datasets
- Processes and integrates results from vector searches
- Combines information from multiple sources if needed
- Generates coherent and comprehensive answers
- Returns natural language answers based on search results
- Provides relevant and accurate information to users
- Maintains context and relevance throughout the conversation
# Copy environment variable example file
cp .env.example .envModify the .env file with your configuration:
- API settings (API_KEY, API_URL, etc.)
- Database settings (MYSCALE_HOST, MYSCALE_PORT, MYSCALE_USER, etc.)
- Server settings (MCP_SERVER_TRANSPORT, MCP_BIND_HOST, etc.)
# Initialize runtime environment
uv sync --all-extras --dev
# Run MCP server
uv run python -m mcp_server.mainRegister the MCP server with the Dify platform to use its SQL generation capabilities.
Please refer to the LICENSE file for license information.
For questions or suggestions, please contact the development team.
