Ollama MCP Server

Integrate local LLM models with MCP-compatible AI assistants via Ollama

An MCP server exposing the complete Ollama SDK as tools, enabling seamless integration between local LLM models and MCP-compatible applications like Claude Desktop, Cline, and Cursor

Author: rawveg

Stars: 151

GitHub

Install: npx -y ollama-mcp