A Streamlit interface for Ollama models with full MCP (Model Context Protocol) integration. Works with any tool-calling capable model like deepseek-r1-tool-calling:14b or llama2:latest.
- Local LLM Execution: Run models locally using Ollama (deepseek-r1)
- MCP Integration: Universal tool protocol support
- Streamlit Interface: Real-time streaming chat interface
- Dynamic Tool Support: Automatic capability detection
MCP is a universal protocol that standardizes how AI models interact with tools and services. It provides:
- Universal Tool Interface: Common protocol for all AI tools
- Standardized Messages: Consistent communication format
- Discoverable Capabilities: Self-describing tools and services
- Language Agnostic: Works with any programming language
- Growing Ecosystem: Many tools available
Learn more:
- Python 3.9+
- Ollama desktop app installed and running
- MCP-compatible tools
- python-dotenv
- An Ollama-compatible model with tool-calling support
-
Prerequisites:
# Install Ollama desktop app from https://ollama.ai/download # Make sure Ollama is running # Then pull the recommended model (or choose another tool-calling capable model) ollama pull MFDoom/deepseek-r1-tool-calling:14b # Alternative models that support tool calling: # ollama pull llama2:latest
-
Setup:
git clone https://github.com/madtank/OllamaAssist.git cd OllamaAssist python -m venv venv source venv/bin/activate pip install -r requirements.txt
OllamaAssist uses environment variables for configuration. Create a .env
file:
# Brave Search Configuration
BRAVE_API_KEY=your_api_key_here
# Optional: Override default commands
#BRAVE_COMMAND=docker
#BRAVE_ARGS=run -i --rm -e BRAVE_API_KEY mcp/brave-search
# Filesystem Configuration
#FILESYSTEM_PATHS=/path1:/path2:/path3
Variables can be:
- Set in .env file
- Commented out to use defaults
- Override using environment variables
OllamaAssist uses MCP to provide powerful capabilities through standardized tools. Configure available tools in mcp_config.json
:
{
"mcpServers": {
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "your-api-key-here"
}
},
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/allowed/path"]
}
}
}
The project supports various MCP servers:
- brave-search - Web and local search capabilities
- filesystem - Secure file operations
- chromadb - Vector database operations
- postgres - SQL database integration
- mcp-memory - Long-term context persistence
- sqlite - Lightweight database operations
- huggingface - Model and dataset access
- langchain - AI workflow integration
- git - Repository operations
- jupyter - Notebook integration
Check out Awesome MCP Servers for more.
-
Each server entry needs:
command
: The MCP tool executableargs
: Optional command line argumentsenv
: Environment variables (like API keys)
-
Common MCP servers:
brave-search
: Web search (requires Brave API key)filesystem
: Local file operationssequential-thinking
: Self-reflection capabilities- Add your own MCP-compatible tools!
For services requiring authentication:
- Get your API key (e.g., Brave Search API)
- Add it to the appropriate server's
env
section - Never commit API keys to version control
Example tool implementation:
async def brave(action: str, query: str = "", count: int = 5) -> Any:
"""Brave Search API wrapper"""
server_name = "brave-search"
return await mcp(
server=server_name,
tool=f"brave_{action}_search",
arguments={"query": query, "count": count}
)
- Create an MCP-compatible tool
- Add it to
mcp_config.json
- The tool will be automatically available to the chatbot
- Ensure Ollama desktop app is running
- Launch OllamaAssist:
streamlit run streamlit_app.py
Run tests:
python -m pytest tests/test_tools.py -v
Want to create your own MCP tool? Follow these guides:
Use the MCP Inspector to test your tools:
mcp dev your_server.py
Or install in Claude Desktop:
mcp install your_server.py
- Fork the repository
- Create a feature branch
- Test your changes
- Submit a pull request
- Additional MCP server integrations
- Enhanced model capability detection
- Advanced tool chaining
- UI improvements for tool interactions
This project is licensed under the MIT License - see the LICENSE file for details.