LinuAI

MCP.Pizza Chef: E13Lau

LinuAI is a native macOS client designed for AI chat services with configurable custom LLM endpoints. Unlike Electron-based apps, it offers smooth native performance and deep system integration, adhering to macOS design principles and accessibility features. LinuAI supports local AI model execution through MCP Server integration, compatible with Ollama and LM Studio, enabling private, offline AI interactions. It also features real-time Markdown rendering and code syntax highlighting, making it ideal for developers and power users seeking a responsive, integrated AI chat experience on macOS.

Use This MCP client To

Configure custom LLM endpoints for personalized AI chat Run local AI models via MCP Server integration Use native macOS AI chat with system-wide accessibility Display real-time Markdown and code syntax highlighting Integrate AI chat seamlessly into macOS workflows

README

Linu

Linu is a Your own AI chat service with custom LLM endpoint configurable.

Download

image

Feature

  1. Native Performance & Integration

Built as a true native macOS app (not Electron!), LinuAI provides smooth performance and deep system integration. It supports system-wide accessibility features and follows macOS design guidelines for a cohesive experience.

  1. Local AI Support with MCP Server

Run AI models locally using MCP (Model Control Protocol) Server integration with Ollama/LM Studio support. This means you can:

  1. Real-time Markdown & Code Highlighting

Built as a true native macOS app (not Electron!), Linu provides smooth performance and deep system integration. It supports system-wide accessibility features and follows macOS design guidelines for a cohesive experience.

Release note

0.5.1.alpha Version Release

The 0.5.1.alpha version of our third-party AI chat robot is here!🎉

  • Supports the Claude model, making chatting more intelligent.
  • Supports the MCP protocol to call local tools, with stronger functions.

image

LinuAI FAQ

How do I configure a custom LLM endpoint in LinuAI?
You can set your preferred LLM endpoint in the app settings, allowing LinuAI to connect to any compatible model server.
Can LinuAI run AI models locally?
Yes, LinuAI supports local AI model execution through MCP Server integration with Ollama and LM Studio.
Is LinuAI a web-based or native app?
LinuAI is a true native macOS app, not built on Electron, ensuring better performance and system integration.
Does LinuAI support accessibility features?
Yes, it supports macOS system-wide accessibility features for an inclusive user experience.
How does LinuAI handle code and Markdown?
It provides real-time Markdown rendering and syntax highlighting for code within the chat interface.
Can I use LinuAI with multiple LLM providers?
Yes, LinuAI can connect to various LLM endpoints, including OpenAI, Claude, and Gemini, via custom configuration.
Is LinuAI suitable for developers?
Absolutely, its code highlighting and local model support make it ideal for developer workflows on macOS.
Where can I download LinuAI?
You can download LinuAI from its GitHub releases page at https://github.com/E13Lau/Linu/releases.