AI integration: getting started
AI integration is available starting with Freeplane version 1.13.1.
Configure AI in Preferences
Open AI Preferences from the AI panel toolbar burger menu, then configure the AI options.
Required user input:
- For OpenRouter:
AI OpenRouter key
- For Gemini:
AI Gemini key
- For Ollama:
- set
AI Ollama service address(for examplehttp://localhost:11434or your remote endpoint URL). - optional: set
AI Ollama API keywhen your Ollama endpoint requires token authentication.
- set
Notes for Ollama:
- Ollama is available only when
AI Ollama service addressis set. - When
AI Ollama API keyis non-empty, Freeplane sendsAuthorization: Bearer <key>for Ollama chat and model discovery requests.
If configuration is missing, chat shows:
No AI provider is configured.
Send your first AI chat message
- Open the
AIpanel. - Type a request in the input area.
- Use
Sendor pressCommand/Ctrl + Enterto start the request. - Use
Cancelto stop an active request.
Helpful chat controls:
New chat: start a clean live chat.Chats: open the chat list dialog.Manage profiles: open profile management.AI profile: shows or selects the active profile.
Next steps
- Continue with workflow patterns in AI chat workflows.
- For remote tool access, see Model Context Protocol server. MCP is disabled by default and should be enabled only when needed, for example when using Freeplane from local MCP-capable tools such as Claude Desktop or Codex App.
- For diagnosis, see AI integration troubleshooting.