# Installation Guide

Producer Pal is free and open source. Choose your preferred AI platform to get
started. Note that some AI services charge for usage.

::: tip Prerequisites

**Download** v{{ $frontmatter.version }}**:**&nbsp;
[Producer_Pal.amxd](https://github.com/adamjmurray/producer-pal/releases/latest/download/Producer_Pal.amxd)
(the Producer Pal Max for Live device)

**Requires:** [Ableton Live 12.3+](https://www.ableton.com/live/) with
[Max for Live](https://www.ableton.com/live/max-for-live/)

Upgrading from a previous version? See the
[upgrading guide](./installation/upgrading).

:::

## Which AI Do You Use?

Pick your preferred AI provider:

- **[Anthropic / Claude](./installation/choose-claude)** — Claude Desktop,
  claude.ai, or Claude Code
- **[OpenAI / ChatGPT](./installation/choose-openai)** — Codex app, ChatGPT web
  app, or Codex CLI
- **[Google / Gemini](./installation/choose-gemini)** — Built-in Chat UI or
  Gemini CLI
- **[Mistral / Mistral AI](./installation/choose-mistral)** — Le Chat or Mistral
  Vibe
- **[Local / Offline](./installation/choose-local)** — Ollama or LM Studio
- **[Multiple Providers](./installation/choose-multi)** — OpenRouter,
  flexibility across providers

## Recommended Options

These are the easiest and most reliable ways to use Producer Pal:

- **[Claude Desktop](./installation/claude-desktop)** — Anthropic's desktop app
  (easiest setup, subscription required)
- **[Codex App](./installation/codex-app)** — OpenAI's desktop app (easy setup,
  macOS only, subscription required)
- **[Built-in Chat UI](./installation/chat-ui)** — Integrated chat interface
  supporting cloud providers and local models

Looking for offline/local options? See
[Desktop Apps](./installation/desktop-apps) for LM Studio or
[Ollama](./installation/ollama) via the built-in chat UI.

Already have an MCP-compatible client? Connect with
[`npx producer-pal`](https://www.npmjs.com/package/producer-pal) (see
[Other MCP LLMs](./installation/other-mcp) and
[CLI options](#command-line-interfaces))

## Command Line Interfaces

For users comfortable with the terminal:

- **[Gemini CLI](./installation/gemini-cli)** - Google's command line agent
  (free tier has strict rate limits)
- **[Codex CLI](./installation/codex-cli)** - OpenAI's command line agent
  (subscription required)
- **[Claude Code](./installation/claude-code)** - Anthropic's command line agent
  (subscription required)
- **[Mistral Vibe](./installation/mistral-vibe)** - Mistral's command line agent
  (API key required)

## Web Applications

Use Producer Pal in your browser:

- **[claude.ai Web App](./installation/claude-web)** - Anthropic's web app
  (requires [web tunnel](./installation/web-tunnels))
- **[ChatGPT Web App](./installation/chatgpt-web)** - OpenAI's web app (requires
  [web tunnel](./installation/web-tunnels))
- **[Le Chat](./installation/mistral-le-chat)** - Mistral's web app (requires
  [web tunnel](./installation/web-tunnels))

## Local & Offline Options

Run models completely offline:

- **[Ollama](./installation/ollama)** - Using the built-in chat interface
- **[LM Studio](./installation/lm-studio)** - Alternative local model server
- **[Other MCP-compatible LLMs](./installation/other-mcp)** - Any LLM supporting
  MCP

## Additional Resources

- **[Upgrading](./installation/upgrading)** - How to update to a new version
- **[Web Tunnels](./installation/web-tunnels)** - Setup remote access (for web
  apps)
- **[Troubleshooting](/support/troubleshooting)** - Common issues and solutions
