How to use OpenRouter in BoltAI

Written By Daniel Nguyen

Last updated About 1 month ago

OpenRouter is a unified API platform for accessing multiple AI Large Language Models (LLMs), like GPT-4, Claude, and Llama, through a single interface, simplifying development by letting you switch models (commercial or open-source) with one API key instead of managing multiple providers. It offers standardized access, better performance, cost management, and data control, making it easier to integrate various AI capabilities into applications.

BoltAI has a first-class support for OpenRouter.

Follow these quick steps to set up and interact with OpenRouter models inside BoltAI.

Prerequisite

Make sure you’ve generated a valid API key and your OpenRouter account has enough credits.

Set up OpenRouter provider in BoltAI

  1. Open BoltAI → Settings → AI Services.

  2. Click Add → choose OpenRouterNext.

  3. (optional) Select your favorite models.

  4. Set a Default ModelAdd Service.

Enter OpenRouter API Key
Set your default model

Start a new chat with OpenRouter models

  1. Click New Chat.

  2. Choose the OpenRouter service/model in the model switcher.

  3. (Optional) Enable web search tool if you want to use OpenRouter’s native web search capabilities

  4. Send your message

OpenRouter's native web search

Start an Instant Chat with OpenRouter

  1. Trigger your Instant Chat keyboard shortcut (default opt+space)

  2. Select OpenRouter model (keyboard shortcut cmd+shift+m)

  3. Send your message