Cursor

Cursor IDE to access Large Language Models hosted by AI Enabler.

This tutorial walks you through configuring Cursor to use an OpenAI-compatible API to access LLMs hosted by AI Enabler in Chat, inline edits, and Agent mode. You'll point Cursor at an AI Enabler base URL so that all built-in AI features run through Cast AI infrastructure.

Overview

By the end of this tutorial, you'll be able to:

  • Use MiniMax 2.5 and GLM 5 through Cursor's native Chat and Agent modes
  • Route all Cursor AI requests to AI Enabler
  • Switch between available models directly from Cursor's model selector

This tutorial is intended for developers who have a Cursor Pro subscription or higher. The free tier does not allow configuring custom API keys.

📘

Note

AI Enabler Base URL works with any OpenAI-compatible client. This tutorial covers Cursor, but the same AI Enabler base URL and API key work with OpenCode and other compatible tools.

Prerequisites

Before starting, ensure you have:

  1. A Cast AI API key — Generate one from the Cast AI console under AI Enabler > Overview
  2. Cursor IDE — Download from cursor.com
  3. A Cursor Pro subscription or higher

Step 1: Open Cursor Settings

Open Cursor and navigate to Cursor Settings:

  • macOS: Menu bar > Cursor > Settings > Cursor Settings
  • Windows/Linux: Menu bar > File > Settings > Cursor Settings

In the left sidebar, click Models.

Step 2: Configure the OpenAI base URL

In the Models settings page, scroll down to API Keys and configure the following:

  1. Enter https://llm.cast.ai/openai/v1 in the Override OpenAI Base URL field, then toggle it ON.

  2. Enter your Cast AI API key in the OpenAI API Key field, then toggle it ON.

    This tells Cursor to route all OpenAI-compatible requests through AI Enabler instead of OpenAI's default endpoint.

Step 3: Add models

In the same Models page, use the Add or search model field at the top to add the following models:

  • minimax-m2.5
  • glm-5-fp8
  • Toggle each model ON after adding it.

Your settings should look like this:

Cursor Settings — Models page with minimax-m2.5 and glm-5-fp8 enabled, API key and Override OpenAI Base URL configured

Step 4: Verify your configuration

  1. Open Cursor Chat.

  2. Click the model selector at the bottom of the chat input panel and switch from Auto to minimax-m2.5.

    The model selector should show minimax-m2.5 and glm-5-fp8 as available options:

    Cursor Chat — model selector showing minimax-m2.5 selected

Send a test prompt — for example, Hello. If you get a response, the connection is working.

Next steps