Remocode
Getting Started6 min read

Setting Up Your First AI Provider in Remocode

A step-by-step tutorial for connecting your first AI provider to Remocode. Covers Anthropic, OpenAI, Google, Groq, and Ollama setup with API key instructions and model selection tips.

AI providerAPI keyconfigurationAnthropicOpenAIGeminiGroqOllama

Setting Up Your First AI Provider in Remocode

Remocode supports five AI providers out of the box. This guide will help you pick the right one and get it connected in a few minutes.

Accessing Provider Settings

  • Press `⌘⇧A` to open the AI panel
  • Click the ⚙ Settings (gear) icon
  • Navigate to the Provider tab

This is where all provider API keys and model selections live.

Option 1: Anthropic (Cloud)

Anthropic offers Claude, one of the most capable coding assistants available. Available models:

  • Claude Opus 4.6 — top-tier reasoning, best for complex architecture tasks
  • Claude Sonnet 4.6 — excellent balance of speed and quality
  • Claude Haiku 4.5 — fast responses, great for quick completions
  • Claude Haiku 3.5 — lightweight option for simple tasks

Setup:

  • Visit [console.anthropic.com](https://console.anthropic.com) and create an account
  • Navigate to API Keys and click Create Key
  • Copy the key (starts with sk-ant-)
  • Paste it into the Anthropic field in Remocode settings

Option 2: OpenAI (Cloud)

OpenAI provides the GPT and o-series models:

  • GPT-5.4 and GPT-5 — latest generation, strong at code generation
  • GPT-5 Mini and GPT-5 Nano — faster and more affordable
  • GPT-4.1 and GPT-4.1 Mini — proven and reliable
  • GPT-4o and GPT-4o Mini — optimized multimodal models
  • o3 and o3 Mini — advanced reasoning models

Setup:

  • Visit [platform.openai.com](https://platform.openai.com) and sign in
  • Go to API Keys and generate a new key
  • Copy and paste it into Remocode settings

Option 3: Google Gemini (Cloud)

Google provides the Gemini family of models:

  • Gemini 3.1 Pro — latest flagship model
  • Gemini 3 Flash — fast and capable
  • Gemini 2.5 Pro and Gemini 2.5 Flash — strong all-rounders
  • Gemini 2.0 Flash, Gemini 1.5 Pro, Gemini 1.5 Flash — stable and well-tested

Setup:

  • Visit [aistudio.google.com](https://aistudio.google.com) and sign in with your Google account
  • Navigate to Get API Key and create one
  • Copy the key and paste it into Remocode settings

Option 4: Groq (Cloud)

Groq delivers blazing-fast inference. Available models:

  • Llama 3.3 70B — powerful open-source model
  • Llama 3.1 8B — lightweight and fast
  • Mixtral 8x7B — efficient mixture-of-experts architecture

Setup:

  • Visit [console.groq.com](https://console.groq.com) and create an account
  • Generate an API key from the dashboard
  • Paste it into Remocode settings

Option 5: Ollama (Local)

Ollama runs models on your own machine with zero cloud dependency. Supported models:

  • Llama 3.2, Mistral, Code Llama, Qwen 3.5, DeepSeek V3

Setup:

  • Install Ollama from [ollama.com](https://ollama.com)
  • Pull a model: ollama pull llama3.2
  • Ensure Ollama is running locally
  • Select Ollama as your provider in Remocode — no API key needed

Which Provider Should You Choose?

  • Best quality: Anthropic Claude Opus 4.6 or OpenAI GPT-5.4
  • Best speed: Groq Llama 3.3 70B
  • Best privacy: Ollama with any local model
  • Best free tier: Google Gemini (generous free quota)

You can configure multiple providers and switch between them at any time. Start with one, and expand as your workflow evolves.

Ready to try Remocode?

Start with a 7-day Pro trial — no credit card required. Download now and start coding with AI from anywhere.

Download Remocodefor macOS

Related Articles