Skip to main content

What it’s good for

Fast, capable models with generous free tiers. Gemini handles both chat and embeddings from a single provider, so there’s no two-server setup needed. Good for getting started quickly with a cloud provider.

Requirements

Configure in Spaceduck

Chat

1

Set your API key

In Settings > Chat, select Google Gemini as the provider, then enter your API key.Or via CLI:
spaceduck config secret set /ai/secrets/geminiApiKey
2

Select a model

In Settings > Chat:
  • Provider: Google Gemini
  • Model: e.g., gemini-2.5-flash or gemini-2.0-pro
Or via CLI:
spaceduck config set /ai/provider gemini
spaceduck config set /ai/model "gemini-2.5-flash"
3

Verify

Send a message in the chat. You should see a streaming response.

Embeddings

Gemini provides text embeddings through the same API key.
1

Configure embedding model

In Settings > Memory:
  • Toggle Semantic recall on
  • Provider: Google Gemini (or “Same as chat provider”)
  • Model: text-embedding-004
  • Dimensions: leave empty for the provider default
Or via CLI:
spaceduck config set /embedding/enabled true
spaceduck config set /embedding/provider gemini
spaceduck config set /embedding/model "text-embedding-004"
2

Test

Click the Test button in Settings > Memory.

Test and troubleshoot

ProblemCauseFix
401 or 403 errorInvalid API keyCheck your key at aistudio.google.com
429 Too Many RequestsRate limit (free tier)Wait a minute and retry, or upgrade to a paid plan
Model not foundTypo in model nameCheck available models at ai.google.dev/models