Skip to content

Providers

TransFlex is built to be model-agnostic. We believe you should have the freedom to pick whichever AI fits your exact use-case.

  1. OpenAIgpt-4o-mini, gpt-4o, gpt-4.1, gpt-5, o3-mini, o4-mini, and more
  2. Anthropicclaude-haiku-4-5, claude-sonnet-4, claude-sonnet-4-5, claude-opus-4-5, and more
  3. Google Geminigemini-2.5-flash, gemini-2.5-pro, gemini-3-flash-preview, and more

Connect to any OpenAI-compatible server — local instances like Ollama, corporate API proxies, or other third-party inference providers.

  1. In TransFlex, open Settings > Providers > Custom.
  2. Click Add Custom Provider.
  3. Enter your Endpoint URL (e.g. http://localhost:11434/v1).
  4. If applicable, enter your Bearer token.
  5. Enter the model ID manually (e.g. llama3).
  6. Save and select it as your default provider.

You can set a Global Default Provider under Settings > Providers. However, for advanced users, each Custom Preset can override the global default and target a completely different model or API provider!