Providers
TransFlex is built to be model-agnostic. We believe you should have the freedom to pick whichever AI fits your exact use-case.
Included Base Providers
Section titled “Included Base Providers”- OpenAI —
gpt-4o-mini,gpt-4o,gpt-4.1,gpt-5,o3-mini,o4-mini, and more - Anthropic —
claude-haiku-4-5,claude-sonnet-4,claude-sonnet-4-5,claude-opus-4-5, and more - Google Gemini —
gemini-2.5-flash,gemini-2.5-pro,gemini-3-flash-preview, and more
Custom OpenAI-Compatible Endpoints
Section titled “Custom OpenAI-Compatible Endpoints”Connect to any OpenAI-compatible server — local instances like Ollama, corporate API proxies, or other third-party inference providers.
- In TransFlex, open Settings > Providers > Custom.
- Click Add Custom Provider.
- Enter your Endpoint URL (e.g.
http://localhost:11434/v1). - If applicable, enter your Bearer token.
- Enter the model ID manually (e.g.
llama3). - Save and select it as your default provider.
Switching Providers
Section titled “Switching Providers”You can set a Global Default Provider under Settings > Providers. However, for advanced users, each Custom Preset can override the global default and target a completely different model or API provider!