feat: add MiniMax provider support#492
Conversation
- Add MiniMax chat model provider via OpenAI-compatible interface - Add CLAUDE_CODE_USE_MINIMAX environment variable to select provider - Add MINIMAX_API_KEY support for authentication - Default model: MiniMax-M2.7, default base URL: https://api.minimax.io/v1 - Add unit tests for MiniMax provider configuration - Document MiniMax provider option in .env.example
|
@Vasanthdev2004 @gnanam1990 I think this is the better approach in favor of #494 |
|
@octo-patch 1. MiniMax is not wired into the broader model-selection path yet. src/utils/model/model.ts still only treats openai / gemini / github / codex specially, so with CLAUDE_CODE_USE_MINIMAX=1 the runtime can still fall through to non-MiniMax defaults instead of MiniMax-M2.7. |
Vasanthdev2004
left a comment
There was a problem hiding this comment.
Rechecked the latest head 229fff70ce2526b91fa6df7afc92730fa8fc3915 against current origin/main.
I still can't approve this head because MiniMax is now treated as a first-class provider selector in the request path, but the surrounding provider/model wiring is still incomplete.
-
High:
src/utils/providerProfiles.ts:255-265,319-330,345-358
Provider-profile env management still does not know aboutCLAUDE_CODE_USE_MINIMAXorMINIMAX_API_KEY.That makes the MiniMax flag sticky across profile switches:
clearProviderProfileEnvFromProcessEnv()removes the older provider flags, but leavesCLAUDE_CODE_USE_MINIMAXandMINIMAX_API_KEYbehindclient.ts:177-182now treatsCLAUDE_CODE_USE_MINIMAXas enough to route throughcreateOpenAIShimClient()openaiShim.ts:1108-1112then prefersMINIMAX_API_KEYoverOPENAI_API_KEY
So a user who previously enabled MiniMax and later switches to an Anthropic or normal OpenAI profile can silently keep hitting the MiniMax/OpenAI-shim path with the wrong credential.
-
Medium:
src/utils/model/model.ts:78-87,129-145
MiniMax is also not wired into the broader model-selection path yet.getUserSpecifiedModelSetting()and the provider-specific defaults still only treatopenai / gemini / github / codexas special OpenAI-compatible providers, notminimax.That means MiniMax mode can still fall through to the generic non-MiniMax defaults instead of consistently using
MiniMax-M2.7as the effective default model throughout the runtime. -
Medium:
src/utils/context.ts:75-92
MiniMax is not included in the OpenAI-compatible context-window path either. The code still only checksCLAUDE_CODE_USE_OPENAI,CLAUDE_CODE_USE_GEMINI, andCLAUDE_CODE_USE_GITHUBbefore consultingopenaiContextWindows.ts.So in MiniMax mode, token warnings and auto-compact can still use the wrong fallback behavior instead of the OpenAI-compatible context/output metadata path this provider is otherwise using.
What I rechecked on this head:
- direct code-path review of
src/utils/providerProfiles.ts,src/utils/model/model.ts,src/utils/context.ts,src/services/api/client.ts, andsrc/services/api/openaiShim.ts bun install --frozen-lockfile- attempted direct Bun repro commands for the sticky-env/model/context paths, but this Windows worktree hit the recurring Bun module-resolution issue (
lodash-es/memoize.jsfrom internal imports) before those modules executed, so I am not using those runtime failures themselves as review evidence
The request-path pieces for MiniMax are in place, but the provider is not yet wired through the rest of the provider lifecycle safely enough for me to approve.
|
weve asked minimax support on this and we invited one of their devs for review. |
|
@kevincodex1 sounds cool ! |
|
@kevincodex1, so we will be use minimax model for free?? |
gnanam1990
left a comment
There was a problem hiding this comment.
Thanks for this — clean provider addition. Routing through the existing OpenAI shim (no new fetch path), tight scope (6 files / 120 lines), tests cover the default/override branches, CI green.
No red flags on the third-party-path checks. Approving.
|
this will be impacted by #910 |
|
This might not be needed; minimax is a vendor in 0.8.0 |
|
This one looks superseded by current After #910 / the 0.8 provider-registry work, MiniMax is already represented as a descriptor-backed vendor/route with:
This PR is still based on the older provider wiring and now conflicts in request routing, provider config, OpenAI shim env hydration, and |
Summary
Add MiniMax as a first-class provider for OpenClaude by introducing a dedicated
CLAUDE_CODE_USE_MINIMAXflag that routes through the existing OpenAI-compatible shim.minimaxtoAPIProviderunion type andgetAPIProvider()detection logicCLAUDE_CODE_USE_MINIMAX=1through the existing OpenAI-compatible shim (no new shim needed)https://api.minimax.io/v1, default model:MiniMax-M2.7MINIMAX_API_KEYfor authentication (falls back toOPENAI_API_KEY)providerConfig.minimax.test.ts.env.exampleUsage
Test plan
bun run buildbun run smokeAPI Reference
Chat (OpenAI Compatible): https://platform.minimax.io/docs/api-reference/text-openai-api