Skip to content

feat: add MiniMax provider support#492

Open
octo-patch wants to merge 1 commit intoGitlawb:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support#492
octo-patch wants to merge 1 commit intoGitlawb:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class provider for OpenClaude by introducing a dedicated CLAUDE_CODE_USE_MINIMAX flag that routes through the existing OpenAI-compatible shim.

  • Add minimax to APIProvider union type and getAPIProvider() detection logic
  • Route CLAUDE_CODE_USE_MINIMAX=1 through the existing OpenAI-compatible shim (no new shim needed)
  • Default base URL: https://api.minimax.io/v1, default model: MiniMax-M2.7
  • Support MINIMAX_API_KEY for authentication (falls back to OPENAI_API_KEY)
  • Add 6 unit tests in providerConfig.minimax.test.ts
  • Document provider in .env.example

Usage

CLAUDE_CODE_USE_MINIMAX=1 \
MINIMAX_API_KEY=sk-... \
openclaude

Test plan

  • 6 new unit tests pass
  • 68 total tests pass (no regressions)
  • Build succeeds: bun run build
  • Smoke test passes: bun run smoke

API Reference

Chat (OpenAI Compatible): https://platform.minimax.io/docs/api-reference/text-openai-api

- Add MiniMax chat model provider via OpenAI-compatible interface
- Add CLAUDE_CODE_USE_MINIMAX environment variable to select provider
- Add MINIMAX_API_KEY support for authentication
- Default model: MiniMax-M2.7, default base URL: https://api.minimax.io/v1
- Add unit tests for MiniMax provider configuration
- Document MiniMax provider option in .env.example
@kevincodex1 kevincodex1 requested review from Vasanthdev2004 and gnanam1990 and removed request for gnanam1990 April 7, 2026 17:20
@kevincodex1
Copy link
Copy Markdown
Contributor

@Vasanthdev2004 @gnanam1990 I think this is the better approach in favor of #494

@gnanam1990
Copy link
Copy Markdown
Collaborator

gnanam1990 commented Apr 7, 2026

@octo-patch 1. MiniMax is not wired into the broader model-selection path yet. src/utils/model/model.ts still only treats openai / gemini / github / codex specially, so with CLAUDE_CODE_USE_MINIMAX=1 the runtime can still fall through to non-MiniMax defaults instead of MiniMax-M2.7.
2. MiniMax is also not included in the OpenAI-compatible context/output metadata path in src/utils/context.ts, so token warnings and auto-compact can still use the wrong fallback behavior for MiniMax mode.

Copy link
Copy Markdown
Collaborator

@Vasanthdev2004 Vasanthdev2004 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rechecked the latest head 229fff70ce2526b91fa6df7afc92730fa8fc3915 against current origin/main.

I still can't approve this head because MiniMax is now treated as a first-class provider selector in the request path, but the surrounding provider/model wiring is still incomplete.

  1. High: src/utils/providerProfiles.ts:255-265, 319-330, 345-358
    Provider-profile env management still does not know about CLAUDE_CODE_USE_MINIMAX or MINIMAX_API_KEY.

    That makes the MiniMax flag sticky across profile switches:

    • clearProviderProfileEnvFromProcessEnv() removes the older provider flags, but leaves CLAUDE_CODE_USE_MINIMAX and MINIMAX_API_KEY behind
    • client.ts:177-182 now treats CLAUDE_CODE_USE_MINIMAX as enough to route through createOpenAIShimClient()
    • openaiShim.ts:1108-1112 then prefers MINIMAX_API_KEY over OPENAI_API_KEY

    So a user who previously enabled MiniMax and later switches to an Anthropic or normal OpenAI profile can silently keep hitting the MiniMax/OpenAI-shim path with the wrong credential.

  2. Medium: src/utils/model/model.ts:78-87, 129-145
    MiniMax is also not wired into the broader model-selection path yet. getUserSpecifiedModelSetting() and the provider-specific defaults still only treat openai / gemini / github / codex as special OpenAI-compatible providers, not minimax.

    That means MiniMax mode can still fall through to the generic non-MiniMax defaults instead of consistently using MiniMax-M2.7 as the effective default model throughout the runtime.

  3. Medium: src/utils/context.ts:75-92
    MiniMax is not included in the OpenAI-compatible context-window path either. The code still only checks CLAUDE_CODE_USE_OPENAI, CLAUDE_CODE_USE_GEMINI, and CLAUDE_CODE_USE_GITHUB before consulting openaiContextWindows.ts.

    So in MiniMax mode, token warnings and auto-compact can still use the wrong fallback behavior instead of the OpenAI-compatible context/output metadata path this provider is otherwise using.

What I rechecked on this head:

  • direct code-path review of src/utils/providerProfiles.ts, src/utils/model/model.ts, src/utils/context.ts, src/services/api/client.ts, and src/services/api/openaiShim.ts
  • bun install --frozen-lockfile
  • attempted direct Bun repro commands for the sticky-env/model/context paths, but this Windows worktree hit the recurring Bun module-resolution issue (lodash-es/memoize.js from internal imports) before those modules executed, so I am not using those runtime failures themselves as review evidence

The request-path pieces for MiniMax are in place, but the provider is not yet wired through the rest of the provider lifecycle safely enough for me to approve.

@kevincodex1
Copy link
Copy Markdown
Contributor

weve asked minimax support on this and we invited one of their devs for review.

@Vasanthdev2004
Copy link
Copy Markdown
Collaborator

@kevincodex1 sounds cool !

@Meetpatel006
Copy link
Copy Markdown
Contributor

@kevincodex1, so we will be use minimax model for free??

Copy link
Copy Markdown
Collaborator

@gnanam1990 gnanam1990 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this — clean provider addition. Routing through the existing OpenAI shim (no new fetch path), tight scope (6 files / 120 lines), tests cover the default/override branches, CI green.

No red flags on the third-party-path checks. Approving.

@jatmn
Copy link
Copy Markdown
Collaborator

jatmn commented Apr 27, 2026

this will be impacted by #910

@jatmn
Copy link
Copy Markdown
Collaborator

jatmn commented May 2, 2026

This might not be needed; minimax is a vendor in 0.8.0

@Vasanthdev2004
Copy link
Copy Markdown
Collaborator

Vasanthdev2004 commented May 2, 2026

This one looks superseded by current main now.

After #910 / the 0.8 provider-registry work, MiniMax is already represented as a descriptor-backed vendor/route with:

  • provider preset/default route model MiniMax-M2.7
  • MiniMax credential detection and validation
  • MiniMax route metadata/context coverage
  • focused tests around provider/profile/model behavior

This PR is still based on the older provider wiring and now conflicts in request routing, provider config, OpenAI shim env hydration, and .env.example. I would not try to merge it as-is. Best path from my side: close this as superseded, and if we still want any missing MiniMax cleanup, open a fresh tiny PR against current main for that exact gap.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants