feat(provider): add AI/ML API provider#863
Conversation
Introduce first-class support for the AI/ML API (aimlapi) OpenAI-compatible provider. Changes include: adding AIMLAPI_API_KEY env handling and docs/README updates; new docs/aimlapi-setup.md and .env.example entries; provider preset 'aimlapi' and UI entry in ProviderManager; providerConfig: DEFAULT_AIMLAPI_BASE_URL and isAimlapiBaseUrl helper; providerDiscovery: new listOpenAICompatibleModelOptions with AIMLAPI-specific headers and metadata filtering; openaiShim: AIMLAPI header attribution and AIMLAPI_API_KEY fallback to OPENAI_API_KEY; bootstrap updated to consume model options and fallback API key; providerProfiles/providerProfile/providerSecrets/providerValidation updates to include AIMLAPI_API_KEY in profiles, secrets, apply/clear flows and validation messages; and tests added/updated to cover aimlapi behaviors. These changes enable using AI/ML API endpoints (https://api.aimlapi.com/v1) seamlessly as an OpenAI-compatible provider.
Introduce a new AIMLAPI provider module (src/providers/aimlapi/index.ts) with helpers, constants, model-mapping, and tests. Wire AIMLAPI into discovery, provider picker, bootstrap, OpenAI shim, provider profiles, and validation so attribution headers, API-key resolution, environment syncing, and model catalog parsing are centralized. Replace scattered AIMLAPI logic and hardcoded strings with exported utilities and constants (AIMLAPI_LABEL, AIMLAPI_PROVIDER_PRESET_OPTION, getAimlapiApiKey, getAimlapiAttributionHeaders, getAimlapiOpenAICompatibleApiKey, syncAimlapiOpenAIEnv, mapAimlapiModelCatalog, etc.), and adjust model description formatting. This consolidates AIMLAPI-specific behavior and ensures consistent handling across the codebase.
There was a problem hiding this comment.
Pull request overview
This PR adds first-class support for the AI/ML API service as an OpenAI-compatible provider, spanning provider selection UX, env/profile handling, runtime request behavior, and model discovery.
Changes:
- Adds an
AI/ML APIprovider preset, plus env/profile support forAIMLAPI_API_KEYas a first-class credential. - Adds AI/ML API–specific runtime behavior (attribution headers + credential fallback/sync) when targeting
api.aimlapi.com. - Improves
/modelsdiscovery for AI/ML API (chat-completions filtering, dedupe, and metadata mapping) and adds docs/tests for the new provider.
Reviewed changes
Copilot reviewed 19 out of 19 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
| src/utils/providerValidation.ts | Accepts AIMLAPI_API_KEY as an alternative credential when base URL is AI/ML API and improves missing-key messaging. |
| src/utils/providerSecrets.ts | Treats AIMLAPI_API_KEY as a secret for sanitization/redaction. |
| src/utils/providerProfiles.ts | Adds aimlapi preset defaults and ensures profiles clear/apply AIMLAPI_API_KEY where appropriate. |
| src/utils/providerProfiles.test.ts | Adds tests for AI/ML API preset defaults and credential fallback behavior. |
| src/utils/providerProfile.ts | Adds AIMLAPI_API_KEY to profile env/secret key sets and associated types. |
| src/utils/providerDiscovery.ts | Adds AI/ML API attribution headers and model-option discovery with metadata mapping; introduces listOpenAICompatibleModelOptions. |
| src/utils/providerDiscovery.test.ts | Tests AI/ML API model option mapping, filtering, deduplication, and headers. |
| src/services/api/providerConfig.ts | Allows additional model option caching for AI/ML API base URLs (in addition to local providers). |
| src/services/api/openaiShim.ts | Syncs AIMLAPI credentials into OpenAI flow for AI/ML API base URLs and adds attribution headers. |
| src/services/api/openaiShim.test.ts | Tests AI/ML API attribution headers and AIMLAPI_API_KEY fallback behavior. |
| src/services/api/bootstrap.ts | Uses model-option discovery (with metadata) and attempts AIMLAPI credential fallback for model discovery. |
| src/providers/aimlapi/index.ts | New provider module defining constants, base URL detection, attribution headers, credential helpers, preset defaults, and model mapping. |
| src/providers/aimlapi/index.test.ts | Unit tests for the new AI/ML API provider helper module. |
| src/components/ProviderManager.tsx | Adds AI/ML API preset to the provider picker. |
| src/components/ProviderManager.test.tsx | Verifies AI/ML API appears in the preset picker output. |
| src/commands/provider/provider.tsx | Updates provider summary labeling to identify AI/ML API specifically. |
| README.md | Adds onboarding instructions and links for AI/ML API setup. |
| docs/aimlapi-setup.md | Adds dedicated AI/ML API setup documentation. |
| .env.example | Documents AI/ML API env var configuration option. |
Comments suppressed due to low confidence (1)
src/services/api/bootstrap.ts:153
getAdditionalModelOptionsCacheScope()now allows AI/ML API (non-local) to use this discovery path, but the log message still says "Local OpenAI model discovery failed". This makes debugging confusing when the failing request was against a remote AI/ML API endpoint; consider updating the wording to reflect "OpenAI-compatible" (or include the resolved base URL/provider label).
if (models === null) {
logForDebugging('[Bootstrap] Local OpenAI model discovery failed')
return null
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
hello thank you for this. some test failed kindly take a look |
Rework bootstrap model discovery to detect AIMLAPI base URLs and use AIMLAPI_API_KEY as a fallback when OPENAI_API_KEY is not set. Removed the direct getAimlapiApiKey import and instead determine the discovery API key at runtime based on resolveProviderRequest() and isAimlapiBaseUrl(). Also improved the debug log to include the baseUrl for failed discoveries. Added an explicit re-export of isAimlapiBaseUrl from providerConfig for external use.
Thanks! |
Vasanthdev2004
left a comment
There was a problem hiding this comment.
Thanks for the PR. I reviewed the current head as a full review.
Verdict: Needs changes
Blocking issue:
src/services/api/providerConfig.tsandsrc/services/api/bootstrap.tsnow route AI/ML API through the bootstrap additional-model discovery path, which means startup/bootstrap can make an authenticated backgroundGET https://api.aimlapi.com/v1/modelsrequest.- The existing nonessential-traffic/privacy guard is only applied in
fetchBootstrapAPI(), not in the OpenAI-compatible bootstrap branch. Before this change that branch was effectively for local providers; on the current head it now reaches a remote third-party host even when nonessential traffic is disabled. That is a trust-boundary/privacy regression.
Non-blocking notes:
- I did not find a separate blocker in the provider preset wiring, host detection, header injection, or AIMLAPI/OpenAI key fallback logic.
- Saved startup profiles still serialize through the generic
OPENAI_API_KEYpath rather than preservingAIMLAPI_API_KEYend-to-end. - I am also not seeing any GitHub check-runs or statuses on the current head, so there is no CI signal attached to this review.
Happy to re-review once the AI/ML API model discovery path is either explicitly user-driven or gated behind the existing nonessential-traffic/privacy rules.
|
okay thanks. I`ll wait |
|
#910 merged, please rebase and fix conflicts. |
|
I would not re-review this current head yet because #910 has now landed and changed the provider architecture quite a lot. The useful next step is to rebase/rework this against current main using the descriptor-backed provider route shape instead of the older provider-specific wiring. After that, the main thing I would re-check is the earlier trust-boundary concern: AI/ML API model discovery should not introduce authenticated remote/background traffic unless it is user-driven or gated by the existing nonessential-traffic/privacy rules. Once the branch is rebased and conflicts are resolved, happy to do a fresh review on the new shape. |
Summary
This PR adds first-class AI/ML API support as an OpenAI-compatible provider across setup, runtime, model discovery, and provider profile flows. From #835
What changed
AI/ML APIprovider preset in the provider pickersrc/providers/aimlapi/AIMLAPI_API_KEYas a provider-specific credentialAIMLAPI_API_KEYinto OpenAI-compatible runtime flow whenOPENAI_BASE_URLpoints to AI/ML APIapi.aimlapi.com/modelsdiscovery for AI/ML API by:Why it changed
OpenClaude already supports OpenAI-compatible providers, but AI/ML API needed a smoother first-class setup path and provider-aware behavior.
This change makes AI/ML API easier to configure, ensures the correct credential is used without forcing users to rename env vars, and improves model selection UX by showing only relevant chat-capable models with better metadata.
Impact
user-facing impact:
AI/ML APIdirectly in/providerAIMLAPI_API_KEYinstead of onlyOPENAI_API_KEYdeveloper/maintainer impact:
Testing
bun run buildbun run smokesrc/providers/aimlapi/index.test.tssrc/components/ProviderManager.test.tsxsrc/services/api/openaiShim.test.tssrc/utils/providerDiscovery.test.tssrc/utils/providerProfiles.test.tsNotes
provider/model path tested:
/provider→AI/ML APICLAUDE_CODE_USE_OPENAI=1AIMLAPI_API_KEY=...OPENAI_BASE_URL=https://api.aimlapi.com/v1OPENAI_MODEL=gpt-4oGET /modelsagainsthttps://api.aimlapi.com/v1screenshots attached (if UI changed):
AI/ML APIpresetAI/ML API