Skip to content

feat(provider): add AI/ML API provider#863

Open
D1m7asis wants to merge 3 commits intoGitlawb:mainfrom
aimlapi:feature/aimlapi-provider-integration
Open

feat(provider): add AI/ML API provider#863
D1m7asis wants to merge 3 commits intoGitlawb:mainfrom
aimlapi:feature/aimlapi-provider-integration

Conversation

@D1m7asis
Copy link
Copy Markdown

@D1m7asis D1m7asis commented Apr 23, 2026

Summary

This PR adds first-class AI/ML API support as an OpenAI-compatible provider across setup, runtime, model discovery, and provider profile flows. From #835

What changed

  • added a dedicated AI/ML API provider preset in the provider picker
  • added AI/ML API setup docs and README onboarding instructions
  • introduced provider-specific helpers in src/providers/aimlapi/
  • added support for AIMLAPI_API_KEY as a provider-specific credential
  • synced AIMLAPI_API_KEY into OpenAI-compatible runtime flow when OPENAI_BASE_URL points to AI/ML API
  • added AI/ML API attribution headers for requests sent to api.aimlapi.com
  • improved /models discovery for AI/ML API by:
    • filtering to chat-completions models
    • deduplicating model IDs
    • mapping metadata into user-friendly labels/descriptions
  • updated provider summary, validation, bootstrap, profile persistence, and secret handling to recognize AI/ML API
  • added focused tests for provider preset rendering, credential fallback, attribution headers, model discovery, and preset defaults

Why it changed

OpenClaude already supports OpenAI-compatible providers, but AI/ML API needed a smoother first-class setup path and provider-aware behavior.

This change makes AI/ML API easier to configure, ensures the correct credential is used without forcing users to rename env vars, and improves model selection UX by showing only relevant chat-capable models with better metadata.

Impact

  • user-facing impact:

    • users can now select AI/ML API directly in /provider
    • users can configure AI/ML API with AIMLAPI_API_KEY instead of only OPENAI_API_KEY
    • model discovery is cleaner and more useful for AI/ML API users
    • README and dedicated setup docs now include explicit AI/ML API onboarding
  • developer/maintainer impact:

    • adds a provider-specific module for AI/ML API behavior
    • keeps OpenAI-compatible support generic while allowing targeted handling for AI/ML API
    • extends provider profile, validation, bootstrap, and secret flows with explicit AI/ML API support
    • adds regression coverage for the new provider path

Testing

  • bun run build
  • bun run smoke
  • focused tests:
    • src/providers/aimlapi/index.test.ts
    • src/components/ProviderManager.test.tsx
    • src/services/api/openaiShim.test.ts
    • src/utils/providerDiscovery.test.ts
    • src/utils/providerProfiles.test.ts

Notes

  • provider/model path tested:

    • /providerAI/ML API
    • env-based setup with:
      • CLAUDE_CODE_USE_OPENAI=1
      • AIMLAPI_API_KEY=...
      • OPENAI_BASE_URL=https://api.aimlapi.com/v1
      • OPENAI_MODEL=gpt-4o
    • model discovery via GET /models against https://api.aimlapi.com/v1
  • screenshots attached (if UI changed):

    • provider picker with AI/ML API preset
image
  • optional: provider summary showing AI/ML API
image image
  • follow-up work or known limitations:
    • current model discovery intentionally focuses on chat-completions models used by OpenClaude's coding workflow
    • other AI/ML API modalities are documented but are outside the current provider loop

Introduce first-class support for the AI/ML API (aimlapi) OpenAI-compatible provider. Changes include: adding AIMLAPI_API_KEY env handling and docs/README updates; new docs/aimlapi-setup.md and .env.example entries; provider preset 'aimlapi' and UI entry in ProviderManager; providerConfig: DEFAULT_AIMLAPI_BASE_URL and isAimlapiBaseUrl helper; providerDiscovery: new listOpenAICompatibleModelOptions with AIMLAPI-specific headers and metadata filtering; openaiShim: AIMLAPI header attribution and AIMLAPI_API_KEY fallback to OPENAI_API_KEY; bootstrap updated to consume model options and fallback API key; providerProfiles/providerProfile/providerSecrets/providerValidation updates to include AIMLAPI_API_KEY in profiles, secrets, apply/clear flows and validation messages; and tests added/updated to cover aimlapi behaviors. These changes enable using AI/ML API endpoints (https://api.aimlapi.com/v1) seamlessly as an OpenAI-compatible provider.
Introduce a new AIMLAPI provider module (src/providers/aimlapi/index.ts) with helpers, constants, model-mapping, and tests. Wire AIMLAPI into discovery, provider picker, bootstrap, OpenAI shim, provider profiles, and validation so attribution headers, API-key resolution, environment syncing, and model catalog parsing are centralized. Replace scattered AIMLAPI logic and hardcoded strings with exported utilities and constants (AIMLAPI_LABEL, AIMLAPI_PROVIDER_PRESET_OPTION, getAimlapiApiKey, getAimlapiAttributionHeaders, getAimlapiOpenAICompatibleApiKey, syncAimlapiOpenAIEnv, mapAimlapiModelCatalog, etc.), and adjust model description formatting. This consolidates AIMLAPI-specific behavior and ensures consistent handling across the codebase.
Copilot AI review requested due to automatic review settings April 23, 2026 16:17
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds first-class support for the AI/ML API service as an OpenAI-compatible provider, spanning provider selection UX, env/profile handling, runtime request behavior, and model discovery.

Changes:

  • Adds an AI/ML API provider preset, plus env/profile support for AIMLAPI_API_KEY as a first-class credential.
  • Adds AI/ML API–specific runtime behavior (attribution headers + credential fallback/sync) when targeting api.aimlapi.com.
  • Improves /models discovery for AI/ML API (chat-completions filtering, dedupe, and metadata mapping) and adds docs/tests for the new provider.

Reviewed changes

Copilot reviewed 19 out of 19 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
src/utils/providerValidation.ts Accepts AIMLAPI_API_KEY as an alternative credential when base URL is AI/ML API and improves missing-key messaging.
src/utils/providerSecrets.ts Treats AIMLAPI_API_KEY as a secret for sanitization/redaction.
src/utils/providerProfiles.ts Adds aimlapi preset defaults and ensures profiles clear/apply AIMLAPI_API_KEY where appropriate.
src/utils/providerProfiles.test.ts Adds tests for AI/ML API preset defaults and credential fallback behavior.
src/utils/providerProfile.ts Adds AIMLAPI_API_KEY to profile env/secret key sets and associated types.
src/utils/providerDiscovery.ts Adds AI/ML API attribution headers and model-option discovery with metadata mapping; introduces listOpenAICompatibleModelOptions.
src/utils/providerDiscovery.test.ts Tests AI/ML API model option mapping, filtering, deduplication, and headers.
src/services/api/providerConfig.ts Allows additional model option caching for AI/ML API base URLs (in addition to local providers).
src/services/api/openaiShim.ts Syncs AIMLAPI credentials into OpenAI flow for AI/ML API base URLs and adds attribution headers.
src/services/api/openaiShim.test.ts Tests AI/ML API attribution headers and AIMLAPI_API_KEY fallback behavior.
src/services/api/bootstrap.ts Uses model-option discovery (with metadata) and attempts AIMLAPI credential fallback for model discovery.
src/providers/aimlapi/index.ts New provider module defining constants, base URL detection, attribution headers, credential helpers, preset defaults, and model mapping.
src/providers/aimlapi/index.test.ts Unit tests for the new AI/ML API provider helper module.
src/components/ProviderManager.tsx Adds AI/ML API preset to the provider picker.
src/components/ProviderManager.test.tsx Verifies AI/ML API appears in the preset picker output.
src/commands/provider/provider.tsx Updates provider summary labeling to identify AI/ML API specifically.
README.md Adds onboarding instructions and links for AI/ML API setup.
docs/aimlapi-setup.md Adds dedicated AI/ML API setup documentation.
.env.example Documents AI/ML API env var configuration option.
Comments suppressed due to low confidence (1)

src/services/api/bootstrap.ts:153

  • getAdditionalModelOptionsCacheScope() now allows AI/ML API (non-local) to use this discovery path, but the log message still says "Local OpenAI model discovery failed". This makes debugging confusing when the failing request was against a remote AI/ML API endpoint; consider updating the wording to reflect "OpenAI-compatible" (or include the resolved base URL/provider label).
  if (models === null) {
    logForDebugging('[Bootstrap] Local OpenAI model discovery failed')
    return null

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread src/services/api/bootstrap.ts
@kevincodex1
Copy link
Copy Markdown
Contributor

hello thank you for this. some test failed kindly take a look

Rework bootstrap model discovery to detect AIMLAPI base URLs and use AIMLAPI_API_KEY as a fallback when OPENAI_API_KEY is not set. Removed the direct getAimlapiApiKey import and instead determine the discovery API key at runtime based on resolveProviderRequest() and isAimlapiBaseUrl(). Also improved the debug log to include the baseUrl for failed discoveries. Added an explicit re-export of isAimlapiBaseUrl from providerConfig for external use.
@D1m7asis
Copy link
Copy Markdown
Author

hello thank you for this. some test failed kindly take a look

Thanks!

Copy link
Copy Markdown
Collaborator

@Vasanthdev2004 Vasanthdev2004 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR. I reviewed the current head as a full review.

Verdict: Needs changes

Blocking issue:

  1. src/services/api/providerConfig.ts and src/services/api/bootstrap.ts now route AI/ML API through the bootstrap additional-model discovery path, which means startup/bootstrap can make an authenticated background GET https://api.aimlapi.com/v1/models request.
  2. The existing nonessential-traffic/privacy guard is only applied in fetchBootstrapAPI(), not in the OpenAI-compatible bootstrap branch. Before this change that branch was effectively for local providers; on the current head it now reaches a remote third-party host even when nonessential traffic is disabled. That is a trust-boundary/privacy regression.

Non-blocking notes:

  • I did not find a separate blocker in the provider preset wiring, host detection, header injection, or AIMLAPI/OpenAI key fallback logic.
  • Saved startup profiles still serialize through the generic OPENAI_API_KEY path rather than preserving AIMLAPI_API_KEY end-to-end.
  • I am also not seeing any GitHub check-runs or statuses on the current head, so there is no CI signal attached to this review.

Happy to re-review once the AI/ML API model discovery path is either explicitly user-driven or gated behind the existing nonessential-traffic/privacy rules.

@kevincodex1
Copy link
Copy Markdown
Contributor

hello @D1m7asis thank you for this, we pause the provider integration to give way to this PR: #910 once its merged you'll need to rewrite the approach. this will be more optimized approach

@D1m7asis
Copy link
Copy Markdown
Author

okay thanks. I`ll wait

@jatmn
Copy link
Copy Markdown
Contributor

jatmn commented May 2, 2026

#910 merged, please rebase and fix conflicts.
note: providers changed alot

@Vasanthdev2004
Copy link
Copy Markdown
Collaborator

I would not re-review this current head yet because #910 has now landed and changed the provider architecture quite a lot.

The useful next step is to rebase/rework this against current main using the descriptor-backed provider route shape instead of the older provider-specific wiring. After that, the main thing I would re-check is the earlier trust-boundary concern: AI/ML API model discovery should not introduce authenticated remote/background traffic unless it is user-driven or gated by the existing nonessential-traffic/privacy rules.

Once the branch is rebased and conflicts are resolved, happy to do a fresh review on the new shape.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants