Skip to content

fix(session): server always generates message ID, store client-provided ID as clientMessageID#26079

Open
klly14 wants to merge 2 commits intoanomalyco:devfrom
klly14:fix/server-generated-message-id-race
Open

fix(session): server always generates message ID, store client-provided ID as clientMessageID#26079
klly14 wants to merge 2 commits intoanomalyco:devfrom
klly14:fix/server-generated-message-id-race

Conversation

@klly14
Copy link
Copy Markdown

@klly14 klly14 commented May 6, 2026

Issue for this PR

Closes #11869

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

When a caller passes a custom `messageID` to `prompt`, OpenCode used it directly:

id: input.messageID ?? MessageID.ascending(),

Passing an ID takes a different async code path than the no-ID case. With openai-compatible providers like LiteLLM, streaming starts fast enough that by the time the stream processor attaches, the first `text-start` chunk has already come and gone. The Vercel AI SDK tracks text parts by ID in `activeTextContent` — `text-start` registers the entry, `text-delta` looks it up. If `text-start` is dropped because the processor wasn't ready yet, `text-delta` finds nothing and throws:

text part chatcmpl-xxx not found

The no-`messageID` path doesn't hit this because it's synchronous — the processor is always attached before LiteLLM sends anything.

I looked at whether an `await` could fix it. `createUserMessage` is already awaited before the loop. The race is inside `streamText`'s `eventProcessor` TransformStream, which is internal to the AI SDK — there's nowhere in OpenCode's code to insert an await that reliably closes the window.

The fix: always generate the ID server-side, store whatever the caller sent as `clientMessageID`:

// Before
id: input.messageID ?? MessageID.ascending(),

// After
id: MessageID.ascending(),
clientMessageID: input.messageID,

One code path now. Processor always attaches before streaming starts. Callers can still correlate their ID via `clientMessageID` on the `message.updated` SSE event. This also fixes the clock skew stall from #24476 / #25024 as a side effect — client-generated IDs can arrive out of order when clocks differ.

How did you verify your code works?

Added three tests to `test/session/prompt.test.ts`:

  • When `messageID` is passed, the stored message gets a server-generated `id`, not the caller's value
  • The caller's value is saved as `clientMessageID`
  • The assistant's `parentID` references the server-generated `id`
  • When no `messageID` is passed, `clientMessageID` is `undefined`

All pass. There's one pre-existing timeout failure in the suite unrelated to this change.

Screenshots / recordings

N/A — not a UI change.

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

…ed ID as clientMessageID

When a client passes a custom messageID via prompt_async, OpenCode was
using it directly as the user message ID. This triggered a different
internal setup path that ran asynchronously, creating a race condition
where streaming chunks from LiteLLM (openai-compatible provider) arrived
before the text part container was created — causing "text part
chatcmpl-... not found" errors and dropped chunks.

Fix: always generate the message ID server-side using MessageID.ascending().
Store the client-provided messageID as clientMessageID on the UserMessage
for external correlation only.

This matches the behaviour when no messageID is provided, ensuring the
eager setup path is always used regardless of whether the client sends
an ID.
@github-actions github-actions Bot added needs:compliance This means the issue will auto-close after 2 hours. needs:issue labels May 6, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

The following comment was made by an LLM, it may be inaccurate:

Based on my search, I found a related PR:

Related PR:

The current PR (#26079) explicitly states it "Closes the same underlying issue as #11869," so this is an intentional follow-up rather than a true duplicate. However, if #11869 is still open, there may be overlapping changes.

No other significant duplicate PRs were found addressing the same messageID handling problem.

…r-generated message id

Add three tests to prompt.test.ts that cover the fix in prompt.ts where
input.messageID is stored as clientMessageID instead of being used as
the canonical message id:

- When a caller provides messageID, the stored user message uses a
  server-generated id (not the caller-provided one)
- The caller-provided messageID is preserved in clientMessageID
- The assistant's parentID points to the server-generated user message
  id, not the client-provided one
- When no messageID is provided, clientMessageID is undefined
@github-actions github-actions Bot removed the needs:compliance This means the issue will auto-close after 2 hours. label May 6, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

Thanks for updating your PR! It now meets our contributing guidelines. 👍

@github-actions github-actions Bot added needs:compliance This means the issue will auto-close after 2 hours. and removed needs:compliance This means the issue will auto-close after 2 hours. labels May 6, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

Thanks for updating your PR! It now meets our contributing guidelines. 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant