Skip to content

Prevent downstream crashes by adding x/y to through_obstacle route points#1121

Closed
AnasSarkiz wants to merge 1 commit intotscircuit:mainfrom
AnasSarkiz:main
Closed

Prevent downstream crashes by adding x/y to through_obstacle route points#1121
AnasSarkiz wants to merge 1 commit intotscircuit:mainfrom
AnasSarkiz:main

Conversation

@AnasSarkiz
Copy link
Copy Markdown
Member

No description provided.

@AnasSarkiz AnasSarkiz requested a review from seveibar May 6, 2026 05:36
@vercel
Copy link
Copy Markdown

vercel Bot commented May 6, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
capacity-node-autorouter Ready Ready Preview, Comment May 6, 2026 5:40am

Request Review

@tscircuitbot
Copy link
Copy Markdown
Contributor

🏃 Benchmark This PR

Run benchmarks by commenting on this PR:

/benchmark [benchmark.sh args...]
/benchmark [solver-name|all] [scenario-limit] --concurrency <n> --effort <n> --dataset <dataset01|zdwiel|srj05> [--profile-solvers]

Everything after /benchmark is forwarded directly to ./benchmark.sh, except --profile-solvers, which adds main/PR profile comparison tables to the result comment.

Examples:

  • /benchmark -> AutoroutingPipelineSolver4, all scenarios (default concurrency uses the benchmark runner CPU count)
  • /benchmark AutoroutingPipelineSolver4 -> one solver, all scenarios
  • /benchmark all 20 -> all solvers, first 20 scenarios
  • /benchmark AutoroutingPipelineSolver4 20 --concurrency 8 -> one solver, 20 scenarios, 8 workers
  • /benchmark AutoroutingPipelineSolver4 20 --effort 2 -> one solver, 20 scenarios, 2x effort
  • /benchmark AutoroutingPipelineSolver4 --dataset zdwiel -> one solver, all scenarios, zdwiel dataset
  • /benchmark AutoroutingPipelineSolver4 --dataset srj05 -> one solver, all scenarios, srj05 dataset
  • /benchmark all 20 --dataset dataset01 --concurrency 8 -> all solvers, 20 scenarios, dataset01, 8 workers
  • /benchmark AutoroutingPipelineSolver4 20 --profile-solvers -> one solver, 20 scenarios, with profile solver comparison tables
  • /benchmark --pipeline 4 -> same as ./benchmark.sh --pipeline 4

Use /update-snapshots (or /us) to run BUN_UPDATE_SNAPSHOTS=1 bun test --timeout 120_000 on the PR branch and auto-commit snapshot updates.
Use /usf to read recent failed test files, run BUN_UPDATE_SNAPSHOTS=1 bun test --timeout 120_000 <files>, auto-commit snapshot updates, then verify with bun test <files>.

Any PR whose title contains [BENCHMARK TEST] will automatically run the benchmark workflow on PR updates.

Copy link
Copy Markdown
Contributor

@seveibar seveibar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah sorry, didn't know this is what you meant- this is confusing and we should fix downstream

@AnasSarkiz AnasSarkiz closed this May 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants