Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Publish results for ignored benchmarks separately #41

Open
kumaraditya303 opened this issue Nov 12, 2024 · 3 comments
Open

Publish results for ignored benchmarks separately #41

kumaraditya303 opened this issue Nov 12, 2024 · 3 comments

Comments

@kumaraditya303
Copy link

kumaraditya303 commented Nov 12, 2024

Publish results for ignored benchmarks separately for benchmarks like asyncio_tcp which are ignored by default.

cc @mdboom

@mdboom
Copy link
Contributor

mdboom commented Dec 16, 2024

If the benchmarks are ignored, they aren't even run, so fixing this would increase our runtimes. One of the reasons we ignore it is because it's pretty non-deterministic, but the other is just to save on time for something that doesn't help us optimize the interpreter.

That said, one middle ground might be to run these on the weeklies rather than every single run.

@kumaraditya303
Copy link
Author

Yeah, having these run even on weekly basis would help to avoid possible performance regressions going unnoticed.

@mdboom
Copy link
Contributor

mdboom commented Dec 18, 2024

Ok -- I've made an issue to track this work here: faster-cpython/bench_runner#320

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants