-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Workflows to benchmark using ASV #250
Conversation
python-project-template/{% if include_benchmarks %}benchmarks{% endif %}/asv.conf.json.jinja
Outdated
Show resolved
Hide resolved
This is looking pretty interesting - I left a could questions in the code, but I have a few others that I'm curious to get your take on: Is there any way to use the unit tests as the benchmarked code? Currently what is written is pretty explicit, I'm curious if we could use some introspection to list all the methods in the pytest classes and run those dynamically with the Documentation will be key here - including access perhaps to some examples so that people can see what they are getting. It doesn't necessarily have to be part of this PR, but we should be sure to include a reasonable amount of documentation around what ASV does. We have a section, https://lincc-ppt.readthedocs.io/en/latest/practices/overview.html, that describes many of the utilities that are included with the PPT, and attempts to describe, at a very high level, how to use them. This work should definitely be included there. Additionally, we list all the copier questions a user could encounter (https://lincc-ppt.readthedocs.io/en/latest/source/new_project.html#create-a-new-project-from-the-template) we should expand that to include any new questions introduced here. Is there any additional setup required for users to publish the results to GH pages for a given project? i.e. do we need to tell them to click a button in the UI before they start? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've left a few questions in the code and in a comment here. I would love to get your take on them when you have a moment. I'm happy to set up some time to chat face to face too if that's easier :)
python-project-template/{% if include_benchmarks %}benchmarks{% endif %}/asv.conf.json.jinja
Outdated
Show resolved
Hide resolved
python-project-template/{% if include_benchmarks %}benchmarks{% endif %}/format.py
Outdated
Show resolved
Hide resolved
...-project-template/.github/workflows/{% if include_benchmarks %}asv-main.yml{% endif %}.jinja
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should also update RTD with the new practice.
- list of options and more help text in docs/source/new_project.rst
- a new practices/benchmarking.rst with the how and why
- any initial steps someone might need to undergo to make it work the first time
python-project-template/{% if include_benchmarks %}benchmarks{% endif %}/format.py
Outdated
Show resolved
Hide resolved
I've extracted the format script to its own PyPI package (lf-asv-formatter) and added relevant documentation to the RTD website! Could you please have a new look? I feel like I'm so deeply involved into this feature that I might be forgetting to write essential information. Thanks :) |
Creates GitHub workflows to benchmark methods using airspeed velocity (ASV).
Tested for benchmarking-asv mock project (live dashboard).
asv-main
Notes:
asv-pr
asv-nightly
Notes:
Checklist
lincc-frameworks/python-project-template
repo and not a downstream one instead.Relates to #79.