Skip to content

Add configuration workflow job, which automatically configures the rest of the workflow run #43

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

chrisjbillington
Copy link
Collaborator

And other changes aimed at minimising deployment and maintenance overhead. Ideally, one shouldn't need to update this workflow file for new Python versions, and should not need to edit it at all for each repository it is deployed to - the workflow file can be deployed as-is.

  • No longer need to specify PURE or NOARCH - this will be introspected during the configure job.
  • No longer need to specify the package name, it will be introspected during the configure job.
  • No longer need to specify which versions of Python to build for in the case of a non-pure wheel or non-noarch conda package - the configure job will set variables to target all currently-supported Python versions.
  • Use cibuildwheel to build impure wheels on all platforms instead of only manylinux for Linux (now deprecated).
  • No longer need to adjust list of OSs to run jobs on - this will be determined automatically based on pure/noarch status.
  • No longer need to specify the repository in job-level if-statements to prevent jobs running on forks. Instead, jobs simply do run on forks, but upload steps will be skipped if the relevant secrets are absent
  • Removed "ignore tags" step that was designed to prevent running the workflow twice on a commit with tags and uploading twice to the Anaconda test label, resulting in an error. Instead, we upload releases to real PyPI and Anaconda main label, and non-releases to test PyPI and anaconda test label so there is no duplication. Also we add the --skip-existing flag to anaconda uploads since sometimes you just need to re-run workflows anyway, this should be idempotent and not crash simply because some previous uploads succeeded.
  • Add if-no-files-found: error to all Upload Artifact actions, with step-level if statements so they only run when we expect output. This reduces the incidence of failed runs that produced no artifacts appearing successful.
  • Fix macOS impure conda builds. These previously didn't work at all on the newer macOS runners, because conda's compiler toolchains are super duper out of date. Instead, we use conda-forge toolchains by installing miniforge instead of miniconda.
  • Use bash to build conda packages on Windows, same as Linux/macOS - it was never necessary to use cmd.exe in the first place, I think doing so just masked the path-length limit problem for some packages, which we now address by setting the conda build root path to a directory with a very short filepath.

Changes needed to use this workflow

It was not possible to get rid of absolutely all configuration, but at least we can obviate the need to edit the workflow file itself, which should now be able to be identical across repositories. To have this workflow do anaconda uploads, you'll need to set the ANACONDA_USER variable in your repository or organisation's Actions variables (similar to how secrets are set). I've already done this for the labscript-suite organisation.

chrisjbillington added a commit to philipstarkey/qtutils that referenced this pull request Dec 6, 2024
…st of the workflow run

And other changes aimed at minimising deployment and maintenance
overhead. Ideally, one shouldn't need to update this workflow file for
new Python versions, and should not need to edit it at all for each
repository it is deployed to - the workflow file can be deployed
as-is.

* No longer need to specify PURE or NOARCH - this will be introspected
  during the configure job.
* No longer need to specify the package name, it will be introspected
  during the configure job.
* No longer need to specify which versions of Python to build for in the
  case of a non-pure wheel or non-noarch conda package - the configure
  job will set variables to target all currently-supported Python
  versions.
* Use `cibuildwheel` to build impure wheels on all platforms instead of
  only `manylinux` for Linux (now deprecated).
* No longer need to adjust list of OSs to run jobs on - this will be
  determined automatically based on pure/noarch status.
* No longer need to specify the repository in job-level if-statements to
  prevent jobs running on forks. Instead, jobs simply do run on forks,
  but upload steps will be skipped if the relevant secrets are absent
* Removed "ignore tags" step that was designed to prevent running the
  workflow twice on a commit with tags and uploading twice to the
  Anaconda test label, resulting in an error. Instead, we upload
  releases to real PyPI and Anaconda main label,  and non-releases to
  test PyPI and anaconda test label so there is no duplication. Also we
  add the `--skip-existing` flag to anaconda uploads since sometimes
  you just need to re-run workflows anyway, this should be idempotent
  and not crash simply because some previous uploads succeeded.
* Add `if-no-files-found: error` to all `Upload Artifact` actions, with
  step-level `if` statements so they only run when we expect output.
  This reduces the incidence of failed runs that produced no artifacts
  appearing successful.
* Fix macOS impure conda builds. These previously didn't work at all on
  the newer macOS runners, because conda's compiler toolchains are
  super duper out of date. Instead, we use conda-forge toolchains by
  installing miniforge instead of miniconda.
* Use bash to build conda packages on Windows, same as Linux/macOS - it
  was never necessary to use cmd.exe in the first place, I think doing
  so just masked the path-length limit problem for some packages, which
  we now address by setting the conda build root path to a directory
  with a very short filepath.

It was not possible to get rid of absolutely all configuration, but at
least we can obviate the need to edit the workflow file itself, which
should now be able to be identical across repositories. To have this
workflow do anaconda uploads, you'll need to set the `ANACONDA_USER`
variable in your repository or organisation's Actions variables
(similar to how secrets are set). I've already done this for the
`labscript-suite` organisation.
@chrisjbillington chrisjbillington force-pushed the automatic-config-workflow branch from 276779f to 5ec853f Compare December 6, 2024 04:40
@chrisjbillington
Copy link
Collaborator Author

Also switch to pyproject.toml-only layout and hard-code setuptools_scm version_schemes. The workflow no longer changes these in CI, it's easier to just always have release-branch-semver and no-local-version all the time on CI or otherwise.

@chrisjbillington chrisjbillington force-pushed the automatic-config-workflow branch 4 times, most recently from fae56ab to 243b0b9 Compare December 7, 2024 00:22
hard-code release-branch-semver and no-local-version

These are no longer configured to be different in CI than locally

Drop minimum version requirements for build requirements where those
versions are now several years old
This is important for moving to PyPI attested uploads and is generally
recommended by the PyPI upload actions anyway.

We split the other steps into separate jobs for consistency
@chrisjbillington chrisjbillington force-pushed the automatic-config-workflow branch from 243b0b9 to 45ffdcd Compare December 7, 2024 10:48
Since we can no longer use the presence of TestPyPI and PyPI API keys to
know if we should attempt to upload releases (i.e. we're not running in
a fork), we instead do need some explicit configuration.

Rather than do this all in repository variables, we do it in a config
file in the workflow directory.

RELEASE_REPO is set to the repository releases should be made from
(otherwise uploads are skipped).

For consistency, ANACONDA_USER is set here too rather than as a
repository variable.

For completeness everything else is configurable there too.

Re-add the "ignore tags" step - with fixes since it was incorrect given
recent changes to the workflow `on:` configuration. It is needed due to
race conditions - we are cloning the repo in multiple jobs and a tag
could appear in between them, leading to version inconsistencies.
@chrisjbillington
Copy link
Collaborator Author

Ok, this is working, and I've used it in qtutils (https://github.com/philipstarkey/qtutils/actions/runs/12364591645) and zprocess (https://github.com/chrisjbillington/zprocess/actions/runs/12365789345).

To enable trusted publisher releases, one just needs to go to:

https://pypi.org/manage/project/qtutils/settings/publishing/

(or the same with test.pypi) and fill in the details. We are using the optional environments (named pypi and testpypi).

Unfortunately if we're using trusted publisher releases, then we can't infer whether we should attempt to upload to PyPI/testPyPI or not based on the presence of the needed secret. So we do need some per-repository configuration after all. Rather than make this either in the form of repository variables, or hard-coded within the workflow file itself, I've decided on having a separate release-vars.sh that lives next to release.yml in the repository and contains per-repository configuration. It's a bash script that is sourced by the configuration job.

I made most things configurable even though everything can be introspected other than RELEASE_REPO (the repository from which package uploads should be made, otherwise they're skipped), and ANACONDA_USER.

So for deployment to other repos, we'll need to copy both files into other repos and modify release-vars.sh with the name of each repository.

This is simple enough that it could be scripted to roll it out to master and release branches on all repos, though I'd probably make a chacklist of other changes to make at the same time, such as:

  • explicitly depending on setuptools_scm in runtime dependencies — necessary for __version__ to work in editable installs
  • removing wheel from build-system/requires
  • removing importlib_metadata from dependencies and depending on python≥3.8 from which it's in the stdlib (3.7 and lower are end of life so this is kosher)
  • archiving all but the latest "maintenance" branches
  • simplifying declared Python version compatibility to just say "3" so we don't have to update it constantly - we aim for compatibility with all stable Python versions, no need to be more specific.

I'll merge this soon if there aren't objections or additional suggestions. Have already done the trusted publisher setup for this repo.

@dihm
Copy link
Collaborator

dihm commented Dec 19, 2024

Well I think this is a pretty solid upgrade. Keeping the config options localized is a decent compromise.

As a suggestion (that you have probably already considered), could we introspect if the repo running the workflow is a fork so we don't have to set the RELEASE_REPO? A quick look showed this SO suggestion (with comments being important as well).

https://stackoverflow.com/questions/70775445/determining-the-origin-repo-of-pull-request-in-workflow-yaml

Even if that works, I still like the idea of a shell config script that collects all the important things (selecting release targets etc), even if the defaults are what we want most of the time.

@dihm
Copy link
Collaborator

dihm commented Apr 11, 2025

Note: for a non-pure installation you get an error configuring on apparently any ci-helper distinfo command if the setup.py is not importable. This came up when updating the labscript-c-extensions workflow which imports cython in setup.py.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy