From 77c64eafc3e3e587349df29054f79cc0a18def2c Mon Sep 17 00:00:00 2001 From: David Meyer Date: Tue, 29 Apr 2025 14:00:38 -0400 Subject: [PATCH] Squashed commit of the following: MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit commit 1a1c053fa6077e9dab47fc81113b677517ffe813 Author: David Meyer Date: Fri Apr 11 10:49:41 2025 -0400 Update link to Build and Release action status commit 0a58581240f36800226071bcd6d259f1186b2bce Author: David Meyer Date: Fri Apr 11 10:43:09 2025 -0400 Bypass `setuptools-conda` incompatibility with `conda-build>=25` commit 8c11c95a6db1616be9feb87dc6321a0c2a808ccb Merge: 56fedba fe2a120 Author: David Meyer Date: Wed Apr 2 14:19:55 2025 -0400 Merge pull request #111 from labscript-suite/pyside6 Make splash screen widget compatible with PySide6 commit fe2a120fe72fca1039e5ccc74a74729fc5a93bae Author: David Meyer Date: Wed Apr 2 13:25:39 2025 -0400 Drop unnecessary `splash.show` overload that causes hangs in some Qt5 situations. commit 56fedba12b227004355579eaf9c486d4ec05ef48 Merge: cedf98d 858d17a Author: David Meyer Date: Mon Mar 24 15:37:55 2025 -0400 Merge pull request #114 from dihm/workflow_refresh Workflow refresh commit 858d17adea00189d7a229669a5925c8507d0da06 Author: David Meyer Date: Mon Mar 24 15:34:30 2025 -0400 Re-add setuptools-scm dependency Only `labscript_utils.versions.get_version` requires it, and it could be removed, but zprocess requires the package anyway so may as well keep the functionality. commit 946e1e5b4d1e21a20c409c746ed5897ae0ed0069 Author: David Meyer Date: Mon Mar 24 14:23:22 2025 -0400 Remove stale dependencies commit 1ec7f0133290ec9f3920090fdbcfc7af16ed81c5 Author: David Meyer Date: Mon Mar 24 14:23:02 2025 -0400 Update release workflow to introspected version commit cedf98d5b3f72e6e059728f77e768485411298ce Merge: 3ab11ce c9a5e13 Author: David Meyer Date: Fri Jan 17 11:24:38 2025 -0500 Merge pull request #112 from dihm/example_compilation Example compilation commit c9a5e1366156b2fa96dcb00a30dca75b4e65058c Author: David Meyer Date: Fri Jan 17 09:58:19 2025 -0500 Separate labscript-profile-create into a separate CLI interface and the actual creation function `create_profile`, which is automatically called when labscript_utils is imported and the profile does not already exist. Separation ensures that argparse doesn't consume erroneous CLI arguments just because a command incidentally imports labscript_utils (like on RTD). commit be08cee368610c04a68e2466e130b3cfefd3eeb6 Author: David Meyer Date: Wed Jan 15 10:00:31 2025 -0500 Add basic docstrings to labscript-profile submodule commit a9927c87c543eaaa986777830ae7bda1e1a19b28 Author: David Meyer Date: Tue Jan 14 15:53:43 2025 -0500 Ensure $HOME gets expanded correctly during compilation on unix Co-authored-by: NMFolks <70551431+NMFolks@users.noreply.github.com> commit c9eac49ae37550e2754ba1e70572b5c6c897197a Author: David Meyer Date: Thu Oct 3 15:05:13 2024 -0400 Add some feedback to `labscript-profile-create` to track progress and output directories. commit 00a21984b6e7dcfd664e7a17c4d6599a9d4c79cf Author: David Meyer Date: Thu Oct 3 15:03:59 2024 -0400 Make compile flag a toggle. commit 2d9f23d911358cd457428222d5f9b99f95797154 Author: David Meyer Date: Wed Aug 21 13:24:45 2024 -0400 Make connection table compilation an optional flag. Default is to not compile, maintaining older behavior. commit a706494a5467e149ae8b1d9bff6f306e77134f09 Author: David Meyer Date: Wed Aug 21 13:12:07 2024 -0400 Configure `labscript-profile-create` to compile the default connection table. This will allow BLACS to run immediately using dummy devices, without manual compilation required from the user. Co-authored-by: NMFolks <70551431+NMFolks@users.noreply.github.com> commit 3ab11cedae5db93b87846f6d7b9e03d664e64948 Merge: 5e3fb28 09b6249 Author: David Meyer Date: Thu Jan 16 16:24:18 2025 -0500 Merge pull request #109 from carterturn/cartertu-digitaloutput-inverted-fix Handling inverted for DigitalOutput set_DO commit 19f8f2344d04f18a67ee21056231c6249d563115 Author: chrisjbillington Date: Tue Dec 17 15:06:50 2024 +1100 Require qtutils≄4.0 for PySide6 support commit c6e729b3d837ca6e74af143fb2ea6400250d5289 Author: chrisjbillington Date: Thu Dec 12 00:21:52 2024 +1030 Make splash screen widget compatible with PySide6 commit 5e3fb28f84f614c2972a59849482edd4f8a8740a Merge: b042db4 a9537dc Author: David Meyer Date: Fri Nov 15 16:29:41 2024 -0500 Merge pull request #110 from dihm/zmq_fix Modify default zmq binding address to use `tcp://*` commit a9537dcbd3e19630bc21da000f01fd44a591e315 Author: David Meyer Date: Fri Nov 15 09:13:11 2024 -0500 Modify default zmq binding address to use `tcp://*` Fixes compatibility with zmq>=26 commit 09b624998907aef7ececaa98689a1ffbd13c4478 Author: Carter Turn Date: Tue May 30 19:02:07 2023 -0400 Removed unneeded return commit 26344e6802ebe0da68fc241e43b117304b20a4ca Author: Carter Turn Date: Tue May 2 09:08:08 2023 -0400 Fix the set_DO function for InvertedDigitalOutput commit b042db43cd3e02f16df3ab4aa410761b7237fc24 Merge: 9bb8842 d7a044a Author: Phil Starkey Date: Thu Oct 3 09:16:44 2024 +1000 Merge pull request #108 from dihm/hotfix_editable_installs Urgent hotfix to #107 commit d7a044adf9dfebf5365cefb7e64ebe090ca1b2fd Author: David Meyer Date: Wed Oct 2 12:01:37 2024 -0400 Re-adds `setup.py` with custom editable install code that should not have been deleted in #107 commit 9bb884223991e08dda6528ca71a25d2ee53b988f Merge: 426bff5 8c1620c Author: David Meyer Date: Sat Jul 13 19:47:16 2024 -0400 Merge pull request #107 from dihm/metadata_overhaul Metadata overhaul commit 426bff511f3c1d1d06f46283235ef8bdb8804e7a Merge: b0882cd 189c3cf Author: David Meyer Date: Sat Jul 13 19:45:53 2024 -0400 Merge pull request #106 from dihm/mod_watcher_imp_fix Module watcher imp removal commit b0882cde5cd38289c592fde4155dcd83aa3f826b Merge: 851f354 2ab6077 Author: David Meyer Date: Sat Jul 13 19:44:04 2024 -0400 Merge pull request #105 from dihm/imp_hotfix Imp hotfix commit 8c1620c0a5e9481bc08eae9a84784c0521195c1f Author: David Meyer Date: Sat Jul 13 02:20:29 2024 -0400 Move metadata to pyproject.toml, update setuptools-scm configs commit 189c3cf34475142caa44bb783b9f6aed2f9fe0c5 Author: David Meyer Date: Thu Apr 4 15:20:13 2024 -0400 Remove `imp.acquire_lock` and `imp.release_lock`. Global import locks have been removed since Python 3.3, so they are likely just not needed anymore. commit 4e3b10105674bed404e4aeab9abdc9185e742db6 Author: David Meyer Date: Thu Apr 4 15:16:24 2024 -0400 Add very simple test that confirms `ModuleWatcher` nominally functions commit 2ab607796df39b83fe622c4dccff6d52d008dfae Author: David Meyer Date: Thu Apr 4 14:17:04 2024 -0400 Remove old method of importing `register_classes.py` to remove `imp` dependency. commit 37aa2ca2986a9f530d947bdf8f93cf37a39e0a4e Author: David Meyer Date: Thu Apr 4 14:09:32 2024 -0400 Swap over of `imp` import check to default to older behavior if available. commit 851f354303c7a54bbc18e8151a43edb2180438bc Merge: 77f5cf7 da5a2c4 Author: David Meyer Date: Mon Feb 12 15:20:43 2024 -0500 Merge pull request #101 from ispielma/FixImpIssue Fixes an issue where `imp` is no longer a python module commit 77f5cf70a54796523d89412488e2c77711ecc399 Merge: ea44a21 1acfde2 Author: David Meyer Date: Fri Feb 9 12:48:56 2024 -0500 Merge pull request #102 from dihm/setuptools_scm_fix Ensure setuptools_scm always uses `release-branch-semver` commit ea44a216e7721c7ed9c22c87568945bb509a1a74 Merge: 2cf7fb6 01bf571 Author: David Meyer Date: Fri Feb 9 12:48:42 2024 -0500 Merge pull request #103 from dihm/update_workflow Update workflow pins to use node.js 20 commit 01bf571bee86e5aac7f83ff6aaf7fce75e644c99 Author: David Meyer Date: Fri Feb 9 12:46:14 2024 -0500 Update workflow pins to use node.js 20 commit 1acfde2c12da9c2454976d2fd8c3344aebf00019 Author: David Meyer Date: Fri Feb 9 12:42:36 2024 -0500 Ensure setuptools_scm always uses `release-branch-semver` Also update setuptools and setuptools_scm pin in build. commit da5a2c457f0f3b44b363e320d4ce53ceadc5adcd Author: spielman Date: Fri Feb 9 12:22:20 2024 -0500 Comment fix. commit 5f9e1ddc7d9c7a72cd309af886f4721452e7c488 Author: spielman Date: Fri Feb 9 12:16:15 2024 -0500 Fixed. commit 2cf7fb65b9323f71c9185e7f873a8a2303a25e57 Merge: 9e9df6d be1c791 Author: David Meyer Date: Thu Jan 18 20:47:13 2024 -0500 Merge pull request #99 from dihm/rtd_update Modernize RTD build commit be1c7913b78568b2750854168e84f1de35149c39 Author: David Meyer Date: Thu Jan 18 20:39:29 2024 -0500 Modernize RTD build --- .github/workflows/release-vars.sh | 47 ++ .github/workflows/release.yml | 519 ++++++++++++------ README.md | 2 +- docs/source/conf.py | 22 +- labscript_profile/create.py | 101 +++- labscript_utils/__version__.py | 19 +- .../device_registry/_device_registry.py | 22 +- labscript_utils/ls_zprocess.py | 2 +- labscript_utils/modulewatcher.py | 42 +- labscript_utils/qtwidgets/digitaloutput.py | 7 +- labscript_utils/qtwidgets/outputbox.py | 2 +- labscript_utils/splash.py | 64 +-- pyproject.toml | 64 ++- readthedocs.yaml | 10 +- setup.cfg | 55 -- setup.py | 9 +- 16 files changed, 660 insertions(+), 327 deletions(-) create mode 100644 .github/workflows/release-vars.sh delete mode 100644 setup.cfg diff --git a/.github/workflows/release-vars.sh b/.github/workflows/release-vars.sh new file mode 100644 index 0000000..27aab92 --- /dev/null +++ b/.github/workflows/release-vars.sh @@ -0,0 +1,47 @@ +# This repository. PyPI and Anaconda test and release package uploads are only done if +# the repository the workflow is running in matches this (i.e. is not a fork). Optional, +# if not set, package uploads are skipped. +export RELEASE_REPO="labscript-suite/labscript-utils" + +# Username with which to upload conda packages. If not given, anaconda uploads are +# skipped. +export ANACONDA_USER="labscript-suite" + +# Whether (true or false) to upload releases to PyPI, non-releases to Test PyPI, +# releases to Anaconda, non-releases to Anaconda test label. Only used if the repository +# the workflow is running in matches RELEASE_REPO, otherwise uploads are skipped. +# Anaconda uploads require ANACONDA_USER be specified and ANACONDA_API_TOKEN secret be +# set. Optional, all default to true. +export PYPI_UPLOAD="" +export TESTPYPI_UPLOAD="" +export ANACONDA_UPLOAD="" +export TEST_ANACONDA_UPLOAD="" + +# Which Python version to use for pure wheel builds, sdists, and as the host Python for +# cibuildwheel. Optional, defaults to the second-most recent minor Python version. +export DEFAULT_PYTHON="" + +# Comma-separated list of Python versions to build conda packages for. Only used if +# HAS_ENV_MARKERS=true or PURE=false, otherwise a noarch conda package is built instead. +# Optional, defaults to all non-end-of-life stable Python minor versions. +export CONDA_PYTHONS="" + +# Environment variable set in the envionment that `cibuildwheel` runs in instructing it +# which Pythons to build for, as a space-separated list of specifiers in the format +# specified by `cibuildwheel`. Only used if PURE=false. Optional, defaults to all +# non-end-of-life stable CPython versions. +export CIBW_BUILD="" + +# Name of Python package. Optional, defaults to name from the package metadata +export PKGNAME="" + +# Version of Python package. Optional, defaults to version from the package metadata +export PKGVER="" + +# Whether the Python package is pure (true) or impure (false). Optional, defaults to +# false if the setuptools package has extension modules or libraries, otherwise true. +export PURE="" + +# Whether (true or false) the Python package has dependencies that vary by platform or +# Python version. Optional, Defaults to presence of env markers in package metadata. +export HAS_ENV_MARKERS="" diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index 3e00491..720d509 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -5,252 +5,443 @@ on: branches: - master - maintenance/* - create: tags: - 'v[0-9]+.[0-9]+.[0-9]+*' env: - PACKAGE_NAME: labscript-utils - SCM_LOCAL_SCHEME: no-local-version - ANACONDA_USER: labscript-suite - - # Configuration for a package with compiled extensions: - # PURE: false - # NOARCH: false - - # Configuration for a package with no extensions, but with dependencies that differ by - # platform or Python version: - # PURE: true - # NOARCH: false - - # Configuration for a package with no extensions and the same dependencies on all - # platforms and Python versions. For this configuration you should comment out all but - # the first entry in the job matrix of the build job since multiple platforms are not - # needed. - PURE: true - NOARCH: true + OS_LIST_UBUNTU: '["ubuntu-latest"]' + OS_LIST_ALL: '["ubuntu-latest", "windows-latest", "macos-latest", "macos-13"]' + jobs: - build: - name: Build - runs-on: ${{ matrix.os }} - strategy: - matrix: - include: - - { os: ubuntu-latest, python: '3.11', arch: x64, conda: true} - # - { os: ubuntu-latest, python: '3.10', arch: x64, conda: true } - # - { os: ubuntu-latest, python: '3.9', arch: x64, conda: true } - # - { os: ubuntu-latest, python: '3.8', arch: x64, conda: true } - # - { os: ubuntu-latest, python: '3.7', arch: x64, conda: true } - - # - { os: macos-11, python: '3.11', arch: x64, conda: true } - # - { os: macos-11, python: '3.10', arch: x64, conda: true } - # - { os: macos-11, python: '3.9', arch: x64, conda: true } - # - { os: macos-11, python: '3.8', arch: x64, conda: true } - # - { os: macos-11, python: '3.7', arch: x64, conda: true } - - # - { os: windows-latest, python: '3.11', arch: x64, conda: true } - # - { os: windows-latest, python: '3.10', arch: x64, conda: true } - # - { os: windows-latest, python: '3.9', arch: x64, conda: true } - # - { os: windows-latest, python: '3.8', arch: x64, conda: true } - # - { os: windows-latest, python: '3.7', arch: x64, conda: true } - - # - { os: windows-latest, python: '3.11', arch: x86, conda: false } # conda not yet available - # - { os: windows-latest, python: '3.10', arch: x86, conda: true } - # - { os: windows-latest, python: '3.9', arch: x86, conda: true } - # - { os: windows-latest, python: '3.8', arch: x86, conda: true } - # - { os: windows-latest, python: '3.7', arch: x86, conda: true } - - if: github.repository == 'labscript-suite/labscript-utils' && (github.event_name != 'create' || github.event.ref_type != 'branch') + configure: + name: Configure workflow run + runs-on: ubuntu-latest + outputs: + DEFAULT_PYTHON: ${{ steps.config.outputs.DEFAULT_PYTHON }} + CIBW_BUILD: ${{ steps.config.outputs.CIBW_BUILD }} + PKGNAME: ${{ steps.config.outputs.PKGNAME }} + PKGVER: ${{ steps.config.outputs.PKGVER }} + PURE: ${{ steps.config.outputs.PURE }} + ANACONDA_USER: ${{ steps.config.outputs.ANACONDA_USER }} + CONDA_BUILD_ARGS: ${{ steps.config.outputs.CONDA_BUILD_ARGS }} + BUILD_OS_LIST: ${{ steps.config.outputs.BUILD_OS_LIST }} + RELEASE: ${{ steps.config.outputs.RELEASE }} + TESTPYPI_UPLOAD_THIS_RUN: ${{ steps.config.outputs.TESTPYPI_UPLOAD_THIS_RUN }} + PYPI_UPLOAD_THIS_RUN: ${{ steps.config.outputs.PYPI_UPLOAD_THIS_RUN }} + TEST_ANACONDA_UPLOAD_THIS_RUN: ${{ steps.config.outputs.TEST_ANACONDA_UPLOAD_THIS_RUN }} + ANACONDA_UPLOAD_THIS_RUN: ${{ steps.config.outputs.ANACONDA_UPLOAD_THIS_RUN }} + steps: - name: Checkout - uses: actions/checkout@v3 + uses: actions/checkout@v4 with: fetch-depth: 0 - - name: Ignore Tags - if: github.event.ref_type != 'tag' + - name: Ignore Tags for non-tag pushes + if: "!startsWith(github.ref, 'refs/tags/')" run: git tag -d $(git tag --points-at HEAD) - name: Install Python - uses: actions/setup-python@v4 + uses: actions/setup-python@v5 with: - python-version: ${{ matrix.python }} - architecture: ${{ matrix.arch }} + python-version: '3.x' - - name: Source Distribution - if: strategy.job-index == 0 + - name: Configure workflow + id: config run: | - python -m pip install --upgrade pip setuptools wheel build - python -m build -s . + pip install ci-helper - - name: Wheel Distribution - # Impure Linux wheels are built in the manylinux job. - if: (env.PURE == 'true' && strategy.job-index == 0) || (env.PURE == 'false' && runner.os != 'Linux') - run: | - python -m pip install --upgrade pip setuptools wheel build - python -m build -w . + # Load repo-specific variables and overrides: + VARS_FILE=".github/workflows/release-vars.sh" + if [ -f "${VARS_FILE}" ]; then + source "${VARS_FILE}" + fi - - name: Upload Artifact - if: strategy.job-index == 0 || (env.PURE == 'false' && runner.os != 'Linux') - uses: actions/upload-artifact@v3 - with: - name: dist - path: ./dist + # Python version used to build sdists, pure wheels, and as host Python for + # `cibuildwheel`: + if [ -z "${DEFAULT_PYTHON}" ]; then + # Default to second-most recent supported Python version: + DEFAULT_PYTHON=$(ci-helper defaultpython) + fi - - name: Set Variables for Conda Build - if: matrix.conda - shell: bash - run: | - if [ $NOARCH == true ]; then - CONDA_BUILD_ARGS="--noarch" + # Versions of Python to build conda packages for: + if [ -z "${CONDA_PYTHONS}" ]; then + # Default to all supported Python versions: + CONDA_PYTHONS=$(ci-helper pythons) + fi + + # Env var for `cibuildwheel` specifying target Python versions: + if [ -z "${CIBW_BUILD}" ]; then + # default to all supported CPython versions: + CIBW_BUILD=$(ci-helper pythons --cibw) + fi + + # Package name and version + if [ -z "${PKGNAME}" ]; then + # Default to package name from project metadata: + PKGNAME=$(ci-helper distinfo name .) + fi + if [ -z "${PKGVER}" ]; then + # Default to package version from project metadata: + PKGVER=$(ci-helper distinfo version .) + fi + + # Whether the package is pure python + if [ -z "${PURE}" ]; then + # Default to whether the setuptools package declares no modules/libraries: + PURE=$(ci-helper distinfo is_pure .) + fi + + # Whether the package requirements depend on platform or Python version: + if [ -z "${HAS_ENV_MARKERS}" ]; then + # Default to the presence of env markers in package metadata: + HAS_ENV_MARKERS=$(ci-helper distinfo has_env_markers .) + fi + + # List of OSs we need to run the build job on and arguments to + # `setuptools-conda build`: + if [[ "${PURE}" == false || "${HAS_ENV_MARKERS}" == true ]]; then + BUILD_OS_LIST="${OS_LIST_ALL}" + CONDA_BUILD_ARGS="--pythons=${CONDA_PYTHONS}" else - CONDA_BUILD_ARGS="" + BUILD_OS_LIST="${OS_LIST_UBUNTU}" + CONDA_BUILD_ARGS="--noarch" fi - echo "CONDA_BUILD_ARGS=$CONDA_BUILD_ARGS" >> $GITHUB_ENV - - name: Install Miniconda - if: matrix.conda - uses: conda-incubator/setup-miniconda@v2 - with: - auto-update-conda: true - python-version: ${{ matrix.python }} - architecture: ${{ matrix.arch }} - miniconda-version: "latest" + # Release if a tag was pushed: + if [ "${{ contains(github.ref, '/tags') }}" == true ]; then + RELEASE=true + else + RELEASE=false + fi - - name: Workaround conda-build incompatibility with xcode 12+ - if: runner.os == 'macOS' - uses: maxim-lobanov/setup-xcode@v1 - with: - xcode-version: 11.7 + # What types of package uploads are enabled: + if [ -z "${PYPI_UPLOAD}" ]; then + PYPI_UPLOAD=true + else + PYPI_UPLOAD=false + fi + if [ -z "${TESTPYPI_UPLOAD}" ]; then + TESTPYPI_UPLOAD=true + else + TESTPYPI_UPLOAD=false + fi + if [ -z "${ANACONDA_UPLOAD}" ]; then + ANACONDA_UPLOAD=true + else + ANACONDA_UPLOAD=false + fi + if [ -z "${TEST_ANACONDA_UPLOAD}" ]; then + TEST_ANACONDA_UPLOAD=true + else + TEST_ANACONDA_UPLOAD=false + fi - - name: Conda package (Unix) - if: (matrix.conda && runner.os != 'Windows') - shell: bash -l {0} - run: | - conda install -c labscript-suite setuptools-conda - setuptools-conda build $CONDA_BUILD_ARGS . + if [ "${{ github.repository }}" != "${RELEASE_REPO}" ]; then + echo "Workflow repo doesn't match ${RELEASE_REPO}, disabling package uploads" + PYPI_UPLOAD=false + TESTPYPI_UPLOAD=false + ANACONDA_UPLOAD=false + TEST_ANACONDA_UPLOAD=false + fi - - name: Conda Package (Windows) - if: (matrix.conda && runner.os == 'Windows') - shell: cmd /C CALL {0} - run: | - conda install -c labscript-suite setuptools-conda && ^ - setuptools-conda build %CONDA_BUILD_ARGS% --croot ${{ runner.temp }}\cb . + # If Anaconda uploads enabled, check necessary username and token are + # available: + if [[ "${ANACONDA_UPLOAD}" == true || "${TEST_ANACONDA_UPLOAD}" == true ]]; then + if [ -z "${{ secrets.ANACONDA_API_TOKEN }}" ]; then + echo "Anaconda uploads enabled but ANACONDA_API_TOKEN secret not set" + exit 1 + fi + if [ -z "${ANACONDA_USER}" ]; then + echo "Anaconda uploads enabled but ANACONDA_USER not set" + exit 1 + fi + fi - - name: Upload Artifact - if: matrix.conda - uses: actions/upload-artifact@v3 - with: - name: conda_packages - path: ./conda_packages + # If enabled, upload releases to PyPI and Anaconda: + if [[ "${RELEASE}" == true && "${PYPI_UPLOAD}" == true ]]; then + PYPI_UPLOAD_THIS_RUN=true + else + PYPI_UPLOAD_THIS_RUN=false + fi + if [[ "${RELEASE}" == true && "${ANACONDA_UPLOAD}" == true ]]; then + ANACONDA_UPLOAD_THIS_RUN=true + else + ANACONDA_UPLOAD_THIS_RUN=false + fi + + # If enabled, upload non-releases to Test PyPI and Anaconda test label: + if [[ "${RELEASE}" == false && "${TESTPYPI_UPLOAD}" == true ]]; then + TESTPYPI_UPLOAD_THIS_RUN=true + else + TESTPYPI_UPLOAD_THIS_RUN=false + fi + if [[ "${RELEASE}" == false && "${TEST_ANACONDA_UPLOAD}" == true ]]; then + TEST_ANACONDA_UPLOAD_THIS_RUN=true + else + TEST_ANACONDA_UPLOAD_THIS_RUN=false + fi + echo "DEFAULT_PYTHON=${DEFAULT_PYTHON}" >> "${GITHUB_OUTPUT}" + echo "CIBW_BUILD=${CIBW_BUILD}" >> "${GITHUB_OUTPUT}" + echo "PKGNAME=${PKGNAME}" >> "${GITHUB_OUTPUT}" + echo "PKGVER=${PKGVER}" >> "${GITHUB_OUTPUT}" + echo "PURE=${PURE}" >> "${GITHUB_OUTPUT}" + echo "ANACONDA_USER=${ANACONDA_USER}" >> "${GITHUB_OUTPUT}" + echo "CONDA_BUILD_ARGS=${CONDA_BUILD_ARGS}" >> "${GITHUB_OUTPUT}" + echo "BUILD_OS_LIST=${BUILD_OS_LIST}" >> "${GITHUB_OUTPUT}" + echo "RELEASE=${RELEASE}" >> "${GITHUB_OUTPUT}" + echo "TESTPYPI_UPLOAD_THIS_RUN=${TESTPYPI_UPLOAD_THIS_RUN}" >> "${GITHUB_OUTPUT}" + echo "PYPI_UPLOAD_THIS_RUN=${PYPI_UPLOAD_THIS_RUN}" >> "${GITHUB_OUTPUT}" + echo "TEST_ANACONDA_UPLOAD_THIS_RUN=${TEST_ANACONDA_UPLOAD_THIS_RUN}" >> "${GITHUB_OUTPUT}" + echo "ANACONDA_UPLOAD_THIS_RUN=${ANACONDA_UPLOAD_THIS_RUN}" >> "${GITHUB_OUTPUT}" + + echo + echo "==========================" + echo "Workflow run configuration:" + echo "--------------------------" + cat "${GITHUB_OUTPUT}" + echo "==========================" + echo + + + build: + name: Build + runs-on: ${{ matrix.os }} + needs: configure + strategy: + matrix: + os: ${{ fromJSON(needs.configure.outputs.BUILD_OS_LIST) }} + + env: + DEFAULT_PYTHON: ${{ needs.configure.outputs.DEFAULT_PYTHON }} + CIBW_BUILD: ${{ needs.configure.outputs.CIBW_BUILD }} + PURE: ${{ needs.configure.outputs.PURE }} + CONDA_BUILD_ARGS: ${{ needs.configure.outputs.CONDA_BUILD_ARGS }} - manylinux: - name: Build Manylinux - runs-on: ubuntu-latest - if: github.repository == 'labscript-suite/labscript-utils' && (github.event_name != 'create' || github.event.ref_type != 'branch') steps: - name: Checkout - if: env.PURE == 'false' - uses: actions/checkout@v3 + uses: actions/checkout@v4 with: fetch-depth: 0 - - name: Ignore Tags - if: github.event.ref_type != 'tag' && env.PURE == 'false' + - name: Ignore Tags for non-tag pushes + if: "!startsWith(github.ref, 'refs/tags/')" run: git tag -d $(git tag --points-at HEAD) - - name: Build Manylinux Wheels - if: env.PURE == 'false' - uses: RalfG/python-wheels-manylinux-build@v0.4.2 + - name: Install Python + uses: actions/setup-python@v5 with: - python-versions: 'cp37-cp37m cp38-cp38 cp39-cp39 cp310-cp310 cp311-cp311' - pre-build-command: 'git config --global --add safe.directory "*"' + python-version: ${{ env.DEFAULT_PYTHON }} - - name: Upload Artifact - if: env.PURE == 'false' - uses: actions/upload-artifact@v3 - with: - name: dist - path: dist/*manylinux*.whl + - name: Install Python tools + run: python -m pip install --upgrade pip setuptools wheel build cibuildwheel - release: - name: Release - runs-on: ubuntu-latest - needs: [build, manylinux] - steps: + - name: Source distribution + if: strategy.job-index == 0 + run: python -m build -s . - - name: Download Artifact - uses: actions/download-artifact@v3 + - name: Wheel distribution (pure) + if: env.PURE == 'true' && strategy.job-index == 0 + run: python -m build -w . + + - name: Wheel distribution (impure) + if: env.PURE == 'false' + run: cibuildwheel --output-dir dist + + - name: Upload artifact + if: env.PURE == 'false' || strategy.job-index == 0 + uses: actions/upload-artifact@v4 with: - name: dist + name: dist-${{ matrix.os }} path: ./dist + if-no-files-found: error - - name: Download Artifact - uses: actions/download-artifact@v3 + - name: Install Miniforge + uses: conda-incubator/setup-miniconda@v3 with: - name: conda_packages - path: ./conda_packages + miniforge-version: "latest" + auto-update-conda: true + conda-remove-defaults: true + auto-activate-base: true + activate-environment: "" - - name: Get Version Number - if: github.event.ref_type == 'tag' + - name: Conda package + shell: bash -l {0} run: | - VERSION="${GITHUB_REF/refs\/tags\/v/}" - echo "VERSION=$VERSION" >> $GITHUB_ENV + if [ "${{ runner.os }}" == Windows ]; then + # Short path to minimise odds of hitting Windows max path length + CONDA_BUILD_ARGS+=" --croot ${{ runner.temp }}\cb" + fi + conda install -c labscript-suite setuptools-conda "conda-build<25" + setuptools-conda build $CONDA_BUILD_ARGS . - - name: Create GitHub Release and Upload Release Asset - if: github.event.ref_type == 'tag' - uses: softprops/action-gh-release@v1 + - name: Upload artifact + uses: actions/upload-artifact@v4 + with: + name: conda_packages-${{ matrix.os }} + path: ./conda_packages + if-no-files-found: error + + + github-release: + name: Publish release (GitHub) + runs-on: ubuntu-latest + needs: [configure, build] + if: ${{ needs.configure.outputs.RELEASE == 'true' }} + permissions: + contents: write + env: + PKGNAME: ${{ needs.configure.outputs.PKGNAME }} + PKGVER: ${{ needs.configure.outputs.PKGVER }} + + steps: + - name: Download Artifact + uses: actions/download-artifact@v4 + with: + pattern: dist* + path: ./dist + merge-multiple: true + + - name: Create GitHub release and upload release asset + uses: softprops/action-gh-release@v2 env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} with: tag_name: ${{ github.event.ref }} - name: ${{ env.PACKAGE_NAME }} ${{ env.VERSION }} + name: ${{ env.PKGNAME }} ${{ env.PKGVER }} draft: true prerelease: ${{ contains(github.event.ref, 'rc') }} - files: ./dist/${{ env.PACKAGE_NAME }}-${{ env.VERSION }}.tar.gz + files: ./dist/*.tar.gz + + + testpypi-upload: + name: Publish on Test PyPI + runs-on: ubuntu-latest + needs: [configure, build] + if: ${{ needs.configure.outputs.TESTPYPI_UPLOAD_THIS_RUN == 'true' }} + env: + PKGNAME: ${{ needs.configure.outputs.PKGNAME }} + PKGVER: ${{ needs.configure.outputs.PKGVER }} + environment: + name: testpypi + url: https://test.pypi.org/project/${{ env.PKGNAME }}/${{ env.PKGVER }} + permissions: + id-token: write + + steps: + - name: Download Artifact + uses: actions/download-artifact@v4 + with: + pattern: dist* + path: ./dist + merge-multiple: true - name: Publish on TestPyPI uses: pypa/gh-action-pypi-publish@release/v1 with: - user: __token__ - password: ${{ secrets.testpypi }} repository-url: https://test.pypi.org/legacy/ + + pypi-upload: + name: Publish on PyPI + runs-on: ubuntu-latest + needs: [configure, build] + if: ${{ needs.configure.outputs.PYPI_UPLOAD_THIS_RUN == 'true' }} + env: + PKGNAME: ${{ needs.configure.outputs.PKGNAME }} + PKGVER: ${{ needs.configure.outputs.PKGVER }} + environment: + name: pypi + url: https://pypi.org/project/${{ env.PKGNAME }}/${{ env.PKGVER }} + permissions: + id-token: write + + steps: + - name: Download Artifact + uses: actions/download-artifact@v4 + with: + pattern: dist* + path: ./dist + merge-multiple: true + - name: Publish on PyPI - if: github.event.ref_type == 'tag' uses: pypa/gh-action-pypi-publish@release/v1 + + + test-anaconda-upload: + name: Publish on Anaconda (test label) + runs-on: ubuntu-latest + needs: [configure, build] + if: ${{ needs.configure.outputs.TEST_ANACONDA_UPLOAD_THIS_RUN == 'true' }} + + steps: + - name: Download Artifact + uses: actions/download-artifact@v4 with: - user: __token__ - password: ${{ secrets.pypi }} + pattern: conda_packages-* + path: ./conda_packages + merge-multiple: true - - name: Install Miniconda - uses: conda-incubator/setup-miniconda@v2 + - name: Install Miniforge + uses: conda-incubator/setup-miniconda@v3 with: + miniforge-version: "latest" auto-update-conda: true + conda-remove-defaults: true + auto-activate-base: true + activate-environment: "" - name: Install Anaconda cloud client shell: bash -l {0} run: conda install anaconda-client - name: Publish to Anaconda test label - if: github.event.ref_type != 'tag' shell: bash -l {0} run: | anaconda \ --token ${{ secrets.ANACONDA_API_TOKEN }} \ upload \ - --user $ANACONDA_USER \ + --skip-existing \ + --user ${{ needs.configure.outputs.ANACONDA_USER }} \ --label test \ conda_packages/*/* - - name: Publish to Anaconda main label + + anaconda-upload: + name: Publish on Anaconda + runs-on: ubuntu-latest + needs: [configure, build] + if: ${{ needs.configure.outputs.ANACONDA_UPLOAD_THIS_RUN == 'true' }} + + steps: + - name: Download Artifact + uses: actions/download-artifact@v4 + with: + pattern: conda_packages-* + path: ./conda_packages + merge-multiple: true + + - name: Install Miniforge + uses: conda-incubator/setup-miniconda@v3 + with: + miniforge-version: "latest" + auto-update-conda: true + conda-remove-defaults: true + auto-activate-base: true + activate-environment: "" + + - name: Install Anaconda cloud client + shell: bash -l {0} + run: conda install anaconda-client + + - name: Publish to Anaconda main shell: bash -l {0} - if: github.event.ref_type == 'tag' run: | anaconda \ --token ${{ secrets.ANACONDA_API_TOKEN }} \ upload \ - --user $ANACONDA_USER \ + --skip-existing \ + --user ${{ needs.configure.outputs.ANACONDA_USER }} \ conda_packages/*/* diff --git a/README.md b/README.md index 9963d95..eb13384 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ ### Shared modules used by the _labscript suite_ -[![Actions Status](https://github.com/labscript-suite/labscript-utils/workflows/Build%20and%20Release/badge.svg?branch=maintenance%2F3.0.x)](https://github.com/labscript-suite/labscript-utils/actions) +[![Actions Status](https://github.com/labscript-suite/labscript-utils/workflows/Build%20and%20Release/badge.svg)](https://github.com/labscript-suite/labscript-utils/actions) [![License](https://img.shields.io/pypi/l/labscript-utils.svg)](https://github.com/labscript-suite/labscript-utils/raw/master/LICENSE.txt) [![Python Version](https://img.shields.io/pypi/pyversions/labscript-utils.svg)](https://python.org) [![PyPI](https://img.shields.io/pypi/v/labscript-utils.svg)](https://pypi.org/project/labscript-utils) diff --git a/docs/source/conf.py b/docs/source/conf.py index 93b69f8..403ebbf 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -13,8 +13,6 @@ import copy import os from pathlib import Path -from m2r import MdInclude -from recommonmark.transform import AutoStructify from jinja2 import FileSystemLoader, Environment # -- Project information (unique to each project) ------------------------------------- @@ -47,7 +45,7 @@ "sphinx.ext.todo", "sphinx.ext.viewcode", "sphinx_rtd_theme", - "recommonmark", + "myst_parser", ] autodoc_typehints = 'description' @@ -73,6 +71,7 @@ # Prefix each autosectionlabel with the name of the document it is in and a colon autosectionlabel_prefix_document = True +myst_heading_anchors = 2 # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] @@ -211,23 +210,8 @@ # Customize the html_theme html_theme_options = {'navigation_depth': 3} -# Use m2r only for mdinclude and recommonmark for everything else -# https://github.com/readthedocs/recommonmark/issues/191#issuecomment-622369992 def setup(app): - config = { - # 'url_resolver': lambda url: github_doc_root + url, - 'auto_toc_tree_section': 'Contents', - 'enable_eval_rst': True, - } - app.add_config_value('recommonmark_config', config, True) - app.add_transform(AutoStructify) - - # from m2r to make `mdinclude` work - app.add_config_value('no_underscore_emphasis', False, 'env') - app.add_config_value('m2r_parse_relative_links', False, 'env') - app.add_config_value('m2r_anonymous_references', False, 'env') - app.add_config_value('m2r_disable_inline_math', False, 'env') - app.add_directive('mdinclude', MdInclude) + app.add_css_file('custom.css') # generate the components.rst file dynamically so it points to stable/latest diff --git a/labscript_profile/create.py b/labscript_profile/create.py index cf91b53..38a6773 100644 --- a/labscript_profile/create.py +++ b/labscript_profile/create.py @@ -5,6 +5,7 @@ from pathlib import Path from subprocess import check_output from labscript_profile import LABSCRIPT_SUITE_PROFILE, default_labconfig_path +import argparse _here = os.path.dirname(os.path.abspath(__file__)) DEFAULT_PROFILE_CONTENTS = os.path.join(_here, 'default_profile') @@ -21,7 +22,15 @@ def make_shared_secret(directory): raise RuntimeError("Could not parse output of zprocess.makesecret") -def make_labconfig_file(): +def make_labconfig_file(apparatus_name = None): + """Create labconfig file from template + + Parameters + ---------- + apparatus_name: str, optional + Overrides the default apparatus name with the provided one if not None + """ + source_path = os.path.join(LABSCRIPT_SUITE_PROFILE, 'labconfig', 'example.ini') target_path = default_labconfig_path() if os.path.exists(target_path): @@ -47,16 +56,88 @@ def make_labconfig_file(): '%(labscript_suite)s', shared_secret.relative_to(LABSCRIPT_SUITE_PROFILE) ) config.set('security', 'shared_secret', str(shared_secret_entry)) + if apparatus_name is not None: + print(f'\tSetting apparatus name to \'{apparatus_name}\'') + config.set('DEFAULT', 'apparatus_name', apparatus_name) with open(target_path, 'w') as f: config.write(f) +def compile_connection_table(): + """Compile the connection table defined in the labconfig file + + The output is placed in the location defined by the labconfig file. + """ + + try: + import runmanager + except ImportError: + # if runmanager doesn't import, skip compilation + return + + config = configparser.ConfigParser(defaults = {'labscript_suite': str(LABSCRIPT_SUITE_PROFILE)}) + config.read(default_labconfig_path()) + + # The path to the user's connection_table.py script + script_path = os.path.expandvars(config['paths']['connection_table_py']) + # path to the connection_table.h5 destination + output_h5_path = os.path.expandvars(config['paths']['connection_table_h5']) + # create output directory, if needed + Path(output_h5_path).parent.mkdir(parents=True, exist_ok=True) + # compile the h5 file + runmanager.new_globals_file(output_h5_path) + + def dummy_callback(success): + pass + + runmanager.compile_labscript_async(labscript_file = script_path, + run_file = output_h5_path, + stream_port = None, + done_callback = dummy_callback) + print(f'\tOutput written to {output_h5_path}') + +def create_profile_cli(): + """Function that defines the labscript-profile-create command + + Parses CLI arguments and calls :func:`~.create_profile`. + """ + + # capture CMD arguments + parser = argparse.ArgumentParser(prog='labscript-profile-create', + description='Initialises a default labscript profile' + ) + + parser.add_argument('-n', '--apparatus_name', + type=str, + help='Sets the apparatus_name in the labconfig file. Defaults to example_apparatus', + ) + parser.add_argument('-c', '--compile', + action='store_true', + help='Enables compilation of the default example connection table', + default=False) + + args = parser.parse_args() + + create_profile(args.apparatus_name, args.compile) + +def create_profile(apparatus_name = None, compile_table = False): + """Function that creates a labscript config profile from the default config + + Parameters + ---------- + appratus_name: str, optional + apparatus_name to define in the config. + If None, defaults to example_apparatus (set in default config file) + compile_table: bool, optional + Whether to compile to example connection table defined by the default config file + Default is False. + """ -def create_profile(): src = Path(DEFAULT_PROFILE_CONTENTS) dest = Path(LABSCRIPT_SUITE_PROFILE) + print(f'Creating labscript profile at {LABSCRIPT_SUITE_PROFILE}') # Profile directory may exist already, but we will error if it contains any of the - # files or directories we want to copy into it: + # sub-directories we want to copy into it: os.makedirs(dest, exist_ok=True) # Preferable to raise errors if anything exists before copying anything, rather than # do a partial copy before hitting an error: @@ -71,4 +152,16 @@ def create_profile(): else: shutil.copy2(src_file, dest_file) - make_labconfig_file() + print('Writing labconfig file') + make_labconfig_file(apparatus_name) + + # rename apparatus directories + if apparatus_name is not None: + print('\tRenaming apparatus directories') + for path in dest.glob('**/example_apparatus/'): + new_path = Path(str(path).replace('example_apparatus', apparatus_name)) + path.rename(new_path) + + if compile_table: + print('Compiling the example connection table') + compile_connection_table() \ No newline at end of file diff --git a/labscript_utils/__version__.py b/labscript_utils/__version__.py index f10de68..c21f494 100644 --- a/labscript_utils/__version__.py +++ b/labscript_utils/__version__.py @@ -1,5 +1,4 @@ from pathlib import Path - try: import importlib.metadata as importlib_metadata except ImportError: @@ -7,10 +6,22 @@ root = Path(__file__).parent.parent if (root / '.git').is_dir(): - from setuptools_scm import get_version - __version__ = get_version(root, version_scheme="release-branch-semver") + try: + from setuptools_scm import get_version + VERSION_SCHEME = { + "version_scheme": "release-branch-semver", + "local_scheme": "node-and-date", + } + scm_version = get_version(root, **VERSION_SCHEME) + except ImportError: + scm_version = None +else: + scm_version = None + +if scm_version is not None: + __version__ = scm_version else: try: __version__ = importlib_metadata.version(__package__) except importlib_metadata.PackageNotFoundError: - __version__ = None \ No newline at end of file + __version__ = None diff --git a/labscript_utils/device_registry/_device_registry.py b/labscript_utils/device_registry/_device_registry.py index 256d76d..e5a122a 100644 --- a/labscript_utils/device_registry/_device_registry.py +++ b/labscript_utils/device_registry/_device_registry.py @@ -1,12 +1,13 @@ +import importlib.machinery import os import importlib -import imp import warnings import traceback import inspect from labscript_utils import dedent from labscript_utils.labconfig import LabConfig + """This file contains the machinery for registering and looking up what BLACS tab and runviewer parser classes belong to a particular labscript device. "labscript device" here means a device that BLACS needs to communicate with. These devices have @@ -248,20 +249,19 @@ def register_classes(labscript_device_name, BLACS_tab=None, runviewer_parser=Non def populate_registry(): """Walk the labscript_devices folder looking for files called register_classes.py, - and run them (i.e. import them). These files are expected to make calls to + and run them. These files are expected to make calls to register_classes() to inform us of what BLACS tabs and runviewer classes correspond to their labscript device classes.""" - # We import the register_classes modules as a direct submodule of labscript_devices. - # But they cannot all have the same name, so we import them as - # labscript_devices._register_classes_script_ with increasing number. - module_num = 0 + # We execute the register_classes modules as a direct submodule of labscript_devices. for devices_dir in LABSCRIPT_DEVICES_DIRS: for folder, _, filenames in os.walk(devices_dir): if 'register_classes.py' in filenames: # The module name is the path to the file, relative to the labscript suite # install directory: - # Open the file using the import machinery, and import it as module_name. - fp, pathname, desc = imp.find_module('register_classes', [folder]) - module_name = 'labscript_devices._register_classes_%d' % module_num - _ = imp.load_module(module_name, fp, pathname, desc) - module_num += 1 + # Open the file using the import machinery, and run it + spec = importlib.machinery.PathFinder.find_spec('register_classes', [folder]) + mod = importlib.util.module_from_spec(spec) + spec.loader.exec_module(mod) + # fully importing module would require adding to sys.modules + # and each import would need to have unique names + # but we just need to run the registering code, not actually import the module diff --git a/labscript_utils/ls_zprocess.py b/labscript_utils/ls_zprocess.py index f3f2021..4d9e957 100644 --- a/labscript_utils/ls_zprocess.py +++ b/labscript_utils/ls_zprocess.py @@ -167,7 +167,7 @@ def __init__( port=None, dtype='pyobj', pull_only=False, - bind_address='tcp://0.0.0.0', + bind_address='tcp://*', timeout_interval=None, **kwargs ): diff --git a/labscript_utils/modulewatcher.py b/labscript_utils/modulewatcher.py index 25df904..ec1cdb1 100644 --- a/labscript_utils/modulewatcher.py +++ b/labscript_utils/modulewatcher.py @@ -14,7 +14,6 @@ import threading import time import os -import imp import site import sysconfig @@ -58,16 +57,8 @@ def mainloop(self): while True: time.sleep(1) with self.lock: - # Acquire the import lock so that we don't unload modules whilst an - # import is in progess: - imp.acquire_lock() - try: - if self.check(): - self.unload() - finally: - # We're done mucking around with the cached modules, normal imports - # in other threads may resume: - imp.release_lock() + if self.check(): + self.unload() def check(self): unload_required = False @@ -133,3 +124,32 @@ def unload(self): # code holds references to sys.meta_path, and to preserve order, since order # is relevant. sys.meta_path[:] = self.meta_whitelist + +if __name__ == "__main__": + + from pathlib import Path + import time + + dict1 = {'t': 5, 'val': 10} + dict2 = {'t': 5, 'val': 11} + + print('ModuleWatcher instatiated in debug mode') + module_watcher = ModuleWatcher(debug=True) + + # import a local module + import labscript_utils.dict_diff + print('imported labscript_utils.dict_diff') + print(labscript_utils.dict_diff.dict_diff(dict1, dict2)) + print('used dict_diff function, waiting 2 seconds for module watcher to update') + time.sleep(2) + + # now pretend it has been updated + ex_mod = Path('dict_diff.py') + ex_mod.touch() + print('dict_diff module touched, waiting 2 seconds for ModuleWatcher to notice') + time.sleep(2) + + print(labscript_utils.dict_diff.dict_diff(dict1, dict2)) + print('Used dict_diff again, waiting 2 seconds for ModuleWatcher to not do anything') + time.sleep(2) + \ No newline at end of file diff --git a/labscript_utils/qtwidgets/digitaloutput.py b/labscript_utils/qtwidgets/digitaloutput.py index 0704d6d..54a73e2 100644 --- a/labscript_utils/qtwidgets/digitaloutput.py +++ b/labscript_utils/qtwidgets/digitaloutput.py @@ -30,13 +30,13 @@ def __init__(self,*args,**kwargs): self._DO = None # Setting and getting methods for the Digitl Out object in charge of this button - def set_DO(self,DO,notify_old_DO=True,notify_new_DO=True): + def set_DO(self,DO,notify_old_DO=True,notify_new_DO=True,inverted=False): # If we are setting a new DO, remove this widget from the old one (if it isn't None) and add it to the new one (if it isn't None) if DO != self._DO: if self._DO is not None and notify_old_DO: self._DO.remove_widget(self) if DO is not None and notify_new_DO: - DO.add_widget(self) + DO.add_widget(self, inverted) # Store a reference to the digital out object self._DO = DO @@ -93,6 +93,9 @@ def state(self,state): class InvertedDigitalOutput(DigitalOutput): + def set_DO(self,DO,notify_old_DO=True,notify_new_DO=True,inverted=True): + DigitalOutput.set_DO(self, DO, notify_old_DO, notify_new_DO, inverted) + @property def state(self): return not DigitalOutput.state.fget(self) diff --git a/labscript_utils/qtwidgets/outputbox.py b/labscript_utils/qtwidgets/outputbox.py index b6e1bc4..c1b16b4 100644 --- a/labscript_utils/qtwidgets/outputbox.py +++ b/labscript_utils/qtwidgets/outputbox.py @@ -29,5 +29,5 @@ def __init__(self, container, scrollback_lines=1000): container=container, scrollback_lines=scrollback_lines, zmq_context=context, - bind_address='tcp://0.0.0.0', + bind_address='tcp://*', ) diff --git a/labscript_utils/splash.py b/labscript_utils/splash.py index e00c892..51c5686 100644 --- a/labscript_utils/splash.py +++ b/labscript_utils/splash.py @@ -15,27 +15,28 @@ from labscript_utils import dedent try: - from qtutils.qt import QtWidgets, QtCore, QtGui + from qtutils.qt import QtWidgets, QtCore, QtGui, QT_ENV except ImportError as e: if 'DLL load failed' in str(e): msg = """Failed to load Qt DLL. This can be caused by application shortcuts not being configured to activate conda environments. Try running the following from within the activated conda environment to fix the shortcuts: - python -m labscript_utils.winshell --fix-shortcuts.""" + desktop-app install blacs lyse runmanager runviewer""" raise ImportError(dedent(msg)) raise Qt = QtCore.Qt - -# Set auto high-DPI scaling - this ensures pixel metrics are scaled -# appropriately so that we don't get a weird mix of large fonts and small -# everything else on High DPI displays: -QtWidgets.QApplication.setAttribute(Qt.AA_EnableHighDpiScaling, True) -# Use high res pixmaps if available, instead of rendering at low resolution and -# upscaling: -QtWidgets.QApplication.setAttribute(Qt.AA_UseHighDpiPixmaps, True) +# These are default in Qt6 and print a warning if set +if QT_ENV == 'PyQt5': + # Set auto high-DPI scaling - this ensures pixel metrics are scaled + # appropriately so that we don't get a weird mix of large fonts and small + # everything else on High DPI displays: + QtWidgets.QApplication.setAttribute(Qt.AA_EnableHighDpiScaling, True) + # Use high res pixmaps if available, instead of rendering at low resolution and + # upscaling: + QtWidgets.QApplication.setAttribute(Qt.AA_UseHighDpiPixmaps, True) class Splash(QtWidgets.QFrame): @@ -46,12 +47,13 @@ class Splash(QtWidgets.QFrame): alpha = 0.875 icon_frac = 0.65 BG = '#ffffff' + FG = '#000000' def __init__(self, imagepath): self.qapplication = QtWidgets.QApplication.instance() if self.qapplication is None: self.qapplication = QtWidgets.QApplication(sys.argv) - QtWidgets.QFrame.__init__(self) + super().__init__() self.icon = QtGui.QPixmap() self.icon.load(imagepath) if self.icon.isNull(): @@ -63,7 +65,7 @@ def __init__(self, imagepath): self.setWindowFlags(Qt.SplashScreen) self.setWindowOpacity(self.alpha) self.label = QtWidgets.QLabel(self.text) - self.setStyleSheet("background-color: %s; font-size: 10pt" % self.BG) + self.setStyleSheet(f"color: {self.FG}; background-color: {self.BG}; font-size: 10pt") # Frame not necessary on macos, and looks ugly. if sys.platform != 'darwin': self.setFrameShape(QtWidgets.QFrame.StyledPanel) @@ -79,46 +81,24 @@ def __init__(self, imagepath): layout.addWidget(image_label) layout.addWidget(self.label) - center_point = QtWidgets.QDesktopWidget().availableGeometry().center() - x0, y0 = center_point.x(), center_point.y() - self.move(x0 - self.w // 2, y0 - self.h // 2) - self._first_paint_complete = False + self._paint_pending = False def paintEvent(self, event): - result = QtWidgets.QFrame.paintEvent(self, event) - if not self._first_paint_complete: - self._first_paint_complete = True - self.qapplication.quit() - return result - - def show(self): - QtWidgets.QFrame.show(self) - self.update_text(self.text) + self._paint_pending = False + return super().paintEvent(event) def update_text(self, text): self.text = text self.label.setText(text) - # If we are not visible yet, exec until we are painted. - if not self._first_paint_complete: - self.qapplication.exec_() - else: - self.repaint() + self._paint_pending = True + while self._paint_pending: + QtCore.QCoreApplication.processEvents(QtCore.QEventLoop.AllEvents) + QtCore.QCoreApplication.sendPostedEvents() if __name__ == '__main__': import time - - MACOS = sys.platform == 'darwin' - WINDOWS = sys.platform == 'win32' - LINUX = sys.platform.startswith('linux') - - if MACOS: - icon = '/Users/bilbo/tmp/runmanager/runmanager.svg' - elif LINUX: - icon = '/home/bilbo/labscript_suite/runmanager/runmanager.svg' - elif WINDOWS: - icon = R'C:\labscript_suite\runmanager\runmanager.svg' - + icon = '../../runmanager/runmanager/runmanager.svg' splash = Splash(icon) splash.show() time.sleep(1) diff --git a/pyproject.toml b/pyproject.toml index c72bd8d..e7bb065 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,3 +1,65 @@ [build-system] -requires = ["setuptools>=42", "wheel", "setuptools_scm[toml]>=4.1.0"] +requires = ["setuptools>=64", "setuptools_scm>=8"] build-backend = "setuptools.build_meta" + +[tool.setuptools_scm] +version_scheme = "release-branch-semver" +local_scheme = "no-local-version" + +[tool.setuptools] +zip-safe = false +include-package-data = true +packages = [ + "labscript_utils", + "labscript_profile", +] + +[tool.setuptools.package-data] +labscript_profile = ["../labscript-suite.pth"] + +[project] +name = "labscript-utils" +description = "Shared utilities for the labscript suite" +authors = [ + {name = "The labscript suite community", email = "labscriptsuite@googlegroups.com"}, +] +keywords = ["experiment control", "automation"] +license = {file = 'LICENSE.txt'} +classifiers = [ + "License :: OSI Approved :: BSD License", + "Programming Language :: Python :: 3 :: Only", +] +requires-python = ">=3.8" +dependencies = [ + "h5py>=2.9", + "numpy>=1.15", + "packaging>=20.4", + "pyqtgraph>=0.11.0rc0", + "qtutils>=4.0", + "scipy", + "zprocess>=2.18.0", + "setuptools_scm>=4.1.0", +] +dynamic = ["version"] + +[project.readme] +file = "README.md" +content-type = "text/markdown" + +[project.urls] +Homepage = "http://labscriptsuite.org/" +Documentation = "https://docs.labscriptsuite.org/" +Repository = "https://github.com/labscript-suite/labscript-utils/" +Downloads = "https://github.com/labscript-suite/labscript-utils/releases/" +Tracker = "https://github.com/labscript-suite/labscript-utils/issues/" + +[project.optional-dependencies] +docs = [ + "PyQt5", + "Sphinx==7.2.6", + "sphinx-rtd-theme==2.0.0", + "myst_parser==2.0.0", +] + +[project.scripts] +labscript-profile-create = "labscript_profile.create:create_profile_cli" diff --git a/readthedocs.yaml b/readthedocs.yaml index 35084d6..4f8b68e 100644 --- a/readthedocs.yaml +++ b/readthedocs.yaml @@ -4,6 +4,12 @@ # Required version: 2 +# Set build environment options +build: + os: ubuntu-22.04 + tools: + python: "3.11" + # Build documentation in the docs/ directory with Sphinx sphinx: builder: dirhtml @@ -15,13 +21,11 @@ formats: - pdf - epub -# Optionally set the version of Python and requirements required to build your docs +# Optionally set the requirements required to build your docs python: - version: 3.7 install: - method: pip path: . extra_requirements: - docs - system_packages: true \ No newline at end of file diff --git a/setup.cfg b/setup.cfg deleted file mode 100644 index 399299e..0000000 --- a/setup.cfg +++ /dev/null @@ -1,55 +0,0 @@ -[metadata] -name = labscript-utils -description = Shared utilities for the labscript suite -long_description = file: README.md -long_description_content_type = text/markdown -author = The labscript suite community -author_email = labscriptsuite@googlegroups.com -url = http://labscriptsuite.org -project_urls = - Source Code=https://github.com/labscript-suite/labscript-utils - Download=https://github.com/labscript-suite/labscript-utils/releases - Tracker=https://github.com/labscript-suite/labscript-utils/issues -keywords = experiment control automation -license = BSD -classifiers = - License :: OSI Approved :: BSD License - Programming Language :: Python :: 3 :: Only - Programming Language :: Python :: 3.6 - Programming Language :: Python :: 3.7 - Programming Language :: Python :: 3.8 - Programming Language :: Python :: 3.9 - Programming Language :: Python :: 3.10 - Programming Language :: Python :: 3.11 - -[options] -zip_safe = False -include_package_data = True -packages = labscript_utils, labscript_profile -python_requires = >=3.6 -install_requires = - importlib_metadata>=1.0 - h5py>=2.9 - numpy>=1.15 - packaging>=20.4 - pyqtgraph>=0.11.0rc0 - qtutils>=2.2.3 - scipy - setuptools_scm>=4.1.0 - zprocess>=2.18.0 - -[options.extras_require] -docs = - PyQt5 - Sphinx==4.4.0 - sphinx-rtd-theme==0.5.2 - recommonmark==0.6.0 - m2r==0.2.1 - mistune<2.0.0 - -[options.package_data] -labscript_profile = ../labscript-suite.pth - -[options.entry_points] -console_scripts = - labscript-profile-create = labscript_profile.create:create_profile diff --git a/setup.py b/setup.py index 5f958c3..6245401 100644 --- a/setup.py +++ b/setup.py @@ -33,16 +33,9 @@ def run(self): if not self.dry_run: self.copy_file('labscript-suite.pth', path) - -VERSION_SCHEME = { - "version_scheme": os.getenv("SCM_VERSION_SCHEME", "release-branch-semver"), - "local_scheme": os.getenv("SCM_LOCAL_SCHEME", "node-and-date"), -} - setup( - use_scm_version=VERSION_SCHEME, cmdclass={ 'develop': develop_command, 'editable_wheel': editable_wheel_command, }, -) +) \ No newline at end of file pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy