-pip_import(name, extra_pip_args, python_interpreter, python_interpreter_target, requirements, timeout) -- -A rule for importing `requirements.txt` dependencies into Bazel. - -This rule imports a `requirements.txt` file and generates a new -`requirements.bzl` file. This is used via the `WORKSPACE` pattern: - -```python -pip_import( - name = "foo", - requirements = ":requirements.txt", -) -load("@foo//:requirements.bzl", "pip_install") -pip_install() -``` - -You can then reference imported dependencies from your `BUILD` file with: - -```python -load("@foo//:requirements.bzl", "requirement") -py_library( - name = "bar", - ... - deps = [ - "//my/other:dep", - requirement("futures"), - requirement("mock"), - ], -) -``` - -Or alternatively: -```python -load("@foo//:requirements.bzl", "all_requirements") -py_binary( - name = "baz", - ... - deps = [ - ":foo", - ] + all_requirements, -) -``` - - -### Attributes - -
name |
-
- Name; required
- - A unique name for this repository. - - |
-
extra_pip_args |
-
- List of strings; optional
- - Extra arguments to pass on to pip. Must not contain spaces. - - |
-
python_interpreter |
-
- String; optional
- - The command to run the Python interpreter used to invoke pip and unpack the -wheels. - - |
-
python_interpreter_target |
-
- Label; optional
- - If you are using a custom python interpreter built by another repository rule, -use this attribute to specify its BUILD target. This allows pip_import to invoke -pip using the same interpreter as your toolchain. If set, takes precedence over -python_interpreter. - - |
-
requirements |
-
- Label; required
- - The label of the requirements.txt file. - - |
-
timeout |
-
- Integer; optional
- - Timeout (in seconds) for repository fetch. - - |
-
-pip3_import(kwargs) -- -A wrapper around pip_import that uses the `python3` system command. - -Use this for requirements of PY3 programs. - -### Parameters - -
kwargs |
- - optional. - | -
-pip_repositories() -- -Pull in dependencies needed to use the packaging rules. - - - diff --git a/docs/precompiling.md b/docs/precompiling.md new file mode 100644 index 0000000000..ea978cddce --- /dev/null +++ b/docs/precompiling.md @@ -0,0 +1,124 @@ +# Precompiling + +Precompiling is compiling Python source files (`.py` files) into bytecode +(`.pyc` files) at build time instead of runtime. Doing it at build time can +improve performance by skipping that work at runtime. + +Precompiling is disabled by default, so you must enable it using flags or +attributes to use it. + +## Overhead of precompiling + +While precompiling helps runtime performance, it has two main costs: +1. Increasing the size (count and disk usage) of runfiles. It approximately + double the count of the runfiles because for every `.py` file, there is also + a `.pyc` file. Compiled files are generally around the same size as the + source files, so it approximately doubles the disk usage. +2. Precompiling requires running an extra action at build time. While + compiling itself isn't that expensive, the overhead can become noticeable + as more files need to be compiled. + +## Binary-level opt-in + +Binary-level opt-in allows enabling precompiling on a per-target basis. This is +useful for situations such as: + +* Globally enabling precompiling in your `.bazelrc` isn't feasible. This may + be because some targets don't work with precompiling, e.g. because they're too + big. +* Enabling precompiling for build tools (exec config targets) separately from + target-config programs. + +To use this approach, set the {bzl:attr}`pyc_collection` attribute on the +binaries/tests that should or should not use precompiling. Then change the +{bzl:flag}`--precompile` default. + +The default for the {bzl:attr}`pyc_collection` attribute is controlled by the flag +{bzl:obj}`--@rules_python//python/config_settings:precompile`, so you +can use an opt-in or opt-out approach by setting its value: +* targets must opt-out: `--@rules_python//python/config_settings:precompile=enabled` +* targets must opt-in: `--@rules_python//python/config_settings:precompile=disabled` + +## Pyc-only builds + +A pyc-only build (aka "sourceless" builds) is when only `.pyc` files are +included; the source `.py` files are not included. + +To enable this, set +{bzl:obj}`--@rules_python//python/config_settings:precompile_source_retention=omit_source` +flag on the command line or the {bzl:attr}`precompile_source_retention=omit_source` +attribute on specific targets. + +The advantage of pyc-only builds are: +* Fewer total files in a binary. +* Imports _may_ be _slightly_ faster. + +The disadvantages are: +* Error messages will be less precise because the precise line and offset + information isn't in a pyc file. +* pyc files are Python major-version-specific. + +:::{note} +pyc files are not a form of hiding source code. They are trivial to uncompile, +and uncompiling them can recover almost the original source. +::: + +## Advanced precompiler customization + +The default implementation of the precompiler is a persistent, multiplexed, +sandbox-aware, cancellation-enabled, json-protocol worker that uses the same +interpreter as the target toolchain. This works well for local builds, but may +not work as well for remote execution builds. To customize the precompiler, two +mechanisms are available: + +* The exec tools toolchain allows customizing the precompiler binary used with + the {bzl:attr}`precompiler` attribute. Arbitrary binaries are supported. +* The execution requirements can be customized using + `--@rules_python//tools/precompiler:execution_requirements`. This is a list + flag that can be repeated. Each entry is a `key=value` pair that is added to the + execution requirements of the `PyCompile` action. Note that this flag + is specific to the `rules_python` precompiler. If a custom binary is used, + this flag will have to be propagated from the custom binary using the + `testing.ExecutionInfo` provider; refer to the `py_interpreter_program` example. + +The default precompiler implementation is an asynchronous/concurrent +implementation. If you find it has bugs or hangs, please report them. In the +meantime, the flag `--worker_extra_flag=PyCompile=--worker_impl=serial` can +be used to switch to a synchronous/serial implementation that may not perform +as well, but is less likely to have issues. + +The `execution_requirements` keys of most relevance are: +* `supports-workers`: 1 or 0, to indicate if a regular persistent worker is + desired. +* `supports-multiplex-workers`: `1` or `0`, to indicate if a multiplexed persistent + worker is desired. +* `requires-worker-protocol`: `json` or `proto`; the `rules_python` precompiler + currently only supports `json`. +* `supports-multiplex-sandboxing`: `1` or `0`, to indicate if sandboxing of the + worker is supported. +* `supports-worker-cancellation`: `1` or `0`, to indicate if requests to the worker + can be cancelled. + +Note that any execution requirements values can be specified in the flag. + +## Known issues, caveats, and idiosyncrasies + +* Precompiling requires Bazel 7+ with the Pystar rule implementation enabled. +* Mixing rules_python PyInfo with Bazel builtin PyInfo will result in pyc files + being dropped. +* Precompiled files may not be used in certain cases prior to Python 3.11. This + occurs due to Python adding the directory of the binary's main `.py` file, which + causes the module to be found in the workspace source directory instead of + within the binary's runfiles directory (where the pyc files are). This can + usually be worked around by removing `sys.path[0]` (or otherwise ensuring the + runfiles directory comes before the repo's source directory in `sys.path`). +* The pyc filename does not include the optimization level (e.g., + `foo.cpython-39.opt-2.pyc`). This works fine (it's all bytecode), but also + means the interpreter `-O` argument can't be used -- doing so will cause the + interpreter to look for the non-existent `opt-N` named files. +* Targets with the same source files and different exec properties will result + in action conflicts. This most commonly occurs when a `py_binary` and + a `py_library` have the same source files. To fix this, modify both targets so + they have the same exec properties. If this is difficult because unsupported + exec groups end up being passed to the Python rules, please file an issue + to have those exec groups added to the Python rules. diff --git a/docs/pypi/circular-dependencies.md b/docs/pypi/circular-dependencies.md new file mode 100644 index 0000000000..62613f489e --- /dev/null +++ b/docs/pypi/circular-dependencies.md @@ -0,0 +1,82 @@ +:::{default-domain} bzl +::: + +# Circular dependencies + +Sometimes PyPI packages contain dependency cycles. For instance, a particular +version of `sphinx` (this is no longer the case in the latest version as of +2024-06-02) depends on `sphinxcontrib-serializinghtml`. When using them as +`requirement()`s, ala + +```starlark +py_binary( + name = "doctool", + ... + deps = [ + requirement("sphinx"), + ], +) +``` + +Bazel will protest because it doesn't support cycles in the build graph -- + +``` +ERROR: .../external/pypi_sphinxcontrib_serializinghtml/BUILD.bazel:44:6: in alias rule @pypi_sphinxcontrib_serializinghtml//:pkg: cycle in dependency graph: + //:doctool (...) + @pypi//sphinxcontrib_serializinghtml:pkg (...) +.-> @pypi_sphinxcontrib_serializinghtml//:pkg (...) +| @pypi_sphinxcontrib_serializinghtml//:_pkg (...) +| @pypi_sphinx//:pkg (...) +| @pypi_sphinx//:_pkg (...) +`-- @pypi_sphinxcontrib_serializinghtml//:pkg (...) +``` + +The `experimental_requirement_cycles` attribute allows you to work around these +issues by specifying groups of packages which form cycles. `pip_parse` will +transparently fix the cycles for you and provide the cyclic dependencies +simultaneously. + +```starlark + ... + experimental_requirement_cycles = { + "sphinx": [ + "sphinx", + "sphinxcontrib-serializinghtml", + ] + }, +) +``` + +`pip_parse` supports fixing multiple cycles simultaneously, however, cycles must +be distinct. `apache-airflow`, for instance, has dependency cycles with a number +of its optional dependencies, which means those optional dependencies must all +be a part of the `airflow` cycle. For instance: + +```starlark + ... + experimental_requirement_cycles = { + "airflow": [ + "apache-airflow", + "apache-airflow-providers-common-sql", + "apache-airflow-providers-postgres", + "apache-airflow-providers-sqlite", + ] + } +) +``` + +Alternatively, one could resolve the cycle by removing one leg of it. + +For example, while `apache-airflow-providers-sqlite` is "baked into" the Airflow +package, `apache-airflow-providers-postgres` is not and is an optional feature. +Rather than listing `apache-airflow[postgres]` in your `requirements.txt`, which +would expose a cycle via the extra, one could either _manually_ depend on +`apache-airflow` and `apache-airflow-providers-postgres` separately as +requirements. Bazel rules which need only `apache-airflow` can take it as a +dependency, and rules which explicitly want to mix in +`apache-airflow-providers-postgres` now can. + +Alternatively, one could use `rules_python`'s patching features to remove one +leg of the dependency manually, for instance, by making +`apache-airflow-providers-postgres` not explicitly depend on `apache-airflow` or +perhaps `apache-airflow-providers-common-sql`. diff --git a/docs/pypi/download-workspace.md b/docs/pypi/download-workspace.md new file mode 100644 index 0000000000..5dfb0f257a --- /dev/null +++ b/docs/pypi/download-workspace.md @@ -0,0 +1,107 @@ +:::{default-domain} bzl +::: + +# Download (WORKSPACE) + +This documentation page covers how to download PyPI dependencies in the legacy `WORKSPACE` setup. + +To add pip dependencies to your `WORKSPACE`, load the `pip_parse` function and +call it to create the central external repo and individual wheel external repos. + +```starlark +load("@rules_python//python:pip.bzl", "pip_parse") + +# Create a central repo that knows about the dependencies needed from +# requirements_lock.txt. +pip_parse( + name = "my_deps", + requirements_lock = "//path/to:requirements_lock.txt", +) + +# Load the starlark macro, which will define your dependencies. +load("@my_deps//:requirements.bzl", "install_deps") + +# Call it to define repos for your requirements. +install_deps() +``` + +## Interpreter selection + +Note that because `pip_parse` runs before Bazel decides which Python toolchain to use, it cannot +enforce that the interpreter used to invoke `pip` matches the interpreter used to run `py_binary` +targets. By default, `pip_parse` uses the system command `"python3"`. To override this, pass in the +{attr}`pip_parse.python_interpreter` attribute or {attr}`pip_parse.python_interpreter_target`. + +You can have multiple `pip_parse`s in the same workspace. This configuration will create multiple +external repos that have no relation to one another and may result in downloading the same wheels +numerous times. + +As with any repository rule, if you would like to ensure that `pip_parse` is +re-executed to pick up a non-hermetic change to your environment (e.g., updating +your system `python` interpreter), you can force it to re-execute by running +`bazel sync --only [pip_parse name]`. + +(per-os-arch-requirements)= +## Requirements for a specific OS/Architecture + +In some cases, you may need to use different requirements files for different OS and architecture combinations. +This is enabled via the {attr}`pip_parse.requirements_by_platform` attribute. The keys of the +dictionary are labels to the file, and the values are a list of comma-separated target (os, arch) +tuples. + +For example: +```starlark + # ... + requirements_by_platform = { + "requirements_linux_x86_64.txt": "linux_x86_64", + "requirements_osx.txt": "osx_*", + "requirements_linux_exotic.txt": "linux_exotic", + "requirements_some_platforms.txt": "linux_aarch64,windows_*", + }, + # For the list of standard platforms that the rules_python has toolchains for, default to + # the following requirements file. + requirements_lock = "requirements_lock.txt", +``` + +In case of duplicate platforms, `rules_python` will raise an error, as there has +to be an unambiguous mapping of the requirement files to the (os, arch) tuples. + +An alternative way is to use per-OS requirement attributes. +```starlark + # ... + requirements_windows = "requirements_windows.txt", + requirements_darwin = "requirements_darwin.txt", + # For the remaining platforms (which is basically only linux OS), use this file. + requirements_lock = "requirements_lock.txt", +) +``` + +:::{note} +If you are using a universal lock file but want to restrict the list of platforms that +the lock file will be evaluated against, consider using the aforementioned +`requirements_by_platform` attribute and listing the platforms explicitly. +::: + +(vendoring-requirements)= +## Vendoring the requirements.bzl file + +:::{note} +For `bzlmod`, refer to standard `bazel vendor` usage if you want to really vendor it, otherwise +just use the `pip` extension as you would normally. + +However, be aware that there are caveats when doing so. +::: + +In some cases you may not want to generate the requirements.bzl file as a repository rule +while Bazel is fetching dependencies. For example, if you produce a reusable Bazel module +such as a ruleset, you may want to include the `requirements.bzl` file rather than make your users +install the `WORKSPACE` setup to generate it, see {gh-issue}`608`. + +This is the same workflow as Gazelle, which creates `go_repository` rules with +[`update-repos`](https://github.com/bazelbuild/bazel-gazelle#update-repos) + +To do this, use the "write to source file" pattern documented in +
-py_runtime_pair(name, py2_runtime, py3_runtime) -- -A toolchain rule for Python. - -This wraps up to two Python runtimes, one for Python 2 and one for Python 3. -The rule consuming this toolchain will choose which runtime is appropriate. -Either runtime may be omitted, in which case the resulting toolchain will be -unusable for building Python code using that version. - -Usually the wrapped runtimes are declared using the `py_runtime` rule, but any -rule returning a `PyRuntimeInfo` provider may be used. - -This rule returns a `platform_common.ToolchainInfo` provider with the following -schema: - -```python -platform_common.ToolchainInfo( - py2_runtime =
name |
-
- Name; required
- - A unique name for this target. - - |
-
py2_runtime |
-
- Label; optional
- - The runtime to use for Python 2 targets. Must have `python_version` set to -`PY2`. - - |
-
py3_runtime |
-
- Label; optional
- - The runtime to use for Python 3 targets. Must have `python_version` set to -`PY3`. - - |
-
-py_binary(attrs) -- -See the Bazel core [py_binary](https://docs.bazel.build/versions/master/be/python.html#py_binary) documentation. - -### Parameters - -
attrs |
-
- optional.
- - Rule attributes - - |
-
-py_library(attrs) -- -See the Bazel core [py_library](https://docs.bazel.build/versions/master/be/python.html#py_library) documentation. - -### Parameters - -
attrs |
-
- optional.
- - Rule attributes - - |
-
-py_runtime(attrs) -- -See the Bazel core [py_runtime](https://docs.bazel.build/versions/master/be/python.html#py_runtime) documentation. - -### Parameters - -
attrs |
-
- optional.
- - Rule attributes - - |
-
-py_test(attrs) -- -See the Bazel core [py_test](https://docs.bazel.build/versions/master/be/python.html#py_test) documentation. - -### Parameters - -
attrs |
-
- optional.
- - Rule attributes - - |
-
-whl_library(name, extras, python_interpreter, requirements, whl) -- -A rule for importing `.whl` dependencies into Bazel. - -This rule is currently used to implement `pip_import`. It is not intended to -work standalone, and the interface may change. See `pip_import` for proper -usage. - -This rule imports a `.whl` file as a `py_library`: -```python -whl_library( - name = "foo", - whl = ":my-whl-file", - requirements = "name of pip_import rule", -) -``` - -This rule defines `@foo//:pkg` as a `py_library` target. - - -### Attributes - -
name |
-
- Name; required
- - A unique name for this repository. - - |
-
extras |
-
- List of strings; optional
-
- A subset of the "extras" available from this |
-
python_interpreter |
-
- String; optional
- - The command to run the Python interpreter used when unpacking the wheel. - - |
-
requirements |
-
- String; optional
-
- The name of the |
-
whl |
-
- Label; required
-
- The path to the |
-
-# Package just a specific py_libraries, without their dependencies
-py_wheel(
- name = "minimal_with_py_library",
- # Package data. We're building "example_minimal_library-0.0.1-py3-none-any.whl"
- distribution = "example_minimal_library",
- python_tag = "py3",
- version = "0.0.1",
- deps = [
- "//experimental/examples/wheel/lib:module_with_data",
- "//experimental/examples/wheel/lib:simple_module",
- ],
-)
-
-# Use py_package to collect all transitive dependencies of a target,
-# selecting just the files within a specific python package.
-py_package(
- name = "example_pkg",
- # Only include these Python packages.
- packages = ["experimental.examples.wheel"],
- deps = [":main"],
-)
-
-py_wheel(
- name = "minimal_with_py_package",
- # Package data. We're building "example_minimal_package-0.0.1-py3-none-any.whl"
- distribution = "example_minimal_package",
- python_tag = "py3",
- version = "0.0.1",
- deps = [":example_pkg"],
-)
-
-""",
- attrs = _concat_dicts(
- {
- "deps": attr.label_list(
- doc = """\
-Targets to be included in the distribution.
-
-The targets to package are usually `py_library` rules or filesets (for packaging data files).
-
-Note it's usually better to package `py_library` targets and use
-`entry_points` attribute to specify `console_scripts` than to package
-`py_binary` rules. `py_binary` targets would wrap a executable script that
-tries to locate `.runfiles` directory which is not packaged in the wheel.
-""",
- ),
- "_wheelmaker": attr.label(
- executable = True,
- cfg = "host",
- default = "//experimental/tools:wheelmaker",
- ),
- },
- _distribution_attrs,
- _requirement_attrs,
- _entrypoint_attrs,
- _other_attrs,
- ),
-)
diff --git a/experimental/tools/wheelmaker.py b/experimental/tools/wheelmaker.py
deleted file mode 100644
index 18e63573e9..0000000000
--- a/experimental/tools/wheelmaker.py
+++ /dev/null
@@ -1,325 +0,0 @@
-# Copyright 2018 The Bazel Authors. All rights reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import argparse
-import base64
-import collections
-import hashlib
-import os
-import os.path
-import sys
-import zipfile
-
-
-def commonpath(path1, path2):
- ret = []
- for a, b in zip(path1.split(os.path.sep), path2.split(os.path.sep)):
- if a != b:
- break
- ret.append(a)
- return os.path.sep.join(ret)
-
-
-class WheelMaker(object):
- def __init__(self, name, version, build_tag, python_tag, abi, platform,
- outfile=None, strip_path_prefixes=None):
- self._name = name
- self._version = version
- self._build_tag = build_tag
- self._python_tag = python_tag
- self._abi = abi
- self._platform = platform
- self._outfile = outfile
- self._strip_path_prefixes = strip_path_prefixes if strip_path_prefixes is not None else []
-
- self._zipfile = None
- self._record = []
-
- def __enter__(self):
- self._zipfile = zipfile.ZipFile(self.filename(), mode="w",
- compression=zipfile.ZIP_DEFLATED)
- return self
-
- def __exit__(self, type, value, traceback):
- self._zipfile.close()
- self._zipfile = None
-
- def filename(self):
- if self._outfile:
- return self._outfile
- components = [self._name, self._version]
- if self._build_tag:
- components.append(self._build_tag)
- components += [self._python_tag, self._abi, self._platform]
- return '-'.join(components) + '.whl'
-
- def distname(self):
- return self._name + '-' + self._version
-
- def disttags(self):
- return ['-'.join([self._python_tag, self._abi, self._platform])]
-
- def distinfo_path(self, basename):
- return self.distname() + '.dist-info/' + basename
-
- def _serialize_digest(self, hash):
- # https://www.python.org/dev/peps/pep-0376/#record
- # "base64.urlsafe_b64encode(digest) with trailing = removed"
- digest = base64.urlsafe_b64encode(hash.digest())
- digest = b'sha256=' + digest.rstrip(b'=')
- return digest
-
- def add_string(self, filename, contents):
- """Add given 'contents' as filename to the distribution."""
- if sys.version_info[0] > 2 and isinstance(contents, str):
- contents = contents.encode('utf-8', 'surrogateescape')
- self._zipfile.writestr(filename, contents)
- hash = hashlib.sha256()
- hash.update(contents)
- self._add_to_record(filename, self._serialize_digest(hash),
- len(contents))
-
- def add_file(self, package_filename, real_filename):
- """Add given file to the distribution."""
-
- def arcname_from(name):
- # Always use unix path separators.
- normalized_arcname = name.replace(os.path.sep, '/')
- for prefix in self._strip_path_prefixes:
- if normalized_arcname.startswith(prefix):
- return normalized_arcname[len(prefix):]
-
- return normalized_arcname
-
- arcname = arcname_from(package_filename)
-
- self._zipfile.write(real_filename, arcname=arcname)
- # Find the hash and length
- hash = hashlib.sha256()
- size = 0
- with open(real_filename, 'rb') as f:
- while True:
- block = f.read(2 ** 20)
- if not block:
- break
- hash.update(block)
- size += len(block)
- self._add_to_record(arcname, self._serialize_digest(hash), size)
-
- def add_wheelfile(self):
- """Write WHEEL file to the distribution"""
- # TODO(pstradomski): Support non-purelib wheels.
- wheel_contents = """\
-Wheel-Version: 1.0
-Generator: bazel-wheelmaker 1.0
-Root-Is-Purelib: {}
-""".format("true" if self._platform == "any" else "false")
- for tag in self.disttags():
- wheel_contents += "Tag: %s\n" % tag
- self.add_string(self.distinfo_path('WHEEL'), wheel_contents)
-
- def add_metadata(self, extra_headers, description, classifiers, python_requires,
- requires, extra_requires):
- """Write METADATA file to the distribution."""
- # https://www.python.org/dev/peps/pep-0566/
- # https://packaging.python.org/specifications/core-metadata/
- metadata = []
- metadata.append("Metadata-Version: 2.1")
- metadata.append("Name: %s" % self._name)
- metadata.append("Version: %s" % self._version)
- metadata.extend(extra_headers)
- for classifier in classifiers:
- metadata.append("Classifier: %s" % classifier)
- if python_requires:
- metadata.append("Requires-Python: %s" % python_requires)
- for requirement in requires:
- metadata.append("Requires-Dist: %s" % requirement)
-
- extra_requires = sorted(extra_requires.items())
- for option, option_requires in extra_requires:
- metadata.append("Provides-Extra: %s" % option)
- for requirement in option_requires:
- metadata.append(
- "Requires-Dist: %s; extra == '%s'" % (requirement, option))
-
- metadata = '\n'.join(metadata) + '\n\n'
- # setuptools seems to insert UNKNOWN as description when none is
- # provided.
- metadata += description if description else "UNKNOWN"
- metadata += "\n"
- self.add_string(self.distinfo_path('METADATA'), metadata)
-
- def add_recordfile(self):
- """Write RECORD file to the distribution."""
- record_path = self.distinfo_path('RECORD')
- entries = self._record + [(record_path, b'', b'')]
- entries.sort()
- contents = b''
- for filename, digest, size in entries:
- if sys.version_info[0] > 2 and isinstance(filename, str):
- filename = filename.encode('utf-8', 'surrogateescape')
- contents += b'%s,%s,%s\n' % (filename, digest, size)
- self.add_string(record_path, contents)
-
- def _add_to_record(self, filename, hash, size):
- size = str(size).encode('ascii')
- self._record.append((filename, hash, size))
-
-
-def get_files_to_package(input_files):
- """Find files to be added to the distribution.
-
- input_files: list of pairs (package_path, real_path)
- """
- files = {}
- for package_path, real_path in input_files:
- files[package_path] = real_path
- return files
-
-
-def main():
- parser = argparse.ArgumentParser(description='Builds a python wheel')
- metadata_group = parser.add_argument_group(
- "Wheel name, version and platform")
- metadata_group.add_argument('--name', required=True,
- type=str,
- help="Name of the distribution")
- metadata_group.add_argument('--version', required=True,
- type=str,
- help="Version of the distribution")
- metadata_group.add_argument('--build_tag', type=str, default='',
- help="Optional build tag for the distribution")
- metadata_group.add_argument('--python_tag', type=str, default='py3',
- help="Python version, e.g. 'py2' or 'py3'")
- metadata_group.add_argument('--abi', type=str, default='none')
- metadata_group.add_argument('--platform', type=str, default='any',
- help="Target platform. ")
-
- output_group = parser.add_argument_group("Output file location")
- output_group.add_argument('--out', type=str, default=None,
- help="Override name of ouptut file")
-
- output_group.add_argument('--strip_path_prefix',
- type=str,
- action="append",
- default=[],
- help="Path prefix to be stripped from input package files' path. "
- "Can be supplied multiple times. "
- "Evaluated in order."
- )
-
- wheel_group = parser.add_argument_group("Wheel metadata")
- wheel_group.add_argument(
- '--header', action='append',
- help="Additional headers to be embedded in the package metadata. "
- "Can be supplied multiple times.")
- wheel_group.add_argument('--classifier', action='append',
- help="Classifiers to embed in package metadata. "
- "Can be supplied multiple times")
- wheel_group.add_argument('--python_requires',
- help="Version of python that the wheel will work with")
- wheel_group.add_argument('--description_file',
- help="Path to the file with package description")
- wheel_group.add_argument('--entry_points_file',
- help="Path to a correctly-formatted entry_points.txt file")
-
- contents_group = parser.add_argument_group("Wheel contents")
- contents_group.add_argument(
- '--input_file', action='append',
- help="'package_path;real_path' pairs listing "
- "files to be included in the wheel. "
- "Can be supplied multiple times.")
- contents_group.add_argument(
- '--input_file_list', action='append',
- help='A file that has all the input files defined as a list to avoid the long command'
- )
-
- requirements_group = parser.add_argument_group("Package requirements")
- requirements_group.add_argument(
- '--requires', type=str, action='append',
- help="List of package requirements. Can be supplied multiple times.")
- requirements_group.add_argument(
- '--extra_requires', type=str, action='append',
- help="List of optional requirements in a 'requirement;option name'. "
- "Can be supplied multiple times.")
- arguments = parser.parse_args(sys.argv[1:])
-
- if arguments.input_file:
- input_files = [i.split(';') for i in arguments.input_file]
- else:
- input_files = []
-
- if arguments.input_file_list:
- for input_file in arguments.input_file_list:
- with open(input_file) as _file:
- input_file_list = _file.read().splitlines()
- for _input_file in input_file_list:
- input_files.append(_input_file.split(';'))
-
- all_files = get_files_to_package(input_files)
- # Sort the files for reproducible order in the archive.
- all_files = sorted(all_files.items())
-
- strip_prefixes = [p for p in arguments.strip_path_prefix]
-
- with WheelMaker(name=arguments.name,
- version=arguments.version,
- build_tag=arguments.build_tag,
- python_tag=arguments.python_tag,
- abi=arguments.abi,
- platform=arguments.platform,
- outfile=arguments.out,
- strip_path_prefixes=strip_prefixes
- ) as maker:
- for package_filename, real_filename in all_files:
- maker.add_file(package_filename, real_filename)
- maker.add_wheelfile()
-
- description = None
- if arguments.description_file:
- if sys.version_info[0] == 2:
- with open(arguments.description_file,
- 'rt') as description_file:
- description = description_file.read()
- else:
- with open(arguments.description_file, 'rt',
- encoding='utf-8') as description_file:
- description = description_file.read()
-
- extra_requires = collections.defaultdict(list)
- if arguments.extra_requires:
- for extra in arguments.extra_requires:
- req, option = extra.rsplit(';', 1)
- extra_requires[option].append(req)
- classifiers = arguments.classifier or []
- python_requires = arguments.python_requires or ""
- requires = arguments.requires or []
- extra_headers = arguments.header or []
-
- maker.add_metadata(extra_headers=extra_headers,
- description=description,
- classifiers=classifiers,
- python_requires=python_requires,
- requires=requires,
- extra_requires=extra_requires)
-
- if arguments.entry_points_file:
- maker.add_file(maker.distinfo_path(
- "entry_points.txt"), arguments.entry_points_file)
-
- maker.add_recordfile()
-
-
-if __name__ == '__main__':
- main()
diff --git a/gazelle/.bazelrc b/gazelle/.bazelrc
new file mode 100644
index 0000000000..97040903a6
--- /dev/null
+++ b/gazelle/.bazelrc
@@ -0,0 +1,14 @@
+test --test_output=errors
+
+# Do NOT implicitly create empty __init__.py files in the runfiles tree.
+# By default, these are created in every directory containing Python source code
+# or shared libraries, and every parent directory of those directories,
+# excluding the repo root directory. With this flag set, we are responsible for
+# creating (possibly empty) __init__.py files and adding them to the srcs of
+# Python targets as required.
+build --incompatible_default_to_explicit_init_py
+
+# Windows makes use of runfiles for some rules
+build --enable_runfiles
+
+common:bazel7.x --incompatible_python_disallow_native_rules
diff --git a/gazelle/.gitignore b/gazelle/.gitignore
new file mode 100644
index 0000000000..8481c9668c
--- /dev/null
+++ b/gazelle/.gitignore
@@ -0,0 +1,12 @@
+# Bazel directories
+/bazel-*
+/bazel-bin
+/bazel-genfiles
+/bazel-out
+/bazel-testlogs
+user.bazelrc
+
+# Go/Gazelle files
+# These otherwise match patterns above
+!go.mod
+!BUILD.out
diff --git a/gazelle/BUILD.bazel b/gazelle/BUILD.bazel
new file mode 100644
index 0000000000..0938be3dfc
--- /dev/null
+++ b/gazelle/BUILD.bazel
@@ -0,0 +1,38 @@
+load("@bazel_gazelle//:def.bzl", "gazelle")
+
+# Gazelle configuration options.
+# See https://github.com/bazelbuild/bazel-gazelle#running-gazelle-with-bazel
+# gazelle:prefix github.com/bazel-contrib/rules_python/gazelle
+# gazelle:exclude bazel-out
+gazelle(
+ name = "gazelle",
+)
+
+gazelle(
+ name = "gazelle_update_repos",
+ args = [
+ "-from_file=go.mod",
+ "-to_macro=deps.bzl%go_deps",
+ "-prune",
+ ],
+ command = "update-repos",
+)
+
+filegroup(
+ name = "distribution",
+ srcs = [
+ ":BUILD.bazel",
+ ":MODULE.bazel",
+ ":README.md",
+ ":WORKSPACE",
+ ":def.bzl",
+ ":deps.bzl",
+ ":go.mod",
+ ":go.sum",
+ "//manifest:distribution",
+ "//modules_mapping:distribution",
+ "//python:distribution",
+ "//pythonconfig:distribution",
+ ],
+ visibility = ["@rules_python//:__pkg__"],
+)
diff --git a/gazelle/MODULE.bazel b/gazelle/MODULE.bazel
new file mode 100644
index 0000000000..51352a0ba6
--- /dev/null
+++ b/gazelle/MODULE.bazel
@@ -0,0 +1,56 @@
+module(
+ name = "rules_python_gazelle_plugin",
+ version = "0.0.0",
+ compatibility_level = 1,
+)
+
+bazel_dep(name = "bazel_skylib", version = "1.6.1")
+bazel_dep(name = "rules_python", version = "0.18.0")
+bazel_dep(name = "rules_go", version = "0.41.0", repo_name = "io_bazel_rules_go")
+bazel_dep(name = "gazelle", version = "0.33.0", repo_name = "bazel_gazelle")
+bazel_dep(name = "rules_cc", version = "0.0.16")
+
+local_path_override(
+ module_name = "rules_python",
+ path = "..",
+)
+
+go_deps = use_extension("@bazel_gazelle//:extensions.bzl", "go_deps")
+go_deps.from_file(go_mod = "//:go.mod")
+use_repo(
+ go_deps,
+ "com_github_bazelbuild_buildtools",
+ "com_github_bmatcuk_doublestar_v4",
+ "com_github_emirpasic_gods",
+ "com_github_ghodss_yaml",
+ "com_github_stretchr_testify",
+ "in_gopkg_yaml_v2",
+ "org_golang_x_sync",
+)
+
+http_archive = use_repo_rule("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
+
+http_archive(
+ name = "com_github_smacker_go_tree_sitter",
+ build_file = "//:internal/smacker_BUILD.bazel",
+ integrity = "sha256-4AkDY4Rh5Auu9Kwzhj5XYSirMLlhmd6ClMWo/r0kmu4=",
+ strip_prefix = "go-tree-sitter-dd81d9e9be82a8cac96ed1d50c7389c5f1997c02",
+ url = "https://github.com/smacker/go-tree-sitter/archive/dd81d9e9be82a8cac96ed1d50c7389c5f1997c02.zip",
+)
+
+python_stdlib_list = use_extension("//python:extensions.bzl", "python_stdlib_list")
+use_repo(
+ python_stdlib_list,
+ "python_stdlib_list",
+)
+
+internal_dev_deps = use_extension(
+ "//:internal_dev_deps.bzl",
+ "internal_dev_deps_extension",
+ dev_dependency = True,
+)
+use_repo(
+ internal_dev_deps,
+ "django-types",
+ "pytest",
+)
diff --git a/gazelle/README.md b/gazelle/README.md
new file mode 100644
index 0000000000..df3085bb37
--- /dev/null
+++ b/gazelle/README.md
@@ -0,0 +1,840 @@
+# Python Gazelle plugin
+
+:::{note}
+The gazelle plugin docs are being migrated to our primary documentation on
+ReadTheDocs. Please see https://rules-python.readthedocs.io/gazelle/docs/index.html.
+:::
+
+
+## Example
+
+We have an example of using Gazelle with Python located [here](https://github.com/bazel-contrib/rules_python/tree/main/examples/bzlmod).
+A fully-working example without using bzlmod is in [`examples/build_file_generation`](../examples/build_file_generation).
+
+The following documentation covers using bzlmod.
+
+## Adding Gazelle to your project
+
+First, you'll need to add Gazelle to your `MODULE.bazel` file.
+Get the current version of Gazelle from there releases here: https://github.com/bazelbuild/bazel-gazelle/releases/.
+
+
+See the installation `MODULE.bazel` snippet on the Releases page:
+https://github.com/bazel-contrib/rules_python/releases in order to configure rules_python.
+
+You will also need to add the `bazel_dep` for configuration for `rules_python_gazelle_plugin`.
+
+Here is a snippet of a `MODULE.bazel` file.
+
+```starlark
+# The following stanza defines the dependency rules_python.
+bazel_dep(name = "rules_python", version = "0.22.0")
+
+# The following stanza defines the dependency rules_python_gazelle_plugin.
+# For typical setups you set the version.
+bazel_dep(name = "rules_python_gazelle_plugin", version = "0.22.0")
+
+# The following stanza defines the dependency gazelle.
+bazel_dep(name = "gazelle", version = "0.31.0", repo_name = "bazel_gazelle")
+
+# Import the python repositories generated by the given module extension into the scope of the current module.
+use_repo(python, "python3_9")
+use_repo(python, "python3_9_toolchains")
+
+# Register an already-defined toolchain so that Bazel can use it during toolchain resolution.
+register_toolchains(
+ "@python3_9_toolchains//:all",
+)
+
+# Use the pip extension
+pip = use_extension("@rules_python//python:extensions.bzl", "pip")
+
+# Use the extension to call the `pip_repository` rule that invokes `pip`, with `incremental` set.
+# Accepts a locked/compiled requirements file and installs the dependencies listed within.
+# Those dependencies become available in a generated `requirements.bzl` file.
+# You can instead check this `requirements.bzl` file into your repo.
+# Because this project has different requirements for windows vs other
+# operating systems, we have requirements for each.
+pip.parse(
+ name = "pip",
+ requirements_lock = "//:requirements_lock.txt",
+ requirements_windows = "//:requirements_windows.txt",
+)
+
+# Imports the pip toolchain generated by the given module extension into the scope of the current module.
+use_repo(pip, "pip")
+```
+Next, we'll fetch metadata about your Python dependencies, so that gazelle can
+determine which package a given import statement comes from. This is provided
+by the `modules_mapping` rule. We'll make a target for consuming this
+`modules_mapping`, and writing it as a manifest file for Gazelle to read.
+This is checked into the repo for speed, as it takes some time to calculate
+in a large monorepo.
+
+Gazelle will walk up the filesystem from a Python file to find this metadata,
+looking for a file called `gazelle_python.yaml` in an ancestor folder of the Python code.
+Create an empty file with this name. It might be next to your `requirements.txt` file.
+(You can just use `touch` at this point, it just needs to exist.)
+
+To keep the metadata updated, put this in your `BUILD.bazel` file next to `gazelle_python.yaml`:
+
+```starlark
+load("@pip//:requirements.bzl", "all_whl_requirements")
+load("@rules_python_gazelle_plugin//manifest:defs.bzl", "gazelle_python_manifest")
+load("@rules_python_gazelle_plugin//modules_mapping:def.bzl", "modules_mapping")
+
+# This rule fetches the metadata for python packages we depend on. That data is
+# required for the gazelle_python_manifest rule to update our manifest file.
+modules_mapping(
+ name = "modules_map",
+ wheels = all_whl_requirements,
+)
+
+# Gazelle python extension needs a manifest file mapping from
+# an import to the installed package that provides it.
+# This macro produces two targets:
+# - //:gazelle_python_manifest.update can be used with `bazel run`
+# to recalculate the manifest
+# - //:gazelle_python_manifest.test is a test target ensuring that
+# the manifest doesn't need to be updated
+gazelle_python_manifest(
+ name = "gazelle_python_manifest",
+ modules_mapping = ":modules_map",
+ # This is what we called our `pip_parse` rule, where third-party
+ # python libraries are loaded in BUILD files.
+ pip_repository_name = "pip",
+ # This should point to wherever we declare our python dependencies
+ # (the same as what we passed to the modules_mapping rule in WORKSPACE)
+ # This argument is optional. If provided, the `.test` target is very
+ # fast because it just has to check an integrity field. If not provided,
+ # the integrity field is not added to the manifest which can help avoid
+ # merge conflicts in large repos.
+ requirements = "//:requirements_lock.txt",
+ # include_stub_packages: bool (default: False)
+ # If set to True, this flag automatically includes any corresponding type stub packages
+ # for the third-party libraries that are present and used. For example, if you have
+ # `boto3` as a dependency, and this flag is enabled, the corresponding `boto3-stubs`
+ # package will be automatically included in the BUILD file.
+ #
+ # Enabling this feature helps ensure that type hints and stubs are readily available
+ # for tools like type checkers and IDEs, improving the development experience and
+ # reducing manual overhead in managing separate stub packages.
+ include_stub_packages = True
+)
+```
+
+Finally, you create a target that you'll invoke to run the Gazelle tool
+with the rules_python extension included. This typically goes in your root
+`/BUILD.bazel` file:
+
+```starlark
+load("@bazel_gazelle//:def.bzl", "gazelle")
+
+# Our gazelle target points to the python gazelle binary.
+# This is the simple case where we only need one language supported.
+# If you also had proto, go, or other gazelle-supported languages,
+# you would also need a gazelle_binary rule.
+# See https://github.com/bazelbuild/bazel-gazelle/blob/master/extend.rst#example
+gazelle(
+ name = "gazelle",
+ gazelle = "@rules_python_gazelle_plugin//python:gazelle_binary",
+)
+```
+
+That's it, now you can finally run `bazel run //:gazelle` anytime
+you edit Python code, and it should update your `BUILD` files correctly.
+
+
+### Directives
+
+You can configure the extension using directives, just like for other
+languages. These are just comments in the `BUILD.bazel` file which
+govern behavior of the extension when processing files under that
+folder.
+
+See https://github.com/bazelbuild/bazel-gazelle#directives
+for some general directives that may be useful.
+In particular, the `resolve` directive is language-specific
+and can be used with Python.
+Examples of these directives in use can be found in the
+/gazelle/testdata folder in the rules_python repo.
+
+Python-specific directives are as follows:
+
+| **Directive** | **Default value** |
+|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------|
+| `# gazelle:python_extension` | `enabled` |
+| Controls whether the Python extension is enabled or not. Sub-packages inherit this value. Can be either "enabled" or "disabled". | |
+| [`# gazelle:python_root`](#directive-python_root) | n/a |
+| Sets a Bazel package as a Python root. This is used on monorepos with multiple Python projects that don't share the top-level of the workspace as the root. See [Directive: `python_root`](#directive-python_root) below. | |
+| `# gazelle:python_manifest_file_name` | `gazelle_python.yaml` |
+| Overrides the default manifest file name. | |
+| `# gazelle:python_ignore_files` | n/a |
+| Controls the files which are ignored from the generated targets. | |
+| `# gazelle:python_ignore_dependencies` | n/a |
+| Controls the ignored dependencies from the generated targets. | |
+| `# gazelle:python_validate_import_statements` | `true` |
+| Controls whether the Python import statements should be validated. Can be "true" or "false" | |
+| `# gazelle:python_generation_mode` | `package` |
+| Controls the target generation mode. Can be "file", "package", or "project" | |
+| `# gazelle:python_generation_mode_per_file_include_init` | `false` |
+| Controls whether `__init__.py` files are included as srcs in each generated target when target generation mode is "file". Can be "true", or "false" | |
+| [`# gazelle:python_generation_mode_per_package_require_test_entry_point`](#directive-python_generation_mode_per_package_require_test_entry_point) | `true` |
+| Controls whether a file called `__test__.py` or a target called `__test__` is required to generate one test target per package in package mode. ||
+| `# gazelle:python_library_naming_convention` | `$package_name$` |
+| Controls the `py_library` naming convention. It interpolates `$package_name$` with the Bazel package name. E.g. if the Bazel package name is `foo`, setting this to `$package_name$_my_lib` would result in a generated target named `foo_my_lib`. | |
+| `# gazelle:python_binary_naming_convention` | `$package_name$_bin` |
+| Controls the `py_binary` naming convention. Follows the same interpolation rules as `python_library_naming_convention`. | |
+| `# gazelle:python_test_naming_convention` | `$package_name$_test` |
+| Controls the `py_test` naming convention. Follows the same interpolation rules as `python_library_naming_convention`. | |
+| [`# gazelle:python_proto_naming_convention`](#directive-python_proto_naming_convention) | `$proto_name$_py_pb2` |
+| Controls the `py_proto_library` naming convention. It interpolates `$proto_name$` with the proto_library rule name, minus any trailing _proto. E.g. if the proto_library name is `foo_proto`, setting this to `$proto_name$_my_lib` would render to `foo_my_lib`. | |
+| `# gazelle:resolve py ...` | n/a |
+| Instructs the plugin what target to add as a dependency to satisfy a given import statement. The syntax is `# gazelle:resolve py import-string label` where `import-string` is the symbol in the python `import` statement, and `label` is the Bazel label that Gazelle should write in `deps`. | |
+| [`# gazelle:python_default_visibility labels`](#directive-python_default_visibility) | |
+| Instructs gazelle to use these visibility labels on all python targets. `labels` is a comma-separated list of labels (without spaces). | `//$python_root$:__subpackages__` |
+| [`# gazelle:python_visibility label`](#directive-python_visibility) | |
+| Appends additional visibility labels to each generated target. This directive can be set multiple times. | |
+| [`# gazelle:python_test_file_pattern`](#directive-python_test_file_pattern) | `*_test.py,test_*.py` |
+| Filenames matching these comma-separated `glob`s will be mapped to `py_test` targets. |
+| `# gazelle:python_label_convention` | `$distribution_name$` |
+| Defines the format of the distribution name in labels to third-party deps. Useful for using Gazelle plugin with other rules with different repository conventions (e.g. `rules_pycross`). Full label is always prepended with (pip) repository name, e.g. `@pip//numpy`. |
+| `# gazelle:python_label_normalization` | `snake_case` |
+| Controls how distribution names in labels to third-party deps are normalized. Useful for using Gazelle plugin with other rules with different label conventions (e.g. `rules_pycross` uses PEP-503). Can be "snake_case", "none", or "pep503". |
+| `# gazelle:python_experimental_allow_relative_imports` | `false` |
+| Controls whether Gazelle resolves dependencies for import statements that use paths relative to the current package. Can be "true" or "false".|
+| `# gazelle:python_generate_pyi_deps` | `false` |
+| Controls whether to generate a separate `pyi_deps` attribute for type-checking dependencies or merge them into the regular `deps` attribute. When `false` (default), type-checking dependencies are merged into `deps` for backward compatibility. When `true`, generates separate `pyi_deps`. Imports in blocks with the format `if typing.TYPE_CHECKING:`/`if TYPE_CHECKING:` and type-only stub packages (eg. boto3-stubs) are recognized as type-checking dependencies. |
+| [`# gazelle:python_generate_proto`](#directive-python_generate_proto) | `false` |
+| Controls whether to generate a `py_proto_library` for each `proto_library` in the package. By default we load this rule from the `@protobuf` repository; use `gazelle:map_kind` if you need to load this from somewhere else. |
+| `# gazelle:python_resolve_sibling_imports` | `false` |
+| Allows absolute imports to be resolved to sibling modules (Python 2's behavior without `absolute_import`). |
+
+#### Directive: `python_root`:
+
+Set this directive within the Bazel package that you want to use as the Python root.
+For example, if using a `src` dir (as recommended by the [Python Packaging User
+Guide][python-packaging-user-guide]), then set this directive in `src/BUILD.bazel`:
+
+```starlark
+# ./src/BUILD.bazel
+# Tell gazelle that are python root is the same dir as this Bazel package.
+# gazelle:python_root
+```
+
+Note that the directive does not have any arguments.
+
+Gazelle will then add the necessary `imports` attribute to all targets that it
+generates:
+
+```starlark
+# in ./src/foo/BUILD.bazel
+py_libary(
+ ...
+ imports = [".."], # Gazelle adds this
+ ...
+)
+
+# in ./src/foo/bar/BUILD.bazel
+py_libary(
+ ...
+ imports = ["../.."], # Gazelle adds this
+ ...
+)
+```
+
+[python-packaging-user-guide]: https://github.com/pypa/packaging.python.org/blob/4c86169a/source/tutorials/packaging-projects.rst
+
+#### Directive: `python_proto_naming_convention`:
+
+Set this directive to a string pattern to control how the generated `py_proto_library` targets are named. When generating new `py_proto_library` rules, Gazelle will replace `$proto_name$` in the pattern with the name of the `proto_library` rule, stripping out a trailing `_proto`. For example:
+
+```starlark
+# gazelle:python_generate_proto true
+# gazelle:python_proto_naming_convention my_custom_$proto_name$_pattern
+
+proto_library(
+ name = "foo_proto",
+ srcs = ["foo.proto"],
+)
+```
+
+produces the following `py_proto_library` rule:
+```starlark
+py_proto_library(
+ name = "my_custom_foo_pattern",
+ deps = [":foo_proto"],
+)
+```
+
+The default naming convention is `$proto_name$_pb2_py`, so by default in the above example Gazelle would generate `foo_pb2_py`. Any pre-existing rules are left in place and not renamed.
+
+Note that the Python library will always be imported as `foo_pb2` in Python code, regardless of the naming convention. Also note that Gazelle is currently not able to map said imports, e.g. `import foo_pb2`, to fill in `py_proto_library` targets as dependencies of other rules. See [this issue](https://github.com/bazel-contrib/rules_python/issues/1703).
+
+#### Directive: `python_default_visibility`:
+
+Instructs gazelle to use these visibility labels on all _python_ targets
+(typically `py_*`, but can be modified via the `map_kind` directive). The arg
+to this directive is a a comma-separated list (without spaces) of labels.
+
+For example:
+
+```starlark
+# gazelle:python_default_visibility //:__subpackages__,//tests:__subpackages__
+```
+
+produces the following visibility attribute:
+
+```starlark
+py_library(
+ ...,
+ visibility = [
+ "//:__subpackages__",
+ "//tests:__subpackages__",
+ ],
+ ...,
+)
+```
+
+You can also inject the `python_root` value by using the exact string
+`$python_root$`. All instances of this string will be replaced by the `python_root`
+value.
+
+```starlark
+# gazelle:python_default_visibility //$python_root$:__pkg__,//foo/$python_root$/tests:__subpackages__
+
+# Assuming the "# gazelle:python_root" directive is set in ./py/src/BUILD.bazel,
+# the results will be:
+py_library(
+ ...,
+ visibility = [
+ "//foo/py/src/tests:__subpackages__", # sorted alphabetically
+ "//py/src:__pkg__",
+ ],
+ ...,
+)
+```
+
+Two special values are also accepted as an argument to the directive:
+
++ `NONE`: This removes all default visibility. Labels added by the
+ `python_visibility` directive are still included.
++ `DEFAULT`: This resets the default visibility.
+
+For example:
+
+```starlark
+# gazelle:python_default_visibility NONE
+
+py_library(
+ name = "...",
+ srcs = [...],
+)
+```
+
+```starlark
+# gazelle:python_default_visibility //foo:bar
+# gazelle:python_default_visibility DEFAULT
+
+py_library(
+ ...,
+ visibility = ["//:__subpackages__"],
+ ...,
+)
+```
+
+These special values can be useful for sub-packages.
+
+
+#### Directive: `python_visibility`:
+
+Appends additional `visibility` labels to each generated target.
+
+This directive can be set multiple times. The generated `visibility` attribute
+will include the default visibility and all labels defined by this directive.
+All labels will be ordered alphabetically.
+
+```starlark
+# ./BUILD.bazel
+# gazelle:python_visibility //tests:__pkg__
+# gazelle:python_visibility //bar:baz
+
+py_library(
+ ...
+ visibility = [
+ "//:__subpackages__", # default visibility
+ "//bar:baz",
+ "//tests:__pkg__",
+ ],
+ ...
+)
+```
+
+Child Bazel packages inherit values from parents:
+
+```starlark
+# ./bar/BUILD.bazel
+# gazelle:python_visibility //tests:__subpackages__
+
+py_library(
+ ...
+ visibility = [
+ "//:__subpackages__", # default visibility
+ "//bar:baz", # defined in ../BUILD.bazel
+ "//tests:__pkg__", # defined in ../BUILD.bazel
+ "//tests:__subpackages__", # defined in this ./BUILD.bazel
+ ],
+ ...
+)
+
+```
+
+This directive also supports the `$python_root$` placeholder that
+`# gazelle:python_default_visibility` supports.
+
+```starlark
+# gazlle:python_visibility //$python_root$/foo:bar
+
+py_library(
+ ...
+ visibility = ["//this_is_my_python_root/foo:bar"],
+ ...
+)
+```
+
+
+#### Directive: `python_test_file_pattern`:
+
+This directive adjusts which python files will be mapped to the `py_test` rule.
+
++ The default is `*_test.py,test_*.py`: both `test_*.py` and `*_test.py` files
+ will generate `py_test` targets.
++ This directive must have a value. If no value is given, an error will be raised.
++ It is recommended, though not necessary, to include the `.py` extension in
+ the `glob`s: `foo*.py,?at.py`.
++ Like most directives, it applies to the current Bazel package and all subpackages
+ until the directive is set again.
++ This directive accepts multiple `glob` patterns, separated by commas without spaces:
+
+```starlark
+# gazelle:python_test_file_pattern foo*.py,?at
+
+py_library(
+ name = "mylib",
+ srcs = ["mylib.py"],
+)
+
+py_test(
+ name = "foo_bar",
+ srcs = ["foo_bar.py"],
+)
+
+py_test(
+ name = "cat",
+ srcs = ["cat.py"],
+)
+
+py_test(
+ name = "hat",
+ srcs = ["hat.py"],
+)
+```
+
+
+##### Notes
+
+Resetting to the default value (such as in a subpackage) is manual. Set:
+
+```starlark
+# gazelle:python_test_file_pattern *_test.py,test_*.py
+```
+
+There currently is no way to tell gazelle that _no_ files in a package should
+be mapped to `py_test` targets (see [Issue #1826][issue-1826]). The workaround
+is to set this directive to a pattern that will never match a `.py` file, such
+as `foo.bar`:
+
+```starlark
+# No files in this package should be mapped to py_test targets.
+# gazelle:python_test_file_pattern foo.bar
+
+py_library(
+ name = "my_test",
+ srcs = ["my_test.py"],
+)
+```
+
+[issue-1826]: https://github.com/bazel-contrib/rules_python/issues/1826
+
+#### Directive: `python_generation_mode_per_package_require_test_entry_point`:
+When `# gazelle:python_generation_mode package`, whether a file called `__test__.py` or a target called `__test__`, a.k.a., entry point, is required to generate one test target per package. If this is set to true but no entry point is found, Gazelle will fall back to file mode and generate one test target per file. Setting this directive to false forces Gazelle to generate one test target per package even without entry point. However, this means the `main` attribute of the `py_test` will not be set and the target will not be runnable unless either:
+1. there happen to be a file in the `srcs` with the same name as the `py_test` target, or
+2. a macro populating the `main` attribute of `py_test` is configured with `gazelle:map_kind` to replace `py_test` when Gazelle is generating Python test targets. For example, user can provide such a macro to Gazelle:
+
+```starlark
+load("@rules_python//python:defs.bzl", _py_test="py_test")
+load("@aspect_rules_py//py:defs.bzl", "py_pytest_main")
+
+def py_test(name, main=None, **kwargs):
+ deps = kwargs.pop("deps", [])
+ if not main:
+ py_pytest_main(
+ name = "__test__",
+ deps = ["@pip_pytest//:pkg"], # change this to the pytest target in your repo.
+ )
+
+ deps.append(":__test__")
+ main = ":__test__.py"
+
+ _py_test(
+ name = name,
+ main = main,
+ deps = deps,
+ **kwargs,
+)
+```
+
+#### Directive: `python_generate_proto`:
+
+When `# gazelle:python_generate_proto true`, Gazelle will generate one
+`py_proto_library` for each `proto_library`, generating Python clients for
+protobuf in each package. By default this is turned off. Gazelle will also
+generate a load statement for the `py_proto_library` - attempting to detect
+the configured name for the `@protobuf` / `@com_google_protobuf` repo in your
+`MODULE.bazel`, and otherwise falling back to `@com_google_protobuf` for
+compatibility with `WORKSPACE`.
+
+For example, in a package with `# gazelle:python_generate_proto true` and a
+`foo.proto`, if you have both the proto extension and the Python extension
+loaded into Gazelle, you'll get something like:
+
+```starlark
+load("@protobuf//bazel:py_proto_library.bzl", "py_proto_library")
+load("@rules_proto//proto:defs.bzl", "proto_library")
+
+# gazelle:python_generate_proto true
+
+proto_library(
+ name = "foo_proto",
+ srcs = ["foo.proto"],
+ visibility = ["//:__subpackages__"],
+)
+
+py_proto_library(
+ name = "foo_py_pb2",
+ visibility = ["//:__subpackages__"],
+ deps = [":foo_proto"],
+)
+```
+
+When `false`, Gazelle will ignore any `py_proto_library`, including previously-generated or hand-created rules.
+
+### Annotations
+
+*Annotations* refer to comments found _within Python files_ that configure how
+Gazelle acts for that particular file.
+
+Annotations have the form:
+
+```python
+# gazelle:annotation_name value
+```
+
+and can reside anywhere within a Python file where comments are valid. For example:
+
+```python
+import foo
+# gazelle:annotation_name value
+
+def bar(): # gazelle:annotation_name value
+ pass
+```
+
+The annotations are:
+
+| **Annotation** | **Default value** |
+|---------------------------------------------------------------|-------------------|
+| [`# gazelle:ignore imports`](#annotation-ignore) | N/A |
+| Tells Gazelle to ignore import statements. `imports` is a comma-separated list of imports to ignore. | |
+| [`# gazelle:include_dep targets`](#annotation-include_dep) | N/A |
+| Tells Gazelle to include a set of dependencies, even if they are not imported in a Python module. `targets` is a comma-separated list of target names to include as dependencies. | |
+| [`# gazelle:include_pytest_conftest bool`](#annotation-include_pytest_conftest) | N/A |
+| Whether or not to include a sibling `:conftest` target in the deps of a `py_test` target. Default behaviour is to include `:conftest`. | |
+
+
+#### Annotation: `ignore`
+
+This annotation accepts a comma-separated string of values. Values are names of Python
+imports that Gazelle should _not_ include in target dependencies.
+
+The annotation can be added multiple times, and all values are combined and
+de-duplicated.
+
+For `python_generation_mode = "package"`, the `ignore` annotations
+found across all files included in the generated target are removed from `deps`.
+
+Example:
+
+```python
+import numpy # a pypi package
+
+# gazelle:ignore bar.baz.hello,foo
+import bar.baz.hello
+import foo
+
+# Ignore this import because _reasons_
+import baz # gazelle:ignore baz
+```
+
+will cause Gazelle to generate:
+
+```starlark
+deps = ["@pypi//numpy"],
+```
+
+
+#### Annotation: `include_dep`
+
+This annotation accepts a comma-separated string of values. Values _must_
+be Python targets, but _no validation is done_. If a value is not a Python
+target, building will result in an error saying:
+
+```
+<login>
and
+<password>
, which are replaced with their equivalent value
+in the netrc file for the same host name. After formatting, the result is set
+as the value for the Authorization
field of the HTTP request.
+
+Example attribute and netrc for a http download to an oauth2 enabled API using a bearer token:
+
++auth_patterns = { + "storage.cloudprovider.com": "Bearer <password>" +} ++ +netrc: + +
+machine storage.cloudprovider.com + password RANDOM-TOKEN ++ +The final HTTP request would have the following header: + +
+Authorization: Bearer RANDOM-TOKEN ++""" + +# AUTH_ATTRS are used within whl_library and pip bzlmod extension. +AUTH_ATTRS = { + "auth_patterns": attr.string_dict( + doc = _AUTH_PATTERN_DOC, + ), + "netrc": attr.string( + doc = "Location of the .netrc file to use for authentication", + ), +} + +def get_auth(ctx, urls, ctx_attr = None): + """Utility for retrieving netrc-based authentication parameters for repository download rules used in python_repository. + + Args: + ctx(repository_ctx or module_ctx): The extension module_ctx or + repository rule's repository_ctx object. + urls: A list of URLs from which assets will be downloaded. + ctx_attr(struct): The attributes to get the netrc from. When ctx is + repository_ctx, then we will attempt to use repository_ctx.attr + if this is not specified, otherwise we will use the specified + field. The module_ctx attributes are located in the tag classes + so it cannot be retrieved from the context. + + Returns: + dict: A map of authentication parameters by URL. + """ + + # module_ctx does not have attributes, as they are stored in tag classes. Whilst + # the correct behaviour should be to pass the `attr` to the + ctx_attr = ctx_attr or getattr(ctx, "attr", None) + ctx_attr = struct( + netrc = getattr(ctx_attr, "netrc", None), + auth_patterns = getattr(ctx_attr, "auth_patterns", ""), + ) + + if ctx_attr.netrc: + netrc = read_netrc(ctx, ctx_attr.netrc) + elif "NETRC" in ctx.os.environ: + # This can be used on newer bazel versions + if hasattr(ctx, "getenv"): + netrc = read_netrc(ctx, ctx.getenv("NETRC")) + else: + netrc = read_netrc(ctx, ctx.os.environ["NETRC"]) + else: + netrc = read_user_netrc(ctx) + + return use_netrc(netrc, urls, ctx_attr.auth_patterns) diff --git a/python/private/builders.bzl b/python/private/builders.bzl new file mode 100644 index 0000000000..54d46c2af2 --- /dev/null +++ b/python/private/builders.bzl @@ -0,0 +1,197 @@ +# Copyright 2024 The Bazel Authors. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""Builders to make building complex objects easier.""" + +load("@bazel_skylib//lib:types.bzl", "types") + +def _DepsetBuilder(order = None): + """Create a builder for a depset. + + Args: + order: {type}`str | None` The order to initialize the depset to, if any. + + Returns: + {type}`DepsetBuilder` + """ + + # buildifier: disable=uninitialized + self = struct( + _order = [order], + add = lambda *a, **k: _DepsetBuilder_add(self, *a, **k), + build = lambda *a, **k: _DepsetBuilder_build(self, *a, **k), + direct = [], + get_order = lambda *a, **k: _DepsetBuilder_get_order(self, *a, **k), + set_order = lambda *a, **k: _DepsetBuilder_set_order(self, *a, **k), + transitive = [], + ) + return self + +def _DepsetBuilder_add(self, *values): + """Add value to the depset. + + Args: + self: {type}`DepsetBuilder` implicitly added. + *values: {type}`depset | list | object` Values to add to the depset. + The values can be a depset, the non-depset value to add, or + a list of such values to add. + + Returns: + {type}`DepsetBuilder` + """ + for value in values: + if types.is_list(value): + for sub_value in value: + if types.is_depset(sub_value): + self.transitive.append(sub_value) + else: + self.direct.append(sub_value) + elif types.is_depset(value): + self.transitive.append(value) + else: + self.direct.append(value) + return self + +def _DepsetBuilder_set_order(self, order): + """Sets the order to use. + + Args: + self: {type}`DepsetBuilder` implicitly added. + order: {type}`str` One of the {obj}`depset` `order` values. + + Returns: + {type}`DepsetBuilder` + """ + self._order[0] = order + return self + +def _DepsetBuilder_get_order(self): + """Gets the depset order that will be used. + + Args: + self: {type}`DepsetBuilder` implicitly added. + + Returns: + {type}`str | None` If not previously set, `None` is returned. + """ + return self._order[0] + +def _DepsetBuilder_build(self): + """Creates a {obj}`depset` from the accumulated values. + + Args: + self: {type}`DepsetBuilder` implicitly added. + + Returns: + {type}`depset` + """ + if not self.direct and len(self.transitive) == 1 and self._order[0] == None: + return self.transitive[0] + else: + kwargs = {} + if self._order[0] != None: + kwargs["order"] = self._order[0] + return depset(direct = self.direct, transitive = self.transitive, **kwargs) + +def _RunfilesBuilder(): + """Creates a `RunfilesBuilder`. + + Returns: + {type}`RunfilesBuilder` + """ + + # buildifier: disable=uninitialized + self = struct( + add = lambda *a, **k: _RunfilesBuilder_add(self, *a, **k), + add_targets = lambda *a, **k: _RunfilesBuilder_add_targets(self, *a, **k), + build = lambda *a, **k: _RunfilesBuilder_build(self, *a, **k), + files = _DepsetBuilder(), + root_symlinks = {}, + runfiles = [], + symlinks = {}, + ) + return self + +def _RunfilesBuilder_add(self, *values): + """Adds a value to the runfiles. + + Args: + self: {type}`RunfilesBuilder` implicitly added. + *values: {type}`File | runfiles | list[File] | depset[File] | list[runfiles]` + The values to add. + + Returns: + {type}`RunfilesBuilder` + """ + for value in values: + if types.is_list(value): + for sub_value in value: + _RunfilesBuilder_add_internal(self, sub_value) + else: + _RunfilesBuilder_add_internal(self, value) + return self + +def _RunfilesBuilder_add_targets(self, targets): + """Adds runfiles from targets + + Args: + self: {type}`RunfilesBuilder` implicitly added. + targets: {type}`list[Target]` targets whose default runfiles + to add. + + Returns: + {type}`RunfilesBuilder` + """ + for t in targets: + self.runfiles.append(t[DefaultInfo].default_runfiles) + return self + +def _RunfilesBuilder_add_internal(self, value): + if _is_file(value): + self.files.add(value) + elif types.is_depset(value): + self.files.add(value) + elif _is_runfiles(value): + self.runfiles.append(value) + else: + fail("Unhandled value: type {}: {}".format(type(value), value)) + +def _RunfilesBuilder_build(self, ctx, **kwargs): + """Creates a {obj}`runfiles` from the accumulated values. + + Args: + self: {type}`RunfilesBuilder` implicitly added. + ctx: {type}`ctx` The rule context to use to create the runfiles object. + **kwargs: additional args to pass along to {obj}`ctx.runfiles`. + + Returns: + {type}`runfiles` + """ + return ctx.runfiles( + transitive_files = self.files.build(), + symlinks = self.symlinks, + root_symlinks = self.root_symlinks, + **kwargs + ).merge_all(self.runfiles) + +# Skylib's types module doesn't have is_file, so roll our own +def _is_file(value): + return type(value) == "File" + +def _is_runfiles(value): + return type(value) == "runfiles" + +builders = struct( + DepsetBuilder = _DepsetBuilder, + RunfilesBuilder = _RunfilesBuilder, +) diff --git a/python/private/builders_util.bzl b/python/private/builders_util.bzl new file mode 100644 index 0000000000..139084f79a --- /dev/null +++ b/python/private/builders_util.bzl @@ -0,0 +1,116 @@ +# Copyright 2025 The Bazel Authors. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Utilities for builders.""" + +load("@bazel_skylib//lib:types.bzl", "types") + +def to_label_maybe(value): + """Converts `value` to a `Label`, maybe. + + The "maybe" qualification is because invalid values for `Label()` + are returned as-is (e.g. None, or special values that might be + used with e.g. the `default` attribute arg). + + Args: + value: {type}`str | Label | None | object` the value to turn into a label, + or return as-is. + + Returns: + {type}`Label | input_value` + """ + if value == None: + return None + if is_label(value): + return value + if types.is_string(value): + return Label(value) + return value + +def is_label(obj): + """Tell if an object is a `Label`.""" + return type(obj) == "Label" + +def kwargs_set_default_ignore_none(kwargs, key, default): + """Normalize None/missing to `default`.""" + existing = kwargs.get(key) + if existing == None: + kwargs[key] = default + +def kwargs_set_default_list(kwargs, key): + """Normalizes None/missing to list.""" + existing = kwargs.get(key) + if existing == None: + kwargs[key] = [] + +def kwargs_set_default_dict(kwargs, key): + """Normalizes None/missing to list.""" + existing = kwargs.get(key) + if existing == None: + kwargs[key] = {} + +def kwargs_set_default_doc(kwargs): + """Sets the `doc` arg default.""" + existing = kwargs.get("doc") + if existing == None: + kwargs["doc"] = "" + +def kwargs_set_default_mandatory(kwargs): + """Sets `False` as the `mandatory` arg default.""" + existing = kwargs.get("mandatory") + if existing == None: + kwargs["mandatory"] = False + +def kwargs_getter(kwargs, key): + """Create a function to get `key` from `kwargs`.""" + return lambda: kwargs.get(key) + +def kwargs_setter(kwargs, key): + """Create a function to set `key` in `kwargs`.""" + + def setter(v): + kwargs[key] = v + + return setter + +def kwargs_getter_doc(kwargs): + """Creates a `kwargs_getter` for the `doc` key.""" + return kwargs_getter(kwargs, "doc") + +def kwargs_setter_doc(kwargs): + """Creates a `kwargs_setter` for the `doc` key.""" + return kwargs_setter(kwargs, "doc") + +def kwargs_getter_mandatory(kwargs): + """Creates a `kwargs_getter` for the `mandatory` key.""" + return kwargs_getter(kwargs, "mandatory") + +def kwargs_setter_mandatory(kwargs): + """Creates a `kwargs_setter` for the `mandatory` key.""" + return kwargs_setter(kwargs, "mandatory") + +def list_add_unique(add_to, others): + """Bulk add values to a list if not already present. + + Args: + add_to: {type}`list[T]` the list to add values to. It is modified + in-place. + others: {type}`collection[collection[T]]` collection of collections of + the values to add. + """ + existing = {v: None for v in add_to} + for values in others: + for value in values: + if value not in existing: + add_to.append(value) diff --git a/python/private/bzlmod_enabled.bzl b/python/private/bzlmod_enabled.bzl new file mode 100644 index 0000000000..84839981a0 --- /dev/null +++ b/python/private/bzlmod_enabled.bzl @@ -0,0 +1,18 @@ +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Variable to check if bzlmod is enabled""" + +# When bzlmod is enabled, canonical repos names have @@ in them, while under +# workspace builds, there is never a @@ in labels. +BZLMOD_ENABLED = "@@" in str(Label("//:unused")) diff --git a/examples/legacy_pip_import/helloworld/helloworld.py b/python/private/cc_helper.bzl similarity index 58% rename from examples/legacy_pip_import/helloworld/helloworld.py rename to python/private/cc_helper.bzl index b629e80f28..552b42eae8 100644 --- a/examples/legacy_pip_import/helloworld/helloworld.py +++ b/python/private/cc_helper.bzl @@ -1,4 +1,4 @@ -# Copyright 2017 The Bazel Authors. All rights reserved. +# Copyright 2023 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. @@ -11,19 +11,13 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. +"""PYTHON RULE IMPLEMENTATION ONLY: Do not use outside of the rule implementations and their tests. -from concurrent import futures +Adapter for accessing Bazel's internal cc_helper. +These may change at any time and are closely coupled to the rule implementation. +""" -class HelloWorld(object): - def __init__(self): - self._threadpool = futures.ThreadPoolExecutor(max_workers=5) +load(":py_internal.bzl", "py_internal") - def SayHello(self): - print("Hello World") - - def SayHelloAsync(self): - self._threadpool.submit(self.SayHello) - - def Stop(self): - self._threadpool.shutdown(wait = True) +cc_helper = getattr(py_internal, "cc_helper", None) diff --git a/python/private/common.bzl b/python/private/common.bzl new file mode 100644 index 0000000000..96f8ebeab4 --- /dev/null +++ b/python/private/common.bzl @@ -0,0 +1,530 @@ +# Copyright 2022 The Bazel Authors. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""Various things common to rule implementations.""" + +load("@bazel_skylib//lib:paths.bzl", "paths") +load("@rules_cc//cc/common:cc_common.bzl", "cc_common") +load("@rules_cc//cc/common:cc_info.bzl", "CcInfo") +load(":cc_helper.bzl", "cc_helper") +load(":py_cc_link_params_info.bzl", "PyCcLinkParamsInfo") +load(":py_info.bzl", "PyInfo", "PyInfoBuilder") +load(":py_internal.bzl", "py_internal") +load(":reexports.bzl", "BuiltinPyInfo") + +_testing = testing +_platform_common = platform_common +_coverage_common = coverage_common +PackageSpecificationInfo = getattr(py_internal, "PackageSpecificationInfo", None) + +# Extensions without the dot +_PYTHON_SOURCE_EXTENSIONS = ["py"] + +# Extensions that mean a file is relevant to Python +PYTHON_FILE_EXTENSIONS = [ + "dll", # Python C modules, Windows specific + "dylib", # Python C modules, Mac specific + "py", + "pyc", + "pyi", + "so", # Python C modules, usually Linux +] + +def create_binary_semantics_struct( + *, + create_executable, + get_cc_details_for_binary, + get_central_uncachable_version_file, + get_coverage_deps, + get_debugger_deps, + get_extra_common_runfiles_for_binary, + get_extra_providers, + get_extra_write_build_data_env, + get_interpreter_path, + get_imports, + get_native_deps_dso_name, + get_native_deps_user_link_flags, + get_stamp_flag, + maybe_precompile, + should_build_native_deps_dso, + should_create_init_files, + should_include_build_data): + """Helper to ensure a semantics struct has all necessary fields. + + Call this instead of a raw call to `struct(...)`; it'll help ensure all + the necessary functions are being correctly provided. + + Args: + create_executable: Callable; creates a binary's executable output. See + py_executable.bzl#py_executable_base_impl for details. + get_cc_details_for_binary: Callable that returns a `CcDetails` struct; see + `create_cc_detail_struct`. + get_central_uncachable_version_file: Callable that returns an optional + Artifact; this artifact is special: it is never cached and is a copy + of `ctx.version_file`; see py_builtins.copy_without_caching + get_coverage_deps: Callable that returns a list of Targets for making + coverage work; only called if coverage is enabled. + get_debugger_deps: Callable that returns a list of Targets that provide + custom debugger support; only called for target-configuration. + get_extra_common_runfiles_for_binary: Callable that returns a runfiles + object of extra runfiles a binary should include. + get_extra_providers: Callable that returns extra providers; see + py_executable.bzl#_create_providers for details. + get_extra_write_build_data_env: Callable that returns a dict[str, str] + of additional environment variable to pass to build data generation. + get_interpreter_path: Callable that returns an optional string, which is + the path to the Python interpreter to use for running the binary. + get_imports: Callable that returns a list of the target's import + paths (from the `imports` attribute, so just the target's own import + path strings, not from dependencies). + get_native_deps_dso_name: Callable that returns a string, which is the + basename (with extension) of the native deps DSO library. + get_native_deps_user_link_flags: Callable that returns a list of strings, + which are any extra linker flags to pass onto the native deps DSO + linking action. + get_stamp_flag: Callable that returns bool of if the --stamp flag was + enabled or not. + maybe_precompile: Callable that may optional precompile the input `.py` + sources and returns the full set of desired outputs derived from + the source files (e.g., both py and pyc, only one of them, etc). + should_build_native_deps_dso: Callable that returns bool; True if + building a native deps DSO is supported, False if not. + should_create_init_files: Callable that returns bool; True if + `__init__.py` files should be generated, False if not. + should_include_build_data: Callable that returns bool; True if + build data should be generated, False if not. + Returns: + A "BinarySemantics" struct. + """ + return struct( + # keep-sorted + create_executable = create_executable, + get_cc_details_for_binary = get_cc_details_for_binary, + get_central_uncachable_version_file = get_central_uncachable_version_file, + get_coverage_deps = get_coverage_deps, + get_debugger_deps = get_debugger_deps, + get_extra_common_runfiles_for_binary = get_extra_common_runfiles_for_binary, + get_extra_providers = get_extra_providers, + get_extra_write_build_data_env = get_extra_write_build_data_env, + get_imports = get_imports, + get_interpreter_path = get_interpreter_path, + get_native_deps_dso_name = get_native_deps_dso_name, + get_native_deps_user_link_flags = get_native_deps_user_link_flags, + get_stamp_flag = get_stamp_flag, + maybe_precompile = maybe_precompile, + should_build_native_deps_dso = should_build_native_deps_dso, + should_create_init_files = should_create_init_files, + should_include_build_data = should_include_build_data, + ) + +def create_library_semantics_struct( + *, + get_cc_info_for_library, + get_imports, + maybe_precompile): + """Create a `LibrarySemantics` struct. + + Call this instead of a raw call to `struct(...)`; it'll help ensure all + the necessary functions are being correctly provided. + + Args: + get_cc_info_for_library: Callable that returns a CcInfo for the library; + see py_library_impl for arg details. + get_imports: Callable; see create_binary_semantics_struct. + maybe_precompile: Callable; see create_binary_semantics_struct. + Returns: + a `LibrarySemantics` struct. + """ + return struct( + # keep sorted + get_cc_info_for_library = get_cc_info_for_library, + get_imports = get_imports, + maybe_precompile = maybe_precompile, + ) + +def create_cc_details_struct( + *, + cc_info_for_propagating, + cc_info_for_self_link, + cc_info_with_extra_link_time_libraries, + extra_runfiles, + cc_toolchain, + feature_config, + **kwargs): + """Creates a CcDetails struct. + + Args: + cc_info_for_propagating: CcInfo that is propagated out of the target + by returning it within a PyCcLinkParamsProvider object. + cc_info_for_self_link: CcInfo that is used when linking for the + binary (or its native deps DSO) itself. This may include extra + information that isn't propagating (e.g. a custom malloc) + cc_info_with_extra_link_time_libraries: CcInfo of extra link time + libraries that MUST come after `cc_info_for_self_link` (or possibly + always last; not entirely clear) when passed to + `link.linking_contexts`. + extra_runfiles: runfiles of extra files needed at runtime, usually as + part of `cc_info_with_extra_link_time_libraries`; should be added to + runfiles. + cc_toolchain: CcToolchain that should be used when building. + feature_config: struct from cc_configure_features(); see + //python/private:py_executable.bzl%cc_configure_features. + **kwargs: Additional keys/values to set in the returned struct. This is to + facilitate extensions with less patching. Any added fields should + pick names that are unlikely to collide if the CcDetails API has + additional fields added. + + Returns: + A `CcDetails` struct. + """ + return struct( + cc_info_for_propagating = cc_info_for_propagating, + cc_info_for_self_link = cc_info_for_self_link, + cc_info_with_extra_link_time_libraries = cc_info_with_extra_link_time_libraries, + extra_runfiles = extra_runfiles, + cc_toolchain = cc_toolchain, + feature_config = feature_config, + **kwargs + ) + +def create_executable_result_struct(*, extra_files_to_build, output_groups, extra_runfiles = None): + """Creates a `CreateExecutableResult` struct. + + This is the return value type of the semantics create_executable function. + + Args: + extra_files_to_build: depset of File; additional files that should be + included as default outputs. + output_groups: dict[str, depset[File]]; additional output groups that + should be returned. + extra_runfiles: A runfiles object of additional runfiles to include. + + Returns: + A `CreateExecutableResult` struct. + """ + return struct( + extra_files_to_build = extra_files_to_build, + output_groups = output_groups, + extra_runfiles = extra_runfiles, + ) + +def csv(values): + """Convert a list of strings to comma separated value string.""" + return ", ".join(sorted(values)) + +def filter_to_py_srcs(srcs): + """Filters .py files from the given list of files""" + + # TODO(b/203567235): Get the set of recognized extensions from + # elsewhere, as there may be others. e.g. Bazel recognizes .py3 + # as a valid extension. + return [f for f in srcs if f.extension == "py"] + +def collect_cc_info(ctx, extra_deps = []): + """Collect C++ information from dependencies for Bazel. + + Args: + ctx: Rule ctx; must have `deps` attribute. + extra_deps: list of Target to also collect C+ information from. + + Returns: + CcInfo provider of merged information. + """ + deps = ctx.attr.deps + if extra_deps: + deps = list(deps) + deps.extend(extra_deps) + cc_infos = [] + for dep in deps: + if CcInfo in dep: + cc_infos.append(dep[CcInfo]) + + if PyCcLinkParamsInfo in dep: + cc_infos.append(dep[PyCcLinkParamsInfo].cc_info) + + return cc_common.merge_cc_infos(cc_infos = cc_infos) + +def collect_imports(ctx, semantics): + """Collect the direct and transitive `imports` strings. + + Args: + ctx: {type}`ctx` the current target ctx + semantics: semantics object for fetching direct imports. + + Returns: + {type}`depset[str]` of import paths + """ + transitive = [] + for dep in ctx.attr.deps: + if PyInfo in dep: + transitive.append(dep[PyInfo].imports) + if BuiltinPyInfo != None and BuiltinPyInfo in dep: + transitive.append(dep[BuiltinPyInfo].imports) + return depset(direct = semantics.get_imports(ctx), transitive = transitive) + +def get_imports(ctx): + """Gets the imports from a rule's `imports` attribute. + + See create_binary_semantics_struct for details about this function. + + Args: + ctx: Rule ctx. + + Returns: + List of strings. + """ + prefix = "{}/{}".format( + ctx.workspace_name, + py_internal.get_label_repo_runfiles_path(ctx.label), + ) + result = [] + for import_str in ctx.attr.imports: + import_str = ctx.expand_make_variables("imports", import_str, {}) + if import_str.startswith("/"): + continue + + # To prevent "escaping" out of the runfiles tree, we normalize + # the path and ensure it doesn't have up-level references. + import_path = paths.normalize("{}/{}".format(prefix, import_str)) + if import_path.startswith("../") or import_path == "..": + fail("Path '{}' references a path above the execution root".format( + import_str, + )) + result.append(import_path) + return result + +def collect_runfiles(ctx, files = depset()): + """Collects the necessary files from the rule's context. + + This presumes the ctx is for a py_binary, py_test, or py_library rule. + + Args: + ctx: rule ctx + files: depset of extra files to include in the runfiles. + Returns: + runfiles necessary for the ctx's target. + """ + return ctx.runfiles( + transitive_files = files, + # This little arg carries a lot of weight, but because Starlark doesn't + # have a way to identify if a target is just a File, the equivalent + # logic can't be re-implemented in pure-Starlark. + # + # Under the hood, it calls the Java `Runfiles#addRunfiles(ctx, + # DEFAULT_RUNFILES)` method, which is the what the Java implementation + # of the Python rules originally did, and the details of how that method + # works have become relied on in various ways. Specifically, what it + # does is visit the srcs, deps, and data attributes in the following + # ways: + # + # For each target in the "data" attribute... + # If the target is a File, then add that file to the runfiles. + # Otherwise, add the target's **data runfiles** to the runfiles. + # + # Note that, contrary to best practice, the default outputs of the + # targets in `data` are *not* added, nor are the default runfiles. + # + # This ends up being important for several reasons, some of which are + # specific to Google-internal features of the rules. + # * For Python executables, we have to use `data_runfiles` to avoid + # conflicts for the build data files. Such files have + # target-specific content, but uses a fixed location, so if a + # binary has another binary in `data`, and both try to specify a + # file for that file path, then a warning is printed and an + # arbitrary one will be used. + # * For rules with _entirely_ different sets of files in data runfiles + # vs default runfiles vs default outputs. For example, + # proto_library: documented behavior of this rule is that putting it + # in the `data` attribute will cause the transitive closure of + # `.proto` source files to be included. This set of sources is only + # in the `data_runfiles` (`default_runfiles` is empty). + # * For rules with a _subset_ of files in data runfiles. For example, + # a certain Google rule used for packaging arbitrary binaries will + # generate multiple versions of a binary (e.g. different archs, + # stripped vs un-stripped, etc) in its default outputs, but only + # one of them in the runfiles; this helps avoid large, unused + # binaries contributing to remote executor input limits. + # + # Unfortunately, the above behavior also results in surprising behavior + # in some cases. For example, simple custom rules that only return their + # files in their default outputs won't have their files included. Such + # cases must either return their files in runfiles, or use `filegroup()` + # which will do so for them. + # + # For each target in "srcs" and "deps"... + # Add the default runfiles of the target to the runfiles. While this + # is desirable behavior, it also ends up letting a `py_library` + # be put in `srcs` and still mostly work. + # TODO(b/224640180): Reject py_library et al rules in srcs. + collect_default = True, + ) + +def create_py_info( + ctx, + *, + original_sources, + required_py_files, + required_pyc_files, + implicit_pyc_files, + implicit_pyc_source_files, + imports, + venv_symlinks = []): + """Create PyInfo provider. + + Args: + ctx: rule ctx. + original_sources: `depset[File]`; the original input sources from `srcs` + required_py_files: `depset[File]`; the direct, `.py` sources for the + target that **must** be included by downstream targets. This should + only be Python source files. It should not include pyc files. + required_pyc_files: `depset[File]`; the direct `.pyc` files this target + produces. + implicit_pyc_files: `depset[File]` pyc files that are only used if pyc + collection is enabled. + implicit_pyc_source_files: `depset[File]` source files for implicit pyc + files that are used when the implicit pyc files are not. + implicit_pyc_files: {type}`depset[File]` Implicitly generated pyc files + that a binary can choose to include. + imports: depset of strings; the import path values to propagate. + venv_symlinks: {type}`list[VenvSymlinkEntry]` instances for + symlinks to create in the consuming binary's venv. + + Returns: + A tuple of the PyInfo instance and a depset of the + transitive sources collected from dependencies (the latter is only + necessary for deprecated extra actions support). + """ + py_info = PyInfoBuilder.new() + py_info.venv_symlinks.add(venv_symlinks) + py_info.direct_original_sources.add(original_sources) + py_info.direct_pyc_files.add(required_pyc_files) + py_info.direct_pyi_files.add(ctx.files.pyi_srcs) + py_info.transitive_original_sources.add(original_sources) + py_info.transitive_pyc_files.add(required_pyc_files) + py_info.transitive_pyi_files.add(ctx.files.pyi_srcs) + py_info.transitive_implicit_pyc_files.add(implicit_pyc_files) + py_info.transitive_implicit_pyc_source_files.add(implicit_pyc_source_files) + py_info.imports.add(imports) + py_info.merge_has_py2_only_sources(ctx.attr.srcs_version in ("PY2", "PY2ONLY")) + py_info.merge_has_py3_only_sources(ctx.attr.srcs_version in ("PY3", "PY3ONLY")) + + for target in ctx.attr.deps: + # PyInfo may not be present e.g. cc_library rules. + if PyInfo in target or (BuiltinPyInfo != None and BuiltinPyInfo in target): + py_info.merge(_get_py_info(target)) + else: + # TODO(b/228692666): Remove this once non-PyInfo targets are no + # longer supported in `deps`. + files = target[DefaultInfo].files.to_list() + for f in files: + if f.extension == "py": + py_info.transitive_sources.add(f) + py_info.merge_uses_shared_libraries(cc_helper.is_valid_shared_library_artifact(f)) + for target in ctx.attr.pyi_deps: + # PyInfo may not be present e.g. cc_library rules. + if PyInfo in target or (BuiltinPyInfo != None and BuiltinPyInfo in target): + py_info.merge(_get_py_info(target)) + + deps_transitive_sources = py_info.transitive_sources.build() + py_info.transitive_sources.add(required_py_files) + + # We only look at data to calculate uses_shared_libraries, if it's already + # true, then we don't need to waste time looping over it. + if not py_info.get_uses_shared_libraries(): + # Similar to the above, except we only calculate uses_shared_libraries + for target in ctx.attr.data: + # TODO(b/234730058): Remove checking for PyInfo in data once depot + # cleaned up. + if PyInfo in target or (BuiltinPyInfo != None and BuiltinPyInfo in target): + info = _get_py_info(target) + py_info.merge_uses_shared_libraries(info.uses_shared_libraries) + else: + files = target[DefaultInfo].files.to_list() + for f in files: + py_info.merge_uses_shared_libraries(cc_helper.is_valid_shared_library_artifact(f)) + if py_info.get_uses_shared_libraries(): + break + if py_info.get_uses_shared_libraries(): + break + + return py_info.build(), deps_transitive_sources, py_info.build_builtin_py_info() + +def _get_py_info(target): + return target[PyInfo] if PyInfo in target or BuiltinPyInfo == None else target[BuiltinPyInfo] + +def create_instrumented_files_info(ctx): + return _coverage_common.instrumented_files_info( + ctx, + source_attributes = ["srcs"], + dependency_attributes = ["deps", "data"], + extensions = _PYTHON_SOURCE_EXTENSIONS, + ) + +def create_output_group_info(transitive_sources, extra_groups): + return OutputGroupInfo( + compilation_prerequisites_INTERNAL_ = transitive_sources, + compilation_outputs = transitive_sources, + **extra_groups + ) + +def maybe_add_test_execution_info(providers, ctx): + """Adds ExecutionInfo, if necessary for proper test execution. + + Args: + providers: Mutable list of providers; may have ExecutionInfo + provider appended. + ctx: Rule ctx. + """ + + # When built for Apple platforms, require the execution to be on a Mac. + # TODO(b/176993122): Remove when bazel automatically knows to run on darwin. + if target_platform_has_any_constraint(ctx, ctx.attr._apple_constraints): + providers.append(_testing.ExecutionInfo({"requires-darwin": ""})) + +_BOOL_TYPE = type(True) + +def is_bool(v): + return type(v) == _BOOL_TYPE + +def target_platform_has_any_constraint(ctx, constraints): + """Check if target platform has any of a list of constraints. + + Args: + ctx: rule context. + constraints: label_list of constraints. + + Returns: + True if target platform has at least one of the constraints. + """ + for constraint in constraints: + constraint_value = constraint[_platform_common.ConstraintValueInfo] + if ctx.target_platform_has_constraint(constraint_value): + return True + return False + +def runfiles_root_path(ctx, short_path): + """Compute a runfiles-root relative path from `File.short_path` + + Args: + ctx: current target ctx + short_path: str, a main-repo relative path from `File.short_path` + + Returns: + {type}`str`, a runflies-root relative path + """ + + # The ../ comes from short_path is for files in other repos. + if short_path.startswith("../"): + return short_path[3:] + else: + return "{}/{}".format(ctx.workspace_name, short_path) diff --git a/python/private/common/py_binary_rule_bazel.bzl b/python/private/common/py_binary_rule_bazel.bzl new file mode 100644 index 0000000000..7858411963 --- /dev/null +++ b/python/private/common/py_binary_rule_bazel.bzl @@ -0,0 +1,6 @@ +"""Stub file for Bazel docs to link to. + +The Bazel docs link to this file, but the implementation was moved. + +Please see: https://rules-python.readthedocs.io/en/latest/api/rules_python/python/defs.html#py_binary +""" diff --git a/python/private/common/py_library_rule_bazel.bzl b/python/private/common/py_library_rule_bazel.bzl new file mode 100644 index 0000000000..be631c9087 --- /dev/null +++ b/python/private/common/py_library_rule_bazel.bzl @@ -0,0 +1,6 @@ +"""Stub file for Bazel docs to link to. + +The Bazel docs link to this file, but the implementation was moved. + +Please see: https://rules-python.readthedocs.io/en/latest/api/rules_python/python/defs.html#py_library +""" diff --git a/python/private/common/py_runtime_rule.bzl b/python/private/common/py_runtime_rule.bzl new file mode 100644 index 0000000000..cadb48c704 --- /dev/null +++ b/python/private/common/py_runtime_rule.bzl @@ -0,0 +1,6 @@ +"""Stub file for Bazel docs to link to. + +The Bazel docs link to this file, but the implementation was moved. + +Please see: https://rules-python.readthedocs.io/en/latest/api/rules_python/python/defs.html#py_runtime +""" diff --git a/python/private/common/py_test_rule_bazel.bzl b/python/private/common/py_test_rule_bazel.bzl new file mode 100644 index 0000000000..c89e3a65c4 --- /dev/null +++ b/python/private/common/py_test_rule_bazel.bzl @@ -0,0 +1,6 @@ +"""Stub file for Bazel docs to link to. + +The Bazel docs link to this file, but the implementation was moved. + +Please see: https://rules-python.readthedocs.io/en/latest/api/rules_python/python/defs.html#py_test +""" diff --git a/python/private/config_settings.bzl b/python/private/config_settings.bzl new file mode 100644 index 0000000000..3089b9c6cf --- /dev/null +++ b/python/private/config_settings.bzl @@ -0,0 +1,277 @@ +# Copyright 2024 The Bazel Authors. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""This module is used to construct the config settings in the BUILD file in this same package. +""" + +load("@bazel_skylib//lib:selects.bzl", "selects") +load("@bazel_skylib//rules:common_settings.bzl", "BuildSettingInfo") +load("//python/private:text_util.bzl", "render") +load(":version.bzl", "version") + +_PYTHON_VERSION_FLAG = Label("//python/config_settings:python_version") +_PYTHON_VERSION_MAJOR_MINOR_FLAG = Label("//python/config_settings:python_version_major_minor") + +_DEBUG_ENV_MESSAGE_TEMPLATE = """\ +The current configuration rules_python config flags is: + {flags} + +If the value is missing, then the default value is being used, see documentation: +{docs_url}/python/config_settings +""" + +# Indicates something needs public visibility so that other generated code can +# access it, but it's not intended for general public usage. +_NOT_ACTUALLY_PUBLIC = ["//visibility:public"] + +def construct_config_settings(*, name, default_version, versions, minor_mapping, documented_flags): # buildifier: disable=function-docstring + """Create a 'python_version' config flag and construct all config settings used in rules_python. + + This mainly includes the targets that are used in the toolchain and pip hub + repositories that only match on the 'python_version' flag values. + + Args: + name: {type}`str` A dummy name value that is no-op for now. + default_version: {type}`str` the default value for the `python_version` flag. + versions: {type}`list[str]` A list of versions to build constraint settings for. + minor_mapping: {type}`dict[str, str]` A mapping from `X.Y` to `X.Y.Z` python versions. + documented_flags: {type}`list[str]` The labels of the documented settings + that affect build configuration. + """ + _ = name # @unused + _python_version_flag( + name = _PYTHON_VERSION_FLAG.name, + build_setting_default = default_version, + visibility = ["//visibility:public"], + ) + + _python_version_major_minor_flag( + name = _PYTHON_VERSION_MAJOR_MINOR_FLAG.name, + build_setting_default = "", + visibility = ["//visibility:public"], + ) + + native.config_setting( + name = "is_python_version_unset", + flag_values = {_PYTHON_VERSION_FLAG: ""}, + visibility = ["//visibility:public"], + ) + + _reverse_minor_mapping = {full: minor for minor, full in minor_mapping.items()} + for version in versions: + minor_version = _reverse_minor_mapping.get(version) + if not minor_version: + native.config_setting( + name = "is_python_{}".format(version), + flag_values = {":python_version": version}, + visibility = ["//visibility:public"], + ) + continue + + # Also need to match the minor version when using + name = "is_python_{}".format(version) + native.config_setting( + name = "_" + name, + flag_values = {":python_version": version}, + visibility = ["//visibility:public"], + ) + + # An alias pointing to an underscore-prefixed config_setting_group + # is used because config_setting_group creates + # `is_{version}_N` targets, which are easily confused with the + # `is_{minor}.{micro}` (dot) targets. + selects.config_setting_group( + name = "_{}_group".format(name), + match_any = [ + ":_is_python_{}".format(version), + ":is_python_{}".format(minor_version), + ], + visibility = ["//visibility:private"], + ) + native.alias( + name = name, + actual = "_{}_group".format(name), + visibility = ["//visibility:public"], + ) + + # This matches the raw flag value, e.g. --//python/config_settings:python_version=3.8 + # It's private because matching the concept of e.g. "3.8" value is done + # using the `is_python_X.Y` config setting group, which is aware of the + # minor versions that could match instead. + for minor in minor_mapping.keys(): + native.config_setting( + name = "is_python_{}".format(minor), + flag_values = {_PYTHON_VERSION_MAJOR_MINOR_FLAG: minor}, + visibility = ["//visibility:public"], + ) + + _current_config( + name = "current_config", + build_setting_default = "", + settings = documented_flags + [_PYTHON_VERSION_FLAG.name], + visibility = ["//visibility:private"], + ) + native.config_setting( + name = "is_not_matching_current_config", + # We use the rule above instead of @platforms//:incompatible so that the + # printing of the current env always happens when the _current_config rule + # is executed. + # + # NOTE: This should in practise only happen if there is a missing compatible + # `whl_library` in the hub repo created by `pip.parse`. + flag_values = {"current_config": "will-never-match"}, + # Only public so that PyPI hub repo can access it + visibility = _NOT_ACTUALLY_PUBLIC, + ) + + libc = Label("//python/config_settings:py_linux_libc") + native.config_setting( + name = "_is_py_linux_libc_glibc", + flag_values = {libc: "glibc"}, + visibility = _NOT_ACTUALLY_PUBLIC, + ) + native.config_setting( + name = "_is_py_linux_libc_musl", + flag_values = {libc: "musl"}, + visibility = _NOT_ACTUALLY_PUBLIC, + ) + freethreaded = Label("//python/config_settings:py_freethreaded") + native.config_setting( + name = "_is_py_freethreaded_yes", + flag_values = {freethreaded: "yes"}, + visibility = _NOT_ACTUALLY_PUBLIC, + ) + native.config_setting( + name = "_is_py_freethreaded_no", + flag_values = {freethreaded: "no"}, + visibility = _NOT_ACTUALLY_PUBLIC, + ) + +def _python_version_flag_impl(ctx): + value = ctx.build_setting_value + return [ + # BuildSettingInfo is the original provider returned, so continue to + # return it for compatibility + BuildSettingInfo(value = value), + # FeatureFlagInfo is returned so that config_setting respects the value + # as returned by this rule instead of as originally seen on the command + # line. + # It is also for Google compatibility, which expects the FeatureFlagInfo + # provider. + config_common.FeatureFlagInfo(value = value), + ] + +_python_version_flag = rule( + implementation = _python_version_flag_impl, + build_setting = config.string(flag = True), + attrs = {}, +) + +def _python_version_major_minor_flag_impl(ctx): + input = _flag_value(ctx.attr._python_version_flag) + if input: + ver = version.parse(input) + value = "{}.{}".format(ver.release[0], ver.release[1]) + else: + value = "" + + return [config_common.FeatureFlagInfo(value = value)] + +_python_version_major_minor_flag = rule( + implementation = _python_version_major_minor_flag_impl, + build_setting = config.string(flag = False), + attrs = { + "_python_version_flag": attr.label( + default = _PYTHON_VERSION_FLAG, + ), + }, +) + +def _flag_value(s): + if config_common.FeatureFlagInfo in s: + return s[config_common.FeatureFlagInfo].value + else: + return s[BuildSettingInfo].value + +def _print_current_config_impl(ctx): + flags = "\n".join([ + "{}: \"{}\"".format(k, v) + for k, v in sorted({ + str(setting.label): _flag_value(setting) + for setting in ctx.attr.settings + }.items()) + ]) + + msg = ctx.attr._template.format( + docs_url = "https://rules-python.readthedocs.io/en/latest/api/rules_python", + flags = render.indent(flags).lstrip(), + ) + if ctx.build_setting_value and ctx.build_setting_value != "fail": + fail("Only 'fail' and empty build setting values are allowed for {}".format( + str(ctx.label), + )) + elif ctx.build_setting_value: + fail(msg) + else: + print(msg) # buildifier: disable=print + + return [config_common.FeatureFlagInfo(value = "")] + +_current_config = rule( + implementation = _print_current_config_impl, + build_setting = config.string(flag = True), + attrs = { + "settings": attr.label_list(mandatory = True), + "_template": attr.string(default = _DEBUG_ENV_MESSAGE_TEMPLATE), + }, +) + +def is_python_version_at_least(name, **kwargs): + flag_name = "_{}_flag".format(name) + native.config_setting( + name = name, + flag_values = { + flag_name: "yes", + }, + ) + _python_version_at_least( + name = flag_name, + visibility = ["//visibility:private"], + **kwargs + ) + +def _python_version_at_least_impl(ctx): + flag_value = ctx.attr._major_minor[config_common.FeatureFlagInfo].value + + # CI is, somehow, getting an empty string for the current flag value. + # How isn't clear. + if not flag_value: + return [config_common.FeatureFlagInfo(value = "no")] + + current = tuple([ + int(x) + for x in flag_value.split(".") + ]) + at_least = tuple([int(x) for x in ctx.attr.at_least.split(".")]) + + value = "yes" if current >= at_least else "no" + return [config_common.FeatureFlagInfo(value = value)] + +_python_version_at_least = rule( + implementation = _python_version_at_least_impl, + attrs = { + "at_least": attr.string(mandatory = True), + "_major_minor": attr.label(default = _PYTHON_VERSION_MAJOR_MINOR_FLAG), + }, +) diff --git a/python/private/coverage.patch b/python/private/coverage.patch new file mode 100644 index 0000000000..051f7fc543 --- /dev/null +++ b/python/private/coverage.patch @@ -0,0 +1,17 @@ +# Because of how coverage is run, the current directory is the first in +# sys.path. This is a problem for the tests, because they may import a module of +# the same name as a module in the current directory. +# +# NOTE @aignas 2023-06-05: we have to do this before anything from coverage gets +# imported. +diff --git a/coverage/__main__.py b/coverage/__main__.py +index ce2d8db..7d7d0a0 100644 +--- a/coverage/__main__.py ++++ b/coverage/__main__.py +@@ -6,5 +6,6 @@ + from __future__ import annotations + + import sys ++sys.path.append(sys.path.pop(0)) + from coverage.cmdline import main + sys.exit(main()) diff --git a/python/private/coverage_deps.bzl b/python/private/coverage_deps.bzl new file mode 100644 index 0000000000..e80e8ee910 --- /dev/null +++ b/python/private/coverage_deps.bzl @@ -0,0 +1,190 @@ +# Copyright 2023 The Bazel Authors. All rights reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Dependencies for coverage.py used by the hermetic toolchain. +""" + +load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") +load("@bazel_tools//tools/build_defs/repo:utils.bzl", "maybe") +load("//python/private:version_label.bzl", "version_label") + +# START: maintained by 'bazel run //tools/private/update_deps:update_coverage_deps
.whl
for which
-requirements
has the dependencies.
-"""),
- "python_interpreter": attr.string(default = "python", doc = """
-The command to run the Python interpreter used when unpacking the wheel.
-"""),
- "requirements": attr.string(doc = """
-The name of the pip_import
repository rule from which to load this
-.whl
's dependencies.
-"""),
- "whl": attr.label(
- mandatory = True,
- allow_single_file = True,
- doc = """
-The path to the .whl
file. The name is expected to follow [this
-convention](https://www.python.org/dev/peps/pep-0427/#file-name-convention)).
-""",
- ),
- "_script": attr.label(
- executable = True,
- default = Label("//tools:whltool.par"),
- cfg = "host",
- ),
- },
- implementation = _whl_impl,
- doc = """A rule for importing `.whl` dependencies into Bazel.
-
-This rule is currently used to implement `pip_import`. It is not intended to
-work standalone, and the interface may change. See `pip_import` for proper
-usage.
-
-This rule imports a `.whl` file as a `py_library`:
-```python
-whl_library(
- name = "foo",
- whl = ":my-whl-file",
- requirements = "name of pip_import rule",
-)
-```
-
-This rule defines `@foo//:pkg` as a `py_library` target.
-""",
-)
diff --git a/renovate.json b/renovate.json
deleted file mode 100644
index ee8c906b91..0000000000
--- a/renovate.json
+++ /dev/null
@@ -1,5 +0,0 @@
-{
- "extends": [
- "config:base"
- ]
-}
diff --git a/sphinxdocs/BUILD.bazel b/sphinxdocs/BUILD.bazel
new file mode 100644
index 0000000000..9ad1e1eef9
--- /dev/null
+++ b/sphinxdocs/BUILD.bazel
@@ -0,0 +1,66 @@
+# Copyright 2023 The Bazel Authors. All rights reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+load("@bazel_skylib//:bzl_library.bzl", "bzl_library")
+load("@bazel_skylib//rules:common_settings.bzl", "bool_flag")
+load("//sphinxdocs/private:sphinx.bzl", "repeated_string_list_flag")
+
+package(
+ default_visibility = ["//:__subpackages__"],
+)
+
+# Additional -D values to add to every Sphinx build.
+# This is usually used to override the version when building
+repeated_string_list_flag(
+ name = "extra_defines",
+ build_setting_default = [],
+)
+
+repeated_string_list_flag(
+ name = "extra_env",
+ build_setting_default = [],
+)
+
+# Whether to add the `-q` arg to Sphinx invocations, which determines if
+# stdout has any output or not (logging INFO messages and progress messages).
+# If true, add `-q`. If false, don't add `-q`. This is mostly useful for
+# debugging invocations or developing extensions.
+bool_flag(
+ name = "quiet",
+ build_setting_default = True,
+)
+
+bzl_library(
+ name = "sphinx_bzl",
+ srcs = ["sphinx.bzl"],
+ deps = ["//sphinxdocs/private:sphinx_bzl"],
+)
+
+bzl_library(
+ name = "sphinx_docs_library_bzl",
+ srcs = ["sphinx_docs_library.bzl"],
+ deps = ["//sphinxdocs/private:sphinx_docs_library_macro_bzl"],
+)
+
+bzl_library(
+ name = "sphinx_stardoc_bzl",
+ srcs = ["sphinx_stardoc.bzl"],
+ deps = ["//sphinxdocs/private:sphinx_stardoc_bzl"],
+)
+
+bzl_library(
+ name = "readthedocs_bzl",
+ srcs = ["readthedocs.bzl"],
+ deps = ["//sphinxdocs/private:readthedocs_bzl"],
+)
diff --git a/sphinxdocs/docs/BUILD.bazel b/sphinxdocs/docs/BUILD.bazel
new file mode 100644
index 0000000000..070e0485d7
--- /dev/null
+++ b/sphinxdocs/docs/BUILD.bazel
@@ -0,0 +1,64 @@
+load("//python/private:bzlmod_enabled.bzl", "BZLMOD_ENABLED") # buildifier: disable=bzl-visibility
+load("//sphinxdocs:sphinx_docs_library.bzl", "sphinx_docs_library")
+load("//sphinxdocs:sphinx_stardoc.bzl", "sphinx_stardocs")
+
+package(default_visibility = ["//:__subpackages__"])
+
+# We only build for Linux and Mac because:
+# 1. The actual doc process only runs on Linux
+# 2. Mac is a common development platform, and is close enough to Linux
+# it's feasible to make work.
+# Making CI happy under Windows is too much of a headache, though, so we don't
+# bother with that.
+_TARGET_COMPATIBLE_WITH = select({
+ "@platforms//os:linux": [],
+ "@platforms//os:macos": [],
+ "//conditions:default": ["@platforms//:incompatible"],
+}) if BZLMOD_ENABLED else ["@platforms//:incompatible"]
+
+sphinx_docs_library(
+ name = "docs_lib",
+ deps = [
+ ":artisian_api_docs",
+ ":bzl_docs",
+ ":py_api_srcs",
+ ":regular_docs",
+ ],
+)
+
+sphinx_docs_library(
+ name = "regular_docs",
+ srcs = glob(
+ ["**/*.md"],
+ exclude = ["api/**"],
+ ),
+ prefix = "sphinxdocs/",
+)
+
+sphinx_docs_library(
+ name = "artisian_api_docs",
+ srcs = glob(
+ ["api/**/*.md"],
+ ),
+ prefix = "api/sphinxdocs/",
+ strip_prefix = "sphinxdocs/docs/api/",
+)
+
+sphinx_stardocs(
+ name = "bzl_docs",
+ srcs = [
+ "//sphinxdocs:readthedocs_bzl",
+ "//sphinxdocs:sphinx_bzl",
+ "//sphinxdocs:sphinx_docs_library_bzl",
+ "//sphinxdocs:sphinx_stardoc_bzl",
+ "//sphinxdocs/private:sphinx_docs_library_bzl",
+ ],
+ prefix = "api/sphinxdocs/",
+ target_compatible_with = _TARGET_COMPATIBLE_WITH,
+)
+
+sphinx_docs_library(
+ name = "py_api_srcs",
+ srcs = ["//sphinxdocs/src/sphinx_bzl"],
+ strip_prefix = "sphinxdocs/src/",
+)
diff --git a/sphinxdocs/docs/api/index.md b/sphinxdocs/docs/api/index.md
new file mode 100644
index 0000000000..3420b9180d
--- /dev/null
+++ b/sphinxdocs/docs/api/index.md
@@ -0,0 +1,8 @@
+# sphinxdocs Bazel APIs
+
+API documentation for sphinxdocs Bazel objects.
+
+```{toctree}
+:glob:
+**
+```
diff --git a/sphinxdocs/docs/api/sphinxdocs/index.md b/sphinxdocs/docs/api/sphinxdocs/index.md
new file mode 100644
index 0000000000..bd4e9b6eec
--- /dev/null
+++ b/sphinxdocs/docs/api/sphinxdocs/index.md
@@ -0,0 +1,29 @@
+:::{bzl:currentfile} //sphinxdocs:BUILD.bazel
+:::
+
+# //sphinxdocs
+
+:::{bzl:flag} extra_defines
+Additional `-D` values to add to every Sphinx build.
+
+This is a list flag. Multiple uses are accumulated.
+
+This is most useful for overriding e.g. the version when performing
+release builds.
+:::
+
+:::{bzl:flag} extra_env
+Additional environment variables to for every Sphinx build.
+
+This is a list flag. Multiple uses are accumulated. Values are `key=value`
+format.
+:::
+
+:::{bzl:flag} quiet
+Whether to add the `-q` arg to Sphinx invocations.
+
+This is a boolean flag.
+
+This is useful for debugging invocations or developing extensions. The Sphinx
+`-q` flag causes sphinx to produce additional output on stdout.
+:::
diff --git a/sphinxdocs/docs/api/sphinxdocs/inventories/index.md b/sphinxdocs/docs/api/sphinxdocs/inventories/index.md
new file mode 100644
index 0000000000..a03645ed44
--- /dev/null
+++ b/sphinxdocs/docs/api/sphinxdocs/inventories/index.md
@@ -0,0 +1,11 @@
+:::{bzl:currentfile} //sphinxdocs/inventories:BUILD.bazel
+:::
+
+# //sphinxdocs/inventories
+
+:::{bzl:target} bazel_inventory
+A Sphinx inventory of Bazel objects.
+
+By including this target in your Sphinx build and enabling intersphinx, cross
+references to builtin Bazel objects can be written.
+:::
diff --git a/sphinxdocs/docs/index.md b/sphinxdocs/docs/index.md
new file mode 100644
index 0000000000..2ea1146e1b
--- /dev/null
+++ b/sphinxdocs/docs/index.md
@@ -0,0 +1,44 @@
+# Docgen using Sphinx with Bazel
+
+The `sphinxdocs` project allows using Bazel to run Sphinx to generate
+documentation. It comes with:
+
+* Rules for running Sphinx
+* Rules for generating documentation for Starlark code.
+* A Sphinx plugin for documenting Starlark and Bazel objects.
+* Rules for readthedocs build integration.
+
+While it is primarily oriented towards docgen for Starlark code, the core of it
+is agnostic as to what is being documented.
+
+### Optimization
+
+Normally, Sphinx keeps various cache files to improve incremental building.
+Unfortunately, programs performing their own caching don't interact well
+with Bazel's model of precisely declaring and strictly enforcing what are
+inputs, what are outputs, and what files are available when running a program.
+The net effect is programs don't have a prior invocation's cache files
+available.
+
+There are two mechanisms available to make some cache available to Sphinx under
+Bazel:
+
+* Disable sandboxing, which allows some files from prior invocations to be
+ visible to subsequent invocations. This can be done multiple ways:
+ * Set `tags = ["no-sandbox"]` on the `sphinx_docs` target
+ * `--modify_execution_info=SphinxBuildDocs=+no-sandbox` (Bazel flag)
+ * `--strategy=SphinxBuildDocs=local` (Bazel flag)
+* Use persistent workers (enabled by default) by setting
+ `allow_persistent_workers=True` on the `sphinx_docs` target. Note that other
+ Bazel flags can disable using workers even if an action supports it. Setting
+ `--strategy=SphinxBuildDocs=dynamic,worker,local,sandbox` should tell Bazel
+ to use workers if possible, otherwise fallback to non-worker invocations.
+
+
+```{toctree}
+:hidden:
+
+starlark-docgen
+sphinx-bzl
+readthedocs
+```
diff --git a/sphinxdocs/docs/readthedocs.md b/sphinxdocs/docs/readthedocs.md
new file mode 100644
index 0000000000..c347d19850
--- /dev/null
+++ b/sphinxdocs/docs/readthedocs.md
@@ -0,0 +1,156 @@
+:::{default-domain} bzl
+:::
+
+# Read the Docs integration
+
+The {obj}`readthedocs_install` rule provides support for making it easy
+to build for, and deploy to, Read the Docs. It does this by having Bazel do
+all the work of building, and then the outputs are copied to where Read the Docs
+expects served content to be placed. By having Bazel do the majority of work,
+you have more certainty that the docs you generate locally will match what
+is created in the Read the Docs build environment.
+
+Setting this up is conceptually simple: make the Read the Docs build call `bazel
+run` with the appropriate args. To do this, it requires gluing a couple things
+together, most of which can be copy/pasted from the examples below.
+
+## `.readthedocs.yaml` config
+
+In order for Read the Docs to call our custom commands, we have to use the
+advanced `build.commands` setting of the config file. This needs to do two key
+things:
+1. Install Bazel
+2. Call `bazel run` with the appropriate args.
+
+In the example below, `npm` is used to install Bazelisk and a helper shell
+script, `readthedocs_build.sh` is used to construct the Bazel invocation.
+
+The key purpose of the shell script it to set the
+`--@rules_python//sphinxdocs:extra_env` and
+`--@rules_python//sphinxdocs:extra_defines` flags. These are used to communicate
+`READTHEDOCS*` environment variables and settings to the Bazel invocation.
+
+## BUILD config
+
+In your build file, the {obj}`readthedocs_install` rule handles building the
+docs and copying the output to the Read the Docs output directory
+(`$READTHEDOCS_OUTPUT` environment variable). As input, it takes a `sphinx_docs`
+target (the generated docs).
+
+## conf.py config
+
+Normally, readthedocs will inject extra content into your `conf.py` file
+to make certain integration available (e.g. the version selection flyout).
+However, because our yaml config uses the advanced `build.commands` feature,
+those config injections are disabled and we have to manually re-enable them.
+
+To do this, we modify `conf.py` to detect `READTHEDOCS=True` in the environment
+and perform some additional logic. See the example code below for the
+modifications.
+
+Depending on your theme, you may have to tweak the conf.py; the example is
+based on using the sphinx_rtd_theme.
+
+## Example
+
+```
+# File: .readthedocs.yaml
+version: 2
+
+build:
+ os: "ubuntu-22.04"
+ tools:
+ nodejs: "19"
+ commands:
+ - env
+ - npm install -g @bazel/bazelisk
+ - bazel version
+ # Put the actual action behind a shell script because it's
+ # easier to modify than the yaml config.
+ - docs/readthedocs_build.sh
+```
+
+```
+# File: docs/BUILD
+
+load("@rules_python//sphinxdocs:readthedocs.bzl.bzl", "readthedocs_install")
+readthedocs_install(
+ name = "readthedocs_install",
+ docs = [":docs"],
+)
+```
+
+```
+# File: docs/readthedocs_build.sh
+
+#!/bin/bash
+
+set -eou pipefail
+
+declare -a extra_env
+while IFS='=' read -r -d '' name value; do
+ if [[ "$name" == READTHEDOCS* ]]; then
+ extra_env+=("--@rules_python//sphinxdocs:extra_env=$name=$value")
+ fi
+done < <(env -0)
+
+# In order to get the build number, we extract it from the host name
+extra_env+=("--@rules_python//sphinxdocs:extra_env=HOSTNAME=$HOSTNAME")
+
+set -x
+bazel run \
+ --stamp \
+ "--@rules_python//sphinxdocs:extra_defines=version=$READTHEDOCS_VERSION" \
+ "${extra_env[@]}" \
+ //docs:readthedocs_install
+```
+
+```
+# File: docs/conf.py
+
+# Adapted from the template code:
+# https://github.com/readthedocs/readthedocs.org/blob/main/readthedocs/doc_builder/templates/doc_builder/conf.py.tmpl
+if os.environ.get("READTHEDOCS") == "True":
+ # Must come first because it can interfere with other extensions, according
+ # to the original conf.py template comments
+ extensions.insert(0, "readthedocs_ext.readthedocs")
+
+ if os.environ.get("READTHEDOCS_VERSION_TYPE") == "external":
+ # Insert after the main extension
+ extensions.insert(1, "readthedocs_ext.external_version_warning")
+ readthedocs_vcs_url = (
+ "http://github.com/bazel-contrib/rules_python/pull/{}".format(
+ os.environ.get("READTHEDOCS_VERSION", "")
+ )
+ )
+ # The build id isn't directly available, but it appears to be encoded
+ # into the host name, so we can parse it from that. The format appears
+ # to be `build-X-project-Y-Z`, where:
+ # * X is an integer build id
+ # * Y is an integer project id
+ # * Z is the project name
+ _build_id = os.environ.get("HOSTNAME", "build-0-project-0-rules-python")
+ _build_id = _build_id.split("-")[1]
+ readthedocs_build_url = (
+ f"https://readthedocs.org/projects/rules-python/builds/{_build_id}"
+ )
+
+html_context = {
+ # This controls whether the flyout menu is shown. It is always false
+ # because:
+ # * For local builds, the flyout menu is empty and doesn't show in the
+ # same place as for RTD builds. No point in showing it locally.
+ # * For RTD builds, the flyout menu is always automatically injected,
+ # so having it be True makes the flyout show up twice.
+ "READTHEDOCS": False,
+ "github_version": os.environ.get("READTHEDOCS_GIT_IDENTIFIER", ""),
+ # For local builds, the github link won't work. Disabling it replaces
+ # it with a "view source" link to view the source Sphinx saw, which
+ # is useful for local development.
+ "display_github": os.environ.get("READTHEDOCS") == "True",
+ "commit": os.environ.get("READTHEDOCS_GIT_COMMIT_HASH", "unknown commit"),
+ # Used by readthedocs_ext.external_version_warning extension
+ # This is the PR number being built
+ "current_version": os.environ.get("READTHEDOCS_VERSION", ""),
+}
+```
diff --git a/sphinxdocs/docs/sphinx-bzl.md b/sphinxdocs/docs/sphinx-bzl.md
new file mode 100644
index 0000000000..8376f60679
--- /dev/null
+++ b/sphinxdocs/docs/sphinx-bzl.md
@@ -0,0 +1,328 @@
+# Bazel plugin for Sphinx
+
+The `sphinx_bzl` Python package is a Sphinx plugin that defines a custom domain
+("bzl") in the Sphinx system. This provides first-class integration with Sphinx
+and allows code comments to provide rich information and allows manually writing
+docs for objects that aren't directly representable in bzl source code. For
+example, the fields of a provider can use `:type:` to indicate the type of a
+field, or manually written docs can use the `{bzl:target}` directive to document
+a well known target.
+
+## Configuring Sphinx
+
+To enable the plugin in Sphinx, depend on
+`@rules_python//sphinxdocs/src/sphinx_bzl` and enable it in `conf.py`:
+
+```
+extensions = [
+ "sphinx_bzl.bzl",
+]
+```
+
+## Brief introduction to Sphinx terminology
+
+To aid understanding how to write docs, lets define a few common terms:
+
+* **Role**: A role is the "bzl:obj" part when writing ``{bzl:obj}`ref` ``.
+ Roles mark inline text as needing special processing. There's generally
+ two types of processing: creating cross references, or role-specific custom
+ rendering. For example `{bzl:obj}` will create a cross references, while
+ `{bzl:default-value}` indicates the default value of an argument.
+* **Directive**: A directive is indicated with `:::` and allows defining an
+ entire object and its parts. For example, to describe a function and its
+ arguments, the `:::{bzl:function}` directive is used.
+* **Directive Option**: A directive option is the "type" part when writing
+ `:type:` within a directive. Directive options are how directives are told
+ the meaning of certain values, such as the type of a provider field. Depending
+ on the object being documented, a directive option may be used instead of
+ special role to indicate semantic values.
+
+Most often, you'll be using roles to refer other objects or indicate special
+values in doc strings. For directives, you're likely to only use them when
+manually writing docs to document flags, targets, or other objects that
+`sphinx_stardoc` generates for you.
+
+## MyST vs RST
+
+By default, Sphinx uses ReStructured Text (RST) syntax for its documents.
+Unfortunately, RST syntax is very different than the popular Markdown syntax. To
+bridge the gap, MyST translates Markdown-style syntax into the RST equivalents.
+This allows easily using Markdown in bzl files.
+
+While MyST isn't required for the core `sphinx_bzl` plugin to work, this
+document uses MyST syntax because `sphinx_stardoc` bzl doc gen rule requires
+MyST.
+
+The main difference in syntax is:
+* MyST directives use `:::{name}` with closing `:::` instead of `.. name::` with
+ indented content.
+* MyST roles use `{role:name}` instead of `:role:name:`
+
+## Type expressions
+
+Several roles or fields accept type expressions. Type expressions use
+Python-style annotation syntax to describe data types. For example `None | list[str]`
+describes a type of "None or a list of strings". Each component of the
+expression is parsed and cross reference to its associated type definition.
+
+## Cross references
+
+In brief, to reference bzl objects, use the `bzl:obj` role and use the
+Bazel label string you would use to refer to the object in Bazel (using `%` to
+denote names within a file). For example, to unambiguously refer to `py_binary`:
+
+```
+{bzl:obj}`@rules_python//python:py_binary.bzl%py_binary`
+```
+
+The above is pretty long, so shorter names are also supported, and `sphinx_bzl`
+will try to find something that matches. Additionally, in `.bzl` code, the
+`bzl:` prefix is set as the default. The above can then be shortened to:
+
+```
+{obj}`py_binary`
+```
+
+The text that is displayed can be customized by putting the reference string in
+chevrons (`<>`):
+
+```
+{obj}`the binary rule Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies: