-
Notifications
You must be signed in to change notification settings - Fork 634
Description
Description:
This may relate to #626, and it may also conflict with your stated anti-goals, but i believe it's worth bringing to the surface as a potential bug you may want to investigate, as I can't find a direct issue around it, and it may be impacting many users who rely on this action.
If you build different extras
with your python project, each containing their own independent dependencies, and you want to test to ensure that each extra has all of its necessary dependencies in a job, while also checking overall lint/type safety/testing, you may run into this issue as I have.
When you specify the cache cache: poetry
or cache: pip
etc, and point to your requirements.txt
or more up to date pyproject.toml
, the cache key doens't take into account what you are installing in that job.
So, if I have a pyproject like so
...
[tool.poetry.dependencies]
python = ">=3.10.9,<3.11"
numpy = "^1.22.3"
boto3 = "^1.24.59"
pydantic = {version = "<2.0", extras = ["dotenv"]}
jinja2 = "^3.1.2"
openai = {version = "0.28", optional = true}
[tool.poetry.extras]
openai = ["openai"]
And in my first job, i use setup-python and then run
poetry install --all-extras
but in another job, I run
poetry install
One may assume that openai will not be installed in the second job. But if i'm using caching, regardless of what I install with, everything from the first cache creation will be installed.
I would think that the install command itself would generate the hash, rather than the dependency file itself.
Based on your non-goals, I understand if this isn't something you want to pursue, but it might be worth documenting in a overly-clear way for users who may not understand this behavior upfront.
Thank you!