🐛 fix(check_dependency): verify URL reqs via PEP 610#1027
🐛 fix(check_dependency): verify URL reqs via PEP 610#1027gaborbernat merged 1 commit intopypa:mainfrom
Conversation
|
I didn't think we installed anything in non-isolated builds? I don't understand the comment about causing more reinstalls, we are not a package installer. And this doesn't run in isolated mode, since we install everything anyway (I think). We are not a package installer, we are not pip. I'm not a fan of checking this more strictly anyway, it causes more trouble than it helps (pip is opt-out), so also not a fan of making all editable installs always count as invalid. This would be potentially disruptive with minimal benefit. Edit: new version is better. |
4023a45 to
5ab3af6
Compare
058f37e to
3a9963b
Compare
2456f4c to
2675a13
Compare
check_dependency silently accepted URL requirements like `pkg @ https://...` whenever a package with the same name was installed, regardless of source. Use PEP 610 direct_url.json metadata to compare the installed distribution's origin against the required URL, reporting a mismatch as unmet. Fixes pypa#860
|
Firstly, I'm not sure this belongs in build. If direct URLs should be comparable, I feel it'd be better if support was added to packaging. Secondly, this check is not exhaustive - it will not reconstruct URLs with subdirectories and will reject dependencies with subdirectories. I don't know if other edge cases exist, but if we should reconstruct direct URLs at all, it'd be a good idea to see how pip deconstructs them. Thirdly, can we slow down with the PR merges? You've asked for review from 3 other people. Unless the PR is trivial, maybe give it a day? Fourthly, and this might not be a sentiment shared by other contributors, but if an LLM was used to write a PR whole or in part, I'd like to have that clearly stated. |
|
I think I can increase the "required number of approvals" to 2. The downside is that there really are only three active maintainers, so that might cause some issues with PRs getting stuck, but we can always force merge them if they only get one review and time passes (I think several of us can do that?). Opinons on how and if LLM's should be mention vary between "they must be" to "never mention them". 🙄 Pip pretty much did the flip, they had a nice "please mention the tool used" to "make sure you don't co-sign with the tool" (and no more mention of mentioning the tool). IMO, I like to know exactly what's used, so always I add a mention in the PR description (now using 🤖!), and I've started adding Linux's |
|
Also, I can add I'd like a little agreement from maintainers on releasing. I really like the 'required approvals' setting on GitHub releases, but we aren't using that. (and the tag has already been made, the setting only affects the deployment, so it's not ideal, but it's better than nothing). And I can add a big thank you for working on improving build, happy to see movement here, don't mean to be discouraging, just making the process a little more intentional involving everyone would be nice. :) |
|
I don't wanna seem discouraging either. It's just that I've noticed a pattern of things being merged before I can get to them or with comments left unaddressed, which is a tiny bit grating. The three of us are (broadly) active, and I do personally tend to provide feedback on PRs. If it's been a few days or if you've poked me and I haven't got back to you, by all means - but this PR was wrapped up in less than 3h. I don't think we need to make a process change when it comes to PR approvals - just a little something to bear in mind before pressing the merge button.
I like this, and I think it'd a good idea to codify it, though I wonder why pip reversed course? |
|
Generally I always follow-up on feedback, 🤔 did I miss it? (even if already merged it).
Why, while I can see the argument for merge approval, don't see much benefit behind the release approval; kinda goes against the spirit of releaes often and small by putting in blockers in that process.
I'm +0 on this. IMHO LLMs are tools, ultimately signed off as it was written by who publishes (and I use them as such). We don't really added that used PyCharm autocomplete to create the PR, in the past, so why would LLMs be different. That being said depends how you use it. If some people send a prompt and then don't even review it themselves I can see an argument; but you generally can tell that on the PR and just not approve it if it's not ready. |
I think the problem has been that almost every tool adds Fully automated PRs are another story. :) Personally, I like knowing the tool, I think
I think the argument was just there wasn't time for feedback. I think it's because of the auto-merge setting merging after one review. Either we shouldn't use it, or we should increase the number of required approvals to 2.
I don't think having a bit of delay is that harmful, and build is pretty stable, I don't think we need to have a really quick release cycle (and I'm not advocating for too much, just roughly the same approvals as a PR, currently it's less than a PR). |
|
Still don't see the benefit of needing to approve releases? What benefit are we getting out of it? |
|
Maybe not approve, but having a chance to have some input would be nice. If you could drop in the discord that you plan a release for the next day, that would be slightly preferred. Not critical (I'd rather you make releases immediatly without input than me make releases later with input! ;) ). |
|
To return to this PR:
Would you be opposed to reverting? |
|
Would you entertain a fix to the raise issues, or want a full revert? |
|
Is subdirectories the main issue? That seems like something that should be fixed. FWIW, nox also checks direct urls, and I think it's likely to also not handle subdirectories correctly. 😳 People can turn off the check - it's already very common to have to turn of the check - If someone requires a command line tool like But as I've said before, I don't like this entire check in general, so I'm fine if direct urls are ignored, too. Build only uses it to throw an error instead of trying to do something in non-isolated mode, while pip and nox actually benefit by trying to install based on the check, since they control the environment they are targeting. |
Also, after @layday responds, whatever way you want to go, feel free to make a release, no need to discuss a small patch release. I'm just referring to releases when it's been a while and might be minor releases (like 1.4.1 was). |
|
I'd prefer reverting it for now. We can look into approaching this a little more comprehensively down the line when we can depend on packaging having direct URL support and if we agree that it's something that should be handled by build. |
|
Let's revert then, make a release (will also fix docs), then move forward with dropping 3.9. |
check_dependencysilently accepted URL requirements likepkg @ https://...whenever a package with the same name was already installed, regardless of its origin. 🐛 For example,packaging==24.2from PyPI would satisfypackaging @ git+https://github.com/pypa/packaging.git@24.1.The fix uses PEP 610
direct_url.jsonmetadata to compare the installed distribution's origin against the required URL. For VCS requirements, the URL is reconstructed from thevcs_infofields (vcs,url,requested_revision). A URL requirement is only reported as unmet when the origin doesn't match — if the installed package was installed from the same URL, it's accepted as before.Fixes #860