-
Notifications
You must be signed in to change notification settings - Fork 166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GitHub autogenerated tarballs not considered stable #2458
Comments
If anyone is looking for a workaround, the following may be used: opam install <package> --no-checksums @erikmd maybe we can use this by default in the Coq Docker? Unless checksums get rolled back (unlikely) this will take the whole OCaml and Coq ecosystem some time to work around. |
GitHub may be in the process of reverting the change:
|
From what I can tell, GitHub has now rolled back the tarball checksum changes. But the central problem remains in the long term: autogenerated GitHub tarballs are probably not trustworthy for preservation of code in the OCaml and Coq ecosystem. |
I just saw that an official request for feedback on this problem was opened: community/community#46034 |
What I understand from reeding this link is that the problem needs to be handled by opam. A solution that comes to mind is to hash, not on the compressed archive, but on the uncompressed one (or any result given by a chosen filter). |
Opam could hash the uncompressed, and be future proof. But we have a lot of packages which declare the hash of the compressed archive. For these we would need some automation to update the hash, once opam understands the new hashing schema. |
Due to GitHub changing compression approach on Jan 30, 2023, every auto-generated GitHub hosted tarball now has a different checksum. The archive content is the same.
The following is GitHub's message to anyone affected by this, like us:
So it seems we have to both (1) ensure checksums are refreshed for all GitHub tarballs in the archive and (2) ensure that only stable archives are used in packages going forward...
The text was updated successfully, but these errors were encountered: