Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: lossless png compression #585

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

ckerr
Copy link
Member

@ckerr ckerr commented Jun 21, 2024

I used this script to run lossless compression utils [zopflipng, optipng, advpng, pngcrush] dialed to maximum (e.g. advpng --shrink-insane) on all the png files in the repo.

The ugly: this is very slow, which is why I'm PR'ing the changed files instead of adding the process to a workflow.

The good: this PR reduces the total size of png files in the repo from 14,328,881 bytes to 6,536,247 bytes, a lossless savings of about 55%.

See also #583 (comment). CC @dsanders11, @BlackHole1, @erickzhao who were all active in that PR.

@ckerr ckerr requested a review from a team as a code owner June 21, 2024 22:38
@MarshallOfSound MarshallOfSound temporarily deployed to electronjsorg-new-pr-585 June 21, 2024 22:38 Inactive
Copy link
Member

@dsanders11 dsanders11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't want the changes to images under docs/ as they'll be overwritten by the next docs update.

@dsanders11
Copy link
Member

It would be nice if we had a way to verify the changes were actually lossless (not just for this PR, but any future ones), so we don't risk committing lossy changes that then get further lossy changes on top of them in the future.

This comment suggests that ffmpeg -loglevel error -i <name>.png -f md5 - would get you the hash of the decoded raw output as a way to confirm two PNGs are the same, but I'm getting different hashes between main and the content in this PR, so not sure it's a proper method.

If we can verify that the output is lossless, we could rig that script up as a GitHub Actions workflow to run after every push to main and commit back the compressed PNGs. I think that would satisfy our issue of not wanting it to be part of the build pipeline but still get the benefit of smaller file sizes.

@ckerr
Copy link
Member Author

ckerr commented Jun 24, 2024

but I'm getting different hashes between main and the content in this PR, so not sure it's a proper method.

Interesting!

I believe these compressors are lossless but I haven't done anything to confirm it. It's possible that I'm holding one of them wrong. I'll do a little digging to see if there's a good way to check and will mark this PR as a draft in the interim.

Agree on docs/, I'll revert those changes too.

@ckerr ckerr marked this pull request as draft June 24, 2024 13:38
@MarshallOfSound MarshallOfSound temporarily deployed to electronjsorg-new-pr-585 June 24, 2024 13:40 Inactive
@ckerr
Copy link
Member Author

ckerr commented Jul 1, 2024

but I'm getting different hashes between main and the content in this PR, so not sure it's a proper method.

I still have some work to do on this, but just to check in: I've found two reasons that a lossless compression could still fail checksum tests:

  1. This thread says some PNGs contain creation timestamps in their data, so checksum tests would fail unless those timestamps are identical

  2. Some PNGs (including some of the ones in this repo) contain a nonstandard data chunk "iDOT" which is an interesting sidestory but the tldr is it's a nonstandard Apple extension that's unused by non-Apple png readers. So if the compressors discarded these, that would also affect the checksum

The closest thing I could find to a clean off-the-shelf test for equality was using ImageMagick's compare -metric AE a.png b.png null. That sounds good but I still need to test it out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants