I tried adding this to the matrix, but I couldn't work out how to do
this *and* keep a vanilla 3.1 test in the mix as well. It seems you
can't add different `env`s with `include`, but maybe I missed something.
This method provides a short cut to finding out how many entries are in
an archive by reading this number directly from the central directory,
and not iterating through the entire set of entries.
This allows for reading the EOCD records without then automatically
reading all of the entry data as well, so that we can do other things
faster, like provide the number of entries in an archive.
When loading extra fields from both the central directory and local headers,
unknown fields were not merged correctly. They were being appended, which
means that we end up with the two versions stuck together - in some
cases duplicating the field completely.
This broke all kinds of things (like calculating the size of a local
header) in subtle ways.
This commit fixes this by implementing a new `Unknown` extra field type,
and making sure that when reading local and central extra fields they
are stored and preserved correctly. We cannot assume the unknown fields
use the same data in the local and central headers.
Fixes#505.
If a zip file has a comment that is 65,535 characters long - which is a
valid length and the maximum allowable length - the initial read of the
archive fails to find the End of Central Directory Record and therefore
cannot read the rest of the file.
This commit fixes this by making sure that we look far enough back into
the file from the end to find the EoCDR. Test added to catch
regressions.
Fixes#508.
so that contributors will be aware metrics are being gathered on this repo over the next year. If there's a better place to put this please let me know. :)
We were previously trying to work out where the next entry would be,
even with GP bit 3 set, but the logic was flaky and cannot really be
correct given the data available. It's not expected behaviour, so raise
the error instead.
This means that we get rid of the incorrect `Entry.data_descriptor_size`
which was doing more harm than good.