fetch-pack: avoid repeatedly re-scanning pack directory
When we look up a sha1 object for reading via parse_object() =>
read_sha1_file() => read_object() callpath, we first check
packfiles, and then loose objects. If we still haven't found it, we
re-scan the list of packfiles in `objects/pack`. This final step
ensures that we can co-exist with a simultaneous repack process
which creates a new pack and then prunes the old object.
This extra re-scan usually does not have a performance impact for
two reasons:
1. If an object is missing, then typically the re-scan will find a
new pack, then no more misses will occur. Or if it truly is
missing, then our next step is usually to die().
2. Re-scanning is cheap enough that we do not even notice.
However, these do not always hold. The assumption in (1) is that the
caller is expecting to find the object. This is usually the case,
but the call to `parse_object` in `everything_local` does not follow
this pattern. It is looking to see whether we have objects that the
remote side is advertising, not something we expect to
have. Therefore if we are fetching from a remote which has many refs
pointing to objects we do not have, we may end up re-scanning the
pack directory many times.
Even with this extra re-scanning, the impact is often not noticeable
due to (2); we just readdir() the packs directory and skip any packs
that are already loaded. However, if there are a large number of
packs, even enumerating the directory can be expensive, especially
if we do it repeatedly.
Having this many packs is a good sign the user should run `git gc`,
but it would still be nice to avoid having to scan the directory at
all.
Signed-off-by: Jeff King <peff@peff.net>
Signed-off-by: Junio C Hamano <gitster@pobox.com>