Re: Change set based shallow clone

From: Linus Torvalds <>
Date: 2006-09-11 12:55:23
On Mon, 11 Sep 2006, Paul Mackerras wrote:
> Do you think there is any way to speed up the cold-cache case for
> git-ls-remote and git-rev-parse with thousands of heads and tags?

Nothing obvious comes to mind.

If we did the same pack-file approach that we do for objects, the problem 
ends up being that _updating_ things is really hard. What we could do (and 
might work) is that a "git repack" would create a "packed representation 
of the heads too".

The issue is that once you create a pack, you _really_ don't want to use 
read-modify-write to modify it ever afterwards. That's how you get nasty 
corruption. The "append-only" side of the git object database is really 
what makes things so stable, and packing multiple heads in the same file 
automatically means that if something goes wrong, it's _disastrously_ 
wrong, in a way it isn't when you have separate files.

So we could generate a "pack of references", but then any modifications 
would be done with the current loose "file objects" approach, and just 
have the filesystem override the pack-files. The problem then actually 
becomes one of _deleting_ branches, because then we'd have to add a 
"negative branch" loose object. Ugly.

An alternate approach might be to keep the current filesystem layour, but 
simply do the readdir over the whole reference situation in one pass, then 
sort them by inode number, and look them up in order. That would help on 
some filesystems, at least.

To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to
More majordomo info at
Received on Mon Sep 11 12:55:58 2006

This archive was generated by hypermail 2.1.8 : 2006-09-11 12:56:37 EST