Remote: Total 3 (delta 0), reused 3 (delta 0), pack-reused 0 Remote: Compressing objects: 100% (3/3), done. Remote: Counting objects: 100% (3/3), done. Remote: Total 1 (delta 0), reused 1 (delta 0), pack-reused 0Īnd then the final checkout downloads the files we requested: remote: Enumerating objects: 3, done. Remote: Counting objects: 100% (1/1), done. On the above, git clone downloads a single object, presumably the commit: Cloning into 'test-git-partial-clone-big-small'. So if you download anything you didn't want, you would get 100 MB extra, and it would be very noticeable. a small/ and small2/ subdirectories with 1000 files of size one byte eachĪll contents are pseudo-random and therefore incompressible, so we can easily notice if any of the big files were downloaded, e.g.9 on toplevel (this is because certain previous attempts would download toplevel files) a big/ subdirectory with 10x 10MB files.In this test, clone is basically instantaneous, and we can confirm that the cloned repository is very small as desired: du -apparent-size -hs *. This method doesn't work for individual files however, but here is another method that does: You could also select multiple directories for download with: git sparse-checkout set -no-cone small small2 Git clone -filter + git sparse-checkout downloads only the required filesĮ.g., to clone only files in subdirectory small/ in this test repository: git clone -n -depth=1 -filter=tree:0 \Ĭd test-git-partial-clone-big-small-no-bigtree
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |