What's faster, cp -R or unpacking tar.gz files?

Buttle Butkus asked:

I have some tar.gz files that total many gigabytes on a CentOS system. Most of the tar.gz files are actually pretty small, but the ones with images are large. One is 7.7G, another is about 4G, and a couple around 1G.

I have unpacked the files once already and now I want a second copy of all those files.

I assumed that copying the unpacked files would be faster than re-unpacking them. But I started running cp -R about 10 minutes ago and so far less than 500M is copied. I feel certain that the unpacking process was faster.

Am I right?

And if so, why? It doesn’t seem to make sense that unpacking would be faster than simply duplicating existing structures.

My answer:


Reading a very small file is much faster than reading a bunch of large files. This is generally true even if the CPU has to decompress it.


View the full question and answer on Server Fault.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.