I have a large folder (3000GB on an enterprise server). I need to zip those and make parts of 100GB each.
I did look into these articles :-
askubuntu.com
But they don't support splitting into multiple parts.
I did look into these articles :-
![www.peterdavehello.org](/proxy.php?image=https%3A%2F%2Fcdn2.peterdavehello.org%2Fwp-content%2Fuploads%2F2015%2F02%2FtarCompressComparison1.png&hash=3fe0c18d970312d67a0912c3379fbd7a&return_error=1)
Use multiple CPU thread/core to make tar compression faster - Peter Dave Hello's Blog
On many unix like systems, tar is a wide … 閱讀全文 →
www.peterdavehello.org
![askubuntu.com](/proxy.php?image=https%3A%2F%2Fcdn.sstatic.net%2FSites%2Faskubuntu%2FImg%2Fapple-touch-icon%402.png%3Fv%3Dc492c9229955&hash=4183324f41f5cfd4d5edbe6880659231&return_error=1)
Multi-Core Compression tools
What compression tools are available in Ubuntu that can benefit from a multi-core CPU.
But they don't support splitting into multiple parts.