Compressing Files And Archiving in Linux Terminal
There are strong compression utilities in Linux that can be run easily from the command line. Linux features are archiving utility called tar
, in which you can add many files, and the tar file will be treated as a single file. You may wonder how this is different from a folder.
Linux features the zip and a gzip utilities to compress and extract (uncompress) files. There is also a bzip and bzip2, but we will talk about zip and gzip.
At the command prompt, use this format compress a file: gzip filename
The file will be compressed and saved as filename.gz . To extract a compressed file:
$ gunzip filename.gz
The file filename.gz will be deleted, and the extracted file, that is, "filename", is created in the current folder.
Similarly, to use the zip format so that the file can be opened even on a Windows machine, you can use the zip utility:
$ zip filename.zip filename //to compress
$ unzip filename.zip //to extract
Archiving at the command prompt
You will need to know the 4 basic options of the tar
command generally used.
C: Create a new archive
V: verbose; show the files being archived
X: extract files from an archive
F: when used with C, it means "use the filename specified for the creation for the file," or when used with X, it means "use the filename specified for the extract file."
You can add several files to the archive with the tar utility as follows:
$ tar -cvf filename.tar /directory1/directory2/directory3 /directory1/directory2/directory3
The files in directory1, directory2 (which is a subdirectory under directory1) and directory3 (which is subdirectory under directory2) will be added to the tar archive such that the directory structure is maintained in the form of directory1, directory2 (created under directory1) and directory3(created under directory2) in the archive. You can of course, specify different folder names in the second set of directories.
Hope this blog will make you understand how to compress, extract and archive files in linux/ubuntu.
Thanks for reading the blog.
0 Comment(s)