The term data compression identifies decreasing the number of bits of information which has to be saved or transmitted. You can do this with or without the loss of info, so what will be removed throughout the compression shall be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the info and its quality will be identical, while in the second case the quality will be worse. You will find different compression algorithms that are more effective for different sort of info. Compressing and uncompressing data usually takes a lot of processing time, therefore the server executing the action must have adequate resources in order to be able to process the info quick enough. A simple example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 inside the binary code as an alternative to storing the actual 1s and 0s.

Data Compression in Shared Website Hosting

The ZFS file system which operates on our cloud Internet hosting platform uses a compression algorithm identified as LZ4. The aforementioned is substantially faster and better than every other algorithm you will find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of sites hosted on ZFS-based platforms. Because the algorithm compresses data quite well and it does that quickly, we can generate several backup copies of all the content kept in the shared website hosting accounts on our servers every day. Both your content and its backups will require less space and since both ZFS and LZ4 work extremely fast, the backup generation will not affect the performance of the web servers where your content will be kept.