The term data compression refers to lowering the number of bits of information which needs to be saved or transmitted. This can be done with or without the loss of info, so what will be removed during the compression can be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the info and the quality will be identical, while in the second case the quality will be worse. You can find various compression algorithms that are more effective for different type of info. Compressing and uncompressing data in most cases takes lots of processing time, so the server performing the action must have ample resources to be able to process the info fast enough. An example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 within the binary code rather than storing the actual 1s and 0s.

Data Compression in Cloud Hosting

The ZFS file system which is run on our cloud web hosting platform employs a compression algorithm called LZ4. The aforementioned is considerably faster and better than every other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of websites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that very quickly, we can generate several backups of all the content kept in the cloud hosting accounts on our servers daily. Both your content and its backups will need reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the servers where your content will be stored.