Data compression is the compacting of data by decreasing the number of bits that are stored or transmitted. As a result, the compressed data will need less disk space than the initial one, so more content could be stored on identical amount of space. You can find many different compression algorithms that work in different ways and with many of them just the redundant bits are deleted, which means that once the information is uncompressed, there's no loss of quality. Others delete unnecessary bits, but uncompressing the data subsequently will lead to reduced quality compared to the original. Compressing and uncompressing content consumes a significant amount of system resources, in particular CPU processing time, therefore each and every Internet hosting platform that uses compression in real time must have sufficient power to support this feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the actual code.

Data Compression in Web Hosting

The ZFS file system which operates on our cloud hosting platform employs a compression algorithm identified as LZ4. The aforementioned is significantly faster and better than every other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the performance of websites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data very well and it does that very fast, we are able to generate several backup copies of all the content stored in the web hosting accounts on our servers every day. Both your content and its backups will take reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the servers where your content will be stored.