Data compression is the compacting of data by lowering the number of bits which are stored or transmitted. As a result, the compressed data requires less disk space than the initial one, so additional content might be stored on the same amount of space. You'll find different compression algorithms which work in different ways and with some of them just the redundant bits are removed, which means that once the data is uncompressed, there's no decrease in quality. Others erase unneeded bits, but uncompressing the data afterwards will result in lower quality compared to the original. Compressing and uncompressing content needs a large amount of system resources, particularly CPU processing time, so any hosting platform that uses compression in real time should have enough power to support that attribute. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of keeping the whole code.
Data Compression in Shared Hosting
The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm identified as LZ4. The aforementioned is substantially faster and better than every other algorithm you can find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the performance of websites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that very fast, we're able to generate several backups of all the content kept in the shared hosting accounts on our servers on a daily basis. Both your content and its backups will require less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the web servers where your content will be stored.