Data compression is the compacting of information by lowering the number of bits which are stored or transmitted. Thus, the compressed info will take considerably less disk space than the initial one, so extra content could be stored using the same amount of space. You'll find many different compression algorithms that function in different ways and with a lot of them just the redundant bits are erased, so once the info is uncompressed, there's no loss of quality. Others delete excessive bits, but uncompressing the data later on will result in lower quality in comparison with the original. Compressing and uncompressing content consumes a huge amount of system resources, and in particular CPU processing time, therefore any hosting platform that employs compression in real time should have adequate power to support this attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of saving the actual code.
Data Compression in Shared Hosting
The ZFS file system that operates on our cloud hosting platform employs a compression algorithm identified as LZ4. The latter is significantly faster and better than every other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that very fast, we're able to generate several backup copies of all the content kept in the shared hosting accounts on our servers every day. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the web hosting servers where your content will be kept.