Data compression is the reduction of the number of bits that need to be stored or transmitted and this particular process is very important in the web hosting field due to the fact that info recorded on hard drives is typically compressed in order to take less space. You'll find different algorithms for compressing info and they provide different effectiveness based upon the content. Some of them remove only the redundant bits, so no data can be lost, while others remove unnecessary bits, which leads to worse quality once your data is uncompressed. This method employs plenty of processing time, which means that a hosting server should be powerful enough to be able to compress and uncompress data instantly. One example how binary code could be compressed is by "remembering" that there are five sequential 1s, for example, in contrast to storing all five 1s.

Data Compression in Shared Web Hosting

The compression algorithm that we employ on the cloud web hosting platform where your new shared web hosting account will be created is known as LZ4 and it's used by the exceptional ZFS file system that powers the system. The algorithm is superior to the ones other file systems employ since its compression ratio is a lot higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed as this happens more quickly than info can be read from a hard disk. For that reason, LZ4 improves the performance of each and every website stored on a server that uses this algorithm. We take advantage of LZ4 in an additional way - its speed and compression ratio let us generate multiple daily backups of the whole content of all accounts and store them for a month. Not only do these backups take less space, but their generation won't slow the servers down like it can often happen with other file systems.

Data Compression in Semi-dedicated Hosting

The ZFS file system which runs on the cloud platform where your semi-dedicated hosting account will be created uses a powerful compression algorithm called LZ4. It's one of the best algorithms out there and definitely the most efficient one when it comes to compressing and uncompressing web content, as its ratio is very high and it'll uncompress data at a faster rate than the same data can be read from a hard drive if it were uncompressed. Thus, using LZ4 will speed up any Internet site that runs on a platform where this algorithm is enabled. The high performance requires lots of CPU processing time, which is provided by the large number of clusters working together as a part of our platform. Furthermore, LZ4 makes it possible for us to generate several backup copies of your content every day and save them for one month as they will take much less space than regular backups and will be created considerably quicker without loading the servers.