Data compression is the compacting of information by lowering the number of bits which are stored or transmitted. As a result, the compressed information will need less disk space than the initial one, so a lot more content can be stored on identical amount of space. You'll find different compression algorithms that function in different ways and with several of them just the redundant bits are deleted, which means that once the data is uncompressed, there's no loss of quality. Others erase excessive bits, but uncompressing the data following that will lead to reduced quality compared to the original. Compressing and uncompressing content takes a large amount of system resources, in particular CPU processing time, therefore every Internet hosting platform which uses compression in real time must have adequate power to support this feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of keeping the whole code.