So basically, any company or group can theoretically increase their overall download speed by giving you tiny chuncks of data which correspond to multiple possible downloads. It may require artificial intelligence to determine what you may probably download.
So compression is where computers find patterns of data and replace them with smaller abridged segments. So if you send a file of text, it might go through and replace all common words with a single symbol.
The reason quantum computing makes a difference is because it can take thousands of formulas and data segments which the user keeps locally, and find an optimum combination to create a coded message convertible to the data being downloaded.
So anextreme example of this is netflix may send you a movie you're likely to want to see, and then when you download it, it is instantaneous. Obiviously this is of no use in terms of data size, but what if 10 videos you my watch all have the same introduction. Netflix can then send you this intro, and save 90% of downloading.
So this is quite poorly written, but here is a vision for better compression. (Now, in honesty I doubt I know enough to make a dent in work already done, but whatever.) The idea is for senders of data or third parties who companies can hire, can begin to significantly decrease sending time/cost and internet traffic by using an improved form of compression. Actually a protocol anyone can use could be created.
The idea is that one can download a protocol of data and formulas up to various sizes, and those sending downloads can tell your computer how to use all this data in clever combinations to recreate locally the files desired. I'm not sure of the math, but I believe with 100 gigs of data and good processing power you... ok out of time.