Local proxies are gaining favour at the moment with the publicity of Google's Web Accelerator as well as long standing products like my favourite Proxomitron.
I was thinking that someone could write a small personal proxy that tests for, for example, http://asite.com/photo.jpg.zip whenever it gets
a request for http://asite.com/photo.jpg. If it finds photo.jpg.zip it fetches that instead, unpacks photo.jpg and hands that up to the browser (or the proxy up the chain).
Not a big improvement on JPEGs when you're using zip, but there are other packages that do better. Or webmasters could offer photo.jpg.jpeg2000 and the proxy could convert it to a JPEG.
Bonus point for the following features:
* Pass through to another proxy (fairly necessary these days)
* Record hits and misses for a different websites (After 10 misses for a given website it could only check once every 10 requests, just so it doesn't fill logs with the requests)
* Option to only allow only localhost requests or requests from a specified subnet.
* Use an enhanced robots.txt file to find out what optimised formats are offered on a given website.
This would mean that as a web developer I could create a set of packed files using much better compression than current web standards allow and potentially reduce my traffic by quite a bit if I can convince my regulars to install the proxy. The advantage of implementing this as a proxy rather than a plug-in is that it would support all browsers. Also, the implementation described wouldn't need websites to replace img tags with embed tags -- since the browser still gets the file type it's expecting.
Repacking libraries could be modular. This proxy could form the basis of a number of web enhancments to do with file compression. You wouldn't have to wait for your favourite browser to get updated to support new file formats. ISPs could implement this to reduce their upstream traffic. Big sites with huge traffic bills could see a big saving as more people adopt better compression.