garreh Posted April 12, 2008 Report Share Posted April 12, 2008 Hi AllDevelopers, let me know what you think of this new idea for a compression algorithm, whether it would work, complications, etc.Video files are compressed using codecs. Let's say we are using XviD as an example. Different frames point information that has changed between those frames, to save the amount of data needed in the file. So a "map" is created to describe what information has changed. This map is local to that file only. Now what if the same idea was applied to multiple files in a predefined directory, and have a single mapping file that can be used as almost like a lookup.The downside to this is, of course, if the mapping file is deleted then all files that point to it will become corrupt. A workaround would be to copy over the data from the mapped upon deletion of a file that would become corrupt.Another downside is let's say a series of Red Dwarf is encoded at different bitrates. The intro remains the same for the same season, but yet it won't be the same information bit for bit simply because it's encoded at different bitrates. The workaround would be to analyse each frame of all the different video files to be compressed, compare how different they are, and decide whether certain frames can be described as one with little to no difference in human perception.This theory has a lot of caveats and I'm not sure of the technicalities of it, whether it could be incorporated into the codec, or even if the whole concept would save that much data. But it just seems that there is a lot of duplicated data from video file to video file, and simply pointing to this information would save much data. This is particularly true among files that are of the same TV series, where the intro and outro are the same, same kind of scenes, and so forth.This idea is sort of related to an article I read a few years ago by some professor of a university. I'm sorry I can't quite remember the details, but the general idea remained the same. The concept was to speed up P2P and Torrent by finding duplicate pieces of a file from other files from different sources. So for example, your downloading Battlestar Galactica and some piece in that file is exactly the same as something found in a download of a Windows XP Hot Fix file. Of course, with Microsoft's very fast servers, it would dramatically improve the download speed.What are your thoughts? Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.