torrentguy2012 Posted March 13, 2012 Report Posted March 13, 2012 i've being thinking: can't we just generate chunks of data localy and then use it to feed into downloading file? I've used repair feature to avoid re-downloading whole big files that I already had similar version of. For example: I have 1 GB binary file and then someone on their end patches their 1GB file by changing 20MB of it. He shares it with me and I let torrent repair and download only 20MB that were changed by the other guy.Back to topic, can't something similar be done to avoid downloading chunks of data again and again by accessing similar chunks somewhere from local storage(from database of randomly generated chunks, etc)?
Recommended Posts
Archived
This topic is now archived and is closed to further replies.