Jump to content

4 Computers / 1 torrent


Recommended Posts

Hello all... I have scoured the internet looking for support/direction on this idea... I currently have 4 computers running on my home network, and was wondering if it would be possible to download only parts of a single file with each machine to a common network storage point (server?). What I'm thinking is that I would have to have seperate IP addressess for each computer via proxy, (or borrowed network... neighbour is my best friend - no problem there) as well as a common storage point that all machines could access, as well as a way to 'compile' the torrent once it gets to the storage point so there would be no overlap in downloaded information. I think the process would have to involve each of the different computers 'talking' to the storage point to see what part of the remaining file needs to be downloaded, just as it would on any single machine. If there is anyone with any guidance in the right direction, or any thoughts on this, your feedback would be greatly appreciated as there is nearly nil on this subject anywhere.

Link to comment
Share on other sites

so... running through a proxy wouldn't work? (I know next to nil about proxie servers), each machine would have to be an entirely seperate entity on the web?, but at the same time, all have a relationship via a switch or hub of some sorts? (sorry to be so naive, but this is seemingly a fairly steep learning curve)

Link to comment
Share on other sites

my thinking was to make UTorrent 'think' there were 4 seperate IP addresses asking for the same torrent, while in actuallity, once the information arrived, it would all be coming to a single IP address. So... how would I go about compiling the information, or re compiling the torrent back to a single file, on the fly so-to-speak? Would I implement a remote file sharing application of some sorts?

Link to comment
Share on other sites

Proxying multiple computers on the same connection is STILL using the same connection split across all the computers.

All the computers would need separate internet connections but a shared local connection.

As they complete pieces, they would share those pieces across each other locally.

Meaning that as each computer completes a piece, that completed piece is shared to the others working on the torrent.

Link to comment
Share on other sites

would Utorrent know to interpret, and utilize the information from potentially 4 different sources? (3 machines on the LAN, + the part of the torrent from the tacker(?)) What I have right now is 4 different machines D/L the same torrent, and there are spikes up to 2,000 kb/s, but... I end up with 4 copies of the same torrent at the end of the download... it is speeding up the download, but only marginally, and there is massive repetition. The average speed is still only around 8 kb/s (old torrent, very few seeders), and with a single machine, it was down around 3 or so..., so... there is some improvement there, but this concept of streamlining it, I would think, should theoretically have that file on my HDD 4 times quicker...

Link to comment
Share on other sites

I don't see how you can have multiple uTorrent clients writing to the same file since one does not know what the other is doing. If the torrent contains multiple files, you can assign a partial list for each client to download. Point all downloads to the same network share. Once you a force re-check, you should have 100% when all four clients have downloaded their assigned files.

Link to comment
Share on other sites

Unfortunately, this is the problem I am having... it is a single 34GB file that I'm downloading, and, at the end of the day, I was wondering how I could make all 4 machines write to the same cache as they are downloading, and the inverse, read from the same cache so as not to copy that same information twice, or in this case, 4 times. There should be a way to be able to combine all 4 to work towards a common goal simultaneously. What I do see is that if I'm downloading on one or two machines for a few days prior, and I bring a 3rd, or even a 4th online, the upload on the original two goes through the roof (like 2,200 kb/s), and inversely, the XP machines match the download... The Vista machines on the other hand are only allowed to open 10 ports/sec., which... this is the only case where I've seen that it matters...

Link to comment
Share on other sites

High half open counts do not increase throughput. They may be useful for initial bursting of data, but once you're connected it matters very very little for continued throughput.

Trying to exploit a swarm in this manner is.... not nice. As DWKnight says, you need each client to appear as a different IP to the swarm to get this benefit. And uT does not offer the ability to WRITE multiple torrent jobs to the same files.

Link to comment
Share on other sites

would this not benefit the enitre torrent? by me contributing 4 more machines to the swarm, yes, I would receive it faster, but, once I have the torrent in it's entirety, would I then not become a seeder, thus, increasing the swarm, and in turn, benefitting others trying to download?... I appologize if that sounded argumentative, or even defensive, but, I'm trying to do my best to understand how this might work if it's even possible..

Link to comment
Share on other sites

To make it possible it's as DWKnight said: You need to get multiple Internet IP Addresses. You still will not be able to keep one copy of the data while downloading, however if it's multi-file you can split up the files, then when they get at the end of their 25% you recheck the directory and it'll be 100%.

Apparently Azureus allows this, but due to data integrity concerns this is not possible in uT.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...