Jump to content

larger swarms = slower speeds ?


Recommended Posts

heya folks,

what may be the cause of this behavior??

i mean,

i have a torrent with 600+ seeds and 2000+ leeches, and it is averaging @ 40kB

another torrent with 20 something seeds and 100+ leeches and it is topping off my connection


is there a setting or something?

tiaa. :)

Link to comment
Share on other sites

You also get the most download speed when you can find your upload speed's "sweet spot".

This is the point where uploading faster only causes download speeds to slow...as does uploading slower.

You also have to calculate how many total upload slots to use, as using too few upload slots means there's FEW peers who will give you anything back...because you're only uploading to few peers.

And using too many upload slots means few peers will give you anything back...because you're uploading very slowly to each peer.

Link to comment
Share on other sites


I can reproduce this behaviour (on some popular newly-uploaded torrent/public tracker whose names I would not however mention;)

Tracker reports Seeders: 3045 / Leechers: 5721, and seems those numbers are not faked by tracker (DHT status: got ~2500 peers). However with ~40 seeds connected the download speed floats only within the range ~10-60 kB/s.

ps. Aha, speed is gradually boosting up. Seems it is indeed not easy to find a fast uploader in such a huge swarm.

pps. Ups. Might be also related - I see that a huge number of peers in this swarm are users of BAD ISPs ("limits BitTorrent bandwidth: Yes" or even "prevents seeding: Yes").

Link to comment
Share on other sites

thank you guys very much for your replies. :)

i wonder if there is a download speed "logic" in utorrent. something like utorrent constantly striving to achieve a maximum sustained top speed, which the user shall set, and keeps on trying every seed/peer avail until it reaches this specified speed etc. etc. if you get my drift...

Link to comment
Share on other sites

You connect to more seeds, who care nothing about what you're uploading...or if you're even uploading at all!

...and more seeds will upload to you.

End result, more connections CAN increase your download speed.

But there's only so many connections PER torrent and many of those connections are FIREWALLED so you can't connect to them, and they may not bother trying to connect to you!

Simple solution:

More torrents = more connections.

HOWEVER...that doesn't scale indefinitely and the disaster that it will cause if "everyone" tries to do it will be like everyone trying to make a phonecall at exactly the same instant. BitTorrent alone won't just fail...quite likely the vast majority of the internet will too! It'll fragment into "island networks" just like the minor Gnutella disaster back around 2002.

The internet backbones are scary fast, they can handle the traffic. The "last mile" has improved immensely too for many ISPs, they now use fiber optic even for cable lines and switch out to copper connections only at the power poles near people's houses. And I've heard the fiber they're using is at least dual-line 100 megabits/sec as of 6+ years ago. Now they're using typically 1, maybe 10 gigabit/sec fiber lines just to feed 1 group of cablemodems in a small area of a city. No offense, but it's a lie to say that just because bandwidth's shared from the ISP to people's houses that it's a major bottleneck for the best cable providers! But there ARE major bottlenecks out there, near and far.

Alot of people here have found out the HARD way that even Win XP SP2's normal 'hard' limit of 10 half-open connections at once is more than sufficient to hose their computers, routers, or modems. But what if more people with good computers, good routers, and good modems tried running with 500+ connections at once and half-open limit (rate new connections are attempted) set at 100+ at once? They would all be hammering the marginal peers and seeds on the torrents with their connection attempts. And those computers and/or connections would crash. I saw something like that back about 2001 when I tried to share about 6 super-popular 100-300 MB files on Gnutella. After about a day, things slowed down. Not long after that, I quit sharing the files, turned off the Gnutella progam I was using, disabled port-forwarding on my router, and reset my router+computer+modem. Still had problems, so I shut my internet connection off for A DAY. Got back on, still had the same internet ip. 3 weeks later after not sharing these files nor running Gnutella at all, the predominate ips to hit my router were still seeking to download those 6 files from me. The "noise" was still so great 3 weeks later that my connection was only about 60-80% sustainable capacity on anything else. That was a network "working as designed"...but perhaps not as intended! I nicknamed the denial-of-service attack that resulted from such overwhelming demand for such files: "Plague of the Angry Locust Swarm".

If anything, BitTorrent would be even more "efficient" than that...especially if lots of people used crazy settings. Instead of only scaling to maybe 100 ips trying to get the file every second, it might be over one million. If that happens on a massive scale, it won't be just BitTorrent trackers that go down. And no raindrop ever thinks it's responsible for the flood.

When a major portion of the Eastern Seaboard internet backbone was knocked out by a fire I think it was in a Boston tunnel, the "backlog" of packets that normally would've gone through there were re-routed automatically to their destination across lesser backbones that couldn't handle the strain. Lines overloaded, network switches crashed...so more packets got re-routed. Domino effect, just like big power outages.

It'll be "fun" to see if BitTorrent can at least temporarily knock out a major portion of the internet.

Are you ready to contribute your part by running lots of torrents at once with 100's of connections each? :lol:

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.

  • Create New...