Jump to content

uTorrent 3.3 stable (29126) Disk overloaded 100% [Win xp]


Malediciton

Recommended Posts

Wondering if everyone is still having this disk overloaded issue with 29677?

I had this problem way back a year ago or more and set my disk cache to 1800 and never saw it again (never had a problem on my system with that setting). I also run uTorrent very aggressively on my machine. 999,999 connections allowed, connect speed at 100, half open connections at 100. I do not allow file preallocation because of other issues that causes in uTorrent. It really taxes my router's CPU (WNDR4500 with DD-WRT and some custom connection settings) around 85% but I download nearly 10MB/sec and upload 1.5MB/sec with thousands of connections for a single torrent, even with dozens and dozens of torrents seeding/downloading. My system never becomes slow.

My downlaoads folder is on a gigabit network drive (WD MyBookLive 3TB) which is nothing special but it seems to keep up. My system is a DIY, quad core Phenom 3.6ghz, 16GB RAM (no page file), 240GB SSD (I don't download to it) running Windows 7 pro x64. uTorrent seems to use around 500mb to 800MB of my system memory, and 10-15% CPU power.

Recently I reinstalled windows and neglected to change utorrent's cache file to 1800 like I had before, and instantly, on the very first download it disk overloaded. I changed the size back to 1800 and it's gone, haven't seen it since (few months now). Has the issue been fixed or even identified yet? I'm wondering why my system doesn't seem to have the problem you all describe, as aggressive as my settings are.

Link to comment
Share on other sites

  • Replies 58
  • Created
  • Last Reply

After still, constantly, having this problem I decided to go the other direction and turn my write cache back on, set to the maximum. This has eliminated the disk overloaded issue, but there is still a bug with flushing writes to disk, and this change has shed some light on it.

Since the cache now absorbs the extra writes that would have had to wait to write to disk, I don't get disk overloaded anymore, but I do get periods where the disk itself is not overloaded at all and yet the cache keeps filling up instead of flushing.

Sometimes, it will right itself and everything will end up fine. But other times, the disk is not utilized at all and the cache just fills and fills. Usually, but not always, the corked jobs number will keep rising at this point, and actual disk writes will go to 0.

The cache never actually does get completely filled though. It has passed 1 GB but at some point the entire application becomes unresponsive and needs to be killed.

All of the stuff downloaded that was never written to disk, is typically lost and needs to be downloaded again.

In at least one instance where the cache started filling, it did not crap out completely, it just wrote data to the disk VERY slowly. The odd thing is that in performance monitor, I could see that both System and the uTorrent process were logging writes to the disk at around the same rate (a rate sufficiently fast that the cache should have been drained quickly, yet it wasn't).

All of the problems noted in this thread has been observed on different disks in my system (I have 3 non-system spinning disks of various sizes, and an SSD for system; I've tried downloaded to all 3).

Also of note, these problems are WAY more likely to happen when downloading from multiple torrents simultaneously, and way more likely when the torrents in question have multiple files (but multiple torrents, even with a single file each, is more likely to trigger it in my experience). It definitely has happened on single active downloads with a single file (I've come back to the machine after RSS picked up a single file torrent and it was stuck).

Hope this helps.. would be great to hear from an actual developer. I don't see any indication that this is being looked at in the dev builds.

Link to comment
Share on other sites

Why is it stupidly set?

Simple

MORE DOESN'T MEAN BETTER.

You are splitting your connection a potential of almost a million different ways, not to mention the fact that most (read: almost all) residential internet connection hardware starts choking down in the 4000-10000 connection range.

Not to mention the fact that if you're actually downloading from a lot of sources at a time, your hard drive will have to keep up with a massive number of writes all over the drive, pushing more towards the random seek times of your hard drive, where disk overloading happens more frequently.

In short, your own settings likely cause your problems DIRECTLY.

Link to comment
Share on other sites

Why is it stupidly set?

Simple

MORE DOESN'T MEAN BETTER.

You are splitting your connection a potential of almost a million different ways, not to mention the fact that most (read: almost all) residential internet connection hardware starts choking down in the 4000-10000 connection range.

This is a pretty big generalization. The residential ISP hardware doesn't know or care how many connections there are unless it does NAT or firewalling, and that depends heavily on what the provider is. Typically, I've seen the ISP provide a (separate) router which can be terrible (FiOS/Actiontec) but can also be replaced.

BUT he already said he uses a DD-WRT with a custom firmware.

Not to mention the fact that if you're actually downloading from a lot of sources at a time, your hard drive will have to keep up with a massive number of writes all over the drive, pushing more towards the random seek times of your hard drive, where disk overloading happens more frequently.

In short, your own settings likely cause your problems DIRECTLY.

Then you talk about disk issues. Well, first, if his connection was choking, this wouldn't be an issue.

Second, you're talking about downloads of 10 MB/s on a 3 TB drive. A drive with platters that size will typically have write speeds upwards of 100 MB/s sequential, and random access of 10 MB/s is unlikely to be a problem. It's not even a system disk.

Maybe you should apply a little more logic and thought before rushing to tell someone how stupid they are.

Link to comment
Share on other sites

Maybe you should apply a little more logic and thought...

I think my proposed settings (sig/below) were after a little thought... You are welcome to read the guide/tips, adjust per you connection and try them out..

Link to comment
Share on other sites

BUT he already said he uses a DD-WRT with a custom firmware.
DD-WRT is in the 10k range, NOT THE 1 MILLION RANGE THAT HE STUPIDLY CHOSE.
Then you talk about disk issues. Well, first, if his connection was choking, this wouldn't be an issue.
Except that if he's actually downloading from a high number of peers, it's going to increase the randomess of the disk access, which increases the strain on the drive.
A drive with platters that size will typically have write speeds upwards of 100 MB/s sequential, and random access of 10 MB/s is unlikely to be a problem.
Benchmarks or GTFO.
Maybe you should apply a little more logic and thought before rushing to tell someone how stupid they are.
Maybe you should before you claim that I'm not. I've been doing this support for over 5 years. Chances are, if they've set their connection limits to the million range, they have absolutely no clue as to the consequences of it.
Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...