Jump to content

utorrent rapes HDD with 4 byte reads/4k writes


boomman

Recommended Posts

When you force utorrent to create a partfile, it reads in the index of the partfile in brain-dead 4 byte chunks. Those 4-byte chunks are indexes into the partfile.

There have been many other things we wanted to fix in the disk system, so we haven't gotten to this rater old problem yet. We'll fix this in 3.3, and may back-port it to 3.2.

Thank you.

Link to comment
Share on other sites

Turn off diskio.use_partfile, or don't skip files.

Thanks, first workaround helped. However, now files that I download take thrice more space than before due to preallocation of partially downloaded files.

Please let me know if you acknowledge that this is a bug and should be fixed. It shouldn't be hard, just apply cache to those operations.

Link to comment
Share on other sites

  • 2 months later...
There have been many other things we wanted to fix in the disk system, so we haven't gotten to this rater old problem yet. We'll fix this in 3.3, and may back-port it to 3.2

Thanks for the updated status :)

I haven't seen any other reports about the issue either (they used to be fairly widespread).

I'll make sure to send you a copy of my test report ... I think those 100 people running this beta are just getting tired of reporting issues that the ETA for a possible fix is far far away ... Issues usually do not tend to magically disappear ... :P

Link to comment
Share on other sites

  • 4 months later...
  • 2 weeks later...
I haven't been able to repro this in over a month honestly with any recent build of 3.2. I haven't seen any other reports about the issue either (they used to be fairly widespread).

I'm getting this an awful lot on 3.2 build 27886

With 2 torrents going at once, there doesnt seem to be an issue. If I force start a 3rd torrent, within approx 1 minute uTorrent begins sending 4MB/s (while only receiving about 1MB/s). I dont know what is in the data or where it is destined for but it doesnt look like its written to my NAS where my torrents are saved to. As my WiFi connection's bandwidth (130MB 802.11n) is now getting hammered on, reading/writing to my NAS is delayed, causing a 'Disk Overload'

I can stop the Force Started torrent, but the 4MB/s continues until I exit the program. uTorrent remains resident in memory for about 2 minutes until it finally closes. The instant it closes, the 4MB/s disappears.

If you guys need more info or need to recreate it, let me know. I just wanted to let you know this is still an issue.

EDIT: The 4MB/s is reported through Microsoft's Wireless Network Connection status. uTorrent only says it is sending 10KB/s. I have been able to recreate this the last 5 times I've tried so I know the 4MB/s is uTorrent related.

Link to comment
Share on other sites

  • 4 weeks later...
As far as I can tell, this is no longer a problem on 3.2.

Looks like it is still a problem. I'm currently using uTorrent 3.2.2.28595 (but the same behaviour existed in all previous versions) and here is uTorrent's file activity shown by Process Monitor:

http://i47.tinypic.com/29z76di.jpg

As we see, some torrent's part file is being read by tiny 4-byte pieces. This part file is about 50 MB big, the torrent's total size is about 400 GB out of which only 24 MB have been downloaded, the torrent's piece size is 16 MB, total number of torrents being seeded simultaneously is 700+. I re-downloaded the torrent to test for the possibility that the part file format is different in uTorrent 3.2.2, but the problem didn't go away (although the part file's contents has changed with the size remaining the same).

This kind of behaviour is not constant, but appears periodically when the torrents with big part files are seeded. It consumes CPU power uselessly and interferes with other uTorrent activities on slow computers which may produce a wavelike (fluctuating, undulating) upload pattern:

http://i48.tinypic.com/2pr8itd.png

Since part files may reach gigabytes in size on really huge torrents, this uTorrent issue is becoming quite annoying, therefore I think it must be addressed in the nearest release.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...