Jump to content

Disk Overloaded 100% - 10 kb/sec download


Recommended Posts

I think this is a problem with the writing of the .DAT file on large torrents that you have only selected partial files on.

I have opened UT with ~3-5 torrents going with download speed of about 1.1MB/sec. I then added a torrent at 100+Gb and selected only 12Gb of files inside it. As soon as did this the speed died. From what I see its because of the .DAT file.

In the following picture you can see the file was created 2/20 at 1:18 pm. It is currenty 2:02 pm and the file is 10Mb in size:


Looking at taskmanager I see nothing out of the ordinary in the way of memory usage or cpu usage:


If I look at disk performance I see the culprit:


Nothing else is touching the disk. UT is only downloading at 10Kb/sec. The creation of this file is what is killing UT. As soon as it is done getting created UT will resume normal operations.

The question is why does this file take so long to create? Its not using any CPU or bandwidth to create. Why is it writing at 5 Mb a second when its been doing this for approximately 40 minutes and the file size is only 10Mb?

I invite you to do the same test, find a way to monitor your disk usage and you will see its the .DAT file UT creates that causes the HD strain and ultimately the disk overloaded.

Link to comment
Share on other sites

Here's the last bit of detail:

uTorrent: 3.1.2 Build 26753

Original torrent was 287.98 GB in size

The torrent had 3003 files

I selected 12.3 GB of data from the original torrent

The .DAT File took from 1:18 pm to 2:22 pm to fully create

The final size of the .DAT file was 19,327 KB

Upon completion of this file my speed rose back to 1.1 MB/sec.

This test is easily repeatable, what is the problem with writing these files?

Link to comment
Share on other sites

We're looking into this, but more than likely, a fix won't happen until the next release, since the code is involved and pretty messy. We're rewriting disk i/o anyway for the next version.

Just turn off diskio.use_partfile in the meantime.


I'm glad to hear you guy's are working on it.... for the most part users are left with just questions and dying threads... the net is littered with em. Plenty of wanna be experts making things worse by giving poor advise. What we needed is exactly what you just stated "we're looking into this" .... like to see a big "sticky" ttt addressing this issue in all caps and bold. It would help the endless lost souls looking for the "fix" (which isn't there yet)

Appreciate your honesty and response to this issue......will try "disco" bandage for now



Link to comment
Share on other sites


This topic is now archived and is closed to further replies.

  • Create New...