HandOfCode Posted September 23, 2010 Report Share Posted September 23, 2010 Storing them as individual files causes load times to get slower and slower as you eventually need hundreds of file accesses (or more) to load utorrent. Is it possible to just store them in a client side DB like SQLite, or something custom? This would boost load times across the board (more so for users that have anti-virus programs like AVG that are kind of intense about checking all file accesses and users with thousands of torrent files)Why keep thousands of files? so you know which ones you've already downloaded because utorrent warns you it's in the list.All in all this should be a super easy to add feature because instead of reading files, you're just reading another type of database. File systems qualify as a minimal database so the conversion with properly weakly coupled code should be easy but I don't know how you've actually programmed utorrent. Link to comment Share on other sites More sharing options...
BuTaMuH Posted October 13, 2010 Report Share Posted October 13, 2010 +1. Now I'm seeding over 11 000 torrents and the start time is ... just under a minute. Link to comment Share on other sites More sharing options...
jthill Posted October 13, 2010 Report Share Posted October 13, 2010 Coincidentally, I was playing with MyDefrag yesterday and came up with the script below, which will gather everything in its subtree. It makes reading lots of files out of that subtree much faster. Maybe it will help here.Description("Localize (defrag and gather) the entire contents of !ScriptDirectory!. Needs a relatively well-maintained disk, as it will gather the files into available space and not move anything else around to make room.")Title("Clean up !scriptdirectory! after install")#WindowSize(invisible)VolumeSelect Writable(yes) and Fixed(yes) # Don't see how to make MyDefrag derive volume from pathVolumeActions FileSelect fullpath('!ScriptDirectory!\*','*.MyD') fileactions fileend FileSelect not(fullpath('!ScriptDirectory!\*','*')) fileactions fileend FileSelect size(0,40000000) # anything larger we can just defrag, avoiding even a long seek for a >500ms read doesn't seem worth it FileActions SortByName(Ascending) FileEnd FileSelect all fileactions defragment() fileendVolumeEndwhenfinished(exit) Link to comment Share on other sites More sharing options...
moogly Posted October 13, 2010 Report Share Posted October 13, 2010 SQLite can be a solution (if possible) like storing cookies, history, passwords etc... in Firefox.But huge SQLite db are problematic too. In the case of Firefox, when the history base (places.sqlite) becomes huge (several dozens of MB), the browser is really slowed down and takes some time to start.Of course, the advantage is you merge all the .torrents into one file. Link to comment Share on other sites More sharing options...
GTHK Posted October 13, 2010 Report Share Posted October 13, 2010 There's a way to clean up the db in firefox, viewing history gets pretty damn slow if you increase retention to half a year+, old way of doing things was better in this regard Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.