Jump to content

RSS feed, single update, double download regardless of wait time.


Tatarize

Recommended Posts

When the RSS feed updates and finds one favorites filter matches two newly added files in that particular feed update it will begin downloading both of them regardless if the time limit between downloads is set to "4 days" (though reason tells me other time settings will be likewise affected).

Filter: FileX *, match only every 4 days.

RSS feed update:

blah1

blah2

<15 minutes passes>

RSS feed update:

FileX 123

FileX 124

blah1

blah2

Both "FileX 123" and "FileX 124" will start downloading. Less than 15 minutes is also less than 4 days.

Link to comment
Share on other sites

What version of µTorrent are you using? I can't confirm this on 1.8.5 build 17091.

Say a filter matches, and has a minimum interval of 4 days. If you update the feed, and there is a new torrent that also matches, it doesn't get added here. Even clicking the "?" in the RSS Downloader dialog for the particular filter in question returns "None" for matching releases.

Link to comment
Share on other sites

Oh. In that case, I'm not sure it's really a bug. The minimal interval is designed to control the interval between feed updates for which the filter is applied to the whole feed. The interval isn't for controlling the interval between torrents added, as it's always possible for multiple torrents to match a filter on any one feed update.

If the user wishes to prevent that, then it would make more sense to have an option to only match one item for each time the filter is applied -- but then that begs the question: which one is µTorrent then supposed to choose if there are multiple files?

Link to comment
Share on other sites

  • 2 weeks later...

As it stands presently there's absolutely no way to prevent it. If two or three files matching a given RSS filter come through in the same update, all of them will be downloaded. There's no preventing this. If two of the same file gets added to the RSS in rapid succession and there's an update between the two, the system acts completely differently than if the two filter matches are added such that they fall into the same update.

Given a 15 minute interval, if the two matching files are added within five minutes of each other, there's a 67% that both will be downloaded and a 33% chance that only the first one will be downloaded.

Strictly speaking the bug is in the protocol rather than in the program (the program is doing what it's supposed to do). But, that way of doing things is the wrong way of doing things. Why should the frequency of updates and happenstance dictate what is and isn't downloaded? -- It shouldn't. The minimal interval should automatically also prevent multiple matches of the filter in any given feed update.

The minimal interval should control the minimal amount of time between which that filter is used. Which requires excluding duplicates within the same feed update.

(I use 1.8.4)

Link to comment
Share on other sites

Fair enough, you've convinced me, but...

... then that begs the question: which one is µTorrent then supposed to choose if there are multiple files?

Should it always choose the earliest one (to pretend as if it happened to refresh the feed before the duplicates showed up)? Or should it pick one of the subsequent ones, which may or may not have fixed something without indicating "repack" or something (in which case smart.repack_filter would've caught it). In the (admittedly edge) case that the feed has messed up timestamps (all items share the same timestamp -- I've seen b0rked feeds like that), which one is it supposed to chose?

Link to comment
Share on other sites

Yes, it should take the first one time-wise. That's the one you'd get if the the two updates were in different RSS feed updates. The point is the behavior shouldn't be different based on when the feed updates struck.

If you choose by another criteria then it is possible the order would flip and you'd get the later one, but only if the update catches the two matches in the same update. In this case the behavior would still changed based on the timing of the feed update.

Nobody publishes repacks within 15 minutes of each other, the only time I've seen this error is when the files are nearly identical (often different torrents but identical files) and just happen to get published from two different sources.

As for "repack" priority defaulting to the later one, I'm somewhat opposed to the idea of even having an "episode" feature hardcoded at all, so certainly "repack" stuff seems a bit off. I'd prefer the things like "smart episode filter" and "episode number" be converted into modifiable regex and soft coded alternatives and not exist in the program as hardcoded features at all. But, that's a largely unrelated matter.

For the borked feeds, the files are likely in order of the timestamp even without a functional proper timestamp in the XML itself. One assumes it's not sorted by name or anything silly like that. In those cases the edge condition simply goes to which ever one the implementation happens to see first. And iteration of the items in the feed update in what is generally the oldest to newest direction (bottom to top) would catch the edge condition.

Link to comment
Share on other sites

Indeed, regular expressions support in filters is something that has been requested for a while, but one of the main issues with it is that quality regex libraries are huge, and in-house libraries would almost definitely be buggy.

And I agree with you in that I also get an odd feeling from seeing these episode number/quality and repack stuff included, but they were added all for the sake of ease-of-use more than anything. And regex couldn't provide what the spart episode filter or repack filter does :P

And alright, your suggestions make sense. I'll pass it along in case it hasn't already been seen by the devs.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...