[Twisted-Python] running 1,000,000 tasks, 40 at-a-time

Phil Mayers p.mayers at imperial.ac.uk
Wed Oct 26 11:23:09 EDT 2011


On 26/10/11 16:08, Jason Rennie wrote:
> On Wed, Oct 26, 2011 at 10:22 AM, Terry Jones <terry at jon.es
> <mailto:terry at jon.es>> wrote:
>
>     Sounds like you should be using a Python generator and that you're not.
>
>
> The issue I'm seeing is that the ~million DeferredSemaphore.run() calls
> are inefficient; it's independent of list/generator.
>
>     First off, have you read http://jcalderone.livejournal.com/24285.html ?
>     If not, do. Take note of the line
>
>     work = (callable(elem, *args, **named) for elem in iterable)
>
>     work is a generator. Make sure you understand every line of that
>     code :-)
>
>
> I don't see anything in task.Cooperator to limit the # of
> simultaneously-running tasks. Am I missing something? I guess,

Yes, you're missing something AIUI.

In JPs example at the given URL, he basically does two things:

  1. Creates a *single* generator (using a generator comprehension, but 
a function def would work just as well) that yields each task deferred 
in turn (not all at once).

  2. Passes the generator to a task.Cooperator "coiterate" method, N 
times. Each call to "coiterate" will setup a sequence of events that 
calls .next() on the generator, and repeats when the deferred 
callback/errbacks.

So, N is the concurrency

> technically, could write my own scheduler that limits the # of
> simultaneously-running tasks. But, then task.Cooperator isn't really
> doing anything useful for me.

See above.



More information about the Twisted-Python mailing list