[Twisted-Python] Handling too many open file descriptors

exarkun at twistedmatrix.com exarkun at twistedmatrix.com
Mon Sep 27 10:53:39 EDT 2010


On 02:45 pm, landreville at deadtreepages.com wrote:
>I'm running an application that makes about 1300 snmp connections
>every minute; I'm using utils.getProcessOutput with snmpget because
>pysnmp throws an error when I try to run it. Now of course I get the
>Too many open files error, but is the best way to handle this
>increasing the limit on Linux or by implementing some sort of queue so
>that there are only x number of the snmpget processes at a time? Is
>there such a queue feature in twisted?

It doesn't seem likely to me that it's useful to have a thousand smtpget 
processes running at once.  But who knows, maybe it is.  To actually 
know, you'll have to decide what your requirements are (how many do you 
actually have to run in parallel to get the throughput you need?  how 
hard are you willing to thrash the machine you're running on?) and then 
measure various approaches and configurations.

If you decide you want to limit the number of snmpget processes you 
launch, then you might find twisted.internet.defer.DeferredSemaphore 
useful (and since the API docs for it are messed up, make sure you find 
the "run" method, not just the "acquire" and "release" methods).

Jean-Paul



More information about the Twisted-Python mailing list