[Twisted-Python] conceptually, why a deferred can't be used more than once?
jeandaniel.browne at gmail.com
Thu Jul 8 17:46:41 EDT 2010
Concerning the concept of the deferred, why is it more useful to go
with a deferred which gets consumed and can only be fired once?
In my small script I realize I need to take special care that the
deferred has not been used, and that I must recreate explicitly a
deferred for each network request. In a parallel world someone might
have come up with a deferred concept which happily fires the callback
as many times as there data coming back from the server. Is it a dumb
Could the deferred design be part of the solution to the network
problem of two requests passing each other as each ends is not yet
aware that the other has just sent a request? Buggy networks nodes
would expect a response but get a request instead and go crazy...
Thanks for your help,
More information about the Twisted-Python