[Reality] speaking/listening interfaces

Chris Armstrong reality@twistedmatrix.com
Sat, 13 Sep 2003 00:01:30 -0400


On Fri, Sep 05, 2003 at 03:09:19PM -0700, Eric Wong wrote:
> Is the mailing list back?

Now it is! :-)

> I'm attaching two files, speaking.py and listening.py.  Do these do the
> right thing to create interfaces/adapters?  speaking.py defines a basic
> interace for talking or whispering in a room, listening.py defines an
> interface for listening to things said.

Why is Listening an Action?

> The latest thing that I find a little confusing is using the variable
> 'actor' to mean the adapter to the interface.  It probably makes perfect
> sense in the component model, but it seems a little ambiguous when
> talking about mud/mush/moo/virtual worlds.

Well, "actor" is whatever implements I*Actor, AIUI.

> How does the unit testing work?  I looked over test_reality.py and I'm
> not sure how to use it (actually run the unit tests) or add new tests.

"trial reality.test_reality". To add a new test, just define a new testFoo
function in one of the existing TestCase subclasses, or create a new
TestCase subclass that defines new testFoo methods.



-- 
                                Chris Armstrong
                         << radix@twistedmatrix.com >>
                http://twistedmatrix.com/users/radix.twistd/