[Pharo-project] Rethinking the event's handling in Gaucho/Morphic
fernando.olivero at usi.ch
Mon Jan 23 12:37:58 CET 2012
On Mon, Jan 23, 2012 at 11:21 AM, Stéphane Ducasse
<stephane.ducasse at inria.fr> wrote:
> On Jan 23, 2012, at 1:18 AM, Fernando Olivero wrote:
>> Hi, in the context of the new UI framework i'm developing for Gaucho,
>> i've decided to design the event handling part as follows:
>> The entry point to the framework is an instance of aGUserInterface ,
>> which collaborates with a set of peripherals: in the standard case it
>> holds a: 1)aGDisplay 2) aGMouse and 3) aGKeyboard.
> Now more time to answer.
> Indeed HandMorph looks to me like having too many responsibility
> - transform low level event into high level
> - manage the Morphic interaction loops
> - looks like a cursor
YES! this design favors the sharing of the above responsibilities
between the user interface and its peripherals.
>> The GUserInterface listens to VM events, and process them as they arrive.
>> GUserInterface>>process: aSensorEvent dispatches to either the mouse
>> or the keyboard.
> in what we are putting in place we use double dispatch because
> we want to be able to plug new event
>> GKeyboard>>process: distinguishes between keyDown, keyUp, and
>> keystrokes, and reacts accordingly.
>> GMouse>>process: finds the proper reaction depending on the context
>> of the mouse.
>> The keyboard handling logic is quite simple.
>> The mouse is significantly more complicated, because the mouse events
>> produce a different action depending on the current "mode" of the
>> mouse. I modeled this problem, using the State pattern, depicted by
>> the attached FSM.
> interesting. Indeed it looks to me that using a state pattern should be a nice way to
> handle the Morphic logic
> Now I do not understand the arrows to the big Mouse.
When a MouseState reacts to a given event, such the MouseDragging
reacting to #up: aDisplayPoint, it sends messages to the Mouse, such
as #drop:at:, #clicked:at:, #secondaryclicked:at:.
> I like the idea that you dispatch to the peripherals now what we encounter
> with igor is that handMorph need to keep the previous system event and modifiers
> so is the distribution to Mouse, Keyboard supporting that?
yes, the last mouse event needs to be stored, to be able to decode
wether the sensor input event is a mouse up or a mouse down .
But this is stored by the Mouse.SURELY, we should improve how the VM
encodes the OS Event, this shouldn't be needed! .
The keyboard doesn't need such mechanism.
> Fernando if you have a look at EventModel
> we are asking ourselves the following questions:
> - HostMenu management looks not the way we want (right now it hijacks low level event and produce event to simulate)
> we are thinking that the HandMorph (or User Interface in your case should be responsible for deciding to publish menu
> as host menu and host menu should be able to call the same code than default menu.
> - We want to look at the API of InputEventSensor (using event but) and SystemInputEventAnnouncer (using events!)
> and check the various users and fix them because we do not want the system to rely on InputEventSensor and its eventBuf
I've looked at the EventModel, quite nice design.
I will adapt the design proposed above to it, the only difference is
that GUserInterface , GMouse and GKeyboard will process a
SystemInputEvent instead of a raw event buffer encoded in an Array.
Which will ease and standardize the dispatching amongst the
More information about the Pharo-project