Rewriting an Input/EventSensor code

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Rewriting an Input/EventSensor code

Igor Stasenko
Hello lists,

I making a cross-post, because there are interest from both camps to
make it done :)

Michael published a changesets with new event sensor code for Pharo,
which mainly focused on supporting multiple event
consumers(frameworks).
Recently , we discussed it with Andreas, and had an idea to integrate
this with Announcement framework (and Announcement will become a core
package).

I very like this idea, but i think we should discuss the
implementation details before starting it.

Here is my vision , how things should look like:

- role 1. Event source. VM, is an events source, in most cases it is
the only source, but not always. I would like to be able to plug-in a
different event source.. imagine a remote socket connection, or
previously recorded events and then replayed. So, an event source
should be a separate, abstract entity, where VM is one of the concrete
kind of it. There is also, sometimes a need to inject a fabricated
events into event queue, to simulate user input, or for producing a
synthetic events.

- role 2. Event listener/Event sensor (or InputSensor) is a mediator
between event source and higher level framework (Morphic.. etc), its
role is to listen events from event source and then dispatch it to
concrete consumer(s), if any.

- role 3. Events. Events should be a full featured objects, with good
protocol. A high level framework should not peruse with raw data, as
it currently does with raw event buffers which coming from VM. This
means, that changes will affect the Morphic. Morphic having a classes
for user input events (keyboard/mouse) and decifering raw VM events
into own representation. I think we should move the 'decifering' part
to EventSource (sub)classes, and make sure EventSensor (and its
consumers) dealing with full featured event objects, leaving event
consumers free to decide what to do with them.

- role 4. Event consumers. Note, there can be multiple different
consumers, not just one, as its currently Morphic. We should make
sure, that integration with any other framework will be painless.

- be ready for supporting multiple host windows. This part is quite
simple to do in EventSensor.. but not so simple in Morphic. One thing
would be is to refactor all code which uses the Sensor global
directly, and replace it with appropriate thing(s). But this is out of
scope of event handling framework, at initial stage , we could keep
things compatible with old ways (1 Sensor, 1 Display, 1 Morphic
World).


I started prototyping a classes for events, where each event kind
having own event subclass, and directly decoding the VM raw event
buffer, so then any event consumer(s) don't have to poke raw events.
But since there were a good idea to make use of Announcements for it,
it may need some refactoring.

Michael, Andreas, i'd like to hear your comments and remarks, as well
as any others are wellcome.

--
Best regards,
Igor Stasenko AKA sig.

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Re: Rewriting an Input/EventSensor code

Igor Stasenko
2009/3/21 Andreas Raab <[hidden email]>:

> Igor Stasenko wrote:
>>
>> Recently , we discussed it with Andreas, and had an idea to integrate
>> this with Announcement framework (and Announcement will become a core
>> package).
>
> Actually, Jecel propsed the combination.
>
>> Here is my vision , how things should look like:
>
> [... big snip ...]
>
>> Michael, Andreas, i'd like to hear your comments and remarks, as well
>> as any others are wellcome.
>
> It's too much to do all that in one go around. What I would propose is to
> start simple by having an event source which maps from raw event buffers to
> some kind of (non-morphic) event objects and have InputSensor be a client of
> that. I believe that is a straightforward extension of the work that has
> already been done.
>

Wait, i proposing nearly same: Have an event source which produces
(non-morphic) event objects and InputSensor.
I just want to know where Announcements takes part of it, or should be
postpone that for a next step?

> Cheers,
>  - Andreas
>

--
Best regards,
Igor Stasenko AKA sig.

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Re: Rewriting an Input/EventSensor code

Michael Rueger-6
Igor Stasenko wrote:

> Wait, i proposing nearly same: Have an event source which produces
> (non-morphic) event objects and InputSensor.
> I just want to know where Announcements takes part of it, or should be
> postpone that for a next step?

What I did while exploring an alternative UI framework was to use the
rewrite and add an Announcer as a second listener. "Interested parties"
could then subscribe to event announcements. The raw input events are
first converted to first class event objects before submitting them to
the announcer.
As discussed earlier this allows for having a completely separate UI
running without any overlaps to morphic. Tweak always had the problem of
still being tied into the morphic event processing, the combination of
the sensor rewrite and announcers avoid this.

I meant to make this stuff available a long time ago, partly as an
effort to try to avoid duplicate effort with the Alain's Miro framework,
but kept distracted by other things.
Will put it a bit higher on my list :-)

Michael

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Re: Rewriting an Input/EventSensor code

Igor Stasenko
2009/3/21 Michael Rueger <[hidden email]>:

> Igor Stasenko wrote:
>
>> Wait, i proposing nearly same: Have an event source which produces
>> (non-morphic) event objects and InputSensor.
>> I just want to know where Announcements takes part of it, or should be
>> postpone that for a next step?
>
> What I did while exploring an alternative UI framework was to use the
> rewrite and add an Announcer as a second listener. "Interested parties"
> could then subscribe to event announcements. The raw input events are first
> converted to first class event objects before submitting them to the
> announcer.

Right, but here we're talking about doing such conversion much more
earlier (at event source object),
so then event sensor already deals with first class event objects.
I want to know, if such scheme (which i described in first post) is plausible.

> As discussed earlier this allows for having a completely separate UI running
> without any overlaps to morphic. Tweak always had the problem of still being
> tied into the morphic event processing, the combination of the sensor
> rewrite and announcers avoid this.
>
Right, that's why we need a separate set of classes (i called them
KernelXXXEvent) which representing an events which came from VM and
not tied to Morphic.

> I meant to make this stuff available a long time ago, partly as an effort to
> try to avoid duplicate effort with the Alain's Miro framework, but kept
> distracted by other things.
> Will put it a bit higher on my list :-)
>
Let me know, if you need some help. At least i can send you a proto
implementation of KernelXXXEvent classes.
I'm also started writing it, but then other things drawn my attention :)

> Michael
>
>



--
Best regards,
Igor Stasenko AKA sig.

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Re: Rewriting an Input/EventSensor code

johnmci

On 21-Mar-09, at 3:09 AM, Igor Stasenko wrote:

> Right, but here we're talking about doing such conversion much more
> earlier (at event source object),
> so then event sensor already deals with first class event objects.
> I want to know, if such scheme (which i described in first post) is  
> plausible.

For the iPhone VM I return a complex event type, which then points to  
Smalltalk objects which are the
representation of the touch events. For location and acceleration data  
I return the actual objective-C objects.
This data is then processed by EventSensor.

If you choose to push the responsibility to the VM for creating event  
objects then you need to be cognizant
of the fact that whatever is proposed has to change very little over  
time, otherwise you end up with the issue
of image versus VM compatibility and the fact that VM version changes  
proceed at a slow rate.


--
=
=
=
========================================================================
John M. McIntosh <[hidden email]>
Corporate Smalltalk Consulting Ltd.  http://www.smalltalkconsulting.com
=
=
=
========================================================================




_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Re: Rewriting an Input/EventSensor code

Igor Stasenko
2009/3/21 John M McIntosh <[hidden email]>:

>
> On 21-Mar-09, at 3:09 AM, Igor Stasenko wrote:
>
>> Right, but here we're talking about doing such conversion much more
>> earlier (at event source object),
>> so then event sensor already deals with first class event objects.
>> I want to know, if such scheme (which i described in first post) is
>> plausible.
>
> For the iPhone VM I return a complex event type, which then points to
> Smalltalk objects which are the
> representation of the touch events. For location and acceleration data I
> return the actual objective-C objects.
> This data is then processed by EventSensor.
>
> If you choose to push the responsibility to the VM for creating event
> objects then you need to be cognizant
> of the fact that whatever is proposed has to change very little over time,
> otherwise you end up with the issue
> of image versus VM compatibility and the fact that VM version changes
> proceed at a slow rate.
>
Nope. I don't want VM to deal with real event objects.
VM will still use the old event buffers to deliver events to image.
But once image receiving it, it should convert it to an instance of
event as soon as source.
This is the role of EventSource class - represent VM as event source,
which producing an instances of KernelXXXEvent classes, and hiding the
details of converting raw event buffers from the eyes of higher
layers, which then going to handle the event (EventSensor/Morphic etc)


>
> --
> ===========================================================================
> John M. McIntosh <[hidden email]>
> Corporate Smalltalk Consulting Ltd.  http://www.smalltalkconsulting.com
> ===========================================================================
>
>
>
>
>



--
Best regards,
Igor Stasenko AKA sig.

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
Reply | Threaded
Open this post in threaded view
|

Re: [squeak-dev] Re: Rewriting an Input/EventSensor code

Igor Stasenko
2009/3/21 Igor Stasenko <[hidden email]>:

> 2009/3/21 John M McIntosh <[hidden email]>:
>>
>> On 21-Mar-09, at 3:09 AM, Igor Stasenko wrote:
>>
>>> Right, but here we're talking about doing such conversion much more
>>> earlier (at event source object),
>>> so then event sensor already deals with first class event objects.
>>> I want to know, if such scheme (which i described in first post) is
>>> plausible.
>>
>> For the iPhone VM I return a complex event type, which then points to
>> Smalltalk objects which are the
>> representation of the touch events. For location and acceleration data I
>> return the actual objective-C objects.
>> This data is then processed by EventSensor.
>>
>> If you choose to push the responsibility to the VM for creating event
>> objects then you need to be cognizant
>> of the fact that whatever is proposed has to change very little over time,
>> otherwise you end up with the issue
>> of image versus VM compatibility and the fact that VM version changes
>> proceed at a slow rate.
>>
> Nope. I don't want VM to deal with real event objects.
> VM will still use the old event buffers to deliver events to image.
> But once image receiving it, it should convert it to an instance of
> event as soon as source.
> This is the role of EventSource class - represent VM as event source,
> which producing an instances of KernelXXXEvent classes, and hiding the
> details of converting raw event buffers from the eyes of higher
> layers, which then going to handle the event (EventSensor/Morphic etc)
>

To give an example what i talking about, here the bits of prototype
implementation:

KernelEvent class>>initialize
        "Initialize the array of event types.
        Note, the order of array elements is important and should
        be same as event type returned by VM in event buffer"
        EventTypes := {
                KernelMouseEvent.
                KernelKeyboardEvent.
                KernelDragDropEvent.
                KernelMenuEvent.
                KernelWindowEvent.
                }
-----

KernelEvent class>>fromBuffer: eventBuffer
        "Decode a raw VM event into an instance of KernelEvent subclass"
       
        | type |
        type := EventTypes at: (eventBuffer at: 1) ifAbsent: [ ^
KernelUnknownEvent from: eventBuffer ].
        ^ type new from: eventBuffer

-----

KernelEvent>>from: buffer
        "Initialize an event instance from raw event buffer.
        Note, all subclasses should call super to initialize fields correctly"

        eventType := buffer at: 1.
        timeStamp := buffer at: 2.
        timeStamp = 0 ifTrue: [timeStamp := Time millisecondClockValue].
        windowIndex := buffer at: 8.
----

KernelMouseEvent>>from: buffer

        super from: buffer.

        position := Point x: (buffer at: 3) y: (buffer at: 4).
        buttons := buffer at: 5.
        modifiers := buffer at: 6.

as you can see, there is nothing complicated. It simply frees
underlaying event handling layers from deciphering event buffers by
themselves, instead, they deal with first class even objects, with
harmonized protocol.

>
> --
> Best regards,
> Igor Stasenko AKA sig.
>



--
Best regards,
Igor Stasenko AKA sig.

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project