Re: Touch/MultiTouch Events

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
18 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

KenDickey
 
Greetings all,

I am cross posting to vm-dev because of wide potential interest, e.g.
for Squeak/Cuis/Whatever.. on smartphones and touchscreen tablets.

I would like to have a solution across X11, vm-display-fbdev, MacOS,
Windose, et al.

I think we need to approach from both ends of the problem: user gesture
recognition and hardware driver/library availability.  What the user
sees, and what the VM sees and delivers.

My thought is to start by looking at what convergence in thinking and
APIs has already been done.

A useful set of user gestures is captured in
   https://static.lukew.com/TouchGestureGuide.pdf

One description of basic touch event properties is
   https://developer.mozilla.org/en-US/docs/Web/API/Touch

For low level vm event input on Linux, I am looking at libevent, which
is used by Wayland (successor to but useful within X11):
   https://wayland.freedesktop.org/libinput/doc/latest/index.html

How gestures are recognized on Android:
   https://developer.android.com/training/gestures

Recognized using libevent:
   https://github.com/bulletmark/libinput-gestures

I am just getting oriented, but if we can largely agree on gesture usage
in the UI and what gesture events the VM delivers, I suspect
implementation convergence details will get worked out in as we
experiment/prototype.

My tummy is now full and I need to digest and think about this..
-KenD


On 2020-09-19 00:16, Beckmann, Tom wrote:

> Hi all,
>
> thank you Ken for bringing this up.
>
> I'll go ahead and share my thoughts thus far. Maybe you could check
> with the respective APIs you use/know of, if this protocol appears
> like something that would be compatible.
>
> VM Side
> -----------
>
> In [1] is the definition of a VM-side sqTouchEvent. When compared to
> for example the Javascript/Browser API, we would not be able to
> represent the touch area fields radiusX,radiusY,rotationAngle,force
> [2] with this definition.
>
> I noticed that there is a sqComplexEvent [3] that appears to have been
> used for touch events on the iPhone. While the constant for
> EventTypeComplex is defined in my image, I see no code handling this
> type of event. I'd be very curious to learn how the objectPointer was
> handled on the image-side. This may also be an option for us to
> support more properties such as the touch area.
>
> Looking at the properties provided by the iPhone API [4], I would
> prefer to derive some of those on the image-side (e.g.
> phase=stationary or tapCount). The current implementation in [5] seems
> to also bundle all active touch points in each event (not quite sure
> about this since it also assigns a single phase as the event type?);
> I'd be more in favor of identifying ongoing touch sequences of one
> finger via an identifier. On XInput2, the sequence field is a simple
> integer that increments each time a finger is touching the screen
> anew.
> One more consideration for the info provided by the VM: the iPhone API
> also provides info on the device type [6], which I think could be an
> interesting addition, allowing us to react appropriately to stylus
> input. This may also require us to then not only provide radius
> information but also tilt angle of the pen.
>
> The properties I would currently see of interest to us:
> 1. event type (EventTypeTouch)
> 2. timestamp
> 3. x coordinate
> 4. y coordinate
> 5. phase/touch type (begin,update,end,cancel)
> 6. sequence (identifier for continuous events for the same finger
> stroke)
> 7. windowIndex (for host window plugin)
> 8. radiusX
> 9. radiusY
> 10. rotationAngle ("Returns the angle (in degrees) that the ellipse
> described by radiusX and radiusY must be rotated, clockwise, to most
> accurately cover the area of contact between the user and the
> surface." [2])
> 11. force
> 12. tiltX
> 13. tiltY
> 14. deviceType (touch,pen)
>
> It could be considered to make the interpretation of fields 8 and 9
> depend on the deviceType and thus merge the radius and tilt fields.
> In practice, field 6 would likely to turn into an objectPointer as for
> the the ComplexEvent and bundle the fields>=8.
>
> Image Side
> ---------------
> I have not invested much thought on the image-side handling just yet.
> The suggestion of mapping to multiple hands sounds sensible. I would
> assume we would still call our touch events mouse events such that
> existing handler code keeps working on a touch-only device? The touch
> event classes could then also simply extend the existing mouse event
> classes.
>
> An alternative could be to check for the implementation of
> touchBegin:/Move:/End: methods on the receiving Morph, similar to
> `Morph>>#wantsStep` but I would prefer not to. I think handling touch
> and mouse for most purposes synonymous avoids a lot of confusion for
> the user. I might be wrong though :)
>
> In terms of what would break with this implementation: I have on
> various occasions written event handling code that remembers the
> lastX/Y position of a mouseMove: event to for example paint a stroke.
> This would no longer work with multiple hands sending interleaved
> events to the same Morph. I suppose relying on MouseMoveEvent's
> startPoint and endPoint could be a better pattern here. It will also
> be interesting to see how our current keyboard focus system will be
> able to cope.
>
> Looking forward to reading your thoughts! If you feel like this is
> appropriate, please also include the squeak-dev list in your reply.
>
> Best,
> Tom
>
> (please excuse that I linked to my fork each time, the only changes to
> upstream are in the X11 event plugin and sq.h)
> [1]
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/Cross/vm/sq.h#L489
> [2] https://developer.mozilla.org/en-US/docs/Web/API/Touch
> [3]
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/Cross/vm/sq.h#L568
> [4] https://developer.apple.com/documentation/uikit/uitouch/phase
> [5]
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/iOS/vm/iPhone/Classes/sqSqueakIPhoneApplication+events.m#L145
> [6] https://developer.apple.com/documentation/uikit/uitouch/touchtype
> ________________________________________
> From: [hidden email] <[hidden email]>
> Sent: Monday, September 14, 2020 5:15:17 PM
> To: Beckmann, Tom
> Cc: Tonyg; Eliot Miranda; [hidden email]
> Subject: Touch/MultiTouch Events
>
> Tom,
>
> I noticed your recent post on "cellphone responds to touchscreen".
>
> Tony Garnock-Jones has been gotten vm-display-fbdev up on postmarketOS
> (Alpine Linux) and I was wondering about getting up touch event
> gestures
> using libevdev.
>
> Early days, but perhaps we can share some thoughts about
> InputSensor/EventSensor and gesture strategy?
>
> -KenD
-KenD
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Phil B
 
It would be appreciated if we didn't shoehorn touches the way some of the mouse events are (i.e. scroll wheel).  At a low level, it would be nice to be able to access any and all touch event data (position, radius, pressure for each touch... if available.  With in-screen fingerprint readers becoming more common, also consider a day when you'll know which finger (and possibly which user) is registering which touch) and treat them as touches rather than pseudo-mouse events.  For example, when you lift your finger(s) from the screen, there is no valid current mouse position as far as touch is concerned.  At a higher level, if it's a single touch it could be synthesized into a higher level click/select or move event... but for some applications you don't want them treated that way.  I'm basically just asking that we don't 'cook' the touch events too much in the VM... pass a (somewhat abstracted, so it can be platform neutral) touch event along and let the image decide how to process it.

On Sat, Sep 19, 2020 at 10:44 AM <[hidden email]> wrote:
 
Greetings all,

I am cross posting to vm-dev because of wide potential interest, e.g.
for Squeak/Cuis/Whatever.. on smartphones and touchscreen tablets.

I would like to have a solution across X11, vm-display-fbdev, MacOS,
Windose, et al.

I think we need to approach from both ends of the problem: user gesture
recognition and hardware driver/library availability.  What the user
sees, and what the VM sees and delivers.

My thought is to start by looking at what convergence in thinking and
APIs has already been done.

A useful set of user gestures is captured in
   https://static.lukew.com/TouchGestureGuide.pdf

One description of basic touch event properties is
   https://developer.mozilla.org/en-US/docs/Web/API/Touch

For low level vm event input on Linux, I am looking at libevent, which
is used by Wayland (successor to but useful within X11):
   https://wayland.freedesktop.org/libinput/doc/latest/index.html

How gestures are recognized on Android:
   https://developer.android.com/training/gestures

Recognized using libevent:
   https://github.com/bulletmark/libinput-gestures

I am just getting oriented, but if we can largely agree on gesture usage
in the UI and what gesture events the VM delivers, I suspect
implementation convergence details will get worked out in as we
experiment/prototype.

My tummy is now full and I need to digest and think about this..
-KenD


On 2020-09-19 00:16, Beckmann, Tom wrote:
> Hi all,
>
> thank you Ken for bringing this up.
>
> I'll go ahead and share my thoughts thus far. Maybe you could check
> with the respective APIs you use/know of, if this protocol appears
> like something that would be compatible.
>
> VM Side
> -----------
>
> In [1] is the definition of a VM-side sqTouchEvent. When compared to
> for example the Javascript/Browser API, we would not be able to
> represent the touch area fields radiusX,radiusY,rotationAngle,force
> [2] with this definition.
>
> I noticed that there is a sqComplexEvent [3] that appears to have been
> used for touch events on the iPhone. While the constant for
> EventTypeComplex is defined in my image, I see no code handling this
> type of event. I'd be very curious to learn how the objectPointer was
> handled on the image-side. This may also be an option for us to
> support more properties such as the touch area.
>
> Looking at the properties provided by the iPhone API [4], I would
> prefer to derive some of those on the image-side (e.g.
> phase=stationary or tapCount). The current implementation in [5] seems
> to also bundle all active touch points in each event (not quite sure
> about this since it also assigns a single phase as the event type?);
> I'd be more in favor of identifying ongoing touch sequences of one
> finger via an identifier. On XInput2, the sequence field is a simple
> integer that increments each time a finger is touching the screen
> anew.
> One more consideration for the info provided by the VM: the iPhone API
> also provides info on the device type [6], which I think could be an
> interesting addition, allowing us to react appropriately to stylus
> input. This may also require us to then not only provide radius
> information but also tilt angle of the pen.
>
> The properties I would currently see of interest to us:
> 1. event type (EventTypeTouch)
> 2. timestamp
> 3. x coordinate
> 4. y coordinate
> 5. phase/touch type (begin,update,end,cancel)
> 6. sequence (identifier for continuous events for the same finger
> stroke)
> 7. windowIndex (for host window plugin)
> 8. radiusX
> 9. radiusY
> 10. rotationAngle ("Returns the angle (in degrees) that the ellipse
> described by radiusX and radiusY must be rotated, clockwise, to most
> accurately cover the area of contact between the user and the
> surface." [2])
> 11. force
> 12. tiltX
> 13. tiltY
> 14. deviceType (touch,pen)
>
> It could be considered to make the interpretation of fields 8 and 9
> depend on the deviceType and thus merge the radius and tilt fields.
> In practice, field 6 would likely to turn into an objectPointer as for
> the the ComplexEvent and bundle the fields>=8.
>
> Image Side
> ---------------
> I have not invested much thought on the image-side handling just yet.
> The suggestion of mapping to multiple hands sounds sensible. I would
> assume we would still call our touch events mouse events such that
> existing handler code keeps working on a touch-only device? The touch
> event classes could then also simply extend the existing mouse event
> classes.
>
> An alternative could be to check for the implementation of
> touchBegin:/Move:/End: methods on the receiving Morph, similar to
> `Morph>>#wantsStep` but I would prefer not to. I think handling touch
> and mouse for most purposes synonymous avoids a lot of confusion for
> the user. I might be wrong though :)
>
> In terms of what would break with this implementation: I have on
> various occasions written event handling code that remembers the
> lastX/Y position of a mouseMove: event to for example paint a stroke.
> This would no longer work with multiple hands sending interleaved
> events to the same Morph. I suppose relying on MouseMoveEvent's
> startPoint and endPoint could be a better pattern here. It will also
> be interesting to see how our current keyboard focus system will be
> able to cope.
>
> Looking forward to reading your thoughts! If you feel like this is
> appropriate, please also include the squeak-dev list in your reply.
>
> Best,
> Tom
>
> (please excuse that I linked to my fork each time, the only changes to
> upstream are in the X11 event plugin and sq.h)
> [1]
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/Cross/vm/sq.h#L489
> [2] https://developer.mozilla.org/en-US/docs/Web/API/Touch
> [3]
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/Cross/vm/sq.h#L568
> [4] https://developer.apple.com/documentation/uikit/uitouch/phase
> [5]
> https://github.com/tom95/opensmalltalk-vm/blob/xi-experiment/platforms/iOS/vm/iPhone/Classes/sqSqueakIPhoneApplication+events.m#L145
> [6] https://developer.apple.com/documentation/uikit/uitouch/touchtype
> ________________________________________
> From: [hidden email] <[hidden email]>
> Sent: Monday, September 14, 2020 5:15:17 PM
> To: Beckmann, Tom
> Cc: Tonyg; Eliot Miranda; [hidden email]
> Subject: Touch/MultiTouch Events
>
> Tom,
>
> I noticed your recent post on "cellphone responds to touchscreen".
>
> Tony Garnock-Jones has been gotten vm-display-fbdev up on postmarketOS
> (Alpine Linux) and I was wondering about getting up touch event
> gestures
> using libevdev.
>
> Early days, but perhaps we can share some thoughts about
> InputSensor/EventSensor and gesture strategy?
>
> -KenD
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
 
Faintly on-topic and possibly interesting -

pi-top are selling a ~12" 10-point HD touch screen for Raspberry Pi's/ Seems like a quite nice way to do your research into what a touch UI for Squeak might be.

https://www.pi-top.com/products/display-keyboard

tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Strange OpCodes: PO: Punch Operator


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Robert Withers-2
 
12”? That’s auto infotainment center size! It would be totally wicked to see Squeak in the cars. In German I would say vollständig geil

And there would be no AppStore restrictions on reflection! There must be a rootkit.

I have been reading this multitouch thread and it keeps rereminding me of Supertouch, by Bad Brains!

https://youtu.be/6ch4i-WxnOI

Kindly,
Robert


On Fri, Sep 25, 2020 at 14:45, tim Rowledge <[hidden email]> wrote:

Faintly on-topic and possibly interesting -

pi-top are selling a ~12" 10-point HD touch screen for Raspberry Pi's/ Seems like a quite nice way to do your research into what a touch UI for Squeak might be.

https://www.pi-top.com/products/display-keyboard

tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Strange OpCodes: PO: Punch Operator




Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Robert Withers-2
 
Sorry about that stale album cut. Here’s Supertouch live...


Kindly,
Robert
. ..  ...   ‘...^,^

On Fri, Sep 25, 2020 at 15:08, Robert <[hidden email]> wrote:
12”? That’s auto infotainment center size! It would be totally wicked to see Squeak in the cars. In German I would say vollständig geil

And there would be no AppStore restrictions on reflection! There must be a rootkit.

I have been reading this multitouch thread and it keeps rereminding me of Supertouch, by Bad Brains!

https://youtu.be/6ch4i-WxnOI

Kindly,
Robert


On Fri, Sep 25, 2020 at 14:45, tim Rowledge <[hidden email]> wrote:

Faintly on-topic and possibly interesting -

pi-top are selling a ~12" 10-point HD touch screen for Raspberry Pi's/ Seems like a quite nice way to do your research into what a touch UI for Squeak might be.

https://www.pi-top.com/products/display-keyboard

tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Strange OpCodes: PO: Punch Operator






Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Eliot Miranda-2
In reply to this post by timrowledge
 
Hi Tim,

On Fri, Sep 25, 2020 at 11:45 AM tim Rowledge <[hidden email]> wrote:
 
Faintly on-topic and possibly interesting -

pi-top are selling a ~12" 10-point HD touch screen for Raspberry Pi's/ Seems like a quite nice way to do your research into what a touch UI for Squeak might be.

https://www.pi-top.com/products/display-keyboard

Search on eBay and you'll find a large number of much cheaper unpackaged displays in all sorts of sizes.  e.g.



tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Strange OpCodes: PO: Punch Operator




--
_,,,^..^,,,_
best, Eliot
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
 


> On 2020-09-25, at 7:34 PM, Eliot Miranda <[hidden email]> wrote:
>
> Hi Tim,
>
> On Fri, Sep 25, 2020 at 11:45 AM tim Rowledge <[hidden email]> wrote:
>  
> Faintly on-topic and possibly interesting -
>
> pi-top are selling a ~12" 10-point HD touch screen for Raspberry Pi's/ Seems like a quite nice way to do your research into what a touch UI for Squeak might be.
>
> https://www.pi-top.com/products/display-keyboard
>
> Search on eBay and you'll find a large number of much cheaper unpackaged displays in all sorts of sizes.  e.g.

Undoubtedly - but will they work with a Pi without hours of fiddling and having to make assorted wiring, supports and so on? Not being an electronics guy (beyond sometimes being able to work out what resistor is needed to make an LED work etc) I truly can't answer that.

Is the price difference either way justifiable? If I were doing it as a for-pay project I'd say definitely not - $150 is barely an of hour.

For hobby? Maybe the fiddle is half the fun; I've done dafter things just for the heck of it. Ask about my R/C transmitter case project some day... now there's a long story of spending money to do something just in order to do it. Buy some CFRP! Make a big CNC router to cut it! 3D printers to make brackets! GRP moulds for case parts! Break the damn circuit board! Short out the replacement board! AaaaaaRgh!


tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
"How many Teela Browns does it take to change a lightbulb?”
"Stupid question."



Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

KenDickey
In reply to this post by KenDickey
 
Hi Tim,

Your info on pi-top touch display came after I had placed an order for a
PinePhone.

Note that Pine64 makes a PineTab for $200 with touch screen and
keyboard:
   https://www.pine64.org/pinetab/

I have no experience with this, just noted the advert while looking at
PinePhone specs.

I am just playing with KDE-Plasma on Wayland on RPi4 Manjaro, looking to
see API and how touch events get handled.

My current thought, assuming we can get raw events and do gesture
recognition in Smalltalk, is to put up an invisible morph on the screen
when a touch event starts and use submorphs to show touch and give
feedback (Chromebook Plus does this with circles for mouse & touch if
you ask for it -- very handy).

Anyway, back to playtime!

Good on ya,
-KenD
-KenD
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
 


> On 2020-09-26, at 8:18 AM, [hidden email] wrote:
>
> Your info on pi-top touch display came after I had placed an order for a PinePhone.
>
> Note that Pine64 makes a PineTab for $200 with touch screen and keyboard:
>  https://www.pine64.org/pinetab/
>

Interesting, though disappointing to have a 720p display. Especially when their PineBook Pro is the same price with a HD display etc.

> I have no experience with this, just noted the advert while looking at PinePhone specs.

That's also an interesting device; with only 2GB ram I suspect that only sometihng as compact as Squeak will really run on it. Bit of a difference from an ancient IBM project I nearly got hired for where the worry was whether an entire 1MB ram was big enough for a Smalltalk.


tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Foolproof operation:  All parameters are hard coded.


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Phil B
 


On Sat, Sep 26, 2020 at 1:29 PM tim Rowledge <[hidden email]> wrote:
 


> On 2020-09-26, at 8:18 AM, [hidden email] wrote:
>
> Your info on pi-top touch display came after I had placed an order for a PinePhone.
>
> Note that Pine64 makes a PineTab for $200 with touch screen and keyboard:
https://www.pine64.org/pinetab/
>

Interesting, though disappointing to have a 720p display. Especially when their PineBook Pro is the same price with a HD display etc.

The lower resolution is a reasonable choice in terms of pairing with the A64 which is a fairly anemic SoC.  The power envelope and price is why they went with it.  On the laptop, they can get rid of the cell modem (and associated certification costs), some sensors etc. which gives them more money to put towards the SoC and display.  It wouldn't surprise me if in a couple of years they come out with a Pinephone Pro which costs $300-400 to give them more room to play with higher end hardware.
 

> I have no experience with this, just noted the advert while looking at PinePhone specs.

That's also an interesting device; with only 2GB ram I suspect that only sometihng as compact as Squeak will really run on it. Bit of a difference from an ancient IBM project I nearly got hired for where the worry was whether an entire 1MB ram was big enough for a Smalltalk.

2GB is plenty of RAM for a handful of running applications... just not a web browser with 50 open tabs.  The majority of Debian packages (that don't have full desktop GL or GL ES 3 requirements) run on it including Gimp etc. Not terribly well, but they do run.

Where Squeak is going to have problems is that it's only able to take advantage of a single core @ ~1.1GHz and the lack of JIT support.  That's going to limit you far more than the RAM.

One other not so surprising issue with Pine64 devices is the not great flash memory performance: ~80MB/s with eMMC, ~20MB/s with microSD.  While it's not as bad as older Raspberry Pi devices (thanks to the eMMC), it's in the same class of performance.  You'll feel this when loading things much more than you're likely to notice the 2GB of RAM.
 


tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Foolproof operation:  All parameters are hard coded.


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

KenDickey
In reply to this post by timrowledge
 
On 2020-09-26 10:29, tim Rowledge wrote:
..
> That's also an interesting device; with only 2GB ram I suspect that
> only sometihng as compact as Squeak will really run on it. Bit of a
> difference from an ancient IBM project I nearly got hired for where
> the worry was whether an entire 1MB ram was big enough for a
> Smalltalk.

My first computer was an Ithica InterSystems S-100 with 64k of RAM and
one 8" floppy.  Remember when floppy disk media really was floppy?  [You
can see their ad in the Byte St-80 issue]

When I got an Amiga with 1MB ram, that was infinite.  I believe it
booted w realtime kernel and window system from 256K EPROM. I had WSWIG
editing, music and animation tools. I ported a Scheme compiler (compiled
to 68k native).  20MB hard disk.

Going from floppy to 10 or 20MB hard drives changed the way we worked!

The Apple Newton, my first ARM device, had a 1/2 MB of static ram (but
4MB of EPROM!).

Remember Dr. Dobbs: "running lite without overbite"?

Hey, one foot in front of another..
-KenD




-KenD
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Stephan Eggermont-3
In reply to this post by Eliot Miranda-2
 
There are large differences in touch displays. You’d probably want a capacitive one with 10 finger detection. 

Stephan

Verstuurd vanaf mijn iPhone

Op 26 sep. 2020 om 04:35 heeft Eliot Miranda <[hidden email]> het volgende geschreven:


Hi Tim,

On Fri, Sep 25, 2020 at 11:45 AM tim Rowledge <[hidden email]> wrote:
 
Faintly on-topic and possibly interesting -

pi-top are selling a ~12" 10-point HD touch screen for Raspberry Pi's/ Seems like a quite nice way to do your research into what a touch UI for Squeak might be.

https://www.pi-top.com/products/display-keyboard

Search on eBay and you'll find a large number of much cheaper unpackaged displays in all sorts of sizes.  e.g.



tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Strange OpCodes: PO: Punch Operator




--
_,,,^..^,,,_
best, Eliot
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
 


> On 2020-09-26, at 1:34 PM, Stephan Eggermont <[hidden email]> wrote:
>
> There are large differences in touch displays. You’d probably want a capacitive one with 10 finger detection.

True; apparently the pi-top one claims that.


tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Strange OpCodes: RSC: Rewind System Clock


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
In reply to this post by Phil B
 


> On 2020-09-26, at 11:04 AM, Phil B <[hidden email]> wrote:
> [snips]
>
> 2GB is plenty of RAM for a handful of running applications... just not a web browser with 50 open tabs.  The majority of Debian packages (that don't have full desktop GL or GL ES 3 requirements) run on it including Gimp etc. Not terribly well, but they do run.

I've been running most of my work on assorted Pi with 1Gb or less for ... however long it is that Pi have been around, so yes, I understand that. And the browser thing is just another indication of utterly the WWW infrastructure has been screwed up. We'd be a lot better off if Squeak were the in-browser language.

>
> Where Squeak is going to have problems is that it's only able to take advantage of a single core @ ~1.1GHz and the lack of JIT support.  


Err, what lack of jit support would that be? ARM32 has been Cog'd for a long time now (coming up on 6 years I think) and ARM64 is working well, though is not totally finished in some part of the FFI stuff IIRC.

My Pi4 (running 32 bit Raspbian) benchmarks at around 35% of my 4.3GHz/i7 iMac. I'm not expecting a huge increase when I flip to AARCH64 but there will likely be some.

And multi-core? Well yes I suppose. Except that one can very easily spawn multiple running images using Dave Lewis' OSProcess package, including in ways that do some work and return the results.The spawning takes very little time; for example on said Pi4 it takes 39mS to do
    UnixProcess forkHeadlessSqueakAndDoThenQuit: [UnixProcess helloWorld]
I suspect we could do very useful things with that.

>
> One other not so surprising issue with Pine64 devices is the not great flash memory performance: ~80MB/s with eMMC, ~20MB/s with microSD.  While it's not as bad as older Raspberry Pi devices (thanks to the eMMC), it's in the same class of performance.  You'll feel this when loading things much more than you're likely to notice the 2GB of RAM.

On my Pi4 again, loading a fresh Squeak trunk image takes ~1-2 sec.


tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Useful random insult:- His mind wandered and never came back.


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

Phil B
 
Tim,

On Sat, Sep 26, 2020 at 6:20 PM tim Rowledge <[hidden email]> wrote:
 


> On 2020-09-26, at 11:04 AM, Phil B <[hidden email]> wrote:
> [snips]
>
> 2GB is plenty of RAM for a handful of running applications... just not a web browser with 50 open tabs.  The majority of Debian packages (that don't have full desktop GL or GL ES 3 requirements) run on it including Gimp etc. Not terribly well, but they do run.

I've been running most of my work on assorted Pi with 1Gb or less for ... however long it is that Pi have been around, so yes, I understand that. And the browser thing is just another indication of utterly the WWW infrastructure has been screwed up. We'd be a lot better off if Squeak were the in-browser language.

I was just using that as an example of the kind of desktop task one wouldn't want to try on a low-powered device even though the software is mostly technically there.  Yes, I know the RPi is similarly mostly there (to a degree).  The difference is that on other devices where we're running things like Mobian, they use the actual Debian (or whatever base distro you're using) ARM repos which have advantages in terms of package selection.
 

>
> Where Squeak is going to have problems is that it's only able to take advantage of a single core @ ~1.1GHz and the lack of JIT support. 


Err, what lack of jit support would that be? ARM32 has been Cog'd for a long time now (coming up on 6 years I think) and ARM64 is working well, though is not totally finished in some part of the FFI stuff IIRC.

Most (all?) current Pinephone distros run ARM64. Given the previously mentioned anemic storage performance and already taxed SoC, throwing multi-arch into the mix to run ARM32 code (especially just for Squeak) wouldn't be a good experience.  Isn't the current state of Cog on ARM64 DIY with major caveats such as FFI? (which doesn't help me as I'm very dependent on FFI)


My Pi4 (running 32 bit Raspbian) benchmarks at around 35% of my 4.3GHz/i7 iMac. I'm not expecting a huge increase when I flip to AARCH64 but there will likely be some.

Your Pi 4 relative performance isn't relevant for current Linux mobile devices.  The power and thermal envelopes of the Pi 4 don't fit that use case.  Realistically, most mobile SoC's that can be used on a remotely open device today are going to have significantly less performance.  The A64 is at the lower end of the range and it is what it is.  The motivation for going to AARCH64 is primarily to avoid multi-arch rather than any significant performance boost inherent in 64-bit.
 

And multi-core? Well yes I suppose. Except that one can very easily spawn multiple running images using Dave Lewis' OSProcess package, including in ways that do some work and return the results.The spawning takes very little time; for example on said Pi4 it takes 39mS to do
    UnixProcess forkHeadlessSqueakAndDoThenQuit: [UnixProcess helloWorld]
I suspect we could do very useful things with that.

OSProcess helps (to a degree) only where you have coarse-grained parallelism needs and/or latency isn't an issue.


>
> One other not so surprising issue with Pine64 devices is the not great flash memory performance: ~80MB/s with eMMC, ~20MB/s with microSD.  While it's not as bad as older Raspberry Pi devices (thanks to the eMMC), it's in the same class of performance.  You'll feel this when loading things much more than you're likely to notice the 2GB of RAM.

On my Pi4 again, loading a fresh Squeak trunk image takes ~1-2 sec.

Again, apples and oranges and not terribly apropos when running on currently available, reasonably open, mobile devices.  Sure, eventually devices based on bargain basement sub-10nm SoCs will appear.  But in the meantime (i.e. next several years), we have what we have.
 


tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Useful random insult:- His mind wandered and never came back.



Thanks,
Phil 
Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

David T. Lewis
In reply to this post by timrowledge
 
On Sat, Sep 26, 2020 at 03:20:10PM -0700, tim Rowledge wrote:
>
> And multi-core? Well yes I suppose. Except that one can very easily spawn multiple running images using Dave Lewis' OSProcess package, including in ways that do some work and return the results.The spawning takes very little time; for example on said Pi4 it takes 39mS to do
>     UnixProcess forkHeadlessSqueakAndDoThenQuit: [UnixProcess helloWorld]

<OT>
Or the somewhat more interesting example:

   RemoteTask do: [3 + 4] ==> 7

which completes in on the order of 10ms on my PC, and hopefully not too
much worse on Pi4. The [3 + 4] block is evaluated in a spawned image with
results returned to the parent image.
</OT>

Dave


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
In reply to this post by Phil B
 


> On 2020-09-26, at 4:35 PM, Phil B <[hidden email]> wrote:
>
>  Yes, I know the RPi is similarly mostly there (to a degree).  The difference is that on other devices where we're running things like Mobian, they use the actual Debian (or whatever base distro you're using) ARM repos which have advantages in terms of package selection.

Raspbian is simply a repackaged Debian ARM, with some extra stuff (like my NuScratch development) and a less-awful UI setup than most distros. All the Debian ARM repos are available and (barring maybe some odd things?) all will work.


> Most (all?) current Pinephone distros run ARM64. Given the previously mentioned anemic storage performance and already taxed SoC, throwing multi-arch into the mix to run ARM32 code (especially just for Squeak) wouldn't be a good experience.

I'm not sure where you picked up the idea I was suggesting any such thing, 'cos I'm fairly sure I didn't. Currently quite a few people are playing with ARM64 kernels and 64/32 mixed userspace on Pi's, as well as several entirely AARCH64 systems. Eliot & Ken have both been using Manjaro quite a lot, for example.

I always feel a bit puzzled when a modern ARM system is treated like some sort of toy; they're the sort of machine we didn't even know to fantasize about not so very long ago. My Pi 4 runs Smalltalk somewhere around 100,000 times as fast as my original ARM1 development machine (which I still have) did in 1986.

>  Isn't the current state of Cog on ARM64 DIY with major caveats such as FFI? (which doesn't help me as I'm very dependent on FFI)

If you're in need of FFI a lot then maybe it isn't ready for you just yet. I think the remaining problem is some float structure returning complications.

>
>  Realistically, most mobile SoC's that can be used on a remotely open device today are going to have significantly less performance.

Well that's a problem for other makers to get off their butts and fix. It's not like the Pi is based on some exotic special core; merely a quad core A72, which has been available for some time now in SnapDragons, NXP, Jacinto, etc. And the Pi version is on ancient an 28nm process, whereas A72 is available for 16nm and considerably faster clock speeds. Numerous tablets and phones use those.

Now the Apple A14 is definitely special and I'm really looking forward to using that. My expectation is somewhat better performance than typical i7 and maybe attacking i9 intel things.

But we've got a lng way from worrying about touch events, so...

tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Useful random insult:- Cackles a lot, but I ain't seen no eggs yet.


Reply | Threaded
Open this post in threaded view
|

Re: Touch/MultiTouch Events

timrowledge
 


> On 2020-09-26, at 5:44 PM, tim Rowledge <[hidden email]> wrote:
>

> But we've got a lng way from worrying about touch events, so...

I should probably point out that I have experience of Smalltalk on ARM mobile devices - and touch screens - with  half a dozen product/projects back to 1987 and the Active Book (https://www.microsoft.com/buxtoncollection/detail.aspx?id=158&from=http%3A%2F%2Fresearch.microsoft.com%2Fen-us%2Fum%2Fpeople%2Fbibuxton%2Fbuxtoncollection%2Fdetail.aspx%3Fid%3D158)

Along with MediaPad (1996-9), DEC/Compaq, Alan's HP tablet project (2003-ish?) and a couple of more minor efforts. They're all existence proofs that Smalltalk on even very slow/small ARMs (the Active Book was an 8MHx ARM2as with 1MB ram *including* the screen buffer & filing system) can work really well. The current software world expectations of multi-core multi-GHz, multi-GB of 64 bit, hardware floating point, huge caches, etc, etc just show how lazy people have become.

tim
--
tim Rowledge; [hidden email]; http://www.rowledge.org/tim
Programmers do it bit by bit.