Squeak/Pharo on touchscreen: requesting opinions

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Squeak/Pharo on touchscreen: requesting opinions

Dimitry Golubovsky
Hi,

As I am progressing through the Stack Cog port for Android (basically
I can load a recent PharoCore or Squeak image althoug some stability
issues still exist), I would like to collect opinions from prospective
users, which is a better way to interact with Squeak (meaning Pharo
and others as well) on a touch screen.

9" tabet screen is not much smaller than say EEE PC netbook's, so at
800x480 px resolution menus and other Squeak GUI elements are well
readable, and can be easily pointed at with a stylus. So Squeak
environment itself can be as well used even on my rather cheap device.

Squeak requires a lot of mousing to interact. On an Android device,
there is a touch screen, and possibly several hardware buttons. In the
current implementation taps on the screen are treated as "red" clicks,
and pressing one of hardware buttons prior to tapping the screen
change click color just for one click: see this picture:

http://wiki.squeakvm-tablet.googlecode.com/hg/pics/pdn-land.jpg

That is, in order to get a context menu in Squeak, one has to press
the "Yellow" hardware button (in PharoCore it is the "Blue" button),
and then tap the screen; the menu appears.

See the whole explanation at
http://code.google.com/p/squeakvm-tablet/wiki/TestDrive

This is how the classic (based on Andreas' code) Squeak VM port works
now, and I have the same working in Stack Cog.

Advantages of this method: it is close to the way traditional mobile
applications interact. In a Squeak application, a Morph has to set
proper handling of red button clicks in order to enable interaction
with user. Morph drag can be done with one hand (however by a reason I
could not explain, morphs disappear from the screen while being
dragged, and reappear in new location only when tap is released).

Disadvantages: current mouse position is unknown/invisible to the
user, mouse over is impossible (hence no balloon help). Also due to
high touchscreen sensitivity, holding a finger/stylus on the screen
may generate many events within short period of time, since even one
pixel change in touch position causes an event to be generated, and
some involuntary finger movements always take place. Such frequently
reported events may "choke" the interpreter, given the CPU is slow,
and Android OS quickly kills it due to unresponsiveness.

While I am trying to address these issues in various ways, there seems
to be another way to interact via touchscreen. This method has not
been implemented yet, and I would like to hear from the community
whether it could be good to have.

A mouse pointer is displayed as part of the activity interface (maybe
even done entirely in Java, so mouse movement itself will not put any
load on the interpreter). Finger movements on the screen move the
pointer, but new position is reported (with button mask 0) only when
the screen tap is released. So mouse over becomes possible: just leave
the pointer where needed. The hardware buttons are used as before (one
becomes Red, another Blue, and a chord would be Yellow), but clicks
only are reported when those buttons are pressed or released*. To drag
a Morph, one would have to hold one of the hardware buttons, and slide
their finger/stylus on the screen.

So this hypothetical mode is more like laptops' trackpads work. One
obvious advantage of this method is less load on the interpreter in
terms of the frequency of events, and hence greater stability,
although less convenience. Again, to run an end-user application, the
former mouse tracking method may be enabled.

I am asking your opinion here (as much of opinion can be had about
something only imaginary) - is the latter method worth implementing:
would one like to use it if available?

Thanks.

-------------------------
* or long taps may be recognized as red clicks

--
Dimitry Golubovsky

Anywhere on the Web

Reply | Threaded
Open this post in threaded view
|

Re: Squeak/Pharo on touchscreen: requesting opinions

Stefan Marr
Hi Dimitry:

On 18 Aug 2011, at 20:40, Dimitry Golubovsky wrote:

> A mouse pointer is displayed as part of the activity interface (maybe
> even done entirely in Java, so mouse movement itself will not put any
> load on the interpreter). Finger movements on the screen move the
> pointer, but new position is reported (with button mask 0) only when
> the screen tap is released. So mouse over becomes possible: just leave
> the pointer where needed. The hardware buttons are used as before (one
> becomes Red, another Blue, and a chord would be Yellow), but clicks
> only are reported when those buttons are pressed or released*. To drag
> a Morph, one would have to hold one of the hardware buttons, and slide
> their finger/stylus on the screen.

For the old MVC image we are using for the RoarVM, we actually display a circular morph to be able to see where to click. (That is for our iOS port: https://github.com/smarr/RoarVM/tree/integrate-ipad-code)
The morph, basically is a traditional mouse pointer and can be dragged around, positioned, and then clicking works with one or multiple fingers.

I haven't actually used it myself on hardware, only on the simulator but it 'works'.
At least you know where you are going to click, and the MVC image size buttons/menus are not made
for fingers...

However, I feel that that is really just a work around. Even in the simulator it feels not natural.

An in general, for instance for tool-tips, the UI designer need to find a better solution.
I think for usability, it would be better to find a good touch-UI instead of trying to make the existing UI fit with it.
Until then, an artificial optional 'mouse-pointer' might be a choice.

Best regards
Stefan

PS: we do not use hardware buttons, but the different click types, single finger tap, double finger tap, etc.



--
Stefan Marr
Software Languages Lab
Vrije Universiteit Brussel
Pleinlaan 2 / B-1050 Brussels / Belgium
http://soft.vub.ac.be/~smarr
Phone: +32 2 629 2974
Fax:   +32 2 629 3525


Reply | Threaded
Open this post in threaded view
|

Re: [Pharo-project] Squeak/Pharo on touchscreen: requesting opinions

Douglas Brebner
In reply to this post by Dimitry Golubovsky
On 18/08/2011 19:40, Dimitry Golubovsky wrote:
> I am asking your opinion here (as much of opinion can be had about
> something only imaginary) - is the latter method worth implementing:
> would one like to use it if available?
>
> Thanks.
>

One thing to bear in mind is that both Android and Morphic support
multitouch; specifically, Morphic supports multiple simultaneous Hands
(in theory, whether the code still works is another matter).

I believe Bert did a demo of multitouch in Squeak (Etoys) on an iPad