The cooked and the raw [RE: Touch/MultiTouch Events]

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

The cooked and the raw [RE: Touch/MultiTouch Events]

KenDickey
 
Greetings,

As I expected, there is someone here before me.

The devil is in the details of course, but the Qt framework allows for  
recognized events based on what widget takes the first touch event and
recognized touch patterns (see below).  Custom event recognizers can be
added (or bypass to take raw events).

? handlesTouch{,Down,Over} ?


=== from:
https://doc.qt.io/qt-5/gestures-overview.html
==============vvv=============
Overview

QGesture is the central class in Qt's gesture framework, providing a
container for information about gestures performed by the user. QGesture
exposes properties that give general information that is common to all
gestures, and these can be extended to provide additional
gesture-specific information. Common panning, pinching and swiping
gestures are represented by specialized classes: QPanGesture,
QPinchGesture and QSwipeGesture.

Developers can also implement new gestures by subclassing and extending
the QGestureRecognizer class. Adding support for a new gesture involves
implementing code to recognize the gesture from input events. This is
described in the Creating Your Own Gesture Recognizer section.

==============^^^=============
https://doc.qt.io/qt-5/qgesturerecognizer.html
https://doc.qt.io/qt-5/qevent.html
https://doc.qt.io/qt-5/qtouchevent.html
==============vvv=============
Qt will group new touch points together using the following rules:

When the first touch point is detected, the destination widget is
determined firstly by the location on screen and secondly by the
propagation rules.
When additional touch points are detected, Qt first looks to see if
there are any active touch points on any ancestor or descendant of the
widget under the new touch point. If there are, the new touch point is
grouped with the first, and the new touch point will be sent in a single
QTouchEvent to the widget that handled the first touch point
==============^^^=============

FYI,
-KenD
-KenD
Reply | Threaded
Open this post in threaded view
|

Re: The cooked and the raw [RE: Touch/MultiTouch Events]

Phil B
 
Ken,

Just a word of warning: I'd be cautious/skeptical taking what Qt or GTK have done on this front as tried and true solutions.  Neither framework is widely deployed on mobile yet, and based on my experiences with both (and the applications built using them) as a user on the Pinephone, they both still have quite a ways to go.  You'd probably be better off spelunking in the Android system code to see how their touch handling works (I believe it's all Apache licensed so you can steal any good ideas you find there)

Something I didn't understand until I got mine was why, in most of the videos I was seeing people put up, were they constantly having to re-tap controls to get things to work.  Well, I got mine, and now I know: the controls are often far too small, have no margin for error and generally seem to make the mistake of thinking 'this is just like a mouse, but with a finger'.  Contrast this with how iOS and Android do things: they lie and cheat all over the place.  For example, in the onscreen keyboard I believe both platforms 'lie' in terms of showing all keys the same size but then varying the hit area sizes on the keys so that more common keys are easier to hit based on how frequently it is used by a given language etc.  Touch input is also incredibly noisy both due to the capacitive touch screen and the users hand/finger not being terribly precise, shaking etc. (think of it more as a noisy analog signal than a digital one).  So the comparatively precise nature of mouse-based GUIs doesn't always translate (i.e. just blowing up the control sizes alone often isn't enough for a great experience)

Phil

On Mon, Sep 21, 2020 at 10:13 AM <[hidden email]> wrote:
 
Greetings,

As I expected, there is someone here before me.

The devil is in the details of course, but the Qt framework allows for 
recognized events based on what widget takes the first touch event and
recognized touch patterns (see below).  Custom event recognizers can be
added (or bypass to take raw events).

? handlesTouch{,Down,Over} ?


=== from:
https://doc.qt.io/qt-5/gestures-overview.html
==============vvv=============
Overview

QGesture is the central class in Qt's gesture framework, providing a
container for information about gestures performed by the user. QGesture
exposes properties that give general information that is common to all
gestures, and these can be extended to provide additional
gesture-specific information. Common panning, pinching and swiping
gestures are represented by specialized classes: QPanGesture,
QPinchGesture and QSwipeGesture.

Developers can also implement new gestures by subclassing and extending
the QGestureRecognizer class. Adding support for a new gesture involves
implementing code to recognize the gesture from input events. This is
described in the Creating Your Own Gesture Recognizer section.

==============^^^=============
https://doc.qt.io/qt-5/qgesturerecognizer.html
https://doc.qt.io/qt-5/qevent.html
https://doc.qt.io/qt-5/qtouchevent.html
==============vvv=============
Qt will group new touch points together using the following rules:

When the first touch point is detected, the destination widget is
determined firstly by the location on screen and secondly by the
propagation rules.
When additional touch points are detected, Qt first looks to see if
there are any active touch points on any ancestor or descendant of the
widget under the new touch point. If there are, the new touch point is
grouped with the first, and the new touch point will be sent in a single
QTouchEvent to the widget that handled the first touch point
==============^^^=============

FYI,
-KenD