[Development] need to handle touch events rather than depending on mouse event synthesis

Samuel Rødal samuel.rodal at nokia.com
Thu Mar 1 15:16:49 CET 2012


On 03/01/2012 02:49 PM, ext Frederik Gladhorn wrote:
> Torsdag 1. mars 2012 19.48.15 skrev ext Alan Alpert:
>> On Thu, 1 Mar 2012 18:26:28 ext Samuel Rødal wrote:
>>> On 02/29/2012 05:20 PM, ext Shawn Rutledge wrote:
>>>> The proposal is this: I think we need to have a QML PointingArea element
>>>> which looks just like MouseArea except that it handles both mouse events
>>>> and single-touch events the same way.  Then we need to start using it
>>>> instead of MouseArea in almost every case.  That way
>>>> AA_SynthesizeMouseForUnhandledTouchEvents can eventually be set to false
>>>> for some applications.
>>>
>>> PointingArea sounds a bit strange at first, but maybe it's hard to find
>>> a better name for it, at least I can't think of one.
>>>
>>>> Some apps may eventually have a use for the distinction between mouse
>>>> and touch, too. One reason I got interested again at this particular
>>>> time is that I was thinking it would be nice if the KDE Konsole
>>>> (terminal application) was flickable. But dragging with the left mouse
>>>> button is the way that you select text, and that is also useful. So on
>>>> my touchscreen laptop, I can select text by dragging, regardless whether
>>>> I drag with a finger on the screen, with the touchpad, or with an
>>>> external mouse; but with a finger on the screen, selecting text is not
>>>> really what I expect. If Konsole handled both mouse events and touch
>>>> events, it could do something appropriate for each of them, and there
>>>> could be some useful multi-finger gestures too. This is just an example,
>>>> and in fact it may already be possible to implement this use case
>>>> without changes to Qt (except for the lack of support for XInput 2.2
>>>> and/or direct support for the evdev driver on my laptop, which I'm also
>>>> interested in looking at separately.)  But in a more complex case which
>>>> has more widgets or Qt Quick elements, if touch events are being
>>>> accepted at all, you need to have them understood everywhere.
>>>>
>>>> But we have the issue that the QML MouseArea component is in very
>>>> widespread use. This is because QML started life on the desktop. There
>>>> is already QQuickMultiPointTouchArea and QQuickPinchArea; so in
>>>> applications which intend to be portable between touchscreen devices and
>>>> conventional desktop usage, it should be OK to have overlapping touch
>>>> areas and MouseAreas, and this will enable the app developer to
>>>> customize the behavior depending on whether the user is interacting with
>>>> a mouse or a finger. But naive components like any of the many Button
>>>> implementations should not necessarily need to care.  They should be
>>>> using the proposed PointingArea instead of MouseArea.
>>>>
>>>> Alternatively MouseArea could handle touch events itself, but then
>>>> pretty soon we will start thinking the name sounds rather dated.  It
>>>> wouldn't even surprise me if people stop using mice in a decade or so;
>>>> whereas we will probably always have some kind of "pointing device",
>>>> thus the need for a generic name which won't be obsolete later.  And, we
>>>> still need a mouse-only Area which can be used in combination with the
>>>> touch-only Areas so that it's possible to make a cross-platform UI.
>>>
>>> I agree with your argument that changing MouseArea to handle touch
>>> events might not be a good idea after all, as applications might want to
>>> still handle touch and mouse input slightly differently by putting both
>>> a MouseArea and a TouchArea covering the same region.
>>
>> The overlapping MouseArea/TouchArea is an interesting idea, and might
>> explain why we'd need a TouchArea when it's virtually identical to the
>> 'PointerArea" element. But then we'd have three area interaction elements
>> that are virtually identical, with a few slight differences in
>> functionality (like wheel events) and some name changes (onTap instead of
>> onClicked despite identical functionality...).
>>
>> Perhaps we could just add an enum to MouseArea? EventType { MouseEvents,
>> TouchEvents, MouseAndTouchEvents (default) }. That allow you more
>> fine-grained control, with identical default behavior that doesn't require
>> event synthesis that messes with other elements. Not to mention that the
>> common case is you don't care what pointy thing they used to say 'do that',
>> there are devices with both mouse and touch and often the app really
>> doesn't care which it was.

Hmm, true, then if you wanted to handle touch and mouse differently 
you'd just put two of them on top of each other with different event types.

>> It doesn't solve the name issue, but that one is a difficult one because it
>> leads to a lot of API differences which are purely naming. I'd rather use a
>> MouseArea's onClicked signal for a touch UI than have to switch to using
>> TouchArea's onTapped everywhere just because this is the mobile app UI.
>> PointerArea's onJab (onPointedWithEnthusiasm? We've run away from the
>> metaphor a little here...) might not solve this, but it would feel like an
>> unnecessary change during the transition period even if it did.
>
> I think I agree with Alan here. Adding the functionallity to MouseArea will be
> the least disruptive and solve the things we discussed.
> In an ideal world we might start with a different name, but MouseArea has
> become so widespread and predominant that it makes sense to simply keep it.
>
> +1 for adding touch handling to MouseArea.

It's possible to make MouseArea a sub-class of whatever we actually want 
to call it, and mark it as deprecated. That way it could be removed down 
the line, or made to only handle the MouseEvent type. Similarly we could 
have a TouchArea convenience sub-class that only handles the TouchEvent 
type.

--
Samuel



More information about the Development mailing list