[Development] need to handle touch events rather than depending on mouse event synthesis

Alan Alpert alan.alpert at nokia.com
Thu Mar 1 10:48:15 CET 2012


On Thu, 1 Mar 2012 18:26:28 ext Samuel Rødal wrote:
> On 02/29/2012 05:20 PM, ext Shawn Rutledge wrote:
> > The proposal is this: I think we need to have a QML PointingArea element
> > which looks just like MouseArea except that it handles both mouse events
> > and single-touch events the same way.  Then we need to start using it
> > instead of MouseArea in almost every case.  That way
> > AA_SynthesizeMouseForUnhandledTouchEvents can eventually be set to false
> > for some applications.
> 
> PointingArea sounds a bit strange at first, but maybe it's hard to find
> a better name for it, at least I can't think of one.
> 
> > Some apps may eventually have a use for the distinction between mouse
> > and touch, too. One reason I got interested again at this particular
> > time is that I was thinking it would be nice if the KDE Konsole
> > (terminal application) was flickable. But dragging with the left mouse
> > button is the way that you select text, and that is also useful. So on
> > my touchscreen laptop, I can select text by dragging, regardless whether
> > I drag with a finger on the screen, with the touchpad, or with an
> > external mouse; but with a finger on the screen, selecting text is not
> > really what I expect. If Konsole handled both mouse events and touch
> > events, it could do something appropriate for each of them, and there
> > could be some useful multi-finger gestures too. This is just an example,
> > and in fact it may already be possible to implement this use case
> > without changes to Qt (except for the lack of support for XInput 2.2
> > and/or direct support for the evdev driver on my laptop, which I'm also
> > interested in looking at separately.)  But in a more complex case which
> > has more widgets or Qt Quick elements, if touch events are being
> > accepted at all, you need to have them understood everywhere.
> > 
> > But we have the issue that the QML MouseArea component is in very
> > widespread use. This is because QML started life on the desktop. There
> > is already QQuickMultiPointTouchArea and QQuickPinchArea; so in
> > applications which intend to be portable between touchscreen devices and
> > conventional desktop usage, it should be OK to have overlapping touch
> > areas and MouseAreas, and this will enable the app developer to
> > customize the behavior depending on whether the user is interacting with
> > a mouse or a finger. But naive components like any of the many Button
> > implementations should not necessarily need to care.  They should be
> > using the proposed PointingArea instead of MouseArea.
> > 
> > Alternatively MouseArea could handle touch events itself, but then
> > pretty soon we will start thinking the name sounds rather dated.  It
> > wouldn't even surprise me if people stop using mice in a decade or so;
> > whereas we will probably always have some kind of "pointing device",
> > thus the need for a generic name which won't be obsolete later.  And, we
> > still need a mouse-only Area which can be used in combination with the
> > touch-only Areas so that it's possible to make a cross-platform UI.
> 
> I agree with your argument that changing MouseArea to handle touch
> events might not be a good idea after all, as applications might want to
> still handle touch and mouse input slightly differently by putting both
> a MouseArea and a TouchArea covering the same region.

The overlapping MouseArea/TouchArea is an interesting idea, and might explain 
why we'd need a TouchArea when it's virtually identical to the 'PointerArea" 
element. But then we'd have three area interaction elements that are virtually 
identical, with a few slight differences in functionality (like wheel events) 
and some name changes (onTap instead of onClicked despite identical 
functionality...).

Perhaps we could just add an enum to MouseArea? EventType { MouseEvents, 
TouchEvents, MouseAndTouchEvents (default) }. That allow you more fine-grained 
control, with identical default behavior that doesn't require event synthesis 
that messes with other elements. Not to mention that the common case is you 
don't care what pointy thing they used to say 'do that', there are devices 
with both mouse and touch and often the app really doesn't care which it was.

It doesn't solve the name issue, but that one is a difficult one because it 
leads to a lot of API differences which are purely naming. I'd rather use a 
MouseArea's onClicked signal for a touch UI than have to switch to using 
TouchArea's onTapped everywhere just because this is the mobile app UI. 
PointerArea's onJab (onPointedWithEnthusiasm? We've run away from the metaphor 
a little here...) might not solve this, but it would feel like an unnecessary 
change during the transition period even if it did.

-- 
Alan Alpert
Senior Engineer
Nokia, Qt Development Frameworks



More information about the Development mailing list