[Development] need to handle touch events rather than depending on mouse event synthesis

Frederik Gladhorn frederik.gladhorn at nokia.com
Tue Jun 19 14:19:26 CEST 2012


Hi,

Fredag 15. juni 2012 15.33.03 skrev ext Luciano Wolf:
> Hi Frederik,
> 
> How the patches are going? I've interest on this touch related things
> as we're having problems with Snowshoe webbrowser running on a N9. I'm
> trying your patch (the 50th) against latest valid Qt5 hash but it's
> not working. Should I try with any specific hash?

As Andras commented in the patch, it works with WebView in some cases.
The state is now that we will finally integrate it and rather start fixing 
bugs on top of it instead of rebasing it again and again.
All in all it has reached a good state.
I'm waiting for a final OK from the QML team.

Martin, it would be great to know if you have any additional insights. 
Otherwise we'll go forward and finally get this behavior change in, hopefully 
before any Qt 5 beta.

Greetings,
Frederik


> 
> Cheers,
> Luciano




> 
> 
> 
> On Tue, May 15, 2012 at 1:12 PM, Frederik Gladhorn
> 
> <frederik.gladhorn at nokia.com> wrote:
> > Hi,
> > 
> > it's been a while and the touch handling issues keep comming back.
> > 
> > Since I don't think just talking helps I spent the last month writing
> > patches with Shawn's and Laszlo's help, testing on actual touch screens.
> > 
> > The result are two patches, I'd like to have more opinions on these. They
> > are not 100% polished, but work quite well for me.
> > 
> > The patches are bigger than I hoped, to sum them up, they "just change the
> > event propagation of touch and sythesized mouse events to be parallel".
> > So each item gets first offered the touch and if it claims to handle mouse
> > events, also the mouse event.
> > The tricky part is that items can filter their children's events to steal
> > them (this is how a flickable works with a mouse area on top of it).
> > 
> > 
> > Send mouse and touch in parallel:
> > https://codereview.qt-project.org/#change,24189
> > 
> > Make it work again with more than one touch point:
> > https://codereview.qt-project.org/#change,25578
> > 
> > (I don't mind squashing them, but it was easier to have one known good
> > checkpoint while working on this)
> > 
> > 
> > There are some issues still, for example currently you can only interact
> > with one pinch area at a time.
> > 
> > Basically my goal would be now to get the big behavior change in as soon
> > as
> > possible. In my opinion this is a now or never since we are way too far in
> > the Qt 5 cycle already. I personally do think this is important and we'll
> > always regret not getting this straight.
> > 
> > Cheers
> > Frederik
> > 
> > Onsdag 29. februar 2012 17.20.36 skrev ext Shawn Rutledge:
> >> We've been chatting some more around the Oslo office about the fact that
> >> even on platforms where the touchscreen is the primary pointing device,
> >> both QWidgets and Qt Quick components are handling mostly mouse
> >> events. In order to make that work, the Qt 4 approach (which is still
> >> in place now) is for each incoming touch event, first let it propagate
> >> as a touch event; then, if the event was not accepted, synthesize a
> >> mouse event and let that propagate through. This method of achieving
> >> backwards compatibility discourages handling touch events though: if
> >> any variety of button widget or QML component, which handles mouse
> >> events only, is a child (or descendant) of any component which handles
> >> touch events, then when the touch event is accepted by the parent
> >> component, the mouse event will not be synthesized. So the button (or
> >> other mouse-only component) cannot be pressed.
> >> 
> >> The WebKit team has this problem, in that they want the QML web view to
> >> be flickable, but obviously the user needs to be able to interact with
> >> any mouse-only components (such as buttons) which might be on top. So
> >> WebKit needs to separately synthesize mouse events from touch events
> >> because the main synthesis mechanism doesn't work in that case.
> >> 
> >> In src/quick/items/qquickcanvas.cpp, there is a very recent new method
> >> translateTouchToMouse which generates a new QQuickMouseEventEx event
> >> type containing velocity information. (In fact maybe it makes sense to
> >> just put the velocity in the base QMouseEvent, but that's optional to
> >> what I'm about to propose.) This is IMO another case where mouse event
> >> synthesis should not be necessary.  I suspect the reason for it is the
> >> same as the WebKit case.
> >> 
> >> Graphics View has yet another way, but handling touch events there is
> >> lower-priority than for QML.
> >> 
> >> If we set aside all the Qt history and think about what would have been
> >> ideal if we were starting over from scratch, I think I'd have wanted a
> >> "pointing" event type which has a bit less than what QMouseEvent does:
> >> at least the coordinates and an enum for "what happened" (pressed,
> >> released, clicked, double-clicked, entering, leaving and so on). The
> >> mouse event could inherit that and add mouse-specific concepts like
> >> having multiple buttons, and the touch event could inherit and add the
> >> multiple-finger concept. The point being that naive widgets like
> >> Button should not need to care where the event came from, just that it
> >> was clicked or tapped, but not dragged; so QPushButton would just handle
> >> the hypothetical Pointing event. Then most of the third-party legacy
> >> apps would have already been doing the same thing, and we wouldn't have
> >> trouble at this stage to introduce a touch event as a different
> >> specialization of the Pointing event. (Alternatively multiple fingers
> >> could be treated just like multiple mice: separate press/release events
> >> for the independent fingers. But actually it's useful to group multiple
> >> touch points together as long as they come from the same user; it
> >> facilitates gestural UIs, in that the UI does not need to gather up
> >> multiple points from multiple events occurring at different times, and
> >> figure out that they are part of one gesture.)
> >> 
> >> Anyway, back to reality... my next idea was let's have a flag to enable
> >> the mouse event synthesis, which should be true by default, so that we
> >> can at least turn it off and try to develop pure-touch UIs. But it turns
> >> out this flag already exists: AA_SynthesizeMouseForUnhandledTouchEvents,
> >> which is true by default. And there is even the reverse one:
> >> AA_SynthesizeTouchForUnhandledMouseEvents. So that's a good start.
> >> 
> >> The proposal is this: I think we need to have a QML PointingArea element
> >> which looks just like MouseArea except that it handles both mouse events
> >> and single-touch events the same way.  Then we need to start using it
> >> instead of MouseArea in almost every case.  That way
> >> AA_SynthesizeMouseForUnhandledTouchEvents can eventually be set to false
> >> for some applications.
> >> 
> >> We also need the exsisting QWidgets to start handling touch events too.
> >> After that is done, individual QWidget-based apps in the field
> >> (especially those with custom widgets) can set the
> >> AA_SynthesizeMouseForUnhandledTouchEvents flag or not, depending on what
> >> works better for them; but we need to move towards a future in which we
> >> do not need to synthesize mouse events.
> >> 
> >> Some apps may eventually have a use for the distinction between mouse
> >> and touch, too. One reason I got interested again at this particular
> >> time is that I was thinking it would be nice if the KDE Konsole
> >> (terminal application) was flickable. But dragging with the left mouse
> >> button is the way that you select text, and that is also useful. So on
> >> my touchscreen laptop, I can select text by dragging, regardless whether
> >> I drag with a finger on the screen, with the touchpad, or with an
> >> external mouse; but with a finger on the screen, selecting text is not
> >> really what I expect. If Konsole handled both mouse events and touch
> >> events, it could do something appropriate for each of them, and there
> >> could be some useful multi-finger gestures too. This is just an example,
> >> and in fact it may already be possible to implement this use case
> >> without changes to Qt (except for the lack of support for XInput 2.2
> >> and/or direct support for the evdev driver on my laptop, which I'm also
> >> interested in looking at separately.)  But in a more complex case which
> >> has more widgets or Qt Quick elements, if touch events are being
> >> accepted at all, you need to have them understood everywhere.
> >> 
> >> But we have the issue that the QML MouseArea component is in very
> >> widespread use. This is because QML started life on the desktop. There
> >> is already QQuickMultiPointTouchArea and QQuickPinchArea; so in
> >> applications which intend to be portable between touchscreen devices and
> >> conventional desktop usage, it should be OK to have overlapping touch
> >> areas and MouseAreas, and this will enable the app developer to
> >> customize the behavior depending on whether the user is interacting with
> >> a mouse or a finger. But naive components like any of the many Button
> >> implementations should not necessarily need to care.  They should be
> >> using the proposed PointingArea instead of MouseArea.
> >> 
> >> Alternatively MouseArea could handle touch events itself, but then
> >> pretty soon we will start thinking the name sounds rather dated.  It
> >> wouldn't even surprise me if people stop using mice in a decade or so;
> >> whereas we will probably always have some kind of "pointing device",
> >> thus the need for a generic name which won't be obsolete later.  And, we
> >> still need a mouse-only Area which can be used in combination with the
> >> touch-only Areas so that it's possible to make a cross-platform UI.
> >> 
> >> In summary the consequences of mouse event synthesis present some real
> >> problems, and I think we need to get the device-agnostic PointingArea
> >> into Qt5 ASAP.
> > 
> > _______________________________________________
> > Development mailing list
> > Development at qt-project.org
> > http://lists.qt-project.org/mailman/listinfo/development



More information about the Development mailing list