[Development] QInputEvent refactoring

Shawn Rutledge shawn.rutledge at qt.io
Tue May 26 17:27:21 CEST 2020


> On 2020 May 26, at 16:27, Henry Skoglund <henry at tungware.se> wrote:
> Hi, I tried some QGestureEvents this winter in my first Qt iOS app, and got bored having to use 3 fingers on my iPhone for swiping (1 finger seems to be the norm on Android and iPhones). And I discovered that it was pretty easy to integrate the vanilla standard iOS 1-finger swiping with Qt, see https://bugreports.qt.io/browse/QTBUG-81042 <https://bugreports.qt.io/browse/QTBUG-81042>
That sounds like what we have been using QNativeGestureEvent for.  The plugin should probably call QWindowSystemInterface::handleGestureEvent() or handleGestureEventWithRealValue() rather than directly posting a QGestureEvent.  So far only the cocoa plugin does that, though.

Do you have to swipe from the edge of the screen with one finger?  Then I wonder if that would work well on a desktop if your window isn’t maximized.  Like with Drawer in Controls… swiping from the edge is fine when you can feel the edge, but opening a Drawer with the mouse isn’t much fun, and opening it on a desktop touchscreen isn’t either, if the edge of the window is in the middle of the screen.  (You might resize it instead, or end up pressing some other window and changing focus, or…)  But Drawer works with mouse and touch events: you simply start dragging an invisible item along its edge.  If it worked with a native swipe gesture instead, maybe it would only receive the gesture if you are swiping from the screen edge, and only on certain platforms. (Maybe it should handle that too though, optionally?)  So it’s a bit confusing when you need a gesture recognizer and when you need to use raw events.

The old gesture framework started out assuming that it could only get raw events, and also with a very ambitious goal of being able to detect any kind of gesture from any kind of event.  So it filters all events, just in case one of the recognizers might be able to detect a gesture in any part of the event stream.  Thus it is a bottleneck, and slows down all event delivery slightly.  That’s what’s wrong with the architecture.  But for the sake of widget API, QNativeGestureEvents can also turn into QGestureEvents during delivery.  I haven’t dug too deeply into the impact on widget API if we get rid of the gesture recognition framework in Qt 6.  I think widgets ought to be able to handle QNativeGestureEvent just as well as Qt Quick does, as long as they get delivered properly.  But at that point, maybe we might as well rename it back to QGestureEvent?  Either way we can probably say that the future is for most gestures to be native.  Either you process raw touch events in leaf items/widgets, or you get a gesture event that the OS has already recognized.  Qt Quick does both.  But the more we depend on the OS to recognize gestures, the more we need every platform to provide them.  So eventually we might need to find or write another gesture recognition engine just for use on platforms that don’t already have one.  (Although now that libinput does it, maybe that’s good enough on Linux.  Someone contributed a WIP patch for that already, too.)  I think gesture recognizers should live below the QPA interface though, probably not in QtGui and certainly not in the widget module (as the existing one does), because the main desktop platforms won’t need it.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/development/attachments/20200526/a25b1100/attachment.html>


More information about the Development mailing list