[Development] QInputEvent refactoring

Henry Skoglund henry at tungware.se
Tue May 26 16:27:35 CEST 2020


On 2020-05-26 16:07, Shawn Rutledge wrote:
> When I started working with Qt Quick in 2011, it wasn't too
> long before I began to notice that our vaunted support for multitouch 
> (which
> still felt like an innovative new feature at the time, even though it had
> already been 5 years since the introduction of the iPhone) was quite 
> flawed.
> Everyone was using MouseArea and Flickable, and those were not very good
> companions for the only components that actually supported multitouch:
> MultiPointTouchArea and PinchArea.  At some point someone had made the 
> decision
> that touch is enough like mouse, that it would simplify things if we just
> convert touch events to synthetic mouse events and reuse the same 
> logic for
> both.  Maybe that was decided before multi-touch screens were invented; it
> would have been ok for the resistive ones from the QTopia days, for 
> example.
> But I thought that Qt Quick was newer than that, so it was an immediate
> facepalm moment when I realized that mistake was perpetuated there. 
>  Finding
> ways to work around it has dominated a lot of my time working on Qt 
> Quick ever
> since.  But Qt 5 has had such a long lifespan, and it wasn't possible 
> to fix it
> in a fundamental way, without increasing complexity in ways that you 
> might have
> noticed.
>
> First, I did some patches to handle actual QTouchEvents in Flickable and
> MouseArea.  This naturally increases the code size quite dramatically, 
> because
> QMouseEvent and QTouchEvent have too little in common, and the delivery
> strategy needs to be different.  QTouchEvents are broken up according 
> to item
> boundaries, so that you can touch multiple Items at the same time with
> different fingers.  This complexity was already there in QQuickWindow, 
> but I
> had to add a lot more code to Flickable and MouseArea.  It took a 
> really long
> time to get all the tests to pass again.  Frederik was supportive, 
> understood
> the point of what I was doing, and helped with it.  Finally this work 
> was ready
> to go into 5.5, but then reviewers still had too many doubts.  From 
> one side it
> doesn't make a lot of sense to take a component called MouseArea, with 
> all the
> limitations that name implies, and try to make it do the right things with
> touch events too.  (Even though users already expected it to handle 
> taps and
> drags on touchscreens, because the touch->mouse synthesis had always been
> there.)  It was a lot of code to read and understand.  So we left it 
> broken.
> And customers keep asking for those patches, so they still exist on a 
> personal
> branch somewhere, which I haven't updated for quite a while.  To this 
> day, if
> you use any touch-handling Item or pointer handler inside a Flickable, the
> results are often not very satisfying.  If you turn on pressDelay, it gets
> worse.  (Flickable never saw the touch press, only a synth-mouse 
> press.  So it
> cannot replay a touch press either.  So the children will see a 
> synthetic mouse
> press followed by a series of touch events, and will be required to 
> see through
> the synth-mouse crap and treat the whole series as if the press had been a
> touch event too.  But… should filtering events that would otherwise go 
> to the
> children really be Flickable’s responsibility?  What about replaying 
> events
> after a press delay: monolithic Flickable should really do that too?) And
> because of another architectural abomination, making Item Views inherit
> Flickable, that affects even more use cases with ListView and TableView
> delegates.  (I have some hope of eventually rewriting Flickable using
> Pointer Handlers (that’s what FakeFlickable.qml demonstrates), but
> keeping it working the same both for subclasses and for end-users is quite
> a high bar.)
>
> That experience taught me that we can only fix touch and mouse and 
> Wacom tablet
> event delivery by making it the same for all of them.  That means we 
> must make
> the events look enough alike that the same delivery code will work for 
> all of
> them.  It's not possible with the leftover QInputEvent hierarchy from 
> Qt 4 and
> earlier.  There is not even a consistently-named set of accessors for 
> getting
> the coordinates from the various event types.
>
> Continuing with the touch->mouse synthesis approach could maybe have been
> justified if we had support for multiple mice in Qt (so that there 
> could be a
> virtual mouse for each touchpoint), and if we could agree that it's ok to
> disassociate touchpoints from each other and deliver them as separate 
> events.
> I had a series of patches to deliver touch events that way.  It worked 
> fine in
> practice, but for that prototype I had done some non-BC modifications in
> qevent.h (which could have been mitigated with differently-designed 
> wrapper
> events).  But when we discussed it with Lars, he was very much against 
> the idea
> of disassociating touchpoints, feeling strongly that points which 
> belong to the
> same gesture need to be kept together.  As said, touch events get 
> broken up
> during delivery in QQuickWindow; but if PinchHandler for example received
> multiple events, one for each finger involved in the pinch, it would 
> have to
> update the gesture each time.  We could have mitigated that by adding an
> incrementing frame counter so that touchpoints could be re-associated.
>
> But at that time, we concluded that we will go the other way in Qt 
> Quick: every
> "pointer event" will have API appropriate for multiple points. 
>  QMouseEvent can
> have hard-coded accessors for the single point that it carries; but touch
> events carry multiple points.  This is how we can eventually refactor the
> delivery code so that mouse and touch events are delivered the same 
> way.  And
> we agreed to make the Qt Quick events a prototype of how we would 
> refactor the
> QEvents in Qt 6.  So since Qt 5.8, Qt Quick has been delivering 
> wrapper events
> instead: QQuickPointerEvent which contains instances of 
> QQuickEventPoint.  Some
> of the delivery refactoring has been done: conservatively, because 
> although few
> are willing to help, a lot more complain loudly when any existing usecase
> breaks.  And there are so many applications already.  But the wrapper 
> events
> made it possible to develop Pointer Handlers; and the goal has always 
> been that
> those would retain QML source compatibility in Qt 6.  The delivery 
> mechanism
> for those adds a lot of flexibility.  After enough use cases have been 
> ported
> over to using them, maybe we can eventually deprecate some of the most
> pernicious features that depend on complex delivery code that I'd like 
> to get
> rid of in QQuickWindow; but progress has been so slow in other modules 
> outside
> of Qt Quick itself, that it still seems too early to consider doing 
> that in Qt
> 6, because it would require heroic effort by a number of people over a 
> short
> time period.
>
> Anyway, the time has come to at least get the QEvent refactoring done, 
> so that
> Qt Quick can go back to delivering them without wrappers, and so that 
> Items can
> receive pointer events directly.  We have discussed this a few times 
> already at
> various events.  At one QtCS Qt 6 planning session, I think in 2018 
> (or was it
> earlier?), I promised to do the refactoring for Qt 6.  The goal is to 
> break
> nothing in widgets: that is, QMouseEvent needs to keep its existing 
> accessors.
> We will add new ones, and deprecate the ones that are named 
> inconsistently and
> the ones that provide integer coordinates.  The same for the other 
> event types.
>  QTouchEvent::TouchPoint is a bit special: it will be replaced by the
> QEventPoint that every QPointerEvent contains at least one of.  So far 
> it looks
> like I can use "using" to make QTouchEvent::TouchPoint an alias of 
> QEventPoint,
> for the sake of source compatibility.
>
> I think the result will look something like this:
>
>
> What I started with a few months ago was adding const QInputDevice 
> *device() to
> QInputEvent.  We have seen that the MouseEventSource enum does not provide
> enough information.  E.g. in Controls we had to assume in some places 
> that if a
> mouse event is SynthesizedByQt, then it's synthesized from a 
> touchscreen by
> QQuickWindow during delivery of the original QTouchEvent.  That's not 
> always
> true (there are other places where synthesis occurs), and resulted in some
> bugs.  Now that we're trying to fully support Wacom tablets in Qt Quick, a
> synth-mouse event could come from that.  So I want to completely replace
> MouseEventSource with the device pointer, so that the event consumer 
> can see
> specifically which device the event came from, and thus can adjust 
> behavior
> depending on the specific type of device, its capabilities, the screen 
> area
> that the device can access (e.g. a touchscreen input device or Wacom 
> tablet is
> probably mapped to a specific QScreen or QWindow), etc.  This way we 
> can also
> begin to support multiple mice and multiple "seats" (in the Wayland 
> sense) at
> the same time.  But it imposes a new requirement on platform plugins: 
> to create
> the QInputDevice instances.  The plugins that I know about all do device
> discovery anyway, though; they have just been using internal ad-hoc data
> structures of their own, and not exposing those devices to
> QWindowSystemInterface.  I've been working on the xcb plugin so far, 
> since I
> understand that one the best, and it already supports multiple seats 
> after a
> fashion (there can be multiple core pointers and core keyboards, and 
> they can
> be associated with each other; there just isn't a seat name, but I can 
> make one
> up).
>
> The fantastic result of that should be that event delivery code can 
> finally be
> device-agnostic!  QQuickWindow and QQuickFlickable will no longer need 
> to treat
> mouse and touch events differently, and Wacom tablet events will go 
> through the
> same way too.  Flickable should be able to blindly replay a copy of 
> whatever event
> it got when the pressDelay timer fires, without caring about every 
> piece of data
> inside. Only the final event consumer (QQuickItem or Pointer Handler 
> or even
> a QWidget subclass) will need to care about the device-specific 
> details, and it
> will have all the information necessary for very finely-tuned 
> behavior.  Now we
> can finally add virtual functions to QQuickItem to handle pointer 
> events, so
> not only Pointer Handlers will be able to do that.  And we will open the
> possibility to refactor event delivery code in other parts of Qt later 
> on.  It
> should become possible to fix most of the open bugs related to 
> handling mouse
> and touch events in Qt Quick and Controls during the Qt 6 series.
>
> Because we will make QPointerEvent look as much as possible like
> QQuickPointerEvent, we will maintain QML source compatibility for 
> anyone who
> just started using pointer handlers.  Of course the goal is for older 
> stuff
> like MouseArea and Flickable and Controls that we can choose 
> appropriate API
> changes, not be required to make them because of event changes. 
>  QPointerEvent
> will be a gadget instead of a QObject wrapper, but it will have the same
> properties, so to the extent that it's exposed in QML API (which is 
> not much),
> it will look the same.  It will also look enough like a QMouseEvent 
> and enough
> like a QTouchEvent that we will have source compatibility in virtual 
> functions
> that handle those, too.  Hopefully.
>
> After proving that we have also maintained source compatibility as much as
> possible (including in the great heap of widget code in the world), we 
> still
> end up with a lot of deprecated methods that should be replaced over time
> (QPoint pos() -> QPointF position() and such).  For that purpose I 
> want to add
> a feature to clazy or develop some other clang-based tooling which can 
> fix all
> of our modules, and also be available to customers to make the same
> replacements in their code bases.  If we end up with any SC breaks, it's a
> possible fallback position that at least we can deliver a tool to fix 
> them.
>
> Beyond that, we probably ought to do something about native gestures. 
>  Another
> reason Qt Quick is so complex is that it started out assuming it will 
> be given
> only raw touch events and needs to do gesture recognition itself; but now
> gestures have gone mainstream on most platforms, so we could be getting
> QNativeGestureEvents from most of them, especially from touchpads.  But I
> didn't get as far as I should have over the last couple of years just 
> exploring
> how to improve that aspect of Qt, and the platform plugin maintainers
> have not gotten around to adding support for native gestures to all the
> platforms that could have them, either.  I wish we could get rid of 
> the gesture
> recognizer in the widgets module completely; but customers will not be
> satisfied unless we then have native gesture recognition on all 
> platforms where
> it's possible.  Some of them want to continue doing custom gesture 
> recognition,
> too.  But we have QGestureEvent (for the widget gesture recognizer) and
> QNativeGestureEvent (for gesture events that come from the QPA plugin) as
> separate types.  We're running out of time to figure out whether we 
> can unify
> those two, just to make it less confusing for applications.  I guess 
> it's not
> so terrible to keep the events separate if we still have to keep the old
> gesture framework intact; but do we?
>
> So that's the status of pointer event handling.  And I'm still working 
> mostly
> alone on it.  It seems with our schedule that the QEvent API, and all 
> other
> APIs that need to change as a result of that, need to be stabilized by 
> the end
> of June.  I still think I can get the broad strokes done if nobody and no
> unexpected bugs get in the way this time, and the +2's come quickly 
> (keeping in
> mind that the perfect is the enemy of the good, and every change is 
> subject to
> ongoing incremental changes later).  (It looks like my time is limited 
> to make
> other API changes in Qt Quick, since this is taking most of it.)  Is 
> anyone
> interested in helping?
>
Hi, I tried some QGestureEvents this winter in my first Qt iOS app, and 
got bored having to use 3 fingers on my iPhone for swiping (1 finger 
seems to be the norm on Android and iPhones). And I discovered that it 
was pretty easy to integrate the vanilla standard iOS 1-finger swiping 
with Qt, see https://bugreports.qt.io/browse/QTBUG-81042

If you're thinking about refactoring also for iOS, maybe my suggestion 
can help...

Rgrds Henry

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/development/attachments/20200526/12fb6caf/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: event-hierarchy-qt6.png
Type: image/png
Size: 167066 bytes
Desc: not available
URL: <http://lists.qt-project.org/pipermail/development/attachments/20200526/12fb6caf/attachment-0001.png>


More information about the Development mailing list