[Development] [QtQuick] Mouse/Touch handling from Qt CS

Rutledge Shawn Shawn.Rutledge at digia.com
Thu Jul 25 10:14:25 CEST 2013


On 24 Jul 2013, at 5:45 PM, Alan Alpert wrote:

> Qt CS had a discussion on how to handle mouse and touch event
> conflicts in QtQuick going forwards.
> 
> An interesting idea came up that the existing model might be
> salvageable if we fix multi-cursor support.

I suggested that idea 2 years ago, but I was told authoritatively at that time that realistically, Qt will never support multiple cursors, because the idea of a single cursor, single grab, single focus, etc. is so pervasive in Qt.  

Another thing is that doing a pinch gesture with 2 mice, or 2 hands on a touch screen, is not the common use case, and multi-user applications will need to distinguish the users.  The ideal touch hardware would be able to treat separate hands as separate users, so that the software could handle multiple separate gestures at the same time.  Therefore it's an oversimplification to say that one mouse cursor is the same thing as one touch point, although we might get away with it in single-user applications.

I also don't like the fact that events take such a circuitous path: the OS event is morphed into a QPA event and queued; then it's morphed into a classic Qt event and queued again; then it's morphed into a Qt Quick event and the hierarchy traversal finally begins, to find the right item in the scene.  If it's a touch event, then for each item in the scene, it might have to get converted to an item-specific synthetic mouse event (that part is my doing though).  It's amazing that we can get real-time dragging performance at all, with so much overhead.  (And graphics view has a similar parallel event path of its own.)  Given that every OS has an event queue already, I wonder why we even need multiple queues of our own; why don't we just ask QPA to fetch the next event from the OS queue, and convert it just once into a single cross-platform event type?  I suppose mainly it's so that Qt can inject its own events into the stream, although a timeline of "our stuff" interleaved with "their stuff" could have been implemented without requiring a common event type.  And why does QtQuick really need its own separate event types?  Just because of binary compatibility guarantees, we can't change the old event types?  Or because of the chance that if we did it wrong, all the widgets would need work?  But we could have planned to change this in time for 5.0, to have one event type for every layer.  Now that is past, but we could still plan to reunify the events for Qt 6.

If not, it would still be nice if QtQuick could short-circuit some of the event delivery path.  I've been wondering if it could interface directly to QPA.  Probably it would have profound effects on our chances of properly nesting QtQuick and widgets in the same application.  (Which is problematic anyhow, due to the fact that OpenGL rendering always requires its own window.)  But it would also provide a chance to handle multiple cursors in QtQuick only, without making incompatible changes in all the other layers.

Ironically it was pointed out again after Doug Englebart's recent passing that from the beginning, he assumed that a multi-user multi-tasking computer would of course require multiple mice, so that each user can interact with objects on the screen independently.  45 years later, we still haven't gotten around to treating that as the mainstream use case.  It could be one of the last reasons left to use a desktop or large-screen computer.




More information about the Development mailing list