[Development] [QtQuick] Mouse/Touch handling from Qt CS

Alan Alpert 416365416c at gmail.com
Thu Jul 25 18:06:47 CEST 2013


On Thu, Jul 25, 2013 at 1:14 AM, Rutledge Shawn
<Shawn.Rutledge at digia.com> wrote:
>
> On 24 Jul 2013, at 5:45 PM, Alan Alpert wrote:
>
>> Qt CS had a discussion on how to handle mouse and touch event
>> conflicts in QtQuick going forwards.
>>
>> An interesting idea came up that the existing model might be
>> salvageable if we fix multi-cursor support.
>
> I suggested that idea 2 years ago, but I was told authoritatively at that time that realistically, Qt will never support multiple cursors, because the idea of a single cursor, single grab, single focus, etc. is so pervasive in Qt.

Two years ago, was capacitive multi-touch an essential feature for
UIs? Or was it still just emerging as a dominant UI paradigm?

It will be easier to fix in QtQuick than in Widgets. Hopefully.
Realistically, we probably don't have the capacity to fix it in
widgets anyways. If it is indeed too hard to fix, then we'll just
implement mouse and touch handling for now and we'll continue
researching the ultimate solution for Qt 6 ( or QtQuick 3 ).

> Another thing is that doing a pinch gesture with 2 mice, or 2 hands on a touch screen, is not the common use case, and multi-user applications will need to distinguish the users.  The ideal touch hardware would be able to treat separate hands as separate users, so that the software could handle multiple separate gestures at the same time.  Therefore it's an oversimplification to say that one mouse cursor is the same thing as one touch point, although we might get away with it in single-user applications.

Single-user applications are the dominant form right now, I'm okay
with focusing on them.

> I also don't like the fact that events take such a circuitous path: the OS event is morphed into a QPA event and queued; then it's morphed into a classic Qt event and queued again; then it's morphed into a Qt Quick event and the hierarchy traversal finally begins, to find the right item in the scene.  If it's a touch event, then for each item in the scene, it might have to get converted to an item-specific synthetic mouse event (that part is my doing though).  It's amazing that we can get real-time dragging performance at all, with so much overhead.  (And graphics view has a similar parallel event path of its own.)  Given that every OS has an event queue already, I wonder why we even need multiple queues of our own; why don't we just ask QPA to fetch the next event from the OS queue, and convert it just once into a single cross-platform event type?  I suppose mainly it's so that Qt can inject its own events into the stream, although a timeline of "our stuff" interleaved with "their stuff" could have been implemented without requiring a common event type.  And why does QtQuick really need its own separate event types?  Just because of binary compatibility guarantees, we can't change the old event types?  Or because of the chance that if we did it wrong, all the widgets would need work?  But we could have planned to change this in time for 5.0, to have one event type for every layer.  Now that is past, but we could still plan to reunify the events for Qt 6.

Sounds good. Add a TODO Qt 6 somewhere.

> If not, it would still be nice if QtQuick could short-circuit some of the event delivery path.  I've been wondering if it could interface directly to QPA.  Probably it would have profound effects on our chances of properly nesting QtQuick and widgets in the same application.  (Which is problematic anyhow, due to the fact that OpenGL rendering always requires its own window.)  But it would also provide a chance to handle multiple cursors in QtQuick only, without making incompatible changes in all the other layers.

I'd have to see a prototype to understand how that would help...

> Ironically it was pointed out again after Doug Englebart's recent passing that from the beginning, he assumed that a multi-user multi-tasking computer would of course require multiple mice, so that each user can interact with objects on the screen independently.  45 years later, we still haven't gotten around to treating that as the mainstream use case.  It could be one of the last reasons left to use a desktop or large-screen computer.

Most people seem to have discarded the idea as impractical approx. 45
years ago. Without a screen the size of a house (which we only just
now have with 4K Projectors), you get in each others way and there
isn't even the physical volume of fingers to help stop that. But my
point is that it's more important for Qt to support the interaction
paradigms that are established in the market, like single cursor and
capacitive multi-touch, than speculative or unpopular models. Those
are just a bonus.

--
Alan Alpert



More information about the Development mailing list