[Development] need to handle touch events rather than depending on mouse event synthesis

Shawn Rutledge shawn.rutledge at nokia.com
Wed Feb 29 18:31:36 CET 2012


On Wednesday, February 29, 2012 12:08:19 PM ext Atlant Schmidt wrote:
> Shawn:
> 
>   This sounds like the roots of a strong proposal -- carry on!
> 
>   Two thoughts:
> 
>     o Please be sure your "pointing device" proposal
>       can be generalized beyond "mice and touchscreens".
>       There are certainly other pointing devices already
>       in the world (who remembers Light Pens, Joy sticks,
>       and "Dial Boxes";-) ) and others that will become
>       fully practical soon (eye tracking where your point-
>       of-regard acts as the pointing device).
> 
>       We should make sure the new approach is "future
>       proof" to the maximal possible degree.

I agree, but the idea of the PointingArea is more of a lowest-common-
denominator handler for things that mice and touchscreens can both do, and 
hopefully the unknown future devices as well.  Joysticks and lightpens are 
also both similar enough in that you have at least 2 axes and one primary 
button.  I think you should be able to click the same Button component with 
any of these devices, as long as there is a driver which can generate an 
existing Qt event type (that is questionable, but at least it's a small thing 
to do if the need arises).

>     o Please be sure that pinch/unpinch gestures fit
>       within the overall strategy.

That's covered by the PinchArea.  As far as I know, there shouldn't be a 
problem with stacking multiple Areas to handle multiple types of events in 
case you want the same item to be interactive via mouse as well.

Thanks for the feedback.
-- 
MVH, Shawn Rutledge ❖ "ecloud" on IRC



More information about the Development mailing list