[Interest] Akward touch behaviour on Windows with MouseArea and MultipointTouchArea

Nuno Santos nunosantos at imaginando.pt
Tue Nov 13 17:21:19 CET 2018


Hi,

I have recently acquired a touch screen. After spending many years developing touch applications for iOS and Android, I have finally acquired a touchscreen for windows touch development. 

I’m writing to this list because I’m facing an awkward issue with MouseArea and MultipointTouchArea.

Let’s suppose I have the following qml code:

MouseArea {
	anchors.fill: parent
	onPressedChanged: console.log(pressed)
}

MultiPointTouchArea {
	anchors.fill: parent
	onPressed: console.log(“pressed”);
	onReleased: console.log(“released”);
}


What is happening here is that when I touch the screen with either of these code piece alone I’m having the same result: the output only comes after the finger is released.

At first I decided to look for Windows touch input settings and found the press and hold to right click which was enabled. But even after disabling that option, the issue remains.

I was expecting to have qml: true / qml: pressed and the finger touches the screen and qml: false / qml: released when the finger is lifted from the screen but that is not happening.

What am I missing?

I’m on Windows 10 with Qt 5.12 beta 3 MSVC2017 Qt kit.

Thanks in advance!

Best regards,

Nuno




-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/interest/attachments/20181113/d5c2ad02/attachment.html>


More information about the Interest mailing list