[Accessibility] Approaches to get screen reader to read text

Marc Sabatella marc at outsideshore.com
Wed Feb 19 22:44:49 CET 2020


The TLDR: how do I get Qt to generate a LiveRegionChanged event for a
desktop application that has nothing to do with HTML or web browsers?

The rest of the story:

Recently I started a discussion on our efforts to get our application
(MuseScore, an open source music notation program) to read information.  I
got some good advice, and managed to follow through to the point of getting
MuseScore reading with JAWS and Orca in addition to NVDA.  But we still are
unable to get Narrator or VoiceOver to read, and I feel I have pretty much
exhausted the possibilities of the path I had been on, so maybe it's time
to try different approach instead of just tweaking the current one.  And we
now have new incentive: program "X" (one of the biggest commercial
alternatives to MuseScore) recently got their application working screen
readers including Narrator for sure and I believe VoiceOver as well.  I
know they use Qt, so it's got to be possible.  But they aren't open source,
so I'm trying to reverse engineer what they did.

Here is the thread from last month:
https://lists.qt-project.org/pipermail/accessibility/2020-January/000103.html
.
Further discussion took place on the Orca list and also offline.

To summarize, the basic issue is we have a custom widget representing the
music score, and we have implemented our own keyboard commands to navigate
it as well as perform various editing operations.  At the end of each
command, we want the screen reader to read a line of information that is a
voice-optimized version of the info in the status bar.  So far we've done
that by implementing accessibility for our widget as recommended in the Qt
documentation, using a role of StaticText, implementing the ValueInterface,
and setting having text() return the information we want as the Value as
well as the description (the Name is the name of the score).  We are able
to get NVDA, JAWS, and Orca to read this, but Narrator and VoiceOver do not
- at least, not automatically.  If you explicitly give the Narrator command
to read the current object, it will.  But that's not good enough.
Unfortunately, no amount of fiddling with different roles and different
events seemed to help.

I've been using AccessibiltiyInsights in my work so far, and after seeing
that program "X" is able to do this, I turned AccessibilityInsights on
program "X" (it doesn't run on Linux, so I cant use Accercizer).  I'm
getting confusing results, though.  At first when I ran it, I saw almost no
objects in the tree.  When I listened for events, I saw only some
"structure changed" events ("children invalidated"), which made me think
they were creating and destroying helper objects to force the reading -
something I've considered but so far rejected and wasn't sure how to
implement anyhow.  But now I am trying again Accessibility Insights on
program "X" again, and I'm seeing a more normal-looking object tree.  They
appear to be using a widget with role set to "Document", and on each
navigation, they are generating LIveRegionChanged events.  ARIA live
regions are something I definitely wanted to experiment with but I thought
they were only for the web.  I guess not.

So, how do I generate LiveRegionChanged in a Qt app?  Searching for info
yields a few interesting hits such as
https://blogs.msdn.microsoft.com/winuiautomation/2017/11/08/can-your-desktop-app-leverage-the-new-uia-notification-event-in-order-to-have-narrator-say-exactly-what-your-customers-need/
,
but not really a clear answer.

Or, does any of this suggestion any other approach?

-- 
Marc Sabatella
marc at outsideshore.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/accessibility/attachments/20200219/909d6f54/attachment.html>


More information about the Accessibility mailing list