[Development] resolution independence (was Re: Retina display support)

Shawn Rutledge shawn.t.rutledge at gmail.com
Wed Oct 3 14:24:31 CEST 2012


On 3 October 2012 12:12, Samuel Rødal <samuel.rodal at digia.com> wrote:
> On 10/02/2012 01:58 PM, Atlant Schmidt wrote:
>> Samuel:
>>
>>> Specifying that all characters must be 5mm in physical height is
>>> typically not what a UI wants in any case, since different types of
>>> displays (mobiles / tablets, computer monitors, and TVs) are typically
>>> viewed at different distances.
>>
>>    In my world (medical devices), it's entirely *TYPICAL*
>>    that there will be specifications on the size of the
>>    text. Realistically, it may be a spec on the minimum
>>    size of the text (so that it is readable by the elderly)
>>    but we can't go bigger either because the text won't
>>    fit.
>>
>>    And yes, our interface is a touch screen so buttons
>>    must also be specified in real physical sizes.
>>
>>    In other words, we need a solution that lets us specify
>>    the actual rendered size of the text on the screen.
>>
>>                                       Atlant
>
> Sure, that assumes that the distance the application will be viewed at
> is constant though, and makes it hard to port to another display type
> unless you made the font size configurable in a different way. In some
> cases that is acceptable since the application is not meant to be portable.
>
> Since logical DPI is how Windows and Linux at least (not sure about
> Mac?) let you control font scaling in a global way, I'd say that Qt's
> suggestion for scalable UIs should be to use point sizes and not
> physical size or pixels to specify the size of fonts. The logical DPI
> can then be set based on viewing distance, resolution, and user preference.

Actually though a point is defined as 1/72 of an inch.  This comes
from the printing field and predates digital technology.

Since logical DPI is often overridden in OS settings, we seem to have
logical pixels being different than actual screen pixels.  Users have
gotten in the habit of specifying fonts in points, but that's also a
leftover habit from the pre-digital printing days (a "font" being
something physical like a set of cast type, or linotype character
molds, or films for a phototypesetter).  We seem to be redefining
points to be virtualized too (1 point = 1 logical pixel is what you
are proposing, right?), so that they scale the same way the virtual
pixels scale, but I don't believe that users are accustomed to
thinking that widget dimensions are in points.  And 1 point is not 1
pixel if your logical DPI is actually 96 rather than 72, or is it?
What about when you draw a line with a width of 1 pixel, do you expect
1 logical pixel (which is much more than 1 pixel on a HiDPI screen) or
1 actual pixel?

It would be much less confusing if the whole computing industry would
give up on obsolete units and move towards using metric units for
everything.  But you do have a point that in practice, there often is
some scaling, so that fonts of a given "point size" tend to end up
smaller on mobile devices.  And there is also the point that for
touchscreen controls, the actual physical size of the control is
important because people's fingers aren't getting smaller.
Furthermore, mobile devices are proliferating faster than PCs.  So
maybe in the future, UIs should be designed in real-world units and
then scaled up when displayed on larger screens that are viewed from a
distance.  There could be a global scale factor so that 1 mm becomes
1.7 mm or whatever the user wants.  It would be less confusing for
users if they set a zoom factor instead of having to understand what
DPI means.  1.0 would mean exact size as designed, and larger numbers
make everything uniformly larger.  Basically we are interpreting
logical DPI that way, but given an OS that allows me to override
logical DPI, at what setting can I be guaranteed that a 72-point font
will be exactly 1 inch tall?  That should be independent of what model
of monitor is attached, as long as the monitor accurately reports its
real size, right?

I've been working on this patch to make QScreen properties more
complete and remove the hard-coded 72 DPI:
https://codereview.qt-project.org/#change,36121
I was surprised to find that if I ask OS X for the logical DPI, it is
really still saying 72 DPI.  That's not like the other platforms.
Doesn't that mean that with the same monitor, the same Qt UI will
appear larger on a Mac than on Windows or Linux?

But at least with this patch it is possible to get the actual screen
size and actual DPI.  So it will become possible to design UIs using
real-world units whenever we get around to that.



More information about the Development mailing list