[Development] resolution independence (was Re: Retina display support)
Samuel Rødal
samuel.rodal at digia.com
Wed Oct 3 15:02:12 CEST 2012
On 10/03/2012 02:24 PM, Shawn Rutledge wrote:
> On 3 October 2012 12:12, Samuel Rødal <samuel.rodal at digia.com> wrote:
>> On 10/02/2012 01:58 PM, Atlant Schmidt wrote:
>>> Samuel:
>>>
>>>> Specifying that all characters must be 5mm in physical height is
>>>> typically not what a UI wants in any case, since different types of
>>>> displays (mobiles / tablets, computer monitors, and TVs) are typically
>>>> viewed at different distances.
>>>
>>> In my world (medical devices), it's entirely *TYPICAL*
>>> that there will be specifications on the size of the
>>> text. Realistically, it may be a spec on the minimum
>>> size of the text (so that it is readable by the elderly)
>>> but we can't go bigger either because the text won't
>>> fit.
>>>
>>> And yes, our interface is a touch screen so buttons
>>> must also be specified in real physical sizes.
>>>
>>> In other words, we need a solution that lets us specify
>>> the actual rendered size of the text on the screen.
>>>
>>> Atlant
>>
>> Sure, that assumes that the distance the application will be viewed at
>> is constant though, and makes it hard to port to another display type
>> unless you made the font size configurable in a different way. In some
>> cases that is acceptable since the application is not meant to be portable.
>>
>> Since logical DPI is how Windows and Linux at least (not sure about
>> Mac?) let you control font scaling in a global way, I'd say that Qt's
>> suggestion for scalable UIs should be to use point sizes and not
>> physical size or pixels to specify the size of fonts. The logical DPI
>> can then be set based on viewing distance, resolution, and user preference.
>
> Actually though a point is defined as 1/72 of an inch. This comes
> from the printing field and predates digital technology.
>
> Since logical DPI is often overridden in OS settings, we seem to have
> logical pixels being different than actual screen pixels. Users have
> gotten in the habit of specifying fonts in points, but that's also a
> leftover habit from the pre-digital printing days (a "font" being
> something physical like a set of cast type, or linotype character
> molds, or films for a phototypesetter). We seem to be redefining
> points to be virtualized too (1 point = 1 logical pixel is what you
> are proposing, right?), so that they scale the same way the virtual
> pixels scale, but I don't believe that users are accustomed to
> thinking that widget dimensions are in points. And 1 point is not 1
> pixel if your logical DPI is actually 96 rather than 72, or is it?
> What about when you draw a line with a width of 1 pixel, do you expect
> 1 logical pixel (which is much more than 1 pixel on a HiDPI screen) or
> 1 actual pixel?
>
> It would be much less confusing if the whole computing industry would
> give up on obsolete units and move towards using metric units for
> everything. But you do have a point that in practice, there often is
> some scaling, so that fonts of a given "point size" tend to end up
> smaller on mobile devices. And there is also the point that for
> touchscreen controls, the actual physical size of the control is
> important because people's fingers aren't getting smaller.
> Furthermore, mobile devices are proliferating faster than PCs. So
> maybe in the future, UIs should be designed in real-world units and
> then scaled up when displayed on larger screens that are viewed from a
> distance. There could be a global scale factor so that 1 mm becomes
> 1.7 mm or whatever the user wants. It would be less confusing for
> users if they set a zoom factor instead of having to understand what
> DPI means. 1.0 would mean exact size as designed, and larger numbers
> make everything uniformly larger.
Yeah, Microsoft has an article that tries to explain this:
http://msdn.microsoft.com/en-us/library/windows/desktop/ff684173
Basically, logical DPI _is_ a scaling factor. If your font has a point
size of 24, the actual number of pixels that ends up corresponding to is
24 * logical_dpi / 72. Only if you have a logical DPI of 72 does one
point correspond to one pixel.
> Basically we are interpreting
> logical DPI that way, but given an OS that allows me to override
> logical DPI, at what setting can I be guaranteed that a 72-point font
> will be exactly 1 inch tall? That should be independent of what model
> of monitor is attached, as long as the monitor accurately reports its
> real size, right?
To get a 72-point font to be 1 inch tall you'd simply make sure the
logical DPI was the same as the physical DPI.
--
Samuel
More information about the Development
mailing list