[Development] resolution independence (was Re: Retina display support)

Ziller Eike Eike.Ziller at digia.com
Thu Oct 4 10:56:00 CEST 2012


On 1 Oct 2012, at 11:57, Rutledge Shawn <Shawn.Rutledge at digia.com> wrote:

> On Sep 21, 2012 w38, at 10:37 AM, ext Ziller Eike wrote:
>> 
>>>> but that would be a huge waste of system resource and performance drag when running on non-retina system. Are there any better solutions?
>>> 
>>> Aren't you seeing the window size in pixels as usual? With that available, you would have a generic answere for your kind of question.
>> 
>> Well, no. "Pixel" in the Qt world atm means something different than "pixel" in the physical world (when talking about Cocoa / Mac).
>> The integer coordinates in Qt actually are mapped to what Cocoa calls "points" which is referring to "logical" coordinate space, not "device" coordinate space.
>> A HiDPI screen has the same number of "points" as a corresponding non-HiDPI screen, but it has a "scale" (of 2). Applications see the same number of points when they run on a HiDPI screen as they would on a non-HiDPI screen (--> everything has exactly the same physical dimensions when running on different screens).
>> That means that Qt also reports the same dimensions. Rastering for pixmaps is also done based on "points".
> 
> That distorts the definition of "pixel" rather more than one would expect.

The above just states the facts about how the Apple world *is*. Qt actually does the right thing (in terms of Mac OS applications) atm that one should do on Mac when running a "non-hi resolution application" on a hi resolution screen. Running Qt Creator on a retina Macbook works in the sense of all things having the same size as before (but having smooth & nice fonts)

> Here's how it's supposed to work (how it already works on Linux and Windows):  QScreen reports both the logical and physical DPI, and the documentation already states that logical DPI determines the size of a "point" for fonts.  The physical DPI is calculated as the ratio of the configured resolution to the physical dimensions of the screen (as reported over the DDC connection from the monitor).  Logical DPI can be overridden in the operating system (in the display control panel, or on Linux, in xorg.conf or by giving a parameter when starting X).  Overriding the logical DPI is the normal way for people to "zoom" the screen, for example to get larger fonts if one's vision is impaired.

I don't think that is normal (or even possible?) on Mac OS X. The accessibility features have a complete screen zoom, but no logical DPI or "font size" setting.

What happens with images/pixmaps in the application? And coordinates within widgets? And widget sizes? Reality is that a developer nowadays has to face screens that have double (or more) the resolution than most screens. As a developer I want to

1) either not care if the user has a hi resolution screen or not, so my application should be scaled, including all coordinates & sizes & pixmaps, and in the best case make use of the dpi for the automatically scalable things like vector fonts and vector graphics

2) or take real advantage of the hi resolution, e.g. by providing hi dpi images/pixmaps for that case, and maybe even use full resolution for coordinate calculations at some points

Case (1) works with Qt on Mac already.
In case (2) I still do *not* want to manually scale all the coordinates & sizes. And I'll probably not be able to provide sensible pixmaps for all kinds of possible dpis. And usually I only care for when the dpi is heavily different.

>  (Or else, people who don't know better might just change the resolution and let the scaling hardware zoom it up to fit, which will have a similar effect on logical DPI, but makes it blurry too.)  On pre-OSX Macs, 72 DPI was normal, and was relatively constant if you bought Apple displays.  But in more recent times 96 DPI has become normal.  So I think a logical pixel should be defined as whatever the user or the OS sets it to be, by setting the logical DPI.  (Maybe Qt could have a configurable limit though, in case the OS doesn't provide a way to override the logical resolution.)
> 
> QScreen on OSX currently has a hard-coded definition of DPI, 72 pixels per inch.  This is not accurate on any modern hardware, and I'm planning to change it to report actual resolution and logical resolution, just like the other platforms.

Reporting reality to the developer sounds like a good idea

>  There are already HiDPI non-Apple displays, for example this from 2009:  http://techreport.com/news/16181/sony-intros-wide-expensive-vaio-p-netbook  which has an 8" display with 1600x768 resolution.  If you run Linux or Windows on it, I expect that QScreen will tell you the actual resolution.  Qt is supposed to be cross-platform, so it doesn't make sense to do something completely different on OSX only.
> 
> Likewise the idea that HiDPI displays are always "2x" seems to me another inelegant hack.

API-wise they aren't always "2x" on Mac. "UIScreen scale" and "NSScreen userSpaceScaleFactor" are CGFloat. Just happen to be 1 or 2 in reality atm, and classes like NSImage provide API shortcuts to load images with pixelSize=2*pointSize (but you can use NSImageRep to define images with any pixel-vs-point relationship).

>  Actually the DPI varies between devices, so high-resolution art should not always need to be exactly 2x the normal size.  It may be convenient, but it's not the kind of "solution" we can expect to last very long.  I wouldn't be surprised if Apple themselves changes their tune later.

I'd expect a non-integer scaling factor to introduce ugly scaling artifacts for pixmaps. That might no longer be relevant when we go to high enough resolution, but I somehow doubt that < 200dpi (< x2) would be enough.
We had the case of slightly varying dpi between different devices already before, and nobody cared that on some the UI was slightly smaller / bigger than on the other ;), so I suppose the assumption is that better have the scaling by an integer factor and live with the small differences in "real size".

> I think for the sake of true resolution independence, we need to extend QML to have support for units.  E.g. you should be able to specify
> 
> Rectangle {
> width: 20mm
> height: 10mm
> Text {
> font.size: 5mm
> text: "Hello World"
> }
> }

Maybe. It might be interesting for specifying some font sizes or the occasional button, but I doubt that it is useful for designing a UI in the whole. The remaining question then would be if it would be worth the whole effort described below, where the occasional font.size: n*dpi would do it as well.

Btw, CSS units cm, mm, in, are interpreted completely differently on mobile browsers, so you don't have any guarantees there.

> font.pixelSize and font.pointSize could even be deprecated then, because every supported unit would be OK for every possible dimension: pixels (which would probably be logical pixels), millimeters, points, inches, etc.  (Maybe we could also have "rpx" or some such to represent actual pixels rather than logical pixels.)  The fact that it's a change to the language makes it nontrivial, but at least it's the same as what CSS does, and QML was designed to be similar to CSS, after all.  Then we can claim that we have true resolution-independence.  You could specify a rectangle as above, and measure with a ruler on the screen, and it should be exactly 2 x 1 cm on every device, as long as the device reports its own screen resolution accurately.  It would be the same if you print it.  When you are creating a UI, if you want exact sizes you could use real-world units, whereas if you want a UI which is scaled in proportion to the user's system-wide wishes, you would use logical pixels.
> 
> But then it would also make sense to extend the Javascript implementation too, so that it's possible to assign numbers with units.  As soon as such unit-value types exist, one begins to think it should be possible to do math with them too, and have transparent unit conversions whenever necessary.  It would be really cool, but it's all-new territory for Javascript (although it has been done before in some math-oriented languages).  As a stop-gap until the JS extension is done, maybe you could still assign a plain number to a unit-value quantity, in which case only the number is changed while the units remain the same.
> 

-- 
Eike Ziller, Senior Software Engineer - Digia, Qt
 
Digia Germany GmbH, Rudower Chaussee 13, D-12489 Berlin
Geschäftsführer: Mika Pälsi, Juha Varelius, Anja Wasenius
Sitz der Gesellschaft: Berlin, Registergericht: Amtsgericht Charlottenburg, HRB 144331 B




More information about the Development mailing list