[Development] resolution independence (was Re: Retina display support)

Tony Van Eerd tvaneerd at rim.com
Thu Oct 4 16:27:17 CEST 2012

I think proper resolution dependence and independence requires, at times at least, 2 separate measurements for each widget:

visual measure: 
	- physical measurement is in arcminutes (ie angle between <top-of-dot, your-eye, bottom-of-dot> for the smallest dot you can see - ie discerning the last line of an eye chart)
	- note that this is dependent on how far (average if necessary) the user is from the screen
	- need a scale-factor for audiences/users/situations that do not have 20/20 vision
	- so visual measurements should be in "logical arcminutes" or something like that - arcminutes * scale_factor
	- In many cases, until recently, the visual measure == pixels.

touch measure:
	- phyiscal measurement is mm, but also needs scale-factor for given situations
	- ie "big thumbs" or wearing gloves or physical impairments, etc - or even just for touch-novices or per situations (touch in car vs touch on device)

The problem, of course, is that both these measurements need to be rendered onto the same surface.

UI elements need to be able to supply both measurements (for elements that are both viewable and touchable, at least), and the renderer should probably pick the larger of the 2. Or some other heuristic(s).

Font sizes typically rely on the visual measure (although you need to take into account the touch measure for editing the text), whereas Widgets like Buttons more heavily rely on the touch measure.

The Text of a button could be at minimum the visual measure size, but to be aesthetically pleasing, should probably scale up to the touch size.  A multi-line text box, however, might be more inclined to stick to the visual measure, so as to get more text on the screen, even when the touch has been scaled up.  (ie might result in a bigger text box, but the text didn't get bigger)

We also need to be able to get "real" units, in order to render, for example, a ruler. (But is that a ruler on the device screen, or a ruler on the attached 55inch monitor...). This doesn't need to be part of a UI layout scheme, just some functions that could be called to get real values for the rare cases where it is necessary.

> -----Original Message-----
> From: development-bounces+tvaneerd=rim.com at qt-project.org
> [mailto:development-bounces+tvaneerd=rim.com at qt-project.org] On Behalf
> Of Ziller Eike
> Sent: Thursday, October 04, 2012 4:56 AM
> To: Rutledge Shawn
> Cc: Sorvig Morten; development at qt-project.org
> Subject: Re: [Development] resolution independence (was Re: Retina
> display support)
> On 1 Oct 2012, at 11:57, Rutledge Shawn <Shawn.Rutledge at digia.com>
> wrote:
> > On Sep 21, 2012 w38, at 10:37 AM, ext Ziller Eike wrote:
> >>
> >>>> but that would be a huge waste of system resource and performance
> drag when running on non-retina system. Are there any better solutions?
> >>>
> >>> Aren't you seeing the window size in pixels as usual? With that
> available, you would have a generic answere for your kind of question.
> >>
> >> Well, no. "Pixel" in the Qt world atm means something different than
> "pixel" in the physical world (when talking about Cocoa / Mac).
> >> The integer coordinates in Qt actually are mapped to what Cocoa
> calls "points" which is referring to "logical" coordinate space, not
> "device" coordinate space.
> >> A HiDPI screen has the same number of "points" as a corresponding
> non-HiDPI screen, but it has a "scale" (of 2). Applications see the
> same number of points when they run on a HiDPI screen as they would on
> a non-HiDPI screen (--> everything has exactly the same physical
> dimensions when running on different screens).
> >> That means that Qt also reports the same dimensions. Rastering for
> pixmaps is also done based on "points".
> >
> > That distorts the definition of "pixel" rather more than one would
> expect.
> The above just states the facts about how the Apple world *is*. Qt
> actually does the right thing (in terms of Mac OS applications) atm
> that one should do on Mac when running a "non-hi resolution
> application" on a hi resolution screen. Running Qt Creator on a retina
> Macbook works in the sense of all things having the same size as before
> (but having smooth & nice fonts)
> > Here's how it's supposed to work (how it already works on Linux and
> Windows):  QScreen reports both the logical and physical DPI, and the
> documentation already states that logical DPI determines the size of a
> "point" for fonts.  The physical DPI is calculated as the ratio of the
> configured resolution to the physical dimensions of the screen (as
> reported over the DDC connection from the monitor).  Logical DPI can be
> overridden in the operating system (in the display control panel, or on
> Linux, in xorg.conf or by giving a parameter when starting X).
> Overriding the logical DPI is the normal way for people to "zoom" the
> screen, for example to get larger fonts if one's vision is impaired.
> I don't think that is normal (or even possible?) on Mac OS X. The
> accessibility features have a complete screen zoom, but no logical DPI
> or "font size" setting.
> What happens with images/pixmaps in the application? And coordinates
> within widgets? And widget sizes? Reality is that a developer nowadays
> has to face screens that have double (or more) the resolution than most
> screens. As a developer I want to
> 1) either not care if the user has a hi resolution screen or not, so my
> application should be scaled, including all coordinates & sizes &
> pixmaps, and in the best case make use of the dpi for the automatically
> scalable things like vector fonts and vector graphics
> 2) or take real advantage of the hi resolution, e.g. by providing hi
> dpi images/pixmaps for that case, and maybe even use full resolution
> for coordinate calculations at some points
> Case (1) works with Qt on Mac already.
> In case (2) I still do *not* want to manually scale all the coordinates
> & sizes. And I'll probably not be able to provide sensible pixmaps for
> all kinds of possible dpis. And usually I only care for when the dpi is
> heavily different.
> >  (Or else, people who don't know better might just change the
> resolution and let the scaling hardware zoom it up to fit, which will
> have a similar effect on logical DPI, but makes it blurry too.)  On
> pre-OSX Macs, 72 DPI was normal, and was relatively constant if you
> bought Apple displays.  But in more recent times 96 DPI has become
> normal.  So I think a logical pixel should be defined as whatever the
> user or the OS sets it to be, by setting the logical DPI.  (Maybe Qt
> could have a configurable limit though, in case the OS doesn't provide
> a way to override the logical resolution.)
> >
> > QScreen on OSX currently has a hard-coded definition of DPI, 72
> pixels per inch.  This is not accurate on any modern hardware, and I'm
> planning to change it to report actual resolution and logical
> resolution, just like the other platforms.
> Reporting reality to the developer sounds like a good idea
> >  There are already HiDPI non-Apple displays, for example this from
> 2009:  http://techreport.com/news/16181/sony-intros-wide-expensive-
> vaio-p-netbook  which has an 8" display with 1600x768 resolution.  If
> you run Linux or Windows on it, I expect that QScreen will tell you the
> actual resolution.  Qt is supposed to be cross-platform, so it doesn't
> make sense to do something completely different on OSX only.
> >
> > Likewise the idea that HiDPI displays are always "2x" seems to me
> another inelegant hack.
> API-wise they aren't always "2x" on Mac. "UIScreen scale" and "NSScreen
> userSpaceScaleFactor" are CGFloat. Just happen to be 1 or 2 in reality
> atm, and classes like NSImage provide API shortcuts to load images with
> pixelSize=2*pointSize (but you can use NSImageRep to define images with
> any pixel-vs-point relationship).
> >  Actually the DPI varies between devices, so high-resolution art
> should not always need to be exactly 2x the normal size.  It may be
> convenient, but it's not the kind of "solution" we can expect to last
> very long.  I wouldn't be surprised if Apple themselves changes their
> tune later.
> I'd expect a non-integer scaling factor to introduce ugly scaling
> artifacts for pixmaps. That might no longer be relevant when we go to
> high enough resolution, but I somehow doubt that < 200dpi (< x2) would
> be enough.
> We had the case of slightly varying dpi between different devices
> already before, and nobody cared that on some the UI was slightly
> smaller / bigger than on the other ;), so I suppose the assumption is
> that better have the scaling by an integer factor and live with the
> small differences in "real size".
> > I think for the sake of true resolution independence, we need to
> extend QML to have support for units.  E.g. you should be able to
> specify
> >
> > Rectangle {
> > width: 20mm
> > height: 10mm
> > Text {
> > font.size: 5mm
> > text: "Hello World"
> > }
> > }
> Maybe. It might be interesting for specifying some font sizes or the
> occasional button, but I doubt that it is useful for designing a UI in
> the whole. The remaining question then would be if it would be worth
> the whole effort described below, where the occasional font.size: n*dpi
> would do it as well.
> Btw, CSS units cm, mm, in, are interpreted completely differently on
> mobile browsers, so you don't have any guarantees there.
> > font.pixelSize and font.pointSize could even be deprecated then,
> because every supported unit would be OK for every possible dimension:
> pixels (which would probably be logical pixels), millimeters, points,
> inches, etc.  (Maybe we could also have "rpx" or some such to represent
> actual pixels rather than logical pixels.)  The fact that it's a change
> to the language makes it nontrivial, but at least it's the same as what
> CSS does, and QML was designed to be similar to CSS, after all.  Then
> we can claim that we have true resolution-independence.  You could
> specify a rectangle as above, and measure with a ruler on the screen,
> and it should be exactly 2 x 1 cm on every device, as long as the
> device reports its own screen resolution accurately.  It would be the
> same if you print it.  When you are creating a UI, if you want exact
> sizes you could use real-world units, whereas if you want a UI which is
> scaled in proportion to the user's system-wide wishes, you would use
> logical pixels.
> >
> > But then it would also make sense to extend the Javascript
> implementation too, so that it's possible to assign numbers with units.
> As soon as such unit-value types exist, one begins to think it should
> be possible to do math with them too, and have transparent unit
> conversions whenever necessary.  It would be really cool, but it's all-
> new territory for Javascript (although it has been done before in some
> math-oriented languages).  As a stop-gap until the JS extension is
> done, maybe you could still assign a plain number to a unit-value
> quantity, in which case only the number is changed while the units
> remain the same.
> >
> --
> Eike Ziller, Senior Software Engineer - Digia, Qt
> Digia Germany GmbH, Rudower Chaussee 13, D-12489 Berlin
> Geschäftsführer: Mika Pälsi, Juha Varelius, Anja Wasenius
> Sitz der Gesellschaft: Berlin, Registergericht: Amtsgericht
> Charlottenburg, HRB 144331 B
> _______________________________________________
> Development mailing list
> Development at qt-project.org
> http://lists.qt-project.org/mailman/listinfo/development

This transmission (including any attachments) may contain confidential information, privileged material (including material protected by the solicitor-client or other applicable privileges), or constitute non-public information. Any use of this information by anyone other than the intended recipient is prohibited. If you have received this transmission in error, please immediately reply to the sender and delete this information from your system. Use, dissemination, distribution, or reproduction of this transmission by unintended recipients is not authorized and may be unlawful.

More information about the Development mailing list