[Interest] Implementation of accelerated rendering of Media in QtWebKit

Luca Carlon carlon.luca at gmail.com
Mon May 6 19:15:42 CEST 2013

Hi! I would like to reimplement the MediaPlayerPrivateQt class in 
QtWebKit in Qt 5.0.2 to use hardware acceleration for rendering video on 
Raspberry Pi (I got some info from here: 
https://bugs.webkit.org/show_bug.cgi?id=86410). I suppose that by 
returning true from virtual bool 
MediaPlayerPrivateQt::supportsAcceleratedRendering(), I should get some 
calls to virtual void MediaPlayerPrivateQt::paintToTextureMapper(...). 
Is this assumption correct? In that case, who is supposed to call that?

I suppose that this would be done in the web process, and that I should 
somehow place decoded frames in the GPU memory in a way that is 
accessible from the ui process. Assuming I can do this, can somebody 
point me to the place where in the sources the "drawing" in the ui 
process takes place? The idea would be to "pack" data somehow int the 
web process and "unpack" it in the ui process to draw the OpenGL texture.

Is there someone who can point me to this couple of points in the code 
to get me started (I understand that my ideas are really vague yet)?


More information about the Interest mailing list