[Qtwebengine] Preparing external textures for HTML5 video in qtwebengine

kenz kiran kirankenz at gmail.com
Thu Jan 30 21:00:49 CET 2014


Hello

We (qnx port of blink) are developing a blink based browser with
QTWebEngine in delegated rendering mode.

To implement HTML5 Video we introduced Android like StreamTexture class.
This is based on the EGL Image external extension.

We have extended delegated_frame_node and added a YUV Node like class (with
custom shaders etc) to support EGL Image External extension.


We are discussing a current design limitation with Chromium graphics dev
mailing list. The subject to search is: "Updating StreamTextures in
Delegated Rendering".

The design issue we need to solve (we have worked around it, but is not
optimal).

Here is the problem:
When ever we get a new frame from the Video Decoder, before we blit the
Quads from the Quad list received for the frame, we need to 'update()'
which would internally do

glBindTexture(GL_TEXTURE_EXTERNAL_OES, mTexName) and
glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, (GLeglImageOES)image).

where 'image' would be the EGLImage of the latest video frame from the
decoder.

In Chromium's existing design, before "ANY" draw call say DrawArrays or
DrawElements, Chromium calls "PrepareTexturesForRender()" which internally
calls Update() on each of the StreamTextures.

But In case of delegated rendering, since we don't use chromium's GL
Rendeder but use QT as the parent compositor rendering the received
Quadlist, we don't make any draw calls using Chromium's gl infrastructure.

So at this point we need a "preparetextures()" like call from
delegated_frame_node.  And this should happen in GPU thread and QT render
thread should be blocked till this completes.

So wanted to check this forum (specifically: Jocelyn and maybe Gunnar) to
get some ideas on how to resolve this problem.


Also chromium folks have raised some concerns of texture's being deleted
while they are being composited by QT. I think mailboxes and existing infra
should cover this case. But I haven't studied them in much depth.

Here is the excerpt:

Q: How exactly do you render with the texture if not DrawArrays or
DrawElements? Does the actual final drawing not go through the
gpu/commandbuffer layer, i.e. do you pull out or share the underlying GL
texture somehow with an external GL context?

 WillUseTexture() is supposed to be called from any GL command being parsed
that causes the texture to be used for drawing or blits.

     I am certain we don't go through, gpu/command buffer setup for drawing
(hence no DrawArray/DrawElements). Yes we have shared context setup. The
texture ids are known to the QT rendering context (which is not the same
used by GPU thread).

 Concern: Does that not cause problems to share the texture that way since
the gpu TextureManager and ContextGroup manages the lifetime of the texture
(and could try to delete it)?
I recommend you either have your view system use the gpu stack's GL client
interface, or you share the texture externally in a more explicit way.

 Otherwise, if you are already pulling out the GL texture from underneath
the chromium gpu stack, you might as well pull out the SurfaceTexture
handle as well and update it (see GetLevelImage()).


regards
Ravi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/qtwebengine/attachments/20140130/c91edb7f/attachment.html>


More information about the QtWebEngine mailing list