[Interest] Efficiently render QtQuick-scene and encode to H264 on i.MX6 VPU?

Gunnar Sletta gunnar at crimson.no
Thu Mar 22 11:15:40 CET 2018


Hi Ola,

I think I would have preferred to skip the FBO -> QImage step as the
normal FBO readback is through glReadPixels which is quite costly, so
you'll lose a lot of performance that way. You might also benefit
from doing the RGB -> YUV conversion on the GPU if there is time
enough to do that.
The i.MX6 Vivante GPU will most likely have the GL_viv_direct_texture
extension available, in which case, so what you could do is to set up
your rendering using QQuickRenderControl to render to a
QOpenGLFramebufferObject as you say. Then, once you have the FBO, you
can run a separate pass over that FBO to do the RGB -> YUV conversion
with a custom fragment shader. Or probably multiple passes since it is
planar YUV and the Y goes into a separate location in the destination
buffer from the U and V, but you get the idea.
The target for the second pass should be a GL_viv_direct_texture, which
means that once the second pass is has completed, the memory will be
directly available as memory from the CPU side and you can pass that on
to the VPU encoder.
You might want to add one frame of latency into the mix, so that you
pipeline things, rather than  stalling to wait for the first and second
pass to complete.
If the GPU isn't able to keep up doing both things, then you can use an
FBO created directly from a direct texture and let that render using
QQuickWindow::setRenderTarget(GLuint fbo, QSize size) with your custom
FBO. Again, pipeline it to have at least two buffers in use. Then use
the direct texture access to the FBO to do the RGB -> YUV conversion on
the CPU side and pass it on to the decoder from there.
I don't know which one will work best, but you want to avoid the
QOpenGLFrameBufferObject -> QImage conversion at all cost :)
--
  Gunnar Sletta
  gunnar at crimson.no



On Thu, Mar 22, 2018, at 2:37 AM, Ola Røer Thorsen wrote:
> Hi,
> I need to render a Qt Quick scene and stream it as H264 data on an i.MX6-
> device running Linux. The use case is to be used live with as little
> latency as possible.> 
> I'm thinking of using the QQuickRenderControl class. The easy way is
> probably to render the scene to a QOpenGLFramebufferObject, convert to
> QImage, convert to yuv420p, and feed to the VPU encoder. But maybe
> there is some device-specific way that is more efficient?> 
> Best regards,
> Ola
> 
> 
> _________________________________________________
> Interest mailing list
> Interest at qt-project.org
> http://lists.qt-project.org/mailman/listinfo/interest

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.qt-project.org/pipermail/interest/attachments/20180322/43b25339/attachment.html>


More information about the Interest mailing list