[Interest] I.MX6 and QT5

Thomas Senyk thomas.senyk at pelagicore.com
Thu Jan 17 18:58:16 CET 2013


On Thu, January 17, 2013 09:46:30 Eric Nelson wrote:
> On 01/17/2013 03:55 AM, Thomas Senyk wrote:
> > On Thu, January 17, 2013 13:52:06 Mandeep Sandhu wrote:
> >> On Wed, Jan 16, 2013 at 4:22 AM, Eric Nelson
> >> 
> >> <eric.nelson at boundarydevices.com> wrote:
> >>> On 01/15/2013 03:43 PM, qtnext wrote:
> >>>> thanks for the info ... I hopes that with I.MX6 it's possible to decode
> >>>> HD video in hardware and for example remap to an opengl texture or
> >>>> quick2 item , but I have checked on freescale website and I am not sure
> >>>> if accelerated decoding is not only for overlay display...
> >>>> 
> >>>   > <snip>
> >>> 
> >>> Possible? yes.
> >>> 
> >>> Easy? Maybe for someone with a very precise set of knowledge...
> >>> 
> >>> The VPU decoder natively supports YUV output, which, as you mention
> >>> can be directly fed to a hardware overlay layer.
> >>> 
> >>> The Vivante GPU also supports a variety of YUV planes though, so
> >>> there's a possibility that a memory buffer known to the GPU can
> >>> be used as the 'sink' for a gstreamer element and the GPU could
> >>> work with or convert it as necessary.
> >> 
> >> We're also using a media SoC with Vivante GPU and Qt 4.8 (on
> >> DirectFB). We have a media player application (2D gfx only) that uses
> >> GStreamer with Fluendo gst elements for audio/video decoder and sinks.
> >> Both media decoding and 2D gfx are h/w accelerated. The SoC offers 4
> >> h/w planes with
> >> 
> >> We merge the o/p of Qt and Gstreamer on the various h/w planes made
> >> available by the SoC (there are 4 of them). The z-order is such that
> >> the OSD plane (used by Qt) comes on top of video plane. The video is
> >> visible by making the required area on the OSD plane transparent.
> >> 
> >> I don't have much exp with openGL, but wouldn't that work for merging
> >> opengl surface with the decoded video o/p as well? Why would you want
> >> to give the GPU buffer to the Gstreamer sink?
> >> 
> >> Though I understand that any effects applied on the OSD layer (i.e in
> >> Qt) would not work on the video as that would be a different plane.
> > 
> > They concept is to use the GPU-buffer (maybe a FBO) as a texture
> > (that's how it's done on the desktop and other embedded HW)
> > 
> > That why you can use the video in the same matter as any other item in
> > your
> > SceneGraph (e.g. transform, translate, shadereffect, mapping on
> > 3D-items,....)
> > 
> > ... so it's about features and flexibility.
> > 
> > If you want a straight, simple, pixel- and color-perfect video-player...
> > the HW-plane sounds like the better choice.
> > If you want to apply shadereffects to it
> > (http://www.youtube.com/watch?v=SMRln8FJvKc) you have to have a
> > video-texture.
> Thanks Thomas.
> 
> I want one!
> 
> I'm particularly interested in how that edge detection works (i.e.
> whether this is a stock OpenGL operation).

It's all in the QtMultimedia examples already:
https://qt.gitorious.org/qt/qtmultimedia/trees/stable/examples/multimedia/video/qmlvideofx





More information about the Interest mailing list