[Qt-interest] Fade In/Out Video with Phonon
Josiah Bryan
jbryan at productiveconcepts.com
Mon Feb 22 13:35:40 CET 2010
Oliver.Knoll at comit.ch wrote:
> Josiah Bryan wrote on Saturday, February 20, 2010 8:07 PM:
>
>> ... Even with setting contrast down to -1 at
>> the same time, video still show through. Going the other way (fade to
>> white, or +1.0 brightness) - works a bit better, but even at 1.0,
>> video still legible.
>>
>
> Setting the contrast to -1 (or -100%) does *not* set the video to black: it reduces, well, the contrast by 100% (exactly what the method name implies).
>
Right, well - it was just a last ditch effort after the brightness at -1
didnt work - I was just trying to elaborate what I had tried, that's all.
> What you want is to set the alpha value from 100% (opaque) to gradually 0% (transparent), so that the underlying surface - black or another video - shines through.
>
Right. I've tried putting a Phonon::VideoWidget in a
QGraphicsProxyWidget via QGraphicsScene::addWidget and setting the
opacity that way - it causes the video to freeze the UI thread after the
first setOpacity call for about 1-2 minutes (video continues to play),
then the video pauses and the timer finishes its run to zero (but no
pause() call was ever made.) Bottom line: Conventional / pedestrian
method of setting the alpha does not seem to work with VideoWidget.
> How you do that with Phonon I don't know (I have no experience with that API), but basically you would
>
> - render the video frame on some "surface" (QImage, OpenGL textured rectangle, ...)
> - set the alpha value accordingly (1.0 down to 0.0)
> - render that "surface" on top of an existing one (black background, other video frame)
>
Right again - all well and good, but how to you actually get the
individual frames from the video to do your own rendering? I've yet to
find a slot like, say, MediaObject::newImage(const QImage&) that fires
so I can render the image. If I could do my own rendering of the video,
then I'd gladly experiment with any and all of these suggestions - but
they all hinge on low-level access to the video stream that Phonon
doesn't seem to provide. Please, someone, anyone - tell me I'm wrong! :-)
> Note that alpha-blending images "in realtime" (at say 25 FPS) can be quite CPU intensive, so depending on the video size (PAL, HDTV, ...) and the compression (h.264, ...) which might also use lot of CPU power doing this "in software" can be unsatisfactory. But that would be my first approach, to see what you get with a simple QImage/QPainter approach. Maybe painting onto a QGLWidget with the QPainter API would already improve the performance enough. Otherwise use as much hardware support as you can get (use a Codec which is supported by your graphic hardware, use OpenGL, use Pixel Buffer Objects (PBO) to transfer texture data as quickly as possible into the graphic card memory etc.). Since you want to blend against "black" (instead of another video) maybe there are even more tricks to speedup things).
>
Thanks for the ideas! If someone can share the secrets of custom phonon
video rendering, I'll explore these more.
-josiah
More information about the Qt-interest-old
mailing list