| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Symbian^3 introduces a new compositing graphics subsystem in which
non-UI content such as video is provided by client applications via
graphics surfaces.
This patch modifies the video playback part of the Phonon MMF backend
so that, on devices which use the new graphics architecture (NGA),
video is rendered to a surface. On devices which use the legacy
graphics architecture, the existing video rendering path, which uses
Direct Screen Access (DSA) is maintained.
On NGA devices, video playback applications do not deal with surfaces
directly; instead, they use a new MMF client API called
CVideoPlayerUtility2. The implementation of this API takes care of
creating a graphics surface, registering it with the window manager,
and directing the output of the video decoder into this surface.
CVideoPlayerUtility2 inherits from the legacy video playback API,
CVideoPlayerUtility, deprecating certain functions and adding new ones.
The main changes involved in modifying CVideoPlayerUtility client code
to instead use CVideoPlayerUtility2 are:
1. CVideoPlayerUtility requires a window handle to be provided at
object construction time.
The CVideoPlayerUtility2 constructor does not take a window
handle; it is provided by the client later via the
SetDisplayWindowL function.
2. CVideoPlayerUtility requires the client to provide an absolute
screen rectangle at construction time, and then to call
SetDisplayWindowL whenever this rectangle changes due to either
window repositioning or resizing.
CVideoPlayerUtility2 requires the client to provide a display
rectangle which is relative to the display window. This
rectangle must be updated via SetVideoExtentL /
SetWindowClipRectL when the window is resized, but no update is
required when the window is repositioned - the compositing
window system takes care of repositioning the video content on
the screen.
3. CVideoPlayerUtility requires the client to paint transparent
black into the region of the window in which video will be
displayed. CVideoPlayerUtility2 does not require the client
to paint the video window.
In order to accomodate these differences, the existing VideoPlayer and
VideoOutput classes are replaced with AbstractVideoPlayer and
AbstractVideoOutput respectively. These abstract base classes
encapsulate functionality which is common between the DSA and surface
rendering client code. Because CVideoPlayerUtility2 inherits from
CVideoPlayerUtility, AbstractVideoPlayer is able to hold a pointer to
CVideoPlayerUtility, via which it controls functionality which is not
affected by the details of the rendering path, such as play/pause/stop,
seek and metadata access.
The three areas of divergence listed above are encapsulated in the
derived classes DsaVideoOutput/SurfaceVideoOutput and DsaVideoPlayer/
SurfaceVideoPlayer. Of the three, (1) and (3) are fairly
straightforward. For DSA video playback, the need to respond to
changes in video widget absolute screen position in (2) necessitated
the AncestorMoveMonitor class, which installs an event filter on each
ancestor of the video widget. This class is not required for surface
video playback and is therefore removed from the surface-rendering
code path.
Selection of either the DSA- or surface-rendering code path is done
at qmake time, via the exists(...) check introduced in mmf.pro. This
checks for existence of the header in which CVideoPlayerUtility2 is
defined; if this file is found, surface rendering is selected,
otherwise the DSA rendering version of the backend is built. Note that
this approach is not completely robust, since it is possible for an
environment to include the videoplayer2.h header and yet be configured
to use the legacy graphics subsystem. This could be dealt with by
instead performing the check for surface support at configuration time,
building and executing a small Symbian program which will return
different output according to which of the two graphics subsystems is
in use.
Task-number: QTBUG-8919
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
| |
Recent changes to phonon and syncqt cause problems with the default
search path for #include with this compiler.
Reviewed-by: Gareth Stockwell
|
|
|
|
|
| |
Task-number: QTBUG-4663
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- Replaced VolumeObserver and VideoOutputObserver interfaces with
signals.
- Added signals for propagating changes in aspect ratio and scale mode
from VideoOutput to VideoPlayer.
- Removed VideoPlayer::getNativeWindowSystemHandles. Interaction with
window system is moved into VideoOutput, so that VideoPlayer is better
focussed on its main task: interacting with CVideoPlayerUtility.
- WId changes, resize and move events received by the VideoOutput
widget cause it to emit a videoWindowChanged signal. This is received
by the VideoPlayer, which triggers a call to updateVideoRect. The
main task of this function is to calculate the scale factors which are
provided to the MMF via CVideoPlayerUtility::SetScaleFactorL. Note
that:
i) This function must be called both before and after the call
to SetDisplayWindowL. For changes in aspect ratio or scale
mode, setting the scale mode after updating the display window
is sufficient. However, testing showed that, when switching in
or out of full-screen mode, two calls were necessary in order
to preserve the correct aspect ratio.
ii) The screen rectangle passed to the MMF is still the full
window extent; it is not the region in which video will
actually be rendered. The post-processor will fill in the
remainder of the window with a background colour (typically
black). If, on the other hand, we passed in the actual video
display rectangle, we would need to do this background painting
in the widget. This in turn would require a change to QtGui:
at present, we can only disable blitting on a per-widget basis
(by setting QWExtra::disableBlit). If we needed to paint the
borders of the video window, disableBlit would need to contain
the actual DSA region, rather than just a single boolean flag.
Task-number: QTBUG-5585
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
being moved.
Because QWidget::moveEvent is only called when a widget moves relative
to its parent, a widget's absolute screen position may change without it
receiving a moveEvent (for example, as a result of its parent being
moved).
The MMF video playback API on Symbian v9.4 requires, in addition to a
window handle, an absolute screen rectangle. Changes in the video
widget's absolute screen position therefore need to be propagated
into the MMF.
This change introduces a new object, AncestorMoveMonitor, which
installs an event filter on the QCoreApplication instance. A
VideoOutput object registers with the AncestorMoveMonitor, which
listens on its behalf for MoveEvents and ParentChangeEvents directed
at any of the ancestors of the VideoOutput. MoveEvents trigger a
callback to the VideoOutput instance, which then notifies the MMF of
the new screen rectangle. ParentChangeEvents cause the
AncestorMoveMonitor to update the ancestor list associated with the
target VideoOutput instance.
The video position now tracks that of the associated widget, but there
are two problems which require further investigation:
1. The video window lags behind. This may be an unavoidable
consequence of the fact that setting a new screen rectangle causes the
MMF to tear down its DSA session and start a new one; this is known to
block the window server and take some time to complete.
2. Artifacts are visible around the edges of the moving video widget.
Task-number: QTBUG-4787
Reviewed-by: Frans Englich
|
|
|
|
|
| |
Previously the MediaObject propagation was only done for effects, but now it's
for all kinds of nodes. This is needed for AudioOutput.
|
|
|
|
|
|
|
|
|
| |
This extends the framework for being able to handle audio effects, largely
affecting how the audio chain is set up, connected and disconnected, and
therefore the Backend has been refactored slightly, and the class MediaNode
introduced, see its documentation.
In addition two effects has been written: BassBoost and AudioEqualizer.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
These hacks, which are enabled using macros in defs.h, are:
PHONON_MMF_HARD_CODE_VIDEO_RECT
VideoPlayer should obtain the screen rectangle into which video rendering will occur, using the CoeControl associated with the video widget (i.e. the VideoOutput object). However, this control always has co-ordinates (relative to its parent) of (0,0)-(100,30), regardless of whether updateGeometry() and show() are called on the widget. So, this macro just hard-codes the screen rectangle to a value which works when the breakfast.mp4 clip is used for testing.
PHONON_MMF_EXPLICITLY_SHOW_VIDEO_WIDGET
In order that the video region does not overwrite the slider and buttons below it, the parent widget of the VideoObject instance must be appropriately re-sized. Debugging showed that both the parent and grandparent of VideoObject are in fact invisible at the time playback begins, so for now we just manually call show() on the grandparent.
Clearly both of these are only temporary measures, for use until the underlying cause of the problem has been determined.
|
|
|
|
|
| |
Video is still not visible; need to debug the initialization of the
VideoOutput object to determine whether DSA is being aborted.
|
| |
|
|
|
|
|
|
| |
Now loads, prepares and plays a clip, but the video is not visible because it's not yet wired up to a VideoWidget.
Video 'playback' can be tested using the demos/mediaplayer application, but the menus are not displayed properly, so a video clip filename must be hardcoded in main.cpp and passed to the MediaPlayer constructor.
|
|
|