| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
being moved.
Because QWidget::moveEvent is only called when a widget moves relative
to its parent, a widget's absolute screen position may change without it
receiving a moveEvent (for example, as a result of its parent being
moved).
The MMF video playback API on Symbian v9.4 requires, in addition to a
window handle, an absolute screen rectangle. Changes in the video
widget's absolute screen position therefore need to be propagated
into the MMF.
This change introduces a new object, AncestorMoveMonitor, which
installs an event filter on the QCoreApplication instance. A
VideoOutput object registers with the AncestorMoveMonitor, which
listens on its behalf for MoveEvents and ParentChangeEvents directed
at any of the ancestors of the VideoOutput. MoveEvents trigger a
callback to the VideoOutput instance, which then notifies the MMF of
the new screen rectangle. ParentChangeEvents cause the
AncestorMoveMonitor to update the ancestor list associated with the
target VideoOutput instance.
The video position now tracks that of the associated widget, but there
are two problems which require further investigation:
1. The video window lags behind. This may be an unavoidable
consequence of the fact that setting a new screen rectangle causes the
MMF to tear down its DSA session and start a new one; this is known to
block the window server and take some time to complete.
2. Artifacts are visible around the edges of the moving video widget.
Task-number: QTBUG-4787
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
| |
The MediaPlayer requires that an output device is available.
Task-number: QTBUG-4755
Reviewed-by: Gareth Stockwell
|
| |
|
|
|
|
| |
Addresses review comment.
|
| |
|
|
|
|
|
| |
Previously the MediaObject propagation was only done for effects, but now it's
for all kinds of nodes. This is needed for AudioOutput.
|
|
|
|
|
|
|
|
|
| |
This extends the framework for being able to handle audio effects, largely
affecting how the audio chain is set up, connected and disconnected, and
therefore the Backend has been refactored slightly, and the class MediaNode
introduced, see its documentation.
In addition two effects has been written: BassBoost and AudioEqualizer.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Video is still not visible; need to debug the initialization of the
VideoOutput object to determine whether DSA is being aborted.
|
| |
|
| |
|
|
|
|
|
|
| |
Now loads, prepares and plays a clip, but the video is not visible because it's not yet wired up to a VideoWidget.
Video 'playback' can be tested using the demos/mediaplayer application, but the menus are not displayed properly, so a video clip filename must be hardcoded in main.cpp and passed to the MediaPlayer constructor.
|
|
|
|
| |
and AudioOutput
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Audio playback now working to the same extent as prior to the abstraction, with one regression: the initial volume level in the UI is set to zero, although playback is audible.
Some cleanup is required:
- Functionality common to AudioPlayer and VideoPlayer (e.g. tick timer, changeState function) should be moved into AbstractPlayer.
- Files may be opened by multiple instances of MediaObject at at time. For example, the musicplayer example app uses one instance to read file metadata, and one for the actual playback. In order to avoid KErrInUse errors from the file server, files must be opened with an EShare* flag and passed around by handle. At present this is done in a slightly hacky way (i.e. AbstractPlayer::setSource is renamed to setFileSource).
- The pointer held by MediaObject::m_player must be checked for nullness in many of the public API calls. This could be made cleaner by implementing a stub derivation of AbstractPlayer, which returns sensible default values. Note that, if functionality such as tick timer handling is going to be pushed upwards from AudioPlayer / VideoPlayer, we should add an intermediate class to the hierarchy so that the overhead of constructing DummyPlayer objects is minimised.
At present, media type (audio / video) is only recognised from file streams - this needs to be extended to include HTTP streaming aswell.
|
|
|
|
|
| |
The patch originally contained all changed done to MMF Phonon, but this
commit contains the changes only Gareth did.
|
| |
|
| |
|
| |
|
|
|