| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The Phonon API does not provide any way for the client to specify
which network connection should be used for streaming playback.
If the application already has a connection open, using a bearer
other than the default (e.g. using WiFi when the device default is
GPRS), it may be desirable to use it for streaming, rather than
allowing the Phonon backend to open a second connection on the
default bearer.
This patch adds a custom property on the Phonon::MediaObject,
called InternetAccessPointName. The client can specify the IAP
which Phonon should use by setting this property.
Note that support for this property is only provided in the Phonon
MMF backend.
Task-number: QTBUG-11436
Reviewed-by: Gareth Stockwell
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit integrates the Download class with the media playback
classes in the backend, to implement Progressive Download.
Note that this PDL implementation has one drawback: when video
playback is paused due to shortage of data (i.e. due to the download
being temporarily stalled), the display goes black. This is because,
when the end of the currently-downloaded data is reached, the
playback session is closed. When more data becomes available, the
clip is re-opened, a seek is done to reach the previous playback
position, and playback is re-started. Closing the playback session
closes the video stack's connection to the display, thereby causing
the video widget to go black while more data is buffered.
This is a consequence of the level in the native video stack at which
the Phonon integration is done: managing a network stall without
requiring the playback session to be closed would require integration
below the MMF client API, specifically at the MMF controller level.
Task-number: QTBUG-10769
Reviewed-by: Derick Hawcroft
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The backend accesses the resource file path via MediaSource::url().
A small patch to Phonon was required to enable this, because by
default, Phonon passes a QIODevice, rather than the resource file
path, to the backend.
The backend uses this path to create a QResource object, through which
the memory buffer into which the resource file has been read can be
accessed. This buffer is wrapped in a Symbian 8-bit descriptor and
passed to the OpenDesL() function of the appropriate MMF client
utility API.
Playback only works for certain file formats, as the Symbian MIME type
recognizer does not always work. For example, playback of an audio
WAV resource file works, while playback of an MP3 resource file does
not.
Task-number: QTBUG-6562
Reviewed-by: Justin McPherson
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
In S60 3.2, CMdaAudioPlayerUtility::SetVolume(TInt) was changed from
having a void return type to returning TInt. The code in the Phonon
MMF backend which calls this function uses a runtime platform version
check to ensure that only on S60 3.2 and above is the return value
from SetVolume treated as valid.
This check was previously testing for the wrong platform version (5.0
rather than 3.2).
Reviewed-by: Shane Kearns
|
|
|
|
|
|
|
|
| |
Previously, the MMF backend simply swallowed a call to pause() when in
StoppedState. However, the stopToPause step in tst_mediaobject requires
the backend to emit a stateChanged signal when this happens.
Reviewed-by: Frans Englich
|
|
|
|
|
| |
Task-number: QTBUG-4663
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
|
|
|
| |
The main changes are:
1. MediaObject emits prefinishMark at the appropriate instant
2. MediaObject emits aboutToFinish at the appropriate instant
3. MediaObject switches to next source when playback completes
Task-number: QTBUG-6214
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
|
|
|
|
|
| |
When clips are buffering (either at the start of playback, or
during playback, when buffer levels drop due to e.g. CPU, file system
or network load), the backend receives notification from the MMF.
While buffering is ongoing, the backend periodically queries the
filling status and emits a signal.
Task-number: QTBUG-4660
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Because the MIME type of the stream cannot always be deduced from the
URL, we assume that it is a video stream. This is based on the
assumption that the video controllers will be capable of parsing
the container formats for audio-only, as well as video clips. Note
that this assumption may not hold on all devices.
Note that most implementations of the MMF client APIs do not support
HTTP streaming (a.k.a. progressive download). The backend has therefore
only been tested with RTSP streams - see the JIRA entry for further
details.
Task-number: QTBUG-4660
Reviewed-by: Frans Englich
|
|
|
|
| |
Reviewed-by: Frans Englich
|
|
|
|
|
|
|
|
|
|
| |
Note that changing Utils from a namespace into a class, and then
using Q_DECLARE_TR_FUNCTIONS in the class declaration, was necessary
in order to be able to call tr(...) from the implementation of
Utils::symbianErrorToString.
Task-number: QTBUG-4994
Reviewed-by: Oswald Buddenhagen
|
|
|
|
|
| |
Task-number: QTBUG-4662
Reviewed-by: Frans Englich
|
|
|
|
| |
"*.pro"`
|
|
|
|
| |
Reviewed-by: Gareth Stockwell
|
|\
| |
| |
| |
| | |
Conflicts:
src/gui/kernel/qwidget_s60.cpp
|
| | |
|
|/
|
|
| |
This is part of an attempt to get the draggablevideo test app working on Symbian. Use of the dummy video output object causes a top-level window to be created, which was suspected of being the reason why video is not visible. This is not the case - for some reason, when the VideoOutput window is activated, it is still marked as 'hidden' by the window server (CWsClientWindow::ResetHiddenFlag, triggered from CCoeControl::ActivateL).
|
|
|
|
| |
Addresses review comment.
|
|
|
|
|
| |
This was previously in, but apparently disappeared as part of git work.
Addresses review comment.
|
|
|
|
| |
Addresses review comment.
|
| |
|
|
|
|
|
|
|
|
|
| |
This extends the framework for being able to handle audio effects, largely
affecting how the audio chain is set up, connected and disconnected, and
therefore the Backend has been refactored slightly, and the class MediaNode
introduced, see its documentation.
In addition two effects has been written: BassBoost and AudioEqualizer.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Video is still not visible; need to debug the initialization of the
VideoOutput object to determine whether DSA is being aborted.
|
| |
|
| |
|
|
|
|
|
|
| |
Now loads, prepares and plays a clip, but the video is not visible because it's not yet wired up to a VideoWidget.
Video 'playback' can be tested using the demos/mediaplayer application, but the menus are not displayed properly, so a video clip filename must be hardcoded in main.cpp and passed to the MediaPlayer constructor.
|
| |
|
|
|
|
| |
and AudioOutput
|
| |
|
|
|
|
| |
AbstractMediaPlayer
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Audio playback now working to the same extent as prior to the abstraction, with one regression: the initial volume level in the UI is set to zero, although playback is audible.
Some cleanup is required:
- Functionality common to AudioPlayer and VideoPlayer (e.g. tick timer, changeState function) should be moved into AbstractPlayer.
- Files may be opened by multiple instances of MediaObject at at time. For example, the musicplayer example app uses one instance to read file metadata, and one for the actual playback. In order to avoid KErrInUse errors from the file server, files must be opened with an EShare* flag and passed around by handle. At present this is done in a slightly hacky way (i.e. AbstractPlayer::setSource is renamed to setFileSource).
- The pointer held by MediaObject::m_player must be checked for nullness in many of the public API calls. This could be made cleaner by implementing a stub derivation of AbstractPlayer, which returns sensible default values. Note that, if functionality such as tick timer handling is going to be pushed upwards from AudioPlayer / VideoPlayer, we should add an intermediate class to the hierarchy so that the overhead of constructing DummyPlayer objects is minimised.
At present, media type (audio / video) is only recognised from file streams - this needs to be extended to include HTTP streaming aswell.
|
| |
|
|
As per discussions with Gareth.
|