Editing Documentation/Maemo 5 Developer Guide/Architecture/Multimedia Domain

Warning: You are not logged in. Your IP address will be recorded in this page's edit history.
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 7: Line 7:
-
[[Image:OMAP_architecture.png|800px|alt=Diagram of top-level multimedia architecture|OMAP multimedia architecture]]
+
[[Image:OMAP_architecture.png|800px]]
-
Major challenge for the Multimedia framework is to provide the same interfaces like on desktop Linux for different hardware platforms. In order to achieve this objective, the whole audio processing is moved to the ARM side (ALSA / PulseAudio) and OpenMAX IL is introduced to abstract the hardware specific components (codecs and other hardware accelerated elements).  
+
 
 +
Major challenge for the Multimedia framework is to provide the same interfaces like on desktop Linux for different hardware platforms. In order to achieve this objective, the whole audio processing is moved to the ARM side (ALSA / PulseAudio) and OpenMAX IL is introduced to abstract the hardware specific components (codecs and other hardware accelerated elements).  
==Video subsystem==
==Video subsystem==
-
[[Image:mmf_video_decomposition.png|frame|center|alt=UML diagram of video subsystem|Video subsystem]]
+
[[Image:mmf_video_decomposition.png]]
===Video Codecs===
===Video Codecs===
-
 
Most of the video encoders and decoders are wrapped under the OpenMAX IL interface [OMAX], which abstracts the codec specific implementation (which in general can run on ARM or DSP). All the OpenMAX IL video codecs will be running on the DSP side in order to best exploit the HW acceleration provided by the OMAP 3430 platform. The DSP based OpenMAX IL components are loaded by TIs IL-Core, which in turn uses LCML and the DSP-Bridge. The ARM based OpenMAX IL are loaded via Bellagio IL-Core. Both IL-Cores are used by the gst-openmax bridge. The GStreamer framework resides on the ARM side.  
Most of the video encoders and decoders are wrapped under the OpenMAX IL interface [OMAX], which abstracts the codec specific implementation (which in general can run on ARM or DSP). All the OpenMAX IL video codecs will be running on the DSP side in order to best exploit the HW acceleration provided by the OMAP 3430 platform. The DSP based OpenMAX IL components are loaded by TIs IL-Core, which in turn uses LCML and the DSP-Bridge. The ARM based OpenMAX IL are loaded via Bellagio IL-Core. Both IL-Cores are used by the gst-openmax bridge. The GStreamer framework resides on the ARM side.  
-
Video post processing is performed on DSS screen accelerator. DSS is used to do the colorspace conversion, the scaling and composition, including overlays. A separate external graphics accelerator is used to refresh the screen. In case of need (complicated use cases) the scaling and colorspace conversion can be done on the ARM side as well, but that is not recommened as it is not optimized. A/V synchronization is done on the ARM, using an audio clock that is based on information from the audio interface.
+
Video post processing is performed on DSS screen accelerator. DSS is used to do the colorspace conversion, the scaling and composition, including overlays. A separate external graphics accelerator is used to refresh the screen. In case of need (complicated use cases) the scaling and colorspace conversion can be done on the ARM side as well, but that is not recommened as it is not optimized.
 +
A/V synchronization is done on the ARM, using an audio clock that is based on information from the audio interface.
The communication between ARM and DSP software is provided by the TI DSP bridge. Any messages or data buffers exchanged between the ARM and DSP go through it. This layer can be regarded as transparent from Multimedia Architecture point of view and hence it will not be described in this document.
The communication between ARM and DSP software is provided by the TI DSP bridge. Any messages or data buffers exchanged between the ARM and DSP go through it. This layer can be regarded as transparent from Multimedia Architecture point of view and hence it will not be described in this document.
===GStreamer===
===GStreamer===
-
 
GStreamer is a crossplatform media framework, covering most multimedia applications use cases from playback, to streaming and imaging. It is a huge collection of objects, interfaces, libraries and plugins. From the application point of view it is just one utility library that can be used by applications to process media streams. The library interface is actually a facade to a versatile collection of dynamic modules that implement the actual functionality. GStreamer core hides the complexity of timing issues, synchronization, buffering, threading, streaming and other functionalities that are needed to produce usable media application.  
GStreamer is a crossplatform media framework, covering most multimedia applications use cases from playback, to streaming and imaging. It is a huge collection of objects, interfaces, libraries and plugins. From the application point of view it is just one utility library that can be used by applications to process media streams. The library interface is actually a facade to a versatile collection of dynamic modules that implement the actual functionality. GStreamer core hides the complexity of timing issues, synchronization, buffering, threading, streaming and other functionalities that are needed to produce usable media application.  
Line 39: Line 39:
*GStreamer provides good modularity and flexibility.  Hence, building applications on GStreamer in short time is possible.
*GStreamer provides good modularity and flexibility.  Hence, building applications on GStreamer in short time is possible.
*GStreamer is LGPL and it allows Multimedia Project to combine GStreamer with proprietary software.
*GStreamer is LGPL and it allows Multimedia Project to combine GStreamer with proprietary software.
 +
====Public interfaces provided by GStreamer====
====Public interfaces provided by GStreamer====
{| class="wikitable"
{| class="wikitable"
-
|+ Public interfaces provided by GStreamer
 
-
|-
 
! Interface name !! Description
! Interface name !! Description
|-
|-
-
| GStreamer API || Interface for Multimedia applications, VOIP etc
+
|GStreamer API||Interface for Multimedia applications, VOIP etc
|-
|-
-
| <code>playbin2</code> || Recommended high level element for playback.
+
|playbin2||Recommended high level element for playback.
|-
|-
-
| <code>uricodebin</code> || Recommended high level element for decoding.
+
|uricodebin||Recommended high level element for decoding.
|-
|-
-
| <code>tagreadbin</code> || Recommended high level element for fast metadata reading.
+
|tagreadbin||Recommended high level element for fast metadata reading.
|-
|-
-
| <code>camerabin</code> || Recommended high level element for camera application.
+
|camerabin||Recommended high level element for camera application.
|}
|}
===OpenMAX IL===
===OpenMAX IL===
-
 
OpenMAX is an effort to provide an industry standard for a multimedia API. The standard defines 3 layers – OpenMAX DL (Development Layer), OpenMAX IL (Integration Layer) and OpenMAX AL (Application Layer). DL is a vendor specific and optional component. IL is the layer that interfaces with IL components (e.g. codecs). We will integrate TI IL-Core for the DSP components and Bellagio IL-Core for ARM components. Neither of the cores uses DL.
OpenMAX is an effort to provide an industry standard for a multimedia API. The standard defines 3 layers – OpenMAX DL (Development Layer), OpenMAX IL (Integration Layer) and OpenMAX AL (Application Layer). DL is a vendor specific and optional component. IL is the layer that interfaces with IL components (e.g. codecs). We will integrate TI IL-Core for the DSP components and Bellagio IL-Core for ARM components. Neither of the cores uses DL.
We use GStreamer’s gomx module to transparently make OpenMAX IL components available to any GStreamer application.
We use GStreamer’s gomx module to transparently make OpenMAX IL components available to any GStreamer application.
Line 82: Line 80:
==Imaging subsystem==
==Imaging subsystem==
-
 
The multimedia framework also provides support for imaging applications. The subsystem is illustrated below.
The multimedia framework also provides support for imaging applications. The subsystem is illustrated below.
-
[[Image:Imaging_decomposition.png|800px|alt=UML diagram of imaging subsystem|Imaging subsystem]]
+
[[Image:Imaging_decomposition.png|800px]]
===Camera Source===
===Camera Source===
-
 
+
The GstV4L2CamSrc is a fork of the GstV4l2Src plugin. The reason for the fork is that the original plugin has lots of extra code for handing various V4L2 devices (such as tuner cards) and that made the code quite complex.
-
The <code>GstV4L2CamSrc</code> is a fork of the <code>GstV4l2Src</code> plugin. The reason for the fork is that the original plugin has lots of extra code for handing various V4L2 devices (such as tuner cards) and that made the code quite complex.
+
====GStreamer V4L2 Camera Source====
====GStreamer V4L2 Camera Source====
Line 122: Line 118:
{| class="wikitable"
{| class="wikitable"
-
|+ Public interface provided by camera daemon
 
-
|-
 
! Interface name !! Description
! Interface name !! Description
|-
|-
-
| V4L2 ioctl || Extensions for V4L2 protocol
+
|V4L2 ioctl||Extensions for V4L2 protocol
|}
|}
==Audio Subsystem==
==Audio Subsystem==
-
[[Image:Audio_decomposition.png|frame|center|alt=UML diagram of audio subsystem|audio subsystem]]
+
[[Image:Audio_decomposition.png]]
Line 172: Line 166:
{| class="wikitable"
{| class="wikitable"
-
|+ Publix interface provided by ALSA library
 
-
|-
 
! Interface name !! Description
! Interface name !! Description
|-
|-
Line 196: Line 188:
: GPL/LGPL
: GPL/LGPL
;Packages
;Packages
-
: [http://maemo.org/packages/view/pulseaudio/ pulseaudio] [http://maemo.org/packages/view/pulseaudio-utils/ pulseaudio-utils]
+
: [http://maemo.org/packages/view/pulseaudio/ pulseaudio]
====Public interface provided by PulseAudio====
====Public interface provided by PulseAudio====
{| class="wikitable"
{| class="wikitable"
-
|+ Public interface provided by PulseAudio
 
-
|-
 
! Interface name !! Description
! Interface name !! Description
|-
|-
Line 216: Line 206:
AEP (Audio Enhancements Package) is a full duplex speech audio enhancement package including echo cancellation, background noise suppression, DRC, AGC, etc.  
AEP (Audio Enhancements Package) is a full duplex speech audio enhancement package including echo cancellation, background noise suppression, DRC, AGC, etc.  
Both EAP and AEP are implemented as a PulseAudio module.
Both EAP and AEP are implemented as a PulseAudio module.
-
====licence====
 
-
Nokia
 
===FMTX Middleware===
===FMTX Middleware===
-
FMTX middleware provides a daemon for controlling the [[N900 FM radio transmitter|FM Transmitter]]. The daemon listens to commands from clients via dbus system interface. The frequency of the transmitter is controlled via Video4Linux2 interface. Note that the transmitter must be unmuted before changing frequency. This is because the device is muted by default and when the device is muted, it's not powered. Other settings are controlled by sysfs files in directory <code>/sys/bus/i2c/devices/2-0063/</code>.
+
FMTX middleware provides a daemon for controlling the FM Transmitter. The daemon listens to commands from clients via dbus system interface. The frequency of the transmitter is controlled via Video4Linux2 interface. Note that the transmitter must be unmuted before changing frequency. This is because the device is muted by default and when the device is muted, it's not powered. Other settings are controlled by sysfs files in directory /sys/bus/i2c/devices/2-0063/.
-
The wire of the headset acts as an antenna, boosting fmtx transmission power over allowed limits. Therefore the daemon is monitoring plugged devices and powers the transmitter down, if the headset is connected.
+
The wire of the headset acts as an antenna, boosting fmtx transmission power over allowed limits. Therefore the daemon is monitoring plugged devices and powers the transmitter down, if the headset is not connected.
GConf: system/fmtx/:
GConf: system/fmtx/:
*Bool enabled
*Bool enabled
*Int frequency
*Int frequency
-
 
-
==== licence ====
 
-
Nokia
 
===Audio/ Video Synchronization===
===Audio/ Video Synchronization===
Line 239: Line 224:
MMF delivers parts of the notification mechanism. The subsytem is illustrated below:
MMF delivers parts of the notification mechanism. The subsytem is illustrated below:
-
[[Image:Notification_decomposition.png|frame|center|alt=UML diagram of notification subsystem|Notification subsystem]]
+
[[Image:Notification_decomposition.png]]
Line 250: Line 235:
===Input Event Sounds===
===Input Event Sounds===
-
The input event sounds module is using the Xserver xtest (http://www.xfree86.org/current/xtestlib.pdf) extension to produce input event feedback via libcanberra. The input-sound module is started with the XSession as a separate process.
+
The input event sounds module is using the Xserver xtest (http://www.xfree86.org/current/xtestlib.pdf) extension to produce input even feedback via libcanberra. The input-sound module is started with the XSession as a separate process.
==Metadata Subsystem==
==Metadata Subsystem==
Line 256: Line 241:
The multimedia framework supports the search engine when indexing media files. The subsystem is illustrated below:
The multimedia framework supports the search engine when indexing media files. The subsystem is illustrated below:
-
[[Image:Metadata_decomposition.png|frame|center|alt=UML diagram of metadata subsystem|Metadata subsystem]]
+
[[Image:Metadata_decomposition.png]]
Line 267: Line 252:
It is neither user friendly nor always possible to run several multimedia use cases at the same time. To provide a predictable and stable behavior, multimedia components interact with a system wide policy component to keep concurrent multimedia use cases under control. The subsystem is illustrated below. The policy engine is based on the OHM framework. It is dynamically configurable with scripting and a prolog based rule database.
It is neither user friendly nor always possible to run several multimedia use cases at the same time. To provide a predictable and stable behavior, multimedia components interact with a system wide policy component to keep concurrent multimedia use cases under control. The subsystem is illustrated below. The policy engine is based on the OHM framework. It is dynamically configurable with scripting and a prolog based rule database.
-
[[Image:Policy_decomposition.png|frame|center|alt=UML diagram of policy subsystem|Policy subsystem]]
+
[[Image:Policy_decomposition.png]]
Line 275: Line 260:
;Purpose
;Purpose
-
: It can be used by media application to synchronize their playback state. This includes audio playback when the silent profile is active.
+
: It can be used by media application to synchronize their playback state.
;Responsibilities
;Responsibilities
: Provides an interface for media playback management
: Provides an interface for media playback management
Line 282: Line 267:
;Packages
;Packages
: [http://maemo.org/packages/view/libplayback-1-0/ libplayback]
: [http://maemo.org/packages/view/libplayback-1-0/ libplayback]
-
;Documentation and example code
 
-
: [http://maemo.org/api_refs/5.0/5.0-final/libplayback-1/ Documentation]
 
-
: [http://talk.maemo.org/showthread.php?t=67157 Thread on libplayback]
 
-
 
-
====Public interface provided by libplayback====
 
 +
'''Public interface provided by libplayback'''
{| class="wikitable"
{| class="wikitable"
-
|+ Public interface provided by libplayback
 
-
|-
 
! Interface name !! Description
! Interface name !! Description
|-
|-

Learn more about Contributing to the wiki.


Please note that all contributions to maemo.org wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see maemo.org wiki:Copyrights for details). Do not submit copyrighted work without permission!


Cancel | Editing help (opens in new window)