Editing Documentation/Maemo 5 Developer Guide/Architecture/Multimedia Domain

Warning: You are not logged in. Your IP address will be recorded in this page's edit history.
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 7: Line 7:
-
[[Image:OMAP_architecture.png|800px|alt=Diagram of top-level multimedia architecture|OMAP multimedia architecture]]
+
[[Image:OMAP_architecture.png|800px]]
-
Major challenge for the Multimedia framework is to provide the same interfaces like on desktop Linux for different hardware platforms. In order to achieve this objective, the whole audio processing is moved to the ARM side (ALSA / PulseAudio) and OpenMAX IL is introduced to abstract the hardware specific components (codecs and other hardware accelerated elements).  
+
 
 +
Major challenge for the Multimedia framework is to provide the same interfaces like on desktop Linux for different hardware platforms. In order to achieve this objective, the whole audio processing is moved to the ARM side (ALSA / PulseAudio) and OpenMAX IL is introduced to abstract the hardware specific components (codecs and other hardware accelerated elements).  
==Video subsystem==
==Video subsystem==
-
[[Image:mmf_video_decomposition.png|frame|center|alt=UML diagram of video subsystem|Video subsystem]]
+
[[Image:mmf_video_decomposition.png]]
===Video Codecs===
===Video Codecs===
-
 
Most of the video encoders and decoders are wrapped under the OpenMAX IL interface [OMAX], which abstracts the codec specific implementation (which in general can run on ARM or DSP). All the OpenMAX IL video codecs will be running on the DSP side in order to best exploit the HW acceleration provided by the OMAP 3430 platform. The DSP based OpenMAX IL components are loaded by TIs IL-Core, which in turn uses LCML and the DSP-Bridge. The ARM based OpenMAX IL are loaded via Bellagio IL-Core. Both IL-Cores are used by the gst-openmax bridge. The GStreamer framework resides on the ARM side.  
Most of the video encoders and decoders are wrapped under the OpenMAX IL interface [OMAX], which abstracts the codec specific implementation (which in general can run on ARM or DSP). All the OpenMAX IL video codecs will be running on the DSP side in order to best exploit the HW acceleration provided by the OMAP 3430 platform. The DSP based OpenMAX IL components are loaded by TIs IL-Core, which in turn uses LCML and the DSP-Bridge. The ARM based OpenMAX IL are loaded via Bellagio IL-Core. Both IL-Cores are used by the gst-openmax bridge. The GStreamer framework resides on the ARM side.  
-
Video post processing is performed on DSS screen accelerator. DSS is used to do the colorspace conversion, the scaling and composition, including overlays. A separate external graphics accelerator is used to refresh the screen. In case of need (complicated use cases) the scaling and colorspace conversion can be done on the ARM side as well, but that is not recommened as it is not optimized. A/V synchronization is done on the ARM, using an audio clock that is based on information from the audio interface.
+
Video post processing is performed on DSS screen accelerator. DSS is used to do the colorspace conversion, the scaling and composition, including overlays. A separate external graphics accelerator is used to refresh the screen. In case of need (complicated use cases) the scaling and colorspace conversion can be done on the ARM side as well, but that is not recommened as it is not optimized.
 +
A/V synchronization is done on the ARM, using an audio clock that is based on information from the audio interface.
The communication between ARM and DSP software is provided by the TI DSP bridge. Any messages or data buffers exchanged between the ARM and DSP go through it. This layer can be regarded as transparent from Multimedia Architecture point of view and hence it will not be described in this document.
The communication between ARM and DSP software is provided by the TI DSP bridge. Any messages or data buffers exchanged between the ARM and DSP go through it. This layer can be regarded as transparent from Multimedia Architecture point of view and hence it will not be described in this document.
===GStreamer===
===GStreamer===
-
 
GStreamer is a crossplatform media framework, covering most multimedia applications use cases from playback, to streaming and imaging. It is a huge collection of objects, interfaces, libraries and plugins. From the application point of view it is just one utility library that can be used by applications to process media streams. The library interface is actually a facade to a versatile collection of dynamic modules that implement the actual functionality. GStreamer core hides the complexity of timing issues, synchronization, buffering, threading, streaming and other functionalities that are needed to produce usable media application.  
GStreamer is a crossplatform media framework, covering most multimedia applications use cases from playback, to streaming and imaging. It is a huge collection of objects, interfaces, libraries and plugins. From the application point of view it is just one utility library that can be used by applications to process media streams. The library interface is actually a facade to a versatile collection of dynamic modules that implement the actual functionality. GStreamer core hides the complexity of timing issues, synchronization, buffering, threading, streaming and other functionalities that are needed to produce usable media application.  
Line 39: Line 39:
*GStreamer provides good modularity and flexibility.  Hence, building applications on GStreamer in short time is possible.
*GStreamer provides good modularity and flexibility.  Hence, building applications on GStreamer in short time is possible.
*GStreamer is LGPL and it allows Multimedia Project to combine GStreamer with proprietary software.
*GStreamer is LGPL and it allows Multimedia Project to combine GStreamer with proprietary software.
 +
====Public interfaces provided by GStreamer====
====Public interfaces provided by GStreamer====
Line 47: Line 48:
! Interface name !! Description
! Interface name !! Description
|-
|-
-
| GStreamer API || Interface for Multimedia applications, VOIP etc
+
|GStreamer API||Interface for Multimedia applications, VOIP etc
|-
|-
-
| <code>playbin2</code> || Recommended high level element for playback.
+
|playbin2||Recommended high level element for playback.
|-
|-
-
| <code>uricodebin</code> || Recommended high level element for decoding.
+
|uricodebin||Recommended high level element for decoding.
|-
|-
-
| <code>tagreadbin</code> || Recommended high level element for fast metadata reading.
+
|tagreadbin||Recommended high level element for fast metadata reading.
|-
|-
-
| <code>camerabin</code> || Recommended high level element for camera application.
+
|camerabin||Recommended high level element for camera application.
|}
|}
===OpenMAX IL===
===OpenMAX IL===
-
 
OpenMAX is an effort to provide an industry standard for a multimedia API. The standard defines 3 layers – OpenMAX DL (Development Layer), OpenMAX IL (Integration Layer) and OpenMAX AL (Application Layer). DL is a vendor specific and optional component. IL is the layer that interfaces with IL components (e.g. codecs). We will integrate TI IL-Core for the DSP components and Bellagio IL-Core for ARM components. Neither of the cores uses DL.
OpenMAX is an effort to provide an industry standard for a multimedia API. The standard defines 3 layers – OpenMAX DL (Development Layer), OpenMAX IL (Integration Layer) and OpenMAX AL (Application Layer). DL is a vendor specific and optional component. IL is the layer that interfaces with IL components (e.g. codecs). We will integrate TI IL-Core for the DSP components and Bellagio IL-Core for ARM components. Neither of the cores uses DL.
We use GStreamer’s gomx module to transparently make OpenMAX IL components available to any GStreamer application.
We use GStreamer’s gomx module to transparently make OpenMAX IL components available to any GStreamer application.
Line 82: Line 82:
==Imaging subsystem==
==Imaging subsystem==
-
 
The multimedia framework also provides support for imaging applications. The subsystem is illustrated below.
The multimedia framework also provides support for imaging applications. The subsystem is illustrated below.
-
[[Image:Imaging_decomposition.png|800px|alt=UML diagram of imaging subsystem|Imaging subsystem]]
+
[[Image:Imaging_decomposition.png|800px]]
===Camera Source===
===Camera Source===
-
 
+
The GstV4L2CamSrc is a fork of the GstV4l2Src plugin. The reason for the fork is that the original plugin has lots of extra code for handing various V4L2 devices (such as tuner cards) and that made the code quite complex.
-
The <code>GstV4L2CamSrc</code> is a fork of the <code>GstV4l2Src</code> plugin. The reason for the fork is that the original plugin has lots of extra code for handing various V4L2 devices (such as tuner cards) and that made the code quite complex.
+
====GStreamer V4L2 Camera Source====
====GStreamer V4L2 Camera Source====
Line 126: Line 124:
! Interface name !! Description
! Interface name !! Description
|-
|-
-
| V4L2 ioctl || Extensions for V4L2 protocol
+
|V4L2 ioctl||Extensions for V4L2 protocol
|}
|}
==Audio Subsystem==
==Audio Subsystem==
-
[[Image:Audio_decomposition.png|frame|center|alt=UML diagram of audio subsystem|audio subsystem]]
+
[[Image:Audio_decomposition.png]]
Line 239: Line 237:
MMF delivers parts of the notification mechanism. The subsytem is illustrated below:
MMF delivers parts of the notification mechanism. The subsytem is illustrated below:
-
[[Image:Notification_decomposition.png|frame|center|alt=UML diagram of notification subsystem|Notification subsystem]]
+
[[Image:Notification_decomposition.png]]
Line 256: Line 254:
The multimedia framework supports the search engine when indexing media files. The subsystem is illustrated below:
The multimedia framework supports the search engine when indexing media files. The subsystem is illustrated below:
-
[[Image:Metadata_decomposition.png|frame|center|alt=UML diagram of metadata subsystem|Metadata subsystem]]
+
[[Image:Metadata_decomposition.png]]
Line 267: Line 265:
It is neither user friendly nor always possible to run several multimedia use cases at the same time. To provide a predictable and stable behavior, multimedia components interact with a system wide policy component to keep concurrent multimedia use cases under control. The subsystem is illustrated below. The policy engine is based on the OHM framework. It is dynamically configurable with scripting and a prolog based rule database.
It is neither user friendly nor always possible to run several multimedia use cases at the same time. To provide a predictable and stable behavior, multimedia components interact with a system wide policy component to keep concurrent multimedia use cases under control. The subsystem is illustrated below. The policy engine is based on the OHM framework. It is dynamically configurable with scripting and a prolog based rule database.
-
[[Image:Policy_decomposition.png|frame|center|alt=UML diagram of policy subsystem|Policy subsystem]]
+
[[Image:Policy_decomposition.png]]

Learn more about Contributing to the wiki.


Please note that all contributions to maemo.org wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see maemo.org wiki:Copyrights for details). Do not submit copyrighted work without permission!


Cancel | Editing help (opens in new window)