Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Amplifiers/Converters
?
?
Amplifiers/Converters??

Media processing meets portable content

Posted: 02 Aug 2004 ?? ?Print Version ?Bookmark and Share

Keywords:media? multimedia? portable? streaming? api?

The proliferation of consumer A/V device and streaming-media types has kept chipmakers and systems houses scrambling to find the best ways to transfer content and control functions between devices. Though the number of devices and functions may seem to be spiraling out of control, efforts to standardize at the middleware level will go a long way toward untangling the programming task for designers.

Formats, functionality and features come in almost unlimited variety. In addition to JPEG, MPEG-1 to MPEG-4, MP3 and high-definition standards, devices should be able to handle legacy analog formats, such as NTSC and PAL.

Systems - that is, the chips and software that they contain - also have to accommodate at least two aspect ratios: multiple resolutions and progressive or interlaced scan modes. As if this were not enough, display types must be addressed.

To fully appreciate the enormity of the task of supporting all this diversity, consider just three of the many operations involved in a codec transferring content from a DVD disk to a personal video recorder (PVR):

? Configuration - The codec must be configured to use the correct A/V formats, encoding parameters and stream type.

? Control - The codec has to know when to start and stop, where to get its bit stream, where to send its output.

? Operation - The codec has to encode or decode actual A/V data in the correct formats.

The net result is that all those content types and the constantly escalating speed required for multimedia content processing cannot be effectively addressed solely by hardware or software. Attention must be paid to what used to be called the hardware/software interface.

Multimedia applications cannot allow disruptions. The system must be capable of receiving, decoding, converting and displaying multiple data streams, each with different data formats, including MPEG-2, NTSC, PAL and audio.

From a system perspective, the architecture for handling streaming media usually includes a control processor and (multiple) DSPs for the A/V-intensive tasks; a unified memory architecture tuned to the needs of streaming media; multiple internal buses, each servicing specific hardware accelerators and processors; and a software architecture that integrates these components into a working system.

In such a media-processing platform, the control processor runs the OS, graphics and applications software. Meanwhile, DSPs run the RTOS and handle streaming-media processing. These processors are part of a single integrated system, sharing a unified memory. This allows them to swap tasks to balance computing loads and also savings in memory costs.

The backbone of the bus architecture is a point-to-point memory bus that connects external SDRAM with the hardware platform's peripherals for high-throughput, low-latency DMA.

The primary function of the software architecture is to support the hardware with multimedia libraries that consist of components that perform most of the A/V stream processing, including digitizing, processing and rendering.

Middleware needed

The value-added software in today's consumer A/V systems is a significant component of the system's value, and its importance will grow over time. However, when multiplied by the number of PVRs, DVDs and other A/V devices on the market, the number of options is spiraling out of control.

Each of these A/V devices requires different middleware, primarily to implement a system specification that provides interoperability between devices that support it. Examples of consumer A/V middleware include the multimedia home platform (MHP), DVD navigator and open cable applications program (OCAP).

Middleware has solved one problem but created another: cost-effective porting. An industry-wide standard on accessing A/V functions in a device could resolve this challenge, offering value across the board-from semiconductor vendors to systems houses to software developers. Such an initiative on a standard is the universal home API (Uhapi).

The Uhapi will be implemented through an A/V software layer tailored to optimize performance and leverage the architectural and functional features of each compliant IC. The software layer abstracts the API to the task level, enabling easy porting of middleware common in DTV applications. Built-in advanced functionality shields the programmer from the need to manually code complex steps associated with real-time and streaming applications.

Introducing the concept of a "use case" gives the API even greater value for application developers, which can interface the middleware of a specific system with that of any other API-compliant device simply by selecting a use case. Then, a connection manager automatically sets up the media-processing tasks and handles the administrative steps associated with them, such as priority setting and synchronization.

Today, application developers must write the complex code that connects A/V components. With an API, this would virtually be reduced to a function call. Hence, they could confidently write application code without worrying about hardware or middleware complexities.

- Hans van Antwerpen

Chief Software Architect

Philips Semiconductors





Article Comments - Media processing meets portable cont...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top