Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Networks
?
?
Networks??

DLNA and UPnP will enable easy home video networks

Posted: 12 Oct 2005 ?? ?Print Version ?Bookmark and Share

Keywords:dlna? upnp? home video networks?

By Joseph Chou and Timothy Simerly

Streaming Media DSP Group, Texas Instruments Inc.
Video/Imaging DesignLine

The networked digital home of the not-too-distant future will contain a wide variety of consumer electronics equipment, PCs and mobile handheld devices. These devices will have to learn how to exchange content that is stored in an almost equally diverse number of video standards and other streaming media data types. And it is possible that this exchange will take place over several wired!and wireless!network protocols.

Video, audio, and still-image media interchange presents a bewildering assortment of combinations and permutations to design engineers. Dozens of media formats, codecs, transmission protocols and display technologies must somehow be woven into what seems to the consumer to be a seamless, simple system.

How to get all these disparate technologies to work together is a big challenge. It is an ambitious goal!but one that is well on its way to being achieved.

In today's digital world, setting common interchange content formats and common network protocols is not sufficient. Digital Rights Management (DRM) must also be included because embedded DRM systems can prevent commercial premium content from being illegally copied, listened to or viewed, as required by the commercial content owners.

Given the level of complexity and widely available standards, getting devices to talk together is not so much a matter of creating new standards as it is of cooperation between leading companies in the PC, CE and mobile markets.

Communication!not convergence
In the past, the electronics industry has used convergence to describe a digital home in which content from numerous sources is available to the consumer. Over time, however, convergence has also become associated with literally merging electronics equipment into a single, all-powerful device. In some scenarios it was a PC; in others, a media player; in still others, a set-top box. For a number of reasons, this vision of the digital home has not come to pass.

But that's OK because all that consumers really want is for all of their electronics gear to work better together. The Digital Living Network Alliance (DLNA) was formed in 2003 to take that approach. Its first set of baseline design guidelines!version 1.0!was introduced in June of 2004.

An organization with more than 200 members, including virtually all of the global brands in PC, CE and mobile electronics, the DLNA is pursing a lowest-common-denominator strategy. Member companies commit themselves to executing a selected number of already common and widely deployed formats, protocols and codecs in all new equipment.

DLNA's initial 2004 v1.0 interoperability guidelines set a baseline for sharing digital content across a broad range of PC and consumer devices by agreeing to a set of core requirements and providing details on how they should be implemented. Drawn from PC and Internet standards, they include support for wired Ethernet and wireless LAN, IPv4, Universal Plug and Play (UPnP), and JPEG, LPCM and MPEG-2 as the baseline image, audio and video formats.

Optimizing performance
This brings up the matter of performance. When two devices both with advanced compression algorithms link up, for example, defaulting to a baseline spec means a big performance hit.

Video is a good example. When broadcasting of MPEG-2 video began in 1993, most content being broadcast required bandwidths in the range of 6Mbps to 8Mbps. High motion content such as sports like basketball and football, which require a lot of panning and scanning of the camera, needed almost the full maximum bit rate allowed for MPEG-2 main profile at main level (MP@ML) which is 15 Mbits/s,. Compression algorithms improved over time until most of the content for broadcast quality MPEG-2 could be contained within the lower 2Mbps to upper 5Mbps. However with the newer compression standards such as H.264 (MPEG-4 AVC or MPEG-4 Part10) and Microsoft's VC-1, they provide a more sophisticated tool suite to further reduce the bitrate by more than a factor of two over the older MPEG-2 offering. Hence this allows broadcast quality content to be distributed within the home at sub 1Mbps which is well within the available bandwidth of devices in the home used for rendering the content.

From a hardware perspective, the early MPEG-2 broadcast encoders were implemented with 12 to 13 dedicated hardware ASICs. Today this is done with one or two devices, many of them using programmable devices in lieu of the earlier hardware only implementations. In addition, the processing power of programmable devices, such as TI DSPs, has increased as a result of faster clock speeds and more advanced parallel architectures, allowing them to be used to implement the much more computationally intensive advanced compression algorithms such as H.264 and VC-1. Today, broadcast quality implementations of standard definition video can be compressed with one to four programmable devices, the actual number of devices depending on the desired level of quality, the complexity of the algorithm, and the profile/level of the compression standard being implemented.

But lowering the bitrate to fall below the maximum channel capacity and bandwidth is not the only consideration at play here. In some instances, high bit rates could mean severely degraded quality depending on the transport medium. Connections over an 802.11 WLAN, for example, are heavily dependent on distance. Sustainable bit rate drops off precipitously with the distance between sender and receiver. A 0.5Mbps bandwidth requirement simply means service that high quality video will be throughout the home.

Subsequent versions of the DLNA interoperability specification address the performance issue by offering a number of optional standards. If two devices discover that each is MPEG-4 capable, for example, no transcoding to MPEG-2 will occur. Optional standards include GIF, PNG, and TIF images, MP3, Windows Media Audio, AC-3, AAC and ATRAC3, MPEG-4 Part 2 audio, H.264 (MPEG-4 AVC or MPEG-4 Part 10), and Microsoft's VC-1 video formats.

Universal plug and play
Earlier attempts at device interoperability have fallen short of the mark because they did not address a baseline set of requirements each device must support. However, one DLNA precursor!Universal Plug and Play (UPnP)!has broad support already and!along with DLNA!is a critical piece of the solution to the interoperability puzzle.

UPnP enables self-configuration and self-discovery between devices. Devices announce their capabilities and options without any user intervention. The specific mechanisms are: automatic address configuration, device discovery, command and control, event generation, and presentation for viewing device status and control.

UPnP runs on top of the IP network layer and utilizes standards such as UDP, TCP, HTTP, XML, GENA, and SOAP. UPnP's audio/video architecture consists of the following devices:

  • Control Point!This device discovers Media Servers and Media Renderers and connects them.
  • Media Server!Stores content on the network for access by Media Renderers.
  • Media Renderer (Player)!A device that renders content received from a Media Server.

Figure 1 illustrates the basic UPnP architecture.


Figure 1: Basic UPnP architecture

DLNA baseline requirements
DNLA picks up where UPnP left off!by defining baseline design guidelines. To keep its interoperability specification consumer focused, DLNA derives its design guidelines from carefully thought-out use cases and usage scenarios. After collecting a wide range of scenarios, DLNA sorts them into "immediate", "next-version", and "future" categories.

Use scenarios were analyzed for common elements and consistent features. The highest priority use cases were simplified by removing all non-essential details. The resulting guidelines deliver all the functionality needed with a relatively small set of device classes and function/capability categories.

In DLNA design guidelines V1.0, devices fall into two general groups, Digital Media Servers (DMS) and Digital Media Players (DMP).

DMS devices source, acquire, record and store media. They usually have rendering capability and they may have intelligence, such as device and user services management, rich user interfaces, media management, aggregation and distribution functions.

Some examples include:

  • Advanced set-top boxes (STB)
  • Digital video recorders (DVR)
  • Personal and laptop computers
  • Stereo and home theaters with hard disk drives (for example, music servers)
  • Broadcast tuners
  • Video and image capture devices, such as cameras and camcorders
  • Multimedia mobile phones

DMP Devices let users to select and play the digital media stored in the home network. Examples include:

  • TV monitors
  • Stereo and home theaters
  • Wireless monitors
  • Game consoles
  • Digital media adapters (DMA)

next page




Article Comments - DLNA and UPnP will enable easy home ...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top