Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > Embedded

Consortium takes 3D TV to the next level

Posted: 01 Jun 2011 ?? ?Print Version ?Bookmark and Share

Keywords:3D? TV broadcasting? 3D standard?

The market for consumer 3D television sets is expanding at the enormous pace of a 75 percent annual growth rate, following the trend for popular movies shot in 3D. With this rapidly growing market, comes the need for standardization and affordable equipment. The timing was right for the project as standards for 3D technologies have now reached a peak in their diversity and number. The TritonZ consortium, assisted by EUREKA Cluster MEDEA+, set out to develop more integrated standards and technologies. One of the major outcomes of the project is a new worldwide used standard called CoaXPress.

The consortium explored technologies from the front to the end of the 3D TV broadcasting chain. "The first challenge was making sure that cameras could actually film in 3D," said project leader Klass Jan Damstra from teleproduction company Grass Valley.

He added, "Now that means of course recording, but also editing and broadcasting, this means that the data processing must be very fast!" The project partners developed technologies that make possible to capture live shows in 3D, although the broadcasting network isn't ready for that yet. At the other end of the broadcasting chain, the researchers focused on new types of screens that would make 3D TV a more pleasant experience than it is the case with the technology currently available on the market.

The challenge with 3D TV broadcasting lies within the amount of data to be processed. The current filming techniques are based on a stereoscopic pair of two 2D images filmed with two cameras, doubling the data load going through the transmission channel. The consortium first developed faster sensors and transmission cables that allow for this increase, but the project team pursued another path as well: adding the third dimension to existing 2D HDTV images, filmed with only one camera. Transmitting a 2D picture plus depth information (2D+Z) uses less transmission bandwidth and provides more flexibility for image display. The result is 'Time of Flight,' a technology similar to the one used in radars.

Near-infrared light is emitted by the camera, which measures the time the light needs to travel to an object and back to the camera. This way the camera can calculate for each pixel in the scene the distance to the camera. Time of Flight thus creates a 'depth-map' that will communicate a TV screen the exact depth in the picture. The company Trident Micro Systems developed the algorithms necessary in rendering a stereo image of the 2D+Z information. The format of a TV screen holds less importance than it is the case for the current technologies as the algorithm used can adapt the image to the size of the screen. The same technology can be used to calculate 'multiple views': allowing more viewing angles from which you can watch a movie and making 3D glasses superfluous.

1???2?Next Page?Last Page

Article Comments - Consortium takes 3D TV to the next l...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top