Interfacing Processors to Audio and Video Devices

This article introduces the basics of connecting DSP processors to audio and video devices. It includes helpful refreshers on common interfaces such as I2C, I2S, SPI, and BT.656.


November 02, 2006
URL:http://www.drdobbs.com/embedded-systems/interfacing-processors-to-audio-and-vide/193501323

This article aims to familiarize the reader with the basic principles of connecting digital signal processors to audio and video devices. While a basic working knowledge of audio and video would provide the reader with a helpful background, basic refreshers are included in the article where appropriate.

Processor Interface Overview
Generally targeted at applications requiring heavy I/O loads, DSPs commonly provide developers with a variety of integrated interfaces—some standard and some proprietary. For example, Blackfin processors from Analog Devices are convergent processors, which means they integrate DSP and MCU functionality into a single device. Blackfin processors have two main types of serial interfaces relevant to audio applications. The processor's low bit-rate TWI (two wire interface) and SPI (serial peripheral interface) modules are used for control and configuration of audio devices. The forward channel of these peripherals is generally used to configure or control audio converters, and the reverse channel relays feedback or status information from the converters. The processor's higher bit rate SPORT (serial port) peripheral is customarily used to perform the actual interchange of audio data.

Blackfin's TWI is compatible with the Philips bidirectional open-collector I²C (Inter-Integrated Circuit) bus. It provides a very simple and concise way to exchange control and data information between multiple devices. It supports both master and slave operation with transmission speeds up to 400 kbits per second. The I²C bus serial data (SDA) and serial clock (SCL) lines (see Figure 1) comprise a multi-master interface, so it can connect to more than one IC capable of initiating a data transfer. The phase differences between the SDA and SCL data streams determine the mode of operation, slave or master, at a given point in time. To list all the world's I²C devices here would be impossible, but suffice it to say that using a DSP with an I²C port opens the architecture up to a very wide variety of interconnectivity options.


1. Example I²C signaling (adapted from Philips spec)

The full-duplex Blackfin SPI is compatible with the Motorola SPI standard. It It operates at up to 33 Mbits per second, well beyond the control needs of most A/D and D/A converters. The SPI name was created by Motorola but is also known as Microwire, which is a trademark of National Semiconductor. Extensions to SPI, including QSPI (Queued Serial Peripheral Interface) and MicrowirePLUS, have also come to market. The SPI consists of a three-pin data communication interface (see Figure 2) that supports both master-slave and multi-master environments. The SPI pins include the MOSI (master output to slave device), the MISO (master input from slave device), and the SCK (serial clock). There are also a total of eight SPI chip-select pins. One input pin lets other SPI devices select the Blackfin processor, and seven output pins let the Blackfin processor select other SPI devices. While developers typically utilize SPI as a synchronous serial communication interface between processors and peripherals, SPI can just as well be used for interprocessor communication. As with I²C, SPI has been widely adopted throughout the industry, and the list of SPI-compatible devices is richly populated.


2. Example SPI signaling

Blackfin's full-duplex synchronous serial port, called "SPORT," operates at high bit rates and supports simultaneous transmit and receive applications. SPORT features that are relevant to audio applications include two sets of independent transmit and receive pins (primary data, secondary data, clock and frame sync). These pins enable eight channels of I²S stereo audio. (I²S is Phillips Semiconductor's "Integrated Interchip Sound" bus protocol for digital audio.), Each of these channels supports word lengths up to 32 bits, surpassing the resolution of most high-precision audio applications.

I²S is a well known serial-bus stereo audio transmission standard, widely used to interconnect system elements like analog-to-digital and digital-to-analog converters. I²S interfaces are also found on some high-end CD and DVD players, and on some PC sound cards. An I²S bus design consists of three serial bus lines: a line with two time-division multiplexing (TDM) data channels, a word select line, and a clock line. In the I²S format (see Figure 3), any device can act as the system master by providing the necessary clock signals, and an I²S slave will usually derive its internal clock signal from an external clock input. The I²S design handles audio data separately from clock signals, and by separating the data and clock signals, time-related errors that can introduce jitter are mitigated, eliminating the need for anti-jitter devices.


3. Example I²S audio signaling
Audio in, audio out
Connecting an audio source to the DSP is a fairly straightforward interface task. Figure 4 shows an example where a microphone's analog output is converted to digital through an A/D converter. Working as the master device, the DSP selects the A/D as an SPI slave peripheral. Data then gets transmitted using three of the four Blackfin SPORT pins in the receive direction. The reverse case shown in Figure 5—connecting the processor to a D/A converter—is equally easy. Again acting as the SPI interface master, the processor configures and controls the converter, with the data flowing in the other direction over the I²S SPORT interface to the D/A. The analog output subsequently feeds a speaker.


(Click to enlarge)

4. Connecting an audio A/D converter to an embedded processor


(Click to enlarge)

5. Connecting an audio D/A converter to an embedded processor

Audio interface tips and tricks
A few quick reminders will make an already easy application even easier. On the processor's TWI, it is important to remember to put pull-up resistors on both SCL and SDA lines, per the I²C specification. This is important because these pins are never actually driven high. (For that matter, you should use proper termination for all clock and your frame sync signals). And on the SPI, check that all MISO pins are connected to MISO pins, and MOSI pins to MOSI pins. Because the two signal names are very similar, it's easy to mistakenly swap the signal names and functionality.

A good way to simplify development is to take advantage of a vendor's device driver suites. For instance, the ADI VisualDSP++ tool suite includes peripheral device drivers for SPI, SPORT, TWI and others, that facilitate the configuration and control of these interfaces (via a standard API).

Also, a vendor's hardware platform offerings can be indispensable for evaluating and developing an embedded processing solution. For example, the Blackfin EZ-KITs and associated EZ-Extender cards come with integrated converters and numerous code examples. These offerings provide a useful framework for quickly learning how to interface audio to Blackfin devices. As another example, ADI's VisualAudio algorithm development tool helps streamline the design of audio systems using the Blackfin processor.Connecting to video sources
Figure 8 shows how a CMOS imager can connect to an embedded processor. Here, the TWI control channel connects to a Micron CMOS image sensor's I²C bus for device configuration, while the image data flows straight into the Blackfin 8-bit PPI (which could be configured for 10, 12 or even 16 bits, depending on the resolution of the sensor).


8. CMOS imager connection to Blackfin processor

While Blackfin's PPI is, in the generic sense, a high speed parallel port with a straightforward interface (16 data lines, up to three frame syncs, and a clock), the port does incorporate some video-friendly features that work hand in hand with the processor's DMA engine. The details of the PPI are too numerous to articulate in this article, but they are readily available on the Analog Devices website.

The imager in this example provides the pixel clock as well as video framing: Horizontal sync demarcates the valid line region and the Vsync serves as a "frame valid" signal. Other sensors support the ITU-R BT.656 standard, in which case these synchronization signals are not needed. The Blackfin EZ-Extender cards and EZ-KITs support connection to a wide range of CMOS sensors from leading vendors including Micron, Omnivision, and Kodak, as illustrated in Figure 9.


9. Micron camera board connected to Blackfin EZ-KIT
If a video-in application comes from an analog source—an analog camcorder, for example—the signal must first pass through a video decoder. In the example of Figure 10, the processor configures the decoder via I²C and the TWI interface, and its PPI receives an 8-bit digital data stream and line-locked pixel clock from the video decoder.


(Click to enlarge)

10. Processor connection to video decoder from analog source
Connecting to video displays
In most applications, a video frame stored only in processor memory is of limited utility. It must be somehow sent out for display. Analog displays first require, of course, a video encoder, as illustrated in Figure 11. The I²C interface is configured to send the data via the Blackfin processor's PPI data bus. While these particular encoders support BT.656, some encoders will require explicit frame sync signals to be sent via the PPI frame syncs.


(Click to enlarge)

11. Processor connection to video encoder for analog display

Digital displays like TFT-LCD panels need some of the same signaling and synchronization layer signals, including horizontal and vertical syncs and the data sampling clock. The whole 16-bit PPI data bus is used for displaying video, since most TFT panels provide at least 18 bits of resolution. A PWM timer block is optionally employed, since many TFT-LCDs do not integrate a timing controller. A representative connection is shown in Figure 12.


(Click to enlarge)

12. Example processor connection to TFT LCD panel

Video tips and tricks
BT.656 should be used whenever possible because it eliminates a lot of timing incongruities and inconsistencies that can occur in video interfaces. Also, it is important to pay close attention to the default converter settings for the A/D or D/A: Sometimes these converters can be used directly without programming through an I²C or SPI interface. Further, it is important to make sure the pixel clock source is as clean as possible, since these clock rates can run to tens of MHz, depending on the application. A clean clock can make a huge difference in the stability of a system.

When the processor is connected to a typical RGB666 LCD panel that has 6 bits of R, G and B for a total of 18 bits, the processor's 16 bits actually connect as a RGB565 device. In this configuration, the red and blue channels receive only 5 bits each. Do not ground the least-significant bit of the red and blue channels, since this will compromise the dynamic range being represented. Instead, at the panel tie the least-significant bit of red to the most-significant bit of red, and do the same for blue. This ensures a full dynamic range from the very lowest to very highest value on each of those channels. With green, all 6 bits are connected and represented, because green is the most visually important channel of the three.

Considering the complexity of video development, it's important to take advantage of all the support a vendor provides. Taking Blackfin's offerings as an example: VisualDSP++ provides the peripheral drivers for the PPI as well as for the TWI, the timers, and all the different modules involved with connecting a video system to a Blackfin processor. This includes many complete video device drivers for encoders, decoders, CMOS sensors and LCD displays. Furthermore, EZ-KITs and the EZ-Extender cards are available with CMOS sensor and LCD interfaces, reference schematics, and a bill of materials. Also, the EZ-KIT and EZ-Extender code examples are invaluable, especially for systems where data movement and memory accesses are crucial to guaranteeing correct system operation.

This article has given an overview of audio and video connections to DSPs and convergent processors. With a basic knowledge of the "lingo," an understanding of the interfaces involved, and a proper choice of processor architecture (including strong consideration of vendor support), it is possible to piece together multimedia systems in a straightforward manner.

About the author
David Katz is a Senior DSP Applications Engineer at Analog Devices, Inc., where he is involved in specifying and supporting Blackfin media processors. He has published dozens of embedded processor articles both domestically and internationally. Previously, he worked at Motorola, Inc., as a senior design engineer in cable modem and automation groups. David holds both a B.S. and M. Eng. in Electrical Engineering from Cornell University. He can be reached at [email protected].

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.