Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Processors/DSPs
?
?
Processors/DSPs??

Designing DSP-based digital DC/DC power supply (Part 1)

Posted: 23 Apr 2008 ?? ?Print Version ?Bookmark and Share

Keywords:DC/DC? power supply? digital control design? DSP controller? buck regulator?

By Shamim Choudhury
Texas Instruments Inc.

As DSPs get some serious consideration for use in controlling power supplies, embedded systems designers need to address a number of pertinent factors in the design and implementation of a digital control loop.

For one thing, accurate representation of the control blocks and the associated control parameters is critical for the analog designers in order to enable them to implement the DSP based digital control techniques using the well-known analog control design approaches.

But the result is well worth the effort. DSP based digital control allows for the implementation of more functional control schemes, standard control hardware design for multiple platforms and flexibility of quick design modifications to meet specific customer needs. DSP based digital control allows for the implementation of more functional control schemes, standard control hardware design for multiple platforms and flexibility of quick design modifications to meet specific customer needs.

Digital controllers are also less susceptible to aging and environmental variations and have better noise immunity. Moreover, modern 32bit DSP controllers, such as TMS320F280x, with their real-time code debugging capabilities, give the power supply designers all the benefits of digital control and allow implementation of high bandwidth and high frequency power supplies without sacrificing performance [2-4]. The extra computing power of such processors also allows implementation of sophisticated nonlinear control algorithms, integrate multiple converter control into the same processor and optimize the total system cost.

However, the power supply engineers, mostly familiar with analog control design, are faced with new challenges as they start to adopt these digital control techniques in their designs.

This article describes a step-by-step DSP based digital control design and implementation of a high frequency DC/DC converter, illustrating two different approaches to digital control design: design by emulation and direct digital design.

The first method, design by emulation, allows the power supply designers to do the control design in the familiar s-domain and then convert it to a discrete/digital controller. The second approach known as direct digital design, allows digital controller design directly in z-domain.

Starting with a DC/DC buck converter and a given set of performance specification, it discusses different control blocks, different control design approaches and highlights the significant differences in designing control in the digital domain compared to the analog approach.

The two methodologies will be described in detail, first shown in MATLAB and then verified by experimental results. In this process the effects of sampling delay and the computation delay are also analyzed in MATLAB and then verified experimentally.

DC/DC converter setup
Figure 1 shows a simplified block diagram of a digitally controlled DC/DC converter interfaced to a TMS320F280x DSP controller, with processor speed up to 100MHz and enhanced peripherals such as a high resolution PWM module, a 12-bit A/D converter with conversion speed up to 160ns, a 32 x 32bit multiplier, and 32bit timers.

*Vin = 4~6V, Vout = 1.6V, Max output current Iout = 16A, RL = Vout/Iout= 0.1? (Minimum)

*Maximum output voltage (used for ADC signal scaling) Vomax = 2V

*PWM frequency fpwm = 250kHz; Voltage loop sampling frequency fs = 250kHz

*Output filter components, L = 1.0?H, C = 1620?F, RC = 0.004?

*Desired voltage loop bandwidth fcv = 20kHz

*Phase Margin = 45, Settling time

TMS320F280x DSP based digital control of DC/DC converter
Figure 1

As indicated in Figure 1, a single signal measurement is needed to implement the voltage mode control of the DC/DC converter. The instantaneous output voltage Vout is sensed and conditioned by the voltage sense circuit and then input to the DSP via the ADC channel. The digitized sensed output voltage Vo is compared to the reference Vref.

The voltage loop controller Gc is designed to make the output voltage Vout track the reference Vref and at the same time achieve the desired dynamic performance. The digitized output U of this controller provides the duty ratio command for the buck regulator switch Q1. This command output is used to calculate the appropriate values for the timer compare registers in the on-chip pulse-width modulation (PWM) module. The PWM module uses this value to generate the PWM output, PWM1 in this case, that finally drives the buck converter switch Q1.

DC/DC converter digital control loop sampling scheme
Figure 2

Figure 2 shows one example of a digital sampling scheme using the DSP on-chip peripherals. The sampling scheme affects the digital controller design and, therefore, needs appropriate attention. PWM output frequency is set up by configuring one of the on-chip Timers, T1 in this case. In this example, T1 generates a dual edge modulated (symmetric), 250kHz PWM output.

These timers have associated compare registers which are used to write the calculated duty ratio values. These values then get compared with the timer counter value in order to generate the PWM output. The time at which a newly written compare value affects the actual PWM output duty ratio is controlled by associated PWM control registers. In this example, the PWM control registers are set up such that a new value written in the compare register, changes the actual PWM output duty ratio at the start of the subsequent timer (T1) period.

Also, the ADC control registers are set up such that the AD conversion is triggered at the middle of the ON pulse of the PWM output. As soon as the conversion is complete, the ADC module is set up to generate an interrupt. The time delay between the start of AD conversion and this interrupt is shown in Figure 2, as Tadc. This time includes the AD conversion time and the processor interrupt latency.

Inside the interrupt service routine (ISR), the user software reads the converted value from the ADC result register, implements the controller and then writes the new PWM duty ratio value to the appropriate PWM compare register. However, this new duty ratio value takes affect at the start of the subsequent PWM cycle. From Figure 2, it is clear that the time delay Td, between the ADC sampling instant and the PWM duty ratio update, is half the PWM period. In this case, the PWM period and the sampling period (Ts) are equal and so the computation delay is, Td = Ts/2.

Also shown in Figure 2, the calculation of a new duty ratio value inside the ISR is completed well before a subsequent interrupt is generated. This means that, at this sampling frequency, the processor bandwidth (100MHz) allows for sufficient spare time to extend the ISR and execute multiple controllers or other time critical tasks. Some of this spare time can also be used for non-time critical tasks by running them from a background loop.


1???2?Next Page?Last Page



Article Comments - Designing DSP-based digital DC/DC po...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top