BUCK CONVERTERS AND ITS OPERATION

                        BUCK CONVERTERS

A buck converter is a type of DC-DC converter and its main goal is to step down the DC voltage with minimum ripple to a lower level. It is also used in SMPS circuits were the output voltage of the Dc must be lower than the input voltage it typically consists of two semiconductors a diode and a transistor and it would have at least one energy storage device it may be a capacitor or an inductor a buck chopper may interface between the variable output voltage of a storage battery and a microprocessor. Buck converter gives feedback to regulate the output voltage in the presence of load change

BUCK POWER CONVERTER

WHY BUCK CONVERTERS ARE USED

Buck converters are mostly used for the DC-DC conversion power management and for the microprocessor voltage regulator applications. Buck converters are used for applications which require fast load and line transient responses and high efficiency for a wide load current range. It can be used to convert a voltage source into a lower regulated voltage buck converter are used to provide longer battery life for the mobile system which spends most of its time in “stand-by” it is also used as switch mode power supplies for baseband digital core and for the RF power amplifiers

OPERATION

The name buck converter is evolved by the fact that the input voltage is chopped, bucked or attenuated so that a lower amplitude voltage will be created in the output. This converter provides a non-isolated switch mode DC-DC conversion and it has advantages like simplicity and low cost

In this figure, the converter accepts the DC input and uses the pulse width modulation of the switching frequency to control the output of the internal MOSFET, by the help of an external diode, inductor, and output capacitor regulated DC voltages are created

In this figure, a capacitor is added across the load resistor so that the capacitor reduces the ripple content in voltage across it and an inductor smoothes the current passing through it. The PWM controller compares the DC output with a reference voltage and it changes the PWM duty cycle to maintain a constant voltage. if the output voltage goes down the PWM will duty cycle increases and it will maintain a proper output. If the output voltage increases the PWM lowers the duty cycle to maintain a constant voltage