Electrical Engineering Stack Exchange is a question and answer site for electronics and electrical engineering professionals, students, and enthusiasts. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

I have designed 3 power supplies with 3.3 V, 2.5 V, and 1.2 V outputs.

I use BD9870FPS Buck converter to step down 8 V - 24 V to 3.3 V.

Then the 3.3 V is used as inputs for 2 buck converters ENPIRION EN5322QI to step down to 2.5 V and 1.2 V. These 2 power supplies are used for load.

The 3.3 V output is also used for load.

The question is: when I tested three of them, the outputs (voltage) are stable, good. However, the current draw is decreasing while the input voltage to the 3.3 V buck converter is increased. Is this normal?

I read that a buck converter steps down voltage by increasing current. But what I got was another way around.

So, can you help me to explain this?

share|improve this question
1  
Have you heard about the Conservation of energy law? You increase the voltage, so the current must decrease in order to maintain same energy consumption. – Eugene Sh. 2 hours ago
    
What type of load is connected with output of the buck converter? If a constant power load then energy conservation. If not please explain the load nature. Is it a pure resistance or anything else? – Saprativ Saha 2 hours ago

Yes! P=U*I. Increase voltage and the current drops and vice versa. It's the entire point of having a switcher instead of linear supply.

share|improve this answer

I read that a buck converter steps down voltage by increasing current.

This means that the output current is higher than the input current.

What you are seeing is that the input current depends on the input voltage. This is true because of, as others have mentioned, conservation of energy. Because of conservation of energy, we can define the following rule for a buck (or other switching) converter:

$$V_{in}I_{in} = \frac{1}{\eta}V_{out}I_{out}$$.

Here, \$\eta\$ is an efficiency factor to account for power lost in the converter itself. It does vary slightly when the other parameters are changed, but it's typically between 0.85 and 0.95 for a well-designed buck converter.

So you can see that if the output voltage and current stay the same, but you increase \$V_{in}\$, \$I_{in}\$ will decrease.

share|improve this answer

I read that a buck converter steps down voltage by increasing current. But what I got was another way around.

Your logic is incorrect.

1) You step down the voltage by increasing the current.

2) Therefore, the more you step down the voltage, the more you can increase the current.

3) The more you can increase the current, the less input current you need for the same output current.

4) Therefore, the more you can step down the voltage, the less current you need.

5) Thus the higher the input voltage, the lower the input current.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.