I’m looking at getting a bench-top DC power supply to tinker with led projects etc.
This may be a silly or obvious question, is the output current control a literal “I’m pumping out x Amps” or is it more of a cap setting on current output?
When using to power an LED driver, the driver normally determines its own current draw, so how does it work in this situation??
I’m wanting to do some resistor mods on various drivers and I want to be able to measure input current and output current. How do I use a DC power supply in this context?
You can use it to either current limit or set the actual current output. LEDs require constant current since they have extremely low internal resistance. Small variations in voltage can result in large increases in current draw.
In this case you want to limit the current. Set it where you want and as you raise the voltage it will go up until it hits the limit.
I use benchtop power supply a lot. I have old style model based on heavy ac-dc transformer. This is not what you need for work with leds and drivers.
Most usefull feature is current limit. You can set currents at lowest level and not worry about components. Well, with modern PS. My uni-t does not limit current immediatly (probably there is too much energy inside coils and capacitors) and sometimes I burn leds (if voltage level is times over Vf but current is set to the lowest level). Modern fully electronic PS should not have this problem.
Stock wires are awfull, in most cases you need to make your own (using connectors and silicon wire from ebay).
But anyway it is very usefull. You can check how LVP works, see how fet and linears drop current (boost and buck raise) when voltage drops. When I use COBs or several leds connected in series, (while batch of them will be parallel connected) I check Vf to avoid issues. Run dc motors, boost dead cells, etc.
Numbers that you can see at digital indicator are not presice at all (Ive met several budget models and were are bad at this side). Do not try to calculate driver losses using this numbers. To imitate work of your driver, you need to set voltage as you need and leave current setting at max level (in 99% cases). My PS is rated to 5Amps but for some reason its internal resistance never compatible with 5Amps cell and DD current level with PS is always lower. If you limit current with PS under your drivers rates, linear and fets will drop output current (input voltage will also drop so at some point you will get LVP). Boost and buck drivers always have “saturation point” - they try to put rated output when it is possible, if you slightly adjust input current you can meet the point when output is still good but driver produces extra noize - you dont want to meet it in real usage and it gives you rough information about cell properties needed.
It sort of is a silly question but, how do you think a power supply is able to pump out current? Current flow is a consequence of voltage drop across a circuit impedance/resistance. So, in order to increase current voltage is increased and vice versa.
Both voltage and current are set limits for the supply's output. When the load is very high resistance (a “no load” condition) the output voltage will remain at the value set (maximum voltage or voltage cap). When the load's resistance is low enough to cause an increase in current output above our desired current set value (like for example shorting the output terminals) the power supply, which senses the current, limits the voltage output for current to remain capped at the set value.