I'm trying to construct a 3.3 V power supply using an old laptop charger as an input voltage of 6.4 V. I am using an LM317 regulator and two external resistors to provide an output voltage of 3.3V. R1 is fixed while R2 is a 10k pot that can be finely adjusted to within an ohm or so. When no load is connected, I get my output to be what I want, and I can adjust the potentiometer to adjust my voltage. I used the formula Vout = 1.25 V * (1 + R2/R1), which is in the LM317 datasheet. However, I've come to realize that when low resistance loads are connected to the output, my output voltage drops considerably. I tested a 14 Ohm resistor by connecting it between the output and ground, and the voltage across it ended up being 0.27 V. This does not happen with large resistances.
I've looked through some old posts and I found that the resistance for R1 is highly recommended to be 120 Ohms. However, I found a website (link below) that essentially says anywhere from 100 to 1000 Ohms is fine, and because I was getting my desired voltages without a load connected, it seemed that this shouldn't be an issue. However, I have little experience with this circuit element so I'm not sure.
understand that the output of a simple voltage divider without a regulator can be affected by small loads, which is why I included the regulator in the first place. Ultimately I would like to use this to power an AD9914 DDS, but for now I have been testing it on resistors. Included below is my circuit and the datasheet for the LM317. I see also that in the LM317 datasheet, two capacitors are inserted to provide some sort of stability. How necessary are these, and could leaving these out be the cause of my problem? I'd like to understand the theory behind how a regulator circuit works instead of blindly copying a circuit and using an equation that I don't understand. If it helps, I have previous experience using op-amps in a lab. Thanks in advance.
Here's the details of LM317: http://www.kynix.com/Detail/799868/LM317.html