As has been intimated, there are a 1001 reasons for the various discrepancies. Ok, I exaggerate somewhat, but nevertheless, this really is something of a minefield.
For a start, the meter may well show both an AC voltage reading and a DC voltage reading. On the AC scale, it may even show different AC voltages depending which way round the meter leads are connected. It could even show no AC voltage one way, but an AC voltage the other. Probably the best advice is to measure DC voltage if you are expecting DC voltage, or AC voltage if expecting AC voltage. Provided you always start from the highest range and work down, it should not damage the meter, and the internal resistance of the meter will not harm a power unit.
In respect of the power units, there are:
Basic transformer. AC output only.
Basic transformer with rectifier. DC output, but your meter is not likely to show the rated voltage output since the rated voltage output is dependant on the current loading.
Basic transformer with rectifier and capacitor. DC output. Again, the actual rated output is dependant on the current loading.
Transformer-rectifier-capacitor with electronic stabilisation. These will give a specific voltage regardless of the load current up to rated load current. Above this, the unit may current limit by reducing the output voltage or simply shut down until the overload is removed.
And then there are the SMPS (Switched Mode Power Supply) types. These are the ones which reside in a small box, and can give a high current output and a high (relatively) voltage output. The power that these units can supply is out of all proportion to their size. A good example is the power supply for a laptop which may be something like 19 Volts and 4.6 Amps out of a box 4" x 1.6" x 1". These are actually highly efficent which is why they are so small. Also, because of their design, they hold their rated voltage very well from no load up to their maximum loading. I don't know what happens if overloaded, but I would not be at all surprised if they simply shut down until the overloading is removed. They are, after all, a highly complex electronic circuit.
Finally, there are the specialist power units used for charging batteries. These are designed to limit over charging, and over heating in the case of NiCd and NiMh. I don't know anything about these other than they are capable of proving sufficient voltage at a suitable current for the number of cells being charged; eg, I have a rapid charger for NiMh cells which can detect how many cells are connected (2 or 4), and when they are charged (by measuring the voltage across the cells whilst charging). I also have another NiCd/NiMh charger which simply charges at a rate sufficiently low to avoid overcharging.
There are also power supplies used in desktop computers etc which can detect if they are connected to a mother board and will not supply power unless so connected.
As you can see, there is a wide variety of power supplies and really the only basic way to test them is to test for the expected voltage, eg 9V DC, 15V AC and see what comes out. If higher than expected, and not grossly higher, then it is possible that the unit is satisfactory, eg a unit rated at 9V DC, could well be ok if the meter indicates say 11V DC, but 34V DC would be somewhat worrying. If you think it is ok, then a cheap second test would be connect up a torch bulb, or a car bulb and see what happens.
Regards,
Peter G. Shaw