There's a bunch of factors that reduce accuracy and precision when you get into the detail. First you have to know what you have and compare like with like. For example:
- Dial types are typically ±0.02mm but many are ±0.05mm.
- Digitals are also typically ±0.02mm but some are ±0.01mm and a few ±0.005mm.
So any digital will outperform a ±0.05mm dial type, and the best digitals outperform the best dials. However, I bet most of us own calipers only good to ±0.02mm irrespective of whether it has a dial or a digital readout.
Awkward questions arise about any caliper's claim to accuracy as soon as you look at the detail. Do you ensure that the object and the caliper are both at the same temperature before taking measurements? Do you wear gloves when handling the caliper? Both digital and dial caliper are vulnerable to dirt, dial mechanisms particularly so: is the instrument clean? How repeatable are measurements? Is the instrument accurate over it's full scale or just in one place? Does it have an up-to-date Calibration certificate? If you don't have an up-to-date calibration certificate, a long list of problems may be hidden under the bonnet. Wear, bent jaws, loose gibs, skipping teeth, or other damage may be making the instrument untrustworthy.
Digital and analogue displays are both imperfect. A fault of the digital display on a ±0.02mm caliper is that it implies an accuracy of 0.01mm, which is a big fib. The faults of the dial display are that it's subject to mechanical and parallax errors plus a dial can be misread ± whole units at the revolution level, ie 10.05 read as 11.05mm
Worst of all is operator error. Not everyone is good at measuring! Getting the jaws aligned on the object isn't always easy. The pressure applied by the operator is both critical and difficult to judge. Too little and the reading will be low. Too much pressure and the caliper will bend. Soft objects like plastics may deform.
One thing that makes me chuckle is chaps on YouTube 'proving' that their calipers are better than another make by checking a gauge block. The problem is operator bias. If you know the correct answer in advance, expect to get the answer you expected rather than the truth. As operator bias is usually subconscious rather than dishonest, nothing is proved by this type of testing. If you really want to know how good you and your instrument are, measure the dimensions of unknown objects handed to you randomly by a third party. Some who do this will be pleasantly surprised, most will be moderately disappointed, and a few horrified.
Like as not in the typical workshop both types are 'good enough' in the right hands. I say use whichever you prefer.
Dave
Edited By SillyOldDuffer on 26/10/2017 13:42:15