|James Alford||12/11/2020 17:34:54|
|435 forum posts|
I have bought some second hand 2", 3" and 4" micrometers. Is there any way to check that they "zero" correctly without using a block of known size?
1207 forum posts
You need an accurate test piece preferably the size of the minimum reading of the micrometer in order to zero the scale, I don’t know of any other method by which you can achieve this, the test piece can be a micrometer test standard or slip gauges.
|old mart||12/11/2020 17:43:36|
|3345 forum posts|
Difficult without the length bars they came with when new. To check the calibration of a micrometer properly, a box of slip gauges are used to measure odd sizes so that the micrometer leadscrew wear or damage is also checked as well as the zero which is what the length bar is for.
|Michael Gilligan||12/11/2020 17:43:48|
18932 forum posts
Starrett make some excellent ‘rods’ for that very purpose, available at surprisingly modest prices.
... I believe the Mitutoyo equivalent is ‘reassuringly expensive’ [but haven’t checked recently]
Edit: I beg Mitutoyo’s pardon ... This is not at all unreasonable:
Edited By Michael Gilligan on 12/11/2020 17:53:25
|Howard Lewis||12/11/2020 17:58:23|
|5299 forum posts|
You can buy checking pieces for Micrometers.
I think that Cromwell Tools are where i got some.
If the mic does not zero after three clicks of the ratchet, the "C" part of the spanner supplied with the mic can be used to rotate the thimble until the two Zero lines co-incide.
Ideally, the mic and the standard should be allowed to soak to the same temperature for 24 hours. A warm mic set with a cold standard, or vice versa, are not the routes to the accuracy sought.
A secondhand mic may have defects, such as worn threads or even a strained frame (throwing the anvils out of parallelism ) or damage from trying to measure rotating work and consequently being thrown across the shop!
If the mics are modern and have carbide anvils, they can be easily be chipped.
|Neil Lickfold||12/11/2020 18:08:22|
|720 forum posts|
Inspecting micrometers is a specialists job, and requires things like a optical flat and monochomatic light, apart from length stacks that check the micrometer at different positions. There are now newer methods of inspecting the spindle accuracy and the anvil runout at the same time, while it is rotated. But zero is still done with length bars.
|Andrew Tinsley||12/11/2020 18:43:16|
|1485 forum posts|
Just as Michael said, Starrett do the standard rods for zeroing micrometers. I purchase 4 about 18 months ago and could not believe how cheap they were!
|Howard Lewis||12/11/2020 18:49:46|
|5299 forum posts|
As Neil Licfold says, to check a measuring instrument properly requires a temperature and humidity controlled room with highly specialised equipment. Not to mention skill.
We have no control over the accuracy of the standards for each mic. What tolerances apply to the length standards?
Possibly not as stringent as a set of slips held in a Calibration Room. And even such a standard is traceable back to NPL standards or an international standard.
And do we question the accuracy with which the temperature and humidity of the Calibration Room is controlled?
So then we question then accuracy of the standards.
As hobbyists, lacking such extreme accuracy of environmental controls, we have to do the best that we can with what is available to us.
But since we are not in Industrial Clean, or Calibration, Room conditions, we have to live with what we have
So chasing hundredths of a thou is not really practicable.
Hence the advice to soak the instrument and standard for 24 hours, and to avoid holding either for too long, (Thermal insulation pads do help to reduce body temperature influencing the dimensions )
To exert a constant torque, and therefore, we hope, force on the anvils of the mic, we rely on the ratchet behaving consistently when we take our measurements or check against the standard.
Unless the force exerted is constant, the frame of the mic will deflect by a varying amount to reduce accuracy, just as temperature departures from the usual standard of 20 degrees Celsius will detract from absolute accuracy.
Let us not confuse ourselves with delusions of accuracy. If a 20 mm diameter piston produces an acceptable fit in a supposedly 20 mm cylinder, the parts are fit for purpose. This statement assumes that we are not manufacturing a sub atmospheric pump to produce a pressure level of a couple of mm Hg.
A plea to be realistic in our expectations.
|Michael Gilligan||12/11/2020 19:01:23|
18932 forum posts
Here’s a drawing of the Starrett 1”
I do hope they are not serious about that fractional tolerance !!
Edited By Michael Gilligan on 12/11/2020 19:05:11
|Stuart Bridger||12/11/2020 19:01:29|
|531 forum posts|
Excellent post from Howard, absolutely spot on.
|Andrew Johnston||12/11/2020 19:30:13|
6266 forum posts
I'm mystified as to how one can check zero with a length standard, unless it's of zero thickness?
To set zero i simply close the micrometer anvils in the normal manner and tweak the thimble to read zero. I then use length standards, gauge blocks or screw together length bars (depending on what I have and the size of the micrometer) to check the full scale reading. I might also check some intermediate values as a sanity check on linearity.
Of courese it's not up to toolroom standards, but it's plenty enough for the standard my work needs.
|282 forum posts|
Andrew. He said......“ I have bought some second hand 2", 3" and 4" micrometers. Is there any way to check that they "zero" correctly without using a block of known size?”
The only way to zero a 2” micrometer is with a 1” standard you refer to a 1” micrometer
Edited By Zan on 12/11/2020 19:45:27
1487 forum posts
I guess some of the answer will depend upon whether you're making stuff purely for yourself, or for someone else to use at the far end. If your "customer" needed calibrated standards for the parts, you wouldn't need to be trying this yourself.
The general conclusion was that for my home use, consistency within the workshop is more important than absolute accuracy, so long as we are "close enough", for want of a better expression.
I had a variety of second hand mics, of dubious origins, from 0-6" with no standard length bars; I did eventually buy a 1-2" with a 1" round disk standard in the box (I'd no idea how accurate that was).
My 0-1" was set to zero OK as normal, and checked wide open with the 1" standard from the 1-2" mic. All seemed to be OK
Having adjusted/"proved" the 1-2" mic at both ends, find something that's almost exactly 2" long/diameter, and measure it with the wide end of the 1-2", and then the short end of the 2-3" and make sure there is consistency between the two instruments.
Carry on and work through your external mics, doing sanity checks with internal mics as you work through everything. Your newly made/measured home 2" standard, doesn't have to be 2" of course, provided both 1-2" & 2-3" mics measure it as the same 1.994"
On the other hand, I recently picked up several larger mics up to I think 10".
|2912 forum posts|
I believe you need a Micrometer Calibration Gauge Block Set to accurately check them. Unlike ordinary sets they contain specific sizes for the job. Checking with a single size is apparently pointless but I’m no metrology expert.
|James Alford||12/11/2020 21:53:38|
|435 forum posts|
Thank you for all of the replies, which will be really useful. It had never occurred to me think about how to check these micrometers when I bought them. I have a 1" mircometer which I have set by closing the anvils and tweaking the barrel.
Anything that I measure is purely for my own use, not for anyone else, so extreme accuracy is not needed. I bought them so that I could measure things like the journals on my crankshaft and other similar car parts.
I shall look for some gauges or, as suggested, large ball bearings.
|Bill Davies 2||12/11/2020 22:35:32|
|246 forum posts|
Not far from Stuart's employment, I did an HNC unit on metrology at Brookland's College. My employer had multiple inspection departments and a temperature control standards room, which I worked in for a while.
For our purposes, it's worth remembering that the micrometer, length standards and workpieces are likely to be steel, so we can relatively ignore temperature control as all will expand or contract to the same extent. Allow a few hours to pass and ensure that you handle everything as little as possible, to avoid diferentially heating them from you hands. Use cloths as insulators.
Length standards thet leave the thimble in different positions from the zero will check for a periodic error. But I would generally say that simply checking at a known length will generally suffice for us. In the workshops I worked in, there was always a series of discs, in one inch steps, to check micrometers and calipers against. The most likely problem is a dropped measuring instrument, which brings the jaws closer together, and in identical error at all positions. Easiest solution is to buy a new instruments.
I would caution against using ball bearings, as they have a theoretical point contact, and unless using a light feel, the ratchet will tend to cause the size to read slighly under - especially smaller balls. Cylindrical shapes are similarly a potential problem, but less so due to a theortical line contact.
5505 forum posts
Yes there is. But you also need a 0 to 1" mike.
Set the 0-1" mike to read zero when the anvils are closed. Then use that mike to turn a piece of bar to read exactly 1.000" diameter. (Use emery paper to achieve final size .)
Then use use that bar to set your 1-2" mike to read exactly 1.000". Then use that mike to turn a piece of bar to read exactly 2.000" diameter and use that to set the 2-3" mike.
And so forth and so on.
You can double check your mikes by measuring the outside race diameter of ball bearings. They are made to pretty tight tolerances available online on the manufacturers websites.
Not toolroom metrology standard for sure, as those of a sensitive nature are sure to point out. But good enough for most home workshop use in a pinch. The only time it is likely to be critical is fitting together two parts such as boring a 1.500" cylinder to fit a 1.499" piston. But if both dimensions are measured with the same mike, you will get the desired clearance anyway.
The main reason industry sets mikes in air-con metrology rooms etc is to ensure consistency of parts size in mass production and so that parts made by different machinists will all fit together. So not necessarily relevant in a one-man home workshop where things are usually individually hand-fitted together.
Edited By Hopper on 12/11/2020 23:18:21
|1895 forum posts|
Someone may have made this point all ready James but I've only quickly skimmed the previous posts
I've now used slip gauges to check my larger mics (good enough for my purposes) but had "uncalibrated" mics (that I had just cleaned up) for a good while before I did that.
If you are simply using your mics to make comparative measurements (and not absolute ones) then you don't need to know if they are zero'd or not. I should state that I'm not after lab accuracy either, just "good enough"
For instance, if you are checking for wear in two parallel flat surfaces (e.g. a Myford bed) then the mic will measure the difference between the two for you and be accurate. What you cannot do is measure what the actual width is with any confidence, just the difference in width. But for many things that is fine and still useful (till you get the setting gauges anyway). I was surprised how inexpensive gauges they were too btw - it's been a while but a 1" one was about £9 I think.
Edited By IanT on 13/11/2020 00:08:36
|duncan webster||13/11/2020 00:35:43|
|3509 forum posts|
Big ball races are made to very tight tolerance on OD, better than I can work to. If you haven't got any in your 'come in handy' box, try your local garage for old wheel bearings etc.
Actually for most model engineering applications you measure the bit you've just made/bought and machine the next bit to fit, so calibration to external standards isn't that important. (Waits for the howls of protest)
|Michael Gilligan||13/11/2020 07:34:06|
18932 forum posts
No protest from here, Duncan; you make a valid point
Regarding the broader discussion, however:
It is worth noting that ... despite its title this thread started with a simple question about setting the zero reference for some larger micrometers ... not about fully calibrating them.
Please login to post a reply.
Want the latest issue of Model Engineer or Model Engineers' Workshop? Use our magazine locator links to find your nearest stockist!
You can contact us by phone, mail or email about the magazines including becoming a contributor, submitting reader's letters or making queries about articles. You can also get in touch about this website, advertising or other general issues.
Click THIS LINK for full contact details.
For subscription issues please see THIS LINK.