Measuring Absolute Ball Diameter Commercially
There are a number of rather complicated problems involved in measuring the absolute diameters of precision balls. Before we even start discussing the techniques used in determining the diameter of precision balls we must accurately determine their sphericity and surface texture. You cannot measure quality into a ball. A high quality commercial ball is only round within 25 millionths of an inch (0.62 micrometers). Measuring it 30 times with a gage that is accurate to one micro inch (0.025 micrometers) will not make it one bit better, nor can you take a volumetric averaging approach because in all probability the out of round will be in an odd number of lobes, so the two point measuring gage you are using will not even detect it. The important effect of surface quality on the accurate determination of absolute ball diameter cannot be overemphasized. The peak to valley height of the surface texture is 3 to 5 times the surface roughness average (Ra). The surface finish of the same high quality commercial ball is 1.5 micro inches (0.038 micrometers) Ra, so at a minimum, the peak to valley is 4.5 micro inches (0.12 micrometers) and the two contacts of the measuring machine sees twice that or 9 micro inches (0.23 micrometers). A question without an answer is where does the surface texture end and the body of the ball begin. Corrections for elastic compression of the ball surface by the measuring force will be dramatically skewed due to the reduced percent of bearing area caused by the peak to valley of the surface texture. A higher measuring force will cause a disproportionate squashing of the rough ball surface. This reminds me of the old adage that you can't make a silk purse out of a sow's ear. Make sure that you are not trying to measure quality into the balls.
We are not trying to make the ball measuring process complicated; it is complicated, very complicated.
In addition to a broad array of standard quality commercial balls, we manufacture two qualities of stainless steel master balls. The Grade 5 is spherical within 5 micro inches (0.13 micrometers) with a 0.4 micro inch (0.01 micrometers) Ra surface finish and the Ultra Precise is a Grade 2.5, that is spherical within 2.5 micro inches (0.064 micrometers) and has a 0.2 micro inch (0.005 micrometers) Ra surface finish. The quality level of this grade 2.5 is at the very limit of commercially available measuring capability.
It has been shown time and time again that two highly qualified technicians using different pieces of high quality measuring equipment in two different but similar environments will get at least slightly different measurements for the absolute dimensions of the same artifact. The challenge is to develop a procedural philosophy that will hold this gap to an acceptable level. With so very many variables involved we have found redundancy to be an invaluable tool in further assuring a high level of reliability in our ball size evaluation. Some years ago on a visit to the P.T.B. in Germany, I was privileged to observe their gage block calibration procedure. Their superb facility had close temperature control, excellent vibration isolation and state of the art measuring instruments, but the thing that blew me away was the 100% redundancy of their measurements. They would measure a gage block one day in one lab and then the next day a different technician would re-measure it in another lab on a different instrument and only when the two measurements coincided was it reported as the size of that block. We have adopted this basic philosophy for our final calibration of master reference balls. We use a binary level of certainty in that the two measurements must agree within two micro inches on two different days to be accepted and they consistently do.
The first challenge of our methodology is to reduce subjective variables to as near zero as possible. With modern computers so readily available, the first and easiest step is to establish a communication link directly between the measuring instrument and the computer. This link will eliminate ambiguity, as each decision will be made based on predetermined algorithms. It will eliminate clerical errors and it will speed up the actual measuring process substantially, which is important to minimize instrument drift. In addition all of the raw data are stored so that additional math, such as averaging of several measurements and the addition or subtractions of constants and variables, can be automatically achieved.
According to the new ISO Philosophy, as set forward in ISO 9001 2008, traceability to the international standard is now required to the actual piece part being measured, not to the calibration standard or the measuring gage. In attempting to achieve this goal, more and more measuring equipment is moving toward absolute gauging techniques, often using laser scales that can be zeroed immediately before each measurement. You may object, as I do, to the use of "absolute" as the term to describe a measuring device that starts at zero and reads to a discrete end point but this has become the jargon. In a sense, calipers and micrometers can be considered absolute gages as contrasted with a comparative gage system like a dial indicator and a stack of gage blocks, with which an exemplar part is compared.
We have written this paper based on a question and (hopefully) answer format. Our questions and their answers should give the information required and the reasons behind the techniques and the methods employed in determining the free space diameter of precision balls. By free space, we mean the ball is floating at sea level with no force being applied to the ball by the measuring machine or by the mass of the ball against other objects. At present, actual measurements cannot be made under these conditions, so corrections for the force applied must be made to the measurements that we do make.
None of our questions are frivolous, as each and every one must be answered, to be sure that the measurements are reliable.
Lets start with the questions, and although there is much fungible technical knowledge in this paper, you must keep in mind that we are only measuring precision balls.
The questions are:
- Do you demagnetize the balls to be measured?
Most precision balls are made of high carbon steel alloys that have a high magnetic retentivity. If these balls are exposed to a strong magnetic field they will become permanently magnetized and will then retain iron-based turbidity that no amount of cleaning will remove. All steel-based balls must be demagnetized before measurement, because there is no way of knowing the prior magnetic exposure to which the balls may have been subjected.
- How do you clean the balls for measurement?
Cleaning the precision balls before measurement can be another complex task. The level of cleanliness required will depend on the desired certainty of the measurement.
One of the considerations is where did the balls originate? Rust preventative oils that are applied by commercial ball manufacturers contain polar compounds that can polymerize overtime to form waxes and varnishes. These tenacious materials are actually a byproduct of the rust preventative oil locking up the moisture that comes into contact with the surface of the balls. To reduce the problem of cleaning, our master balls are only lightly coated with mineral oil.
In general there are two types of contaminates that must be removed from the balls before measurement. The first is particulate. These consist of solid particles of dust and dirt. The second types are adhesives, including oils, greases, waxes and varnishes. Water based cleaning solutions present a serious corrosion problem when dealing with really fine quality balls. For this reason, organic solvents are normally the only safe approach for cleaning balls of the highest quality.
When using solvents for cleaning, great care must be exercised to prevent operator inhalation or assimilation through contact with their skin of any substantial quantities of these liquid agents. Solvent cleaning of the balls will remove the adhesives that hold the particulate on the surface of the balls. In this way we eliminate both forms of contamination.
Three or even four different solvents may be required to remove the various organic adhesives that may be attached to the surface of the balls. The four common solvents used are mineral spirits, acetone, alcohol and the very highest purity water. We use ultrasonically assisted cleaning.
In our experience, for the absolute ultimate in cleaning: when we are doing research on the nanometric level, we use three sequential cleanings in ultrasonically activated baths of medical quality Ethyl Ether.
Most contaminates, both particulate and adhesive are of organic origin. An excellent way to check for their presence is by illuminating the cleaned balls with ultraviolet (black light). Any remaining organic matter will fluoresce
- Are you making absolute measurements or are they comparative measurements?<
Absolute gauging is faster, more versatile and at least as accurate as the comparative method.
- Are you using flat parallel measuring surfaces?
We believe that this is essential for high quality ball measurement.
- If they are flat, parallel measuring surfaces, how do you evaluate their flatness and parallelism? I only know of one major metrology lab that doesn't use flat parallel measuring surfaces, although there may be more. A Fizeau Optical Interferometer is the best way to check these conditions but an X and Y scan of the measuring tip is entirely acceptable. By measuring the diameter of a small precision ball, say 3 mm (0.1181 inch), as it is moved in small increments from the front to the back and then from one side to the other side, an accurate scan of both flatness and parallelism can be made.
We have spent thousands of man-hours developing tools and techniques to perfect both the generation of flat parallel surfaces and for the evaluation of these conditions. These conditions can be a major source of error in determining absolute ball size. When you consider that one light band, which is the generally accepted flatness tolerance for gage anvils, will add eleven micro inches (0.279 micrometers) of uncertainty to the gage's error budget.
- If you are using a flat measuring anvil and a spherical gage tip instead of flat parallel surfaces, how do you know when you are at the Apex of the sphere? As this is a two-axis problem, some form of dual axis scan will be required. We feel that this method is far too slow and too complex to be practical.
- If you are using a flat anvil and a spherical gage tip what is the front to back and side-to-side hysteresis (repeatability) of the gage? This can be determined by measuring a flatted cylinder with approximately the same radius as the ball. The cylinder is pushed forward under the gage and then pulled back. Next it is introduced from the left side and then from the right. Any variation in the peak readings is the mechanical hysteresis and can be an important part of the gage's error budget. If you don't have this cylindrical device a gage maker can supply it. A cylinder made of the same material, as the balls with precision lapped surface texture comparable to that of the balls is desirable.
- What is the sensitivity of your measuring gage? Or, what are the measuring increments? The finest sensitivity consistent with your financial budget and the geometry and surface quality of the balls being measured should be used. Be sure that you are not using a micrometer to measure a brick.
- What force is being applied to the sphere by the measuring gage? This is the most important variable in determining Hertzian Elastic Deformations and a force gage should be used to accurately determine it. When larger diameter balls are measured on a vertical gage, the mass of the ball must be included as an additional factor in the elasticity calculations. The axis of the standard sphere measuring machine used by N.I.S.T. is horizontal. This approach eliminates the mass of the ball from the equation but it is a far more complex mechanism.
- What are the Young's Modulus of Elasticity and the Poisson's Ratio of the balls, the measuring tip and the measuring table or gage anvil? If these numbers are not known, what materials are the balls, the measuring tip and the measuring table (gage anvil) made of? Use this information to look up Y. M. and P. R. These numbers are one of the greatest potential disparities between two different systems due to the three potentially different materials used. Accurate knowledge of these numbers is never better than 10% and in addition the determinations were all made in tension, and we are always working in compression. Young's Modulus of elasticity is a measure of stiffness. It is the units of strain caused by the units of stress or simply the stress strain ratio. Poisson's Ratio is the ratio of the units of lateral strain to the units of longitudinal strain.
Dr. Ted Dorion, the head gage block metrologist at NIST says that "diamonds are for Jewelry" but they are still used for some gage tips. The stiffness of diamond (Y.M.) can only be guessed at. For different orientations of the same diamond this characteristic can vary 300% and three different studies couldn't agree on what the Y.M. was within hundreds of millions of pounds per square inch?
- What algorithms are you using to correct the four Hertzian Elastic Deformations? We consider the Y.M. of the 440c alloy ball to be 30,000,000. P.S.I. and its Poisson's Ratio to be 0.293 (440c alloy is the universal material for steel master balls). These same values can be applied for standard chrome alloy steel bearing balls. In trying to understand Hertzian Elastic Deformation, it helps to remember that everything is mostly nothing. When you get really serious about making absolute dimensional evaluations, compensation for the elastic compression of the ball surfaces by the force applied between the measuring tip and the anvil or measuring table must be made. At the same time that the measuring surfaces are elastically denting the ball, the ball is deforming the tip and the table. All four of these elastic deformations are in compression and each one makes the ball look somewhat smaller than its true diameter, would be in free space.
When setting a comparison gage with a gage block master, the zero setting is not truly zero. As if the ball gauging process is not already complicated enough, there is an additional elastic deformation that occurs when the tip of the gage come into contact with the flat surface of the gage block. A similar phenomena occurs with a laser interferometer or laser scale gage when the gage tip touches the anvil. When you set the gage to zero the pent up stresses caused by the gauging force is still being applied. In these cases, absolute zero is a small value plus the gage reading. This condition requires an additional stress strain calculation. This correction is a positive number so it must be subtracted from the gage readout. Because this is an in-gage condition it will be a constant that will not vary with the ball diameter or material.
There are two acceptable methods of compensating for these elastic deformations. You can do stress strain calculations based on the three geometry's involved and the metallurgical properties of the three materials involved. The alternative method is to measure the ball diameter under several different loads and then to extrapolate these readings to zero force. If you know the metallurgical properties of all three materials well enough, either method will give identical results. Copyright restrictions will require that you look up the formulas for the elastic deformations in a materials handbook.
- What is the real time dimensional stability of your gauging system?
This is not the hands off static stability but the in use stability while measuring balls. This stable time will be very short even on the best gages. This is why we put such emphasis on getting in there, making the measurement and getting it over. As you will read later, this is one of the three major errors sources.
- How long does it take to make your measurements?
We limit our number of measurements to three that are perfectly orthogonal (at right angles to each other) evaluations, to limit the gauging time (N.I.S.T. uses four). It should not take more than thirty seconds to complete all three measurements.
- How often do you master your gage? Again the short time stability of the gage is one of those three major ball-measuring limitations. We recommend that you master the gage before each measurement.
- Do you take the measurements at random points? They should be taken in a geometric pattern. The implication of just moving the ball a little this way or that is quite obvious.
- What device do you use to change the orientation of the ball for the three different measurements? The ball should be quickly and accurately moved to three orthogonal positions. We use a discrete mechanical device to move the ball.
- Do you measure the room temperature, the temperature of the soaking plate, the temperature of the gage, or the actual temperature of the balls? The best and most practical method is to measure the temperature of the soaking plate. It has a large thermal mass that will only change temperature slowly and the small self-heating of the temperature measuring transducer won't effect it. Although all four temperatures bear on the results, you will find yourself chasing your tail if you try to juggle all four of them. As with all dimensional measurements, precision balls must be measured at exactly 68 degrees F. (20 degrees C.) or a correction for any variation in the temperature of the lab must be made. We seldom know the thermal coefficient of expansion for the ball material better then 10-15% so having to correct for any temperature variation will add to the measuring uncertainty. The top of the soaking plate should be situated at the same elevation as the top of the measuring machine anvil to minimize the effect of temperature stratification (floor to ceiling) within the measuring lab.
- Of what material is the soaking plate made? We have found that in a well-lighted lab, the commonly used cast iron soaking plates, because of their dark oxidized color, can be warmer than the environment. We consider aluminum to be the best practical soaking plate material. Liquid soaking baths seem like a good approach but they simply aren't practical. For critical measurement, the ball must be dry. When the liquid evaporates it reduces the temperature of the ball and thus its apparent diameter.
- Is the soaking plate covered, or is it exposed to radiant energy? Uncovered balls can be a full half of a degree F. warmer than the soaking plate and the rest of the environment.
- Do you use a breath shield? Is it transparent or opaque? We have found a breath shield to be mandatory and that a clear plastic shield adds to measuring gage.
- What form of traceable master is being used to evaluate the gage? Is it a gage block? (Absolute uncertainty is plus or minus 4 micro inches) (0.10 micrometers). Is it a master Ball? Or? A master ball has a built in uncertainty of plus or minus 10 micro inches (0.24 micrometers). The only good reason for using a ball to master the gage is that a master ball made of a complimentary material will eliminate all of the Hertzian elastic corrections.
- How do you achieve traceability of this master? It should obviously be traceable to a National Laboratory. This is a very good reason to use gage block masters, as they are so readily available and have such a high degree of dimensional certainty. Although the modern laser scale devices are astonishingly sensitive and accurate there are still enough opportunities for error to keep the metrologist on his toes. As the gage itself is not our subject, I will concentrate on an outline of a gage evaluation procedure that provides direct traceability When an absolute gauging system is being used, you install a pin and rail anvil, i.e. a gage block calibration anvil in the measuring machine. Set the machine to zero with the gage tip against the anvil. Trace ability is achieved by measuring a series of high quality calibrated gage blocks of several different thicknesses. It is a good policy to mix some English sizes with some Metric sizes to preclude the possibility of harmonic or systematic patterns being present. If a comparison gage is to be used, a traceable combination of gage blocks that are the same height as the diameter of the ball, is wrung up. This combination is then wrung to the flat anvil of the measuring machine and the indicating gage is set to zero. After a soaking period to thermionically stabilize (equilibrate) the measuring system, the ball is measured by comparison. We consider an uncertainty of 2 micro inches per wring to be realistic.
- Of what material is the master made? This is a major concern; we often find companies using tungsten carbide master balls or gage blocks to measure steel parts. It is imperative that the master and the exemplar are made of complimentary materials, otherwise complex and uncertain corrections must be applied. The main disparities between materials are a difference in their stiffness and a difference in their thermal coefficient of expansion. We measured one steel and one tungsten carbide master reference gage block that were both calibrated with direct traceability on one of our laser scale measuring machines. The tungsten carbide gage block measured six micro inches (0.152 micrometers) larger than the steel block.
- What is the proper calibration time cycle for the master?
There is no magic answer. It will depend on how much use it gets, but never longer than one year.
- What is the metallurgical stability of the master? Is it growing? The metallurgical instability of some materials can be gross. If a master ball is used it must be made of select material and thermal- cycled for long-term dimensional stability. Gage blocks have been dimensionally stabilized, thermionically, for over eighty years.
- What is the simple repeatability of the gage system? What sizes do the same balls measure on two different days? The result of this simple test may shock you.
- If you are using a gage tip lifting device, what is its repeatability? We find that these devices inject several micro inches into the gage's error budget. Set the tip on a ball and just raise and lower the tip 20 times to test.
- What is the entire error budget of your gauging system? This can be considerable as it includes the uncertainty of the master; the knowledge of and the repeatability of the force being applied; all four or five of the Hertzian Elastic Deformations involved, the full spectrum of the temperatures involved, all of the radiant energies but especially those of the metrology technician. An average adult radiates 125 watts of heat. Be sure you don't forget hysteresis. Have you measured it? We had a disagreement with a National Standards lab recently involving a sphere. When I asked if they had checked the bi-directional repeatability (hysteresis) of their gage, they had not and that was the problem. The final assigned value of the error budget can very dramatically according to how the independent factors are applied. Do you add the errors, do you average the errors arithmetically or do you, add them in quadrature, or? N.I.S.T. offers their Technical Note-1297- "Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results".
- Do you average the multiple ball diameter measurements arithmetically, least square, or? We average arithmetically. It is a little more critical because any diameter variations add more to the result but it is simple and understandable.
- Do you average in outliers caused by random errors?
We are very deterministic. Random errors are simply gauging errors that we have not worked hard enough to eliminate, so eliminate them.
We start out with the premises that every one of the balls is spherical to a predetermined number of micro inches (nanometers). This roundness must be well established by the manufacturing process and by in process roundness inspection. The computer program of our gauging system is written to alarm any of the three readings that vary more than this number of micro inches (nanometers) and no label will be printed, so no report can be made.
- What level of cleanliness do you maintain in your lab -10,000 - 1000 - 200 or what? We have found this to be one of the three most important factors in determining absolute ball size. An outlier generated by a sub-micron particle of turbidity; when averaged into the measured size of the ball, will dramatically skew the resultant size reported. There is a real problem here because flakes of human epidermis (skin) are usually 3 to 10 micro inches (0.076 to 0.250 micrometers) thick, so a reading taken on top of one of these flakes may look like a true reading of the ball's diameter. These flakes are sticky and tend to attach themselves to the measuring surfaces so it may distort the size measurement of several balls.
- How often do you clean the measuring surfaces? In a clean room environment, the gage tip and anvil will stay clean almost indefinitely. In a commercial facility with class 10,000 or less environment the measuring surfaces must be cleaned frequently. If the gage does not re-zero before a measurement begins, the gage tip and the anvil should be cleaned.
- What is the relative humidity of your measuring facility? If it is over 40% you have a serious potential for corrosion of the balls. The most common ball material is chrome alloy steel, which will rust while you are looking at it in a high humidity environment. Even the hard 440c stainless steel used for master balls can corrode in this environment, if they have been touched by bare human hands.
- How do you handle the balls? How do you move the balls from the soaking plate to the gauging station? For every day measuring the balls can be quickly moved with gloved hands. Vinyl or latex surgical gloves are probably the best choice, to reduce heat transfer, because of their cleanliness, but nylon gloves are acceptable. When high-end measurements are to be made, we find a vacuum pickup wand, with a soft plastic contact, to be the best way to go.
- How long should balls be soaked before measuring? We thoroughly clean the balls to be measured and place them on a covered soaking plate for 24
hours. When time is of the essence a one half-hour soak for one-sixteenth inch diameter balls on up to a one-hour soak for one-inch diameter balls, seems to be adequate. A note of caution: We are assuming that the soaking plate temperature is already at 68 degrees F. (20 degrees C.) when the balls are placed on the plate. It will take many hours to equilibrate a large thick soaking plate.
In studies that we have conducted on ball size measurement for the U.S. Navy, the three main error sources that we reported were as follows:
- (A.) Was cleanliness
- (B.) Were the real time stability and the repeatability of the gauging system due mainly to temperature and electrical phenomena
- (C.) Was a very rapid and sometimes quite substantial reference datum shift due to slippage of the interfaces or joints between elements of the gage.
These elements are the measuring tip with the stem of the gage, the anvil with the base of the gage stand, the back of the transducer with the gage arm, the gage arm with the vertical column and the vertical column with the base of the gage. All of the aforementioned elements are clamped under considerable compressive stress. The trigger for the sudden shift may be a tap on the table, setting a specimen down on the gage or a door closing. There are good reasons for these reactions and they can be minimized by sophisticated designs but most companies do not find such elaborate and expensive techniques practical or even necessary when very frequent mastering of the gage will give acceptable results. We have spent over $50,000 to build a single gage system with mechanical stability at the 0.2 micro inch (five-nanometer) level.
In the title of this paper appears the term "commercially", so this may raise the question of "how much further can you go and what is the cost"? We ran a program for the U.S. Air Force to manufacturer and have calibrated a full set of 23 sizes of stainless steel master balls from one sixteenth of an inch (1.588 millimeters) to one inch (25.4 millimeters). N.I.S.T. did an absolute five color interferometric evaluation of each ball under three different loads and extrapolated the force to zero. This N.I.S.T. calibration cost a small fortune. Even on this ultimate effort, N.I.S.T. reported uncertainties in their measurements as large as three micro inches (0.076 micrometers). Our readings were in agreement with theirs within three micro inches (0.076 micrometers) across the board: however, we do not consider this to be our entire error budget. On our super lab-measuring machine, we get repeatability of two tenths of a micro inch (five nanometer) but with all of the uncertainties involved, we consider plus or minus ten micro inches (0.25 micrometers) to be a realistic total error budget (uncertainty) up to one and one and one half inch (38 millimeter) diameter balls.
As far as we know, this N.I.S.T. effort is the only time that a full range of master balls has ever been calibrated to this level of accuracy. Unfortunately, N.I.S.T. has not had the budget to publish the results of this program. We were told that this set of balls was the highest quality that N.I.S.T. had ever calibrated. The group leader involved in this project was Dr. Howard Harary.
There are at least two programs being conducted by National Laboratories that are attempting to make the ultimate spherical diameter measurements using non-contact optical techniques. I have visited both the Japanese National Lab in Tsukuba and the Italian National Lab in Turin. What I came away with was an appreciation of the enormity of the problems. In Italy, I was shown a polar recording of an ultra precise sphere at a magnification of 200,000 to 1. It was not a nice ellipse or other polylobular form but hundreds of craggy peaks and valleys. After digesting what I have seen, I have concluded that with careful measurement, we may be able to determine the volume of a spherical shape to some very fine number; but it remains for the development of some very hard, amorphous metallic material before we can go much further with the production and measurement of practical master balls.
Another factor that can effect ball diameter measurements of the highest level is vibration. The measuring machine that we use for high-level research is suspended on an air isolation table.
The actual factors that have the greatest effect on the vibration generated by the balls running in a bearing are the slope of the surface topography and its amplitude. This in turn is determined by the number of undulations per revolution and their amplitude. The basic geometry of a ball is typically considered to be from one to fifteen undulations per revolution. In practice, the surface texture of balls is divided into two components, the microfinish, which is measured optically by interferometry, and the waviness, the amplitude of which is measured in three frequency bands by a waveometer. It is interesting to correlate the variations in ball quality and the effect it has on the level of vibration generated in a running bearing. This quality level is measured by an Anderometer. An Anderon is a unit of measure comprised of a velocity vector (rate of change) and the amplitude of the undulations.
Any variation in the diameters of individual balls in a given bearing is another factor contributing to the Anderon readings.
Determining ball diameter variations in the micro inch (nanometer) range has been a major factor in driving our ball diameter-measuring program.