It is customary to employ a ratio of CMM uncertainty to feature tolerance of at least 1:5 for determining how accurate a CMM measurement must be (1:10 is ideal, but can prove to be too expensive to be practical in many cases). This ratio offers a safety buffer that guarantees the findings have a low level of uncertainty in comparison to the component’s anticipated range of variation. The accuracy debate should be over as long as a 1:5 ratio can be maintained on the tightest tolerance.
Sadly, something as seemingly insignificant as replacing the stylus on a probe can have a surprising amount of impact on the potential true accuracy, leading to noticeable fluctuation in the measurement findings. The annual calibration of the CMM is insufficient to verify this precision because it only verifies the results obtained with the test stylus (usually a very short one). This is most likely the most accurate scenario. We need to understand how the stylus affects measurement uncertainty in order to fully comprehend the possible precision of a larger range of measurements.
The four primary stylus selection factors that have an impact on CMM accuracy will be examined in this section:
1. Sphericity of stylus balls (roundness)
2. The bent stylus
3. Heat Resistance
4. Material selection for stylus tips (scanning applications)