The next time you replace human inspection with a vision system, do you think Production will complain about the false reject rate?
I think so. In fact I’ll bet a slap-up dinner for two that they will. “The system’s kicking out too many parts,” someone will bleat, “And it’s costing us money.”
Well since you know this will happen, why not take action beforehand to head it off? Here’s what I suggest: gather some data on the accuracy of your human inspectors. Juran tells us they’ll be around 80% accurate, though I suspect that’s a long-run, task-dependent number. But assuming he’s correct, that means humans have significant percentages of false rejects as well as false accepts. But how do you get your hands on this data?
One way would be to quarantine all inspected product and reinspect it. But with any respectable sample size your reinspectors will also have an error rate, so how do you know who is right?
I’m sure there are some statistical ways of handling this, but I’d like to suggest a more pragmatic approach: find out what your customers are telling you. Your Quality department almost certainly has data on returns and complaints: go through this to find out how much bad product human inspection is allowing through.
Calculate what that is as a percentage of total production and you have your False Accept rate. Subtract that from 20% and you have a False Reject rate.
Okay, that’s hardly scientific, but it will give you an indication of how much good product is being scrapped. (Yes, it’s probably over 10% because, in my experience, Inspectors tend to err on the side of caution.)
So how can you use this in your next project? Well hard numbers on the accuracy of your current performance will let you show how the vision system you install does a better job. (And if it doesn’t do better … well just make sure it does!) I suggest though, that rather than trying to design and run your own experiment out in the plant, ask Quality and HR to work on it for you. This will help get some buy-in and add credibility to the results. Sharing the numbers you estimated from customer complaints and Juran’s 80% rule might be enough to spur them into action without further nagging.
That’s what I mean by putting a stake in the ground: establish the current level of performance, so that you can, post-implementation, show an improvement.
No comments:
Post a Comment