Thursday, April 30, 2009

What’s your worst case scenario?

National Instruments have a neat little tool in Vision Assistant and Vision Builder, (the Performance meter,) that shows how fast each step of an image processing algorithm runs. What makes this particularly useful is that it executes multiple times and gives average, max, min and standard deviation for each processing step.

At this point you may be wondering why the execution time varies. The best answer I can offer is that some algorithms – blob detection is a good example – will take more or less time to run depending on what they find in the image. (A far better explanation is provided in “Accelerate Processing Performance,” by Pierantonio Boriero and Dwayne Crawford of Matrox Imaging, published on Visionsensorsmag.com Feb 27th, 2009.)

What this means is that you can’t just assume there’s plenty of time between image acquisitions for processing, because sometimes there may not be. In other words, 99% of the images you capture may process in let’s say 20ms, but occasionally the tools might need much longer to run, perhaps extending execution time to 40ms. If your images are coming in at 30 fps – in other words, one every 33ms – 40ms execution time could cause a problem.

Bottom line? Run your algorithm on many, many images before deploying it on the factory floor, and make sure you know the worst case execution time.

No comments: