Wednesday, October 31, 2012

Benchmarking optics


How do you determine if a lens costing $800 is better than one costing a quarter the price? Well asides from thinking, as I do with wine, it must be better if they can command a premium for it, the answer is to do a side-by-side comparison.

But Brian,” I hear you moaning, “I only want to buy one lens, not two, one of which I will not need.”

Well fortunately, Edmund Optics has already done some benchmarking for you. “Better Optics = Better System Performance” describes how they compared their lens with one from a competitor. Surprisingly enough, the Edmunds lens came out on top, but that’s not important right now.

What is important is how they conducted the comparison. It was a simple test that you could do yourself. Yes, you will need two lenses, but if you ask nicely your supplier will let you have them “on evaluation.” They may ask that you share the results with them. That’s something I have no problem with: mutual back-scratching is mutually beneficial.

So read the Edmunds article and learn why their lenses are so good how to evaluate lenses for yourself.

Tuesday, October 30, 2012

An Atom-based vision PC? Really?


The Intel Atom processor is a neat little device. It’s miserly in its use of electricity, so in laptops and netbooks it gives great battery life. It also generates relatively little heat, which makes it good for fanless applications. I’ve seen some camera companies use it in their smart cameras too, but I have to question the wisdom of doing so.

I’ve nothing against the little processor that could – in fact I have one in the netbook I’m typing this on – but they’re not really up to high-speed image processing.

So you’ll understand my surprise in learning that Dalsa – sorry, Teledyne Dalsa – have launched a vision PC built around the little fellow. The box I’m referring to is the GEVA-300. Now I try to avoid being negative, but really Dalsa, an Atom-based machine? And it’s intended for multiple cameras. Why not just step up to an i5? Was the whole point just to have a fanless offering?

And to that point, if you spend just a couple of minutes on the Dalsa site you’ll see they offer such a device, the GEVA-1000. It’s not clear what processor it actually employs, but I’m figuring that the bigger model number means more horsepower under the hood. I don’t know what the price difference is either, but I’ll wager anyone who buys the 300 will later wish they’d stepped up to the 1000.

Monday, October 29, 2012

Q3 Results from Cognex


If you haven't already heard, Cognex reported solid numbers for the third quarter of 2012. Revenue (sales) was flat compared to the same period in 2011, while income (profit) was down just 1%. Comparing the first nine months, 2012 is looking a little better than '11, although we must remember that every year seems to be a good year for Cognex.

R&D expenditure, (the number I use as an indicator of new products in the pipeline,) has been pretty flat all year, suggesting the pace is being maintained but not accelerated. I guess when you're number one, all you have to do is stay ahead of the competition.

One point that I keep returning to is the amount of cash Cognex keeps in the bank. Right now they're sitting on $416m, up from $357m at the end of last year. That's right, nearly half a billion dollars in the bank!

If I worked at Cognex, (and I am available!) I'd be looking for a way to make that pot of gold work a little harder. We've seen some consolidation in the machine vision business of late. I can't help thinking that if the people in Natick don't put that money to use, (R&D or acquisitions,) someone is going to do it for them.

Time will tell.

Sunday, October 28, 2012

A different approach to color imaging


Machine vision pros know it’s not always necessary to use a color camera for a color application. Often all that’s needed is to either match the wavelength of the illumination source to the target. Like colors lighten, so a red light will make a red target appear white(ish) to the camera. An alternative is to put a filter over the lens, so that only light of the target wavelength is allowed through.

But what if you need to look at several targets in several different wavelengths?

What you need then is a liquid crystal tunable filter. A product like VariSpec might do what you want, and something similar is available from Inno-Spec. And if you want to understand just how they might be used, I suggest you read “Tell-Tale Color Changes: Camera Can Find Age of a Bruise” on the BioPhotonics website, October 2012.

This fascinating article describes some important color imaging work going on in the medical field. I’m not going to steal their page views by telling you about it: click the link and read for yourself.

And last, if you want to understand more about how a liquid crystal tunable lens actually works, take a look at “Liquid Crystal Tunable Filters” on the Olympus Microscopy Resource Center site.

Thursday, October 25, 2012

Industrial lights from Smart Vision Lights


At the risk of offending some manufacturers, not all machine vision lights are terribly robust. I find they often lack sturdy mounting points or heat sinks, and seem rather fragile. That’s when, when I saw the Prox Spot Lights from Smart Vision Lights I said, “Gosh, isn’t that a good idea.”

My favorites are the SA30 Series which come in a 30mm housing and have an adjustable spot size, thanks to a sliding outer barrel. There’s no need for a dedicated external power supply – just feed them 24V – and it’s even possible to vary the light output.

I’d suggest setting them up at 80%, so if their output drops off over a year or two there’s some space to crank ‘em up a bit. Assuming you know what intensity you actually need. ;-)

Wednesday, October 24, 2012

Lens focusing, simplified?


Anything has to be better than turning the focus ring first one way, then the other while trying to watch an image on a monitor. It’s even worse when, as in a system I worked on recently, the monitor is not viewable when working at the camera. And don’t get me started on the tribulations of linescan camera focusing!

All of which is why, since 2009 I’ve been getting excited about the potential of liquid lenses. (“The end of focus problems” June 14th, 2009.) Cognex and Microscan have offered liquid lenses on select products for a few years, but there’s been nothing I could buy to add to a camera, until now.

Just in time for Vision 2012 optics specialists Qioptiq have announced their flo.x lens “with liquid lens focusing.” This sounds exactly what I’ve been waiting for, albeit with a couple of drawbacks. First, it’s made for an M12 mount, rather than C-mount. And second, the focal length is a rather wide angle 3.35mm.

No, it’s not exactly what I’ve been waiting for, but the very fact that it exists gives me hope. Who knows, perhaps I’ll be sent a plane ticket to Stuttgart so I can attend Vision 2013 for the unveiling of the C-mount liquid lens!

By-the-way, if you’re looking on the Qiotiq website for details of the flo.x, - well I couldn’t find anything, although they do have some great machine vision lenses.

Tuesday, October 23, 2012

A foolish look at Cognex


US-based stock market investors will understand that by “foolish” I’m referring to that excellent source of financial advice, The Motley Fool. And the Fool recently took a look at our friends from Natick. Or more accurately, they took a look at how Cognex manages its cash flow.

Cognex: Making Bucks More Quickly” (fool.com, October 12th, 2012,) discusses what is, to me at any rate, a different way to assess cash flow management. To understand how the “Cash Conversion Cycle” is calculated and what it means you’ll need to read the Fool article. I will however tell you the bottom line: they believe Cognex is doing a pretty good job of managing its cashflows.

Third quarter results come out in just a few days, so we’ll be able to see for ourselves if that trend is continued.

Monday, October 22, 2012

Directional light


Ah yes, lighting. The bane of our machine vision lives. One of the challenges we face is that what works for item A on our conveyor may not work so well for item B, even though they come off the same machines and travel down the same conveyor. So when designing a system we spend hours … make that days … trying to find an optimal solution.

How about just using different lights? When space allows, this can be an effective solution, but sometimes it’s not possible. In desperation I have even resorted to placing a mask over regions of my lights, thus Mask A for item A and so on.

But wouldn’t it be easier to have control over each LED?

The RL28Q and RL16Q from Oregon-based Orled (ORegon–LED?) go some way towards this illumination nirvana. These ring lights provide independent control over four quadrants, so you can chose to cast a shadow in a particular direction, for example. (Something that might help with detecting topographical defects, for example.

Now my impression is that these lights are really intended for use with microscopes, where a person viewing would switch quadrant as necessary. But I’m pretty sure that, with a little ingenuity, segment-switching could be automated for a machine vision application.

And that might make lighting just a tiddly bit easier.

Sunday, October 21, 2012

Dealing with variation (have I used that title before?)


Back on October 7th I started to talk about the challenge of inspecting objects that vary in appearance. This was in reference to meat (“The intelligent bacon slicer”) but even stamped, cast, molded and machined parts have a habit of changing.

Now the normal, random, short-term variation you should take in to account when the system is first designed. But how about the unexpected shift that takes place when Purchasing switch you to a different coating supplier? Chances are, no one thinks to tell you there’s a change coming; you’re only the machine vision engineer after all. So the first you know is when you get a phone call to tell you all the parts are failing inspection.

First off, your troubleshooting skills are put to the test. This is where some initial setup images are so useful. Pull them out and compare them with the latest images. If you can run your software offline – in emulator mode perhaps – so much the better. That way you should see what’s gone wrong. But what do you do?

Personally, I like to rant at those who made the change – “throwing my toys out the pram,” my co-worker calls it – but then it’s time to make some changes.

Don’t tweak the lens. Once you start altering hardware you’ll never get it back to the original settings. And be careful about changing thresholds because what will you do when Purchasing switch back to the old supplier?

I suggest you create a new configuration – call it “Product 999-dark” and have the original be “Product 999-light”. Then, if you’re really clever, you’ll find some way for the system to determine which of these is in front of the camera and auto-select the right file.

This approach won’t get you out of every hole, but if it did what would I right about next? But I hope it stops from digging yourself in deeper.

As always, your comments are welcomed.

Thursday, October 18, 2012

Less of the doom and gloom


Headlines last month reported a slowdown in the North American machine vision business. (“Local, Global Machine Vision Markets Slow” ControlDesign, September 10th 2012.) This sounds grim, so let’s take a look at the report.

First, it’s just the rate of growth that’s slowing. Back in the heady, rebound days of 2011 growth was up at 10%; through to 2016 IMS Research are projecting 7 – 8%. That’s pretty darned good growth. Name me another industry that can look forward to that.

Second, and this is where it gets really interesting, the same article says the AIA reported machine vision sales dropped 2% in the first quarter of 2012, yet in the same breath said software sales were up 26%, lighting up 11%, frame grabbers up 7% and smart cameras up by 4%. That sounds like pretty robust business for the components folks, so who’s down?

It has to be the builders of inspection machines, doesn’t it? Fewer turnkey systems and more people doing it themselves. At least, that’s my interpretation. Do you have a different one?

And one last complaint, while I’m in a moaning mood: There’s been a tendency in recent years, started I believe by the Wall Street Journal, to follow every announcement of positive news with a “But” and this Control Design article is no different: “… vision revenues grew more than 10% … But…”

Am I the only one tired of the negative spinmeisters? The machine vision industry is looking pretty healthy. Let’s celebrate that.

Wednesday, October 17, 2012

Looking for a new job?


Employment prospects in machine vision are pretty good: the industry is growing and our skills are in demand. But even if you use LinkedIn, Indeed and Dice, which I think are some of the best places to search, the odds of landing an interview seem slim.

Why can't good engineers get good jobs?” on the EE Times website October 10th, 2012, discusses this problem. It’s a good article, so I’m not going to regurgitate it here, just click the link. I would say though, read the comments and then adapt your job search accordingly.

Tuesday, October 16, 2012

My new friend


Attentive readers may have noticed an addition to my Friends list over on the right. “Doctor of Vision” is my link, (they appear in alphabetical order, so Vision Doctor would have had lower placement,) to a new site targeted at those looking for machine vision information.

That might sound like what I'm doing, but as you can tell from the screen shot below, the site Lars Fermum has put together is in a completely different league to my humble blog.

Whether you're new to machine vision, or just looking to improve your knowledge, the Vision Doctor site is an absolute treasure trove of info. Lars has articles on all the main topics – lighting, optics, cameras – plus, in the Service area, a number of useful calculators.

Check it out.


Monday, October 15, 2012

Laser triangulation application


Time is short, so I shall very quickly point you at an interesting 3D application. This is on the website of Netherlands vision company, Ellips. Ellips specialize in fruit and vegetable sorting, and you may have clicked through to them from the article I linked to in my last post, “Where the volume is.” Clearly though, they also dabble in 3D, and as the pictures show, it looks very interesting.

Sunday, October 14, 2012

Where the volume is


Everyone in machine vision wants volume. It’s where the money is, but so few applications are more than one or two-offs. Food inspection, as detailed in “Machine Vision Helps Food Processors Cut OverheadCosts,” on the Vision Online website (October 9th, 2012,) is a notable exception.

Everyone eats food, which is probably why there are so many inspection opportunities. Few though match the application writer Winn Hardin described thus, “The machine builder typically sells 50 sorting machines per year, requiring 400 machine vision camera solutions.”

Mr. Hardin doesn’t name the particular builder, although you could do the same as me and Google the type of application.

Thursday, October 11, 2012

Camera trends


The love-fest that is Vision 2012 is less than a month away and already vendors are giving us sneak-peeks at their latest and greatest. Some of the most intriguing come from camera-maker AVT who, besides some ho-hum range extensions, have an entirely new product, the Mako.

As I write this, the Mako page on the AVT website doesn’t have any content, so instead I will direct you their press release. Now this camera looks much like the reliable old Guppy, but whereas that was a FireWire camera, this will be offered with both GigE and USB3 interfaces.

I see that as highly significant. In fact I’d go so far as to say it means FireWire is on its death-bed, at least in the machine vision world. Some years back AVT saw how the wind was blowing when they snapped up Prosilica, thereby gaining a GigE product line up, and now they’re backing the new USB standard. I’d say that means the future is GigE and USB3, for everything except really large sensor formats and very high data rates.

Just out of curiosity, does anyone think we’ll see a linescan USB3 camera before the year end?

Wednesday, October 10, 2012

Cross-shopping


In the automotive world Sales and Marketing types like to talk to about cross-shopping. The three series BMW is often cross-shopped with the Audi A4, the VW Beetle with the Mini. In other words, consumers are smart enough to check out competitive products before plunking down their hard-earned.

So here's my question: how often do you cross-shop vision systems?

I'll bet the answer is “practically never.” I'll bet that when you need a smart camera you call your friendly local electrical bits-and-bobs distributor and ask what he can offer. If he carries Cognex, that's what you'll get. If he works with Omron, or Panasonic, well guess what you'll be using.

Is this a smart way to buy machine vision? I don't think so.

I'm shopping for a new car right now. I figured out my needs, (interior space, good gas mileage, good warranty,) and my budget, and now I'm compiling a spreadsheet where I can compare the models that meet my constraints. The final decision will still involve some subjective criteria – how it looks, how it feels – but I'm comfortable with that because I know I'll be working from a base of quantifiable data and will be making an informed decision.

I think we should buy vision systems the same way. There are a lot of vendors out there with products that differ but all have strengths and weaknesses. So before you buy a Matrox Iris, a Banner PresencePlus P4, or a Cognex InSight, figure out your needs and see which fits best. That way you won't be buying a Maserati when what you need is a minivan.

Tuesday, October 9, 2012

Camera mounts


A good camera mount is hard to find, so when I saw an interesting design in a recent movie from Allied Vision Technology I had to pause and rewind a few times. Eventually I grabbed this screen shot.

This shows a couple of AVT cameras on mounts that both act as heat sinks and provide two axes of rotation. I guess you could count a third translational axis too.

I like the look of these, although I have to wonder how you’d get a camera back to its original position if it got knocked. Perhaps some kind of scale could be added?

If my plea for a camera mounting standard, (“We need a new camera mounting system”,) ever gains any traction perhaps it would look something like these.

And for any of you interested in viewing the entire movie, here it is in all its glory.



Monday, October 8, 2012

The new vision system is a secret


If you visit the machine vision section of Keyence’s website, (www.keyence.com,) you’ll see they’re talking about the wonderful new CVX series of vision sensors. With features like a “judgment algorithm,” and “color learning,” it sounds pretty slick. The pictures show what looks like a low-end smart camera slash high-end vision sensor that you just point at the objects to be inspected as they shuffle by.

But then disappointment struck. I asked a co-worker to get some more info – download the brochure, find out about pricing and so on – and that’s when trouble struck. Apparently Keyence aren’t ready to share that kind of info.

I understand it takes time to get all the marketing materials together, but surely they can be coordinated? And here’s something I’ve said before but I think it bears repeating: please don’t tell me you’ve got a great new product until you’re actually ready to sell it to me. Just telling me what’s coming soon causes me to, at best, delay a purchase, and at worst, go to one of your competitors.

So please, Keyence and every other vendor in our industry, don’t be coy about your new products. Talk about them openly, have all the info, and be willing to ship, or shut up until you are.

Sunday, October 7, 2012

The intelligent bacon slicer


There’s nothing like the smell of bacon frying in the pan, unless you’re a vegetarian I suppose, but when you buy a pound, or a kilo, how do you know what you’re getting?

Well the butcher, or more likely the packaging machine, weighs it. But there lies a problem. The retailer wants to give you exactly a pound of bacon, and no more. But as bacon comes in slices he has to add in that extra rasher so as to make the minimum weight.

This giveaway is a problem, but machine vision has come to the rescue.

Stemmer Imaging announced recently that they’ve been working with slicing equipment maker Marel on an automated grading system. The issue here is that bacon is sliced to a thickness, but the weight of a slice can vary depending on the ratio of meat to fat. So the “smart” slicer looks at the end face of the slab of bacon and determines how much meat and how much fat. From here it can determine the optimum thickness to ensure the final pack weighs exactly the declared weight, and no more.

Clever, eh?

Sadly, there’s little information about the IBS2000 Vision Bacon Slicer on the Marel website, but I did find a write-up on, of all places, the ITS International website. They’re the people who cover intelligent transportation systems, (another big vision market,) and if you scroll to the bottom of “Machine vision - cameras for intelligent traffic management,” (October 2011,) you’ll find a sidebar piece about the bacon slicer.

But the story doesn’t end there. While Googling bacon and vision, up popped a link to a University of Nebraska-Lincoln report for the National Pork Board. Published in 2000, “Quality Lean Growth Modeling-Bacon Quality Assessment,” (it’s a pdf,) includes a section titled, “CHAPTER 3- MACHINE VISION ANALYSIS OF BACON”.

What I found fascinating is that this addresses the same issue: how to objectively determine ratios of fat and meat. I hope the folks at Stemmer read the report because it goes in to much detail about how the appearance of the meat can vary.

Variation in the object being inspected is of course a big challenge for the development of automated inspection systems. So big in fact that I think I’ll return to it very soon. Until then, how’s that bacon sandwich looking?

Thursday, October 4, 2012

Going the distance


Picking a camera is hard work. There’s resolution and lens format, sensor type and frame rate, and then there’s the interface standard, in terms of both speed and range. Now if you’re able to put the PC close to the camera pretty much any interface – GigE, FireWire, USB, or CameraLink - will span the distance, but there are many applications where the two are, of necessity, separated.

As a rough guide, if you have more than 3m between them you should give some thought to the most appropriate standard. Sometimes though this limits your choice of camera, but never fear, Andy Wilson is here.

No, Andy’s not going to stretch your cables, but he does have some advice on how to make them reach further. “Clearing Up Choices for Cabling and Connectors,” (Vision Systems Design, September 1st, 2012, points out that most of the standard interfaces now have some kind of extender technology. If this is something you might need to deal with, I suggest you click the link above and save Andy’s article in your Favorites.

Wednesday, October 3, 2012

Machine Vision Education


Perhaps you stumbled across this site because you need to learn about machine vision. If so, allow me to point you at some quality video material. This comes from Microscan, who sell cameras, lights, and software. In short, they’re positioning themselves as something of a one-stop-shop for vision.

Needless to say, though I shall say it anyway, their videos are a little biased towards the use of Microscan products, but I’ve found them a pretty good start point. They won’t get you through the AIA’s Certification but they’ll put you on the right road.

Take a look and come back to me with questions.

Tuesday, October 2, 2012

Troubleshooting


It’s the rare vision system that never needs any loving. It shouldn’t be that way, but factories are rough places for electronics, and that’s before all those nightshift tweakers get their fingers in the system. So inevitably, long-suffering vision engineer gets called to fix a system that isn’t working.

Now definitions of “not working” vary. It might mean that everything is being rejected, or that the system won’t power-up, or something in between, so said vision engineer needs some good troubleshooting skills.

This is something I’ve tried to address over my years of blogging, and now I’ve come across an article on the Assembly website, (“Machine Vision: Troubleshooting Vision Systems,” December 22nd, 2010,) that seems to repeat many of my suggestions.

One of those interesting of the points made concerned camera mounting. To quote writer John Sprovieri, quoting Mark Sippel of Balluff“A standard 3/4"-inch screw mount is fine for attaching a camera to a tripod, but it’s ridiculous for industrial applications.”

Does that call to mind my recent post, “We need a new camera mountingsystem”?

Monday, October 1, 2012

Control your computer with machine vision?


Is your webcam spying on you? According to this story on the BBC, it could be. But more to the point, could you use that camera for something useful?

Start-up Flutter thinks so. They’ve developed software that will use your PC’s webcam as an alternative to the mouse. The idea is that it will recognize your gestures and respond appropriately. (See “How Flutter wants to become the eye of the machine,” posted on the Gigaom website, September 24th, 2012.)

I understand some will ask, “Has it become too hard to use a mouse?” But I think there are some serious applications.

On the domestic front, I often leave Pandora playing on a laptop. That’s all well and good until it decides to ask if I’m still listening. Wouldn’t it be cool if I could just give it a thumbs-up from the comfort of my couch, rather than getting up to walk across the room?

Then there are the work applications. The factory floor is a tough place for a mouse and keyboard. Yes, touch screens are possible but they cost two limbs at least and still may not do all I want. Imagine just being able to start and stop by pointing.

And one last question for you. Is this machine vision or computer vision?