Showing posts with label NI. Show all posts
Showing posts with label NI. Show all posts

Wednesday, November 20, 2013

Putting 3D to work


If you read my post “Playing in the 3D space” you might be wondering what 3D can do for you. Well an interesting application story in Quality magazine might prompt some ideas.

Integrating A 3-D Inspection System” (May 2013,) describes a system that Cyth Systems of San Diego put together. Cyth is a National Instruments Alliance partner, so naturally this uses LabVIEW (not that there's anything wrong with that.)

Take a look, and perhaps you'll be inspired.

Sunday, September 22, 2013

It’s smart and it’s linescan


Here’s an interesting product, one that I intend to learn more about. One Box Vision of Clonmel, Ireland, has launched a linescan smart camera.

The Astro Vision System is available with sensors from 2k up to 12k, but what I find especially interesting is the software: the camera comes with Vision Builder AI from National Instruments.

I’m a fan of VBAI, as we NI fanboys call it, because it has a host of tools from the LabVIEW Vision Development Module and it’s simple to use. (Alright, I find the state machine concept hard to get my head around, but I’m a slow learner.)

The only downside I’m aware of is that VBAI tends to be expensive, but perhaps One Box Vision has done a good deal with NI. I suppose there’s only one way to find out …



Thursday, February 9, 2012

My theory on smart cameras from NI


This follows on from my post “New smart cameras from NIso if you haven’t read that, click the link and do so.

Now, answer me this: why would NI introduce a new range of smart cameras with specifications remarkably similar to those of the Iris GT from Matrox? And if you don’t think they’re close perhaps you’d like to highlight one area of difference, because I can’t find it.

Here’s what I think is going on. NI want to be in the smart camera business as a way of pushing their very capable VBAI vision software. Unfortunately neither VBAI nor their range of low cost smart cameras have set the world alight, so how do they maintain a presence while minimizing costs?

They buy cameras from someone else.

This then is an interesting trend: the smart camera hardware becomes something of a commodity item, the differentiation comes from the software the user selects.

I have no problem with this, in fact I suspect it’s a trend we will see more of: vision companies will focus on the software that differentiates their products while one or a few companies will remain in the hardware manufacturing business.

Tuesday, January 24, 2012

New smart cameras from NI


Here’s a puzzle. Why does the NI website show two different products as a NI 1772 Smart Camera?

Seriously. Here’s the old, strangely-styled 1722 priced at $1,999, and here’s the chunky new 1722, priced at $3,598.

That’s quite a bump in price, so what’s going on? Well the new cams, billed as “High-Performance” and running from VGA to 5Mp, use a 1.6GHz Atom processor, while the old – and presumably outgoing – model harnesses a 400 MHz PowerPC processor. That’s quite a bump in horsepower, but does it really warrant a $1,500 price increase?

Or, have NI just noticed that competing products, from the likes of say, Matrox, are priced from around $3,500 for an Atom-powered, VGA res smart camera, so they figured they should position their camera at the same level?

Or, could there be more to this than meets the eye? If you’re interested in my hypothesis, check back in a day or two.

Thursday, August 4, 2011

Business outlook for machine vision


July/August is second quarter reporting time for most companies, so I thought it might be fun to see what trends we can deduce from the reports appearing.

Vision Systems Design magazine provide an industry overview under the heading of “Machine-vision market report indicates slowing revenue growth in 2011” (July 28, 2011). The title says it all really, so my only comment is to say that this confirms the hockey stick projections I was making a long time ago: we’ve had the rebound and now we’re back to that pattern of steady and sustainable growth.

Looking for clues in specific company results has become harder since Teledyne acquired Dalsa. All their 2nd quarter report had to say was that Dalsa contributed sales of $63.8m to their imaging group. Not terribly informative.

Likewise, National Instruments reported their half year results at the end of July. Looks like they’re back on track with revenue growth of 20% over the same period in 2010, but how much of that $452m of first half sales came from machine vision? NI don’t say.

Across the pond, Basler don’t report until later in August, but we do know that they had an excellent 1st quarter with revenues up 47% over the start of 2010. Likewise, Augusta, parent of Allied Vision technology, VDS Vosskuhler and LMI Technologies, reported very strong 1st quarter growth.

So what does all this suggest going forwards for the rest of 2011? Can this pace be sustained? Well perhaps the results from Cognex, published August 1st, give us a clue.

Like NI, Cognex also saw first half growth of around 20%, although an interesting side note is that this came all from the Modular Vision Systems Division (InSights, Checkers and the like,) rather than from the much smaller Surface Inspection Systems Division. Also of interest was the note that much of the future growth is anticipated to come from China. This would seem to link with the recent announcement from Foxconn that they intend to dramatically increase their usage of manufacturing automation.

Staying with Cognex, a couple of other points to mention: finance website The Motley Fool recently identified Cognex as one of the very few companies satisfying their “7 Signs of a Winner” criteria. (Perhaps it’s time I finally bought some stock – the dividend alone is at least as good as I can get in bank.) And second, tucked away in the various financial announcements was word that R&D headcount has been increasing. That has to be a good thing, although I suspect the additional resource will be channeled into enhanced smart camera products rather than cool high-end systems, but I guess, as their results show, they are following the money.
So, what of the second half of 2011?

Well clearly, growth is slowing from the pace of 2010, but that’s only to be expected. I also noticed several players commented that the third quarter is traditionally slow, so perhaps like-for-like growth of around 10% should be expected. My guess is that the fourth will be similar, at around 10% compared to Q4 ’10, meaning that 2011 will probably turn out to be the best ever for machine vision.

Monday, January 31, 2011

Microsoft’s Kinect as a machine vision platform?

Regular readers will know that I’m quite enamored of the Kinect.: this low cost 3D imaging system seems to have huge potential and is just waiting for hackers to figure out how to put it to work.
I’m also a fan of National Instruments LabVIEW software, so the news that at least one clever person has figured out how to marry the two is of course music to my ears.

I figure there’s no point in me regurgitating a summary of how it’s all done – I’ll probably get it wrong anyway – as you can just go straight to the source, so here are the links you need:

Part 3 can’t be far away!

Tuesday, November 16, 2010

Speakers for machine vision conferences

I don’t attend many conferences, partly because my travel budget is microscopic, but mainly because I don’t find them very useful. The reason they lack usefulness is that most of the presenters are trying to sell me something. If I want to learn about new products or services I’ll read about them in the many magazines that cross my desk or on-line. What I want from a conference is to learn about how my peers have solved difficult imaging and inspection problems. Unfortunately though, these kinds of presentations are rather rare.

I think there are two reasons for this. First, many users of machine vision are reluctant to share what they’ve done out of a fear that they will be helping their competitors. That’s a valid concern although I think some companies are slipping into paranoia – exchanging information can help everyone advance (why else do we have a patent system?)

Second, people who develop and use machine vision systems are too busy doing the work to spend time talking about it. Let’s not forget that actually getting up at the conference is the least of it. Many hours have to go in to preparing a good presentation, getting it approved by the company “higher ups”, and then rehearsing.

And what’s the return on this investment of time and effort? Not a whole lot, for either speaker or his employer. Sure, there’s the slim chance that a recruiter will get to know of the speaker and match him or her up with a wonderful new job, but that’s another good reason for employers not to let their staff present.

So what’s the answer? Well if conference organizers want to avoid a parade of thinly disguised sales presentations I suggest they do the following:

  • Don’t just advertise a “Call for Speakers.” Proactively go out and invite people to speak at your conference.
  • Pick up the speakers travel and hotel costs.
  • Provide some freebies for the employer – maybe a $500 credit towards some of the products exhibited at the show or multiple free conference passes.
  • Be very appreciative of the time and effort (not to mention the nervous anxiety) that go into preparing and delivering a presentation

In wrapping up, I should mention that one notable exception to my “thinly disguised sales presentations” rule seems to be National Instruments Week. Reading some of the press reviews, it appears they actually took the time to seek out machine vision practitioners rather than vendors, so kudos to NI.

Wednesday, October 27, 2010

Spreading the word

I always thought that National Instruments were very smart in the way they get their products into the hands of engineering students. It’s such a chore to learn a programming language that most of us would rather only do it once, so by indoctrinating students, NI wins over a large body of disciples who go out into the world each year and lobby their employers to start using LabVIEW.

As evidence to support my thesis I present, “Winning an International Robotics Competition with LabVIEW and NI Machine Vision Hardware,” (undated) published on the National Instruments web site.

This case study describes a project undertaken by students at Fontys University in The Netherlands. The goal was to develop an autonomous vehicle that can drive through a field of corn. The students achieved this by combining the LabVIEW Real-Time Module for motion control with the NI Embedded Vision System. A particularly interesting aspect of the latter was that it used only a single camera to obtain 360 degree situational awareness.

How did they do that? Well you’ll have to read the article to find out, and while you’re at it, I suggest you click through the photo gallery too

Sunday, August 22, 2010

How the dart was illuminated

Did you watch the video in my last post? If not, you may want to do that before reading on.

Here are my thoughts on how the dart was illuminated. I think there are two parts to it.

First, we see an LED ringlight around each camera but they never appear to fire up. Additionally, there appears to be a filter over each lens. My guess is that the lighting is infrared and the lenses are each covered with an IR bandpass filter. That would allow the system builder to strobe the lights without blinding anyone watching.

This much was also deduced by the first person to post a comment, but I think there's more to it than just the use of IR. Did you notice the big shroud over the dart’s flightpath? I think that does much more than just keep out ambient light: I suspect that the ringlights light it up so it acts as a giant backlight. And to that end, the shroud is probably not made of any old cardboard but rather, is formed from a material selected specifically for its ability to bounce light around.

Well that’s how I’d do it anyway.

Now if you read the second comment, from Andy Wilson of Vision Systems Design, you'll know that this demo is not exactly novel. Like Andy, I'm a little surprised that NI would replicate what someone else has already done. Yes, I know the NI system has to work faster than its predecessor, but what they've done isn't really awe-inspiring.

Thursday, August 19, 2010

Test your lighting knowledge

Here’s a little quiz for all you experts in machine vision lighting … watch the video, shot earlier this month at NI Week in Austin, TX., then answer this question: the object of the exercise is to calculate the trajectory of the dart in flight. We are shown two cameras but we don’t see any illumination, so how does this work?



Check back next week for my thoughts.

Thursday, August 12, 2010

More on the new VBAI

Yesterday I shared with you some of the image processing upgrades to this low(ish) priced vision software: today let’s talk about the interface.

Most machine vision developers like to put a customized user interface on the systems they set out on the plant floor, and until now VBAI has been rather inflexible in that regard. However, with the 2010 release that has changed.

VBAI 2010 has a user interface design capability that works pretty much like grown up LabVIEW. You can drop in images, meters, numeric indicators with gay abandon, and it’s even possible to import your own logos and color schemes. If you believe, as I do, that the GUI is a critical factor in getting a system accepted, this is all going to matter to you.

There’s no doubt in my mind that NI is serious about being a player in machine vision. Cognex beware!

Wednesday, August 11, 2010

NI adding machine vision capabilities

The VBAI vision product (that’s Vision Builder Automated Inspection) is great for when you don’t want to wrestle the beast that is LabVIEW - that is, if you just need to implement a fairly straightforward solution and don’t want to mess with all those wires – but it has been just a little limited.

It seems National Instruments have recognized that they’re missing out on a significant market, because VBAI 2010 has some useful additions. There are four new software tools: one for checking contours, a rather clever color segmentation tool or classifier, optical flow and texture defect detection.

Optical flow is a little ho-hum (Roborealm has had it for a long time,) but the others could be rather useful. The defect detection tool in particular is quite clever. It uses such tools as “wavelet decomposition,” “Haralick features,” and a “Support Vector Machine” classifier. In laymans terms, that means you train the system on the texture you want to inspect. The software offers a number of different processing effects and you, the user, select the one that seems to work best on your particular surface.

As you might expect, it’s very computationally intensive, so don’t use it when time is of the essence – unless you wanted to take advantage of it in its “Vision Development Module” incarnation (which means using LabVIEW.)

Regular readers know I have a cynical streak, so you won’t be surprised that I see this as a response to the “Stain” tool that Keyence launched a while back, and which works pretty well.

There is one other new feature in VBAI 2010 that I rather like, but as I try to keep these posts short you’ll have to check back tomorrow to learn about that.

Wednesday, August 4, 2010

Time for an update on NI machine vision

It’s been a while since I passed comment on the vision offerings from National Instruments, so let me give a shout-out to the latest incarnation of their real time vision box.

I’m referring to the sexily-named EVS-1463RT, which is billed as a complete vision system in an industrial box. The 1463 handles GigE and CameraLink, unlike its sister box, the 1464, which was released first and provides FireWire (b) and GigE connectivity.

I like the idea behind these boxes – all the IO, memory, OS, everything really is in one robust housing which should make it quick and easy to implement. I just have to swallow hard when I look at the price: $4,500, and that’s before you’ve bought a camera and lights.

I like the NI range of vision products, but their pricing sure makes it hard to argue a persuasive case for going with them rather than the competition.

Monday, February 8, 2010

Inexpensive machine vision from NI

You don’t often see “NI” and “inexpensive” in the same sentence but I’ve discovered it’s true. Let me explain:

Log on to NI’s website and you’ll find that Vision Builder for Automated Inspection (VBAI) is listed at $1,899. It’s a nice piece of software but to me that pricing seems a little steep. But purchase their entry-level smart camera – the 1722 – and VBAI is included for the grand price of $1,999.

Now to me that’s a good deal, and it puts the base camera in direct completion from the more basic offerings such as those from Banner.

Order it now before NI realize what they’ve done!

Sunday, February 7, 2010

Machine vision prototyping tool

As you put together your system, one of the challenges is anticipating every possible environmental variation. You know that lighting will change, there could be vibration and everything will get dusty, but how do you ensure your application will continue to perform?

Well one way is to test it on a large set of doctored images. By “doctored” I mean start with a good image and use a package such as Photoshop to make subtle tweaks in brightness and contrast as well as adding in noise or blur. (Of course, as a good engineer, you’ll make controlled changes so you can actually quantify sensitivity to the various parameters you select.)

This is all well and good in theory, but time is money so the hours you spend developing that set of test images has to be paid for by the customer.

An alternative is to let your machine vision software do it for you. That’s the idea behind some of the new tools in the latest release of Vision Builder AI from National Instruments and as discussed by Kamalina Srikant in
Advanced Imaging Magazine (January 4th, 2010)

I think it’s a great idea. Let me know if you’ve tried it.

Monday, August 31, 2009

The only web link you’ll ever need

OK, that’s an exaggeration, but if you’re new to machine vision and looking for a place to begin learning, the Vision Resource Kit from National Instruments is a good place to start. Topics covered include lighting, optics, image processing, and of course, an introduction to NI’s family of smart cameras.

Beware though; it is a 26MB download, so you probably don’t want to be on a dial up connection when you follow the link above.

Monday, June 29, 2009

NI – still committed to machine vision

I’ve always found Vision Builder a good product for desktop evaluation of images and rudimentary algorithm prototyping, so I’m looking forward to the next release. I anticipate the inclusion of a new function to help in testing algorithm robustness. More specifically, NI have taken on board a suggestion, made at a previous NI Week, that a large image set should be created artificially for offline robustness testing.

What this means is that the new release of VBAI will have tools to make subtle modifications to images, such as adding motion blur or changing scale. With this I can start with a small set of acquired images and turn them into a much larger set that reflect every condition my vision system might see in the real world. By running these through my prototype algorithm I can then quantify the level of false accepts and false rejects to be expected, and so identify where improvements are needed.

Or at least, that’s the theory. When I get my hands on the new release I’ll let you know if it really delivers.

Tuesday, June 16, 2009

Compact, yet effective

The NI presence at the 2009 Vision & Robots show was very much smaller than the extravaganza they’ve put on in previous years, yet they managed to pack a lot of machine vision into a small space. I counted 4 cameras in a single small demo: two of their smart cameras, a FireWire camera hooked up to the venerable CVS, and most interestingly, a line scan camera plugged in to the new “Embedded Vision System,” the EVS.

The EVS is perhaps best described as the CVS all grown up. Physically, it’s a little bigger, although much of the increased bulk is heatsink, necessitated by the fanless design. Inside there’s an Intel 1.66GHz dual core processor, but rather than Windows it runs LabVIEW RT. Sporting every connection interface imaginable (except CameraLink,) this is a versatile little box with aspirations to be the heart of a real industrial machine vision system. (If you’re interesting a watching a webcast intro to the product, use this
link.)

The big downside of the package is price. The EVS itself is $4,500, on top of which you need the LabVIEW RT vision development bundle (at least, I think you do. NI are as confusing as ever about software.)

So who’s going to buy this product? I think its appeal is to the hardcore LabVIEW developer (or Alliance member,) who needs to implement a vision system. I really can’t see it winning converts from users of other machine vision products because of the price and the learning curve.

Thursday, May 21, 2009

NI Week – will you be there?

It’s a few years since I last attended the LabVIEW lovefest that is NI Week. Held every August in sunny – and hot – Austin, NI Week is always an impressive demonstration of the power of the LabVIEW graphical programming language.

But my interest is machine vision, and industrial applications thereof, and having scanned the program, I’m not seeing a lot to whet my appetite. Is it that they are slow pulling the program together, or could it be that machine vision is not high on NI’s priorities?

Time was, NI had a strong vision champion in Matt Slaughter, but Matt moved on to a new position earlier this year and I sense that the baton was not passed. So just what is NI’s long term plan for vision? Speak up Austin, I’d like to hear what you have to say.

Sunday, November 23, 2008

Is execution speed a differentiator?


In their white paper “10 Things to Consider When Choosing Vision Software,” (published as part of the “Vision Resources Kit,”) National Instruments (NI) included a table comparing the speed of their image processing algorithms with those of a competitor. As I’m promoting their products I’m sure they won’t object to me reproducing it here.

I would have liked some information on how they produced these statistics – for example, what size was the image? But even without such details I find the table interesting, for two reasons. First, the NI algorithms appear to execute much faster than those from the unnamed competitor, which suggests that NI engineers do their coding very efficiently. And second, NI appear to believe that, even in these days of quad-core processors running at a gazillion MIPS, speed is still an issue for some vision users.

So let me throw out a question: do you ever run into problems with algorithms taking too long to execute? If you do, would you be kind enough to share some details?