Thursday, July 5, 2012

Can GigE be trusted?


I spoke with a gentleman a few days back who advised me not to use GigE because it “drops frames.” His recommendation was to do everything with CameraLink. Now while I agree that CameraLink is robust and reliable I’d not heard of GigE “dropping frames,” so I did some research.

In my library of useful machine vision stuff I found two relevant papers. One is “GigE Vision cameras and network performance,” published by Leutron Vision back in 2009, (unfortunately I can’t find a link on their website.) The other, more recent paper is “4 Critical Factors: Deploying GigE Vision in Real-Time Industrial Imaging” available in the Knowledge Center area of Teledyne Dalsa’s website. (I believe you’ll have to register for access.)

These papers both talk about overhead, packet size, flow control, selecting the right network card and use of Ethernet switches, but there’s little about the possibility of dropping frames. The nearest I came was a comment in the Leutron paper about the risk of dropping packets if the CPU load is too high. Dropping packets would be bad, although it strikes me the risk is low unless you’ve loaded a lot of other processing onto an older PC. But dropping frames?

GigE is rapidly becoming the standard camera interface for machine vision, so presumably the community is happy with how it performs. But are there pitfalls lurking for the unwary?

If you have any knowledge, experience, or perhaps most important, verifiable facts about the reliability of GigE, please use the Comment function to share.

8 comments:

Zhenyu said...

GigE Vision uses UDP instead of TCP protocol. Therefore, the sender (camera) keeps sending packages without knowing whether the receiver (grabber and CPU) is ready for new packages. This is less of an issue if the receiver triggers the sender to capture a new image, although the receiver may still drop the whole image if one of the many packages for that image is lost. This is more of an issue if the sender is configured to be the master (free-running).

Disclaim: I have not verified the above speculation. My speculation is based only on the available documents above GigE Vision.

Zhenyu said...
This comment has been removed by the author.
Steve Maves said...

I wonder if the real complaint isn't "drops triggers" instead of drops frames. In my experience, some of the GigE vision vendors have not done a great job duplicating basic frame grabber functions. The most basic one that gets overlooked is how the camera trigger input responds when it receives triggers too fast (over-run condition). Some vendors have no mechanism to report this back to the host PC. Basler seems to do a good job of reporting this, I'm sure there are others as well.

Mark Willamson said...

Cameralink is very basic point to point interface. In fact has no validation so if the data gets corrupted in the cable, which is very possible on fast cameras and longer cable lengths this will not be found.

On the other hand GEVision uses CRC and the ability to resend data if it gets lost. If you have a point to point ge vision set up the chances of packets getting dropped are very slight. Yes if the CPU is hitting 100% you may get a loss as the CPU cant process all the packets but optimising the network card will minimise this. But with the right GE software such ans CVB camera suite you can detect and choose to process the corrupted frame or reject the item as the data is corrupt. A well designed vision system should never hit 100% CPU anyhow. If you really want to be 100% use a GE frame grabber such as the silicon software Micro enable. This does all the packet receiving on board so is not subject to host CPU load. It delivers a full frame image to the host by full frame DMA just like Camera link grabber. Finally if you are wanting a network of cameras you have to ensure you network can deal with the payloads so you need a good understanding of networks and choosing the right switches. If you get this wrong you will get dropped packets. What other vision standard can give you this level of network ability. You just need to be sensible. We implemented a system with 250 cameras in a networked environment at a very large scientific research centre. Using the advanced capabilities of GEVision we delivered a very reliable system.

Paul Kozik said...

At the end of the day, most GigE cameras offer a resend mechanism. This allows users to know that a packet has actually been dropped. A driver can ask the camera to resend this packet for instance. Cameralink on the other hand does not provide this visibility, hence if you drop data because of poor cabling, etc, the user has no warning bells.

Julie Harrison said...

Hi Everyone,

For a detailed response, I'd recommend checking out Eric from Dalsa's answer, sparked directly from your interesting discussion here.

You can access it here: http://blog.teledynedalsa.com/2012/07/can-gige-vision-lose-frames/

I thought he covered the issue really well.

Julie (Pleora Technologies)

W. Kent said...

Yes, you will drop packets.
Yes, it will suck.
No, there is no easy way to fix it.

So, what it comes down to is 2 key perspectives about your camera & driver and the network adapter you are using.

1. Some companies implement GigE really, really POORLY; here is looking at you *******. This will cause dropped packets when you are running a multi-camera system. But it depends on how, some are switched over a single IP port on the computer. Others have a unique IP port per camera... this is where things get screwy. The driver I was stuck working with, before I implemented the GigE spec myself (HUGE pain) to resolve the matter, had an issue if crossed subnets above 10.0.* and had them on the same computer. Shitty implementation and lots of headaches over dropped packets.

2. The multi-port network driver itself. Some drivers have 2/4 individual (physically individual) chips on them that are basically a completely separate network card from the others. Some share memory between the ports. The second, lots of dropped packets.

So, in total; dropped packets can happen. They will happen. You will cry. But there is a bright side, if you are using a single camera per computer setup you are fine. If you end up needing multi-cam single port or multi-cam multi-port setups like I have look at other GigE software vendors that support your camera; their usability and overall work-ability, on average, seem much better than vendor supplied.

-Sincerely,

The guy who thought he could run 12 cameras into a single server with 12 individual ports.

W. Kent said...

Yes, you will drop packets.
Yes, it will suck.
No, there is no easy way to fix it.

So, what it comes down to is 2 key perspectives about your camera & driver and the network adapter you are using.

1. Some companies implement GigE really, really POORLY; here is looking at you *******. This will cause dropped packets when you are running a multi-camera system. But it depends on how, some are switched over a single IP port on the computer. Others have a unique IP port per camera... this is where things get screwy. The driver I was stuck working with, before I implemented the GigE spec myself (HUGE pain) to resolve the matter, had an issue if crossed subnets above 10.0.* and had them on the same computer. Shitty implementation and lots of headaches over dropped packets.

2. The multi-port network driver itself. Some drivers have 2/4 individual (physically individual) chips on them that are basically a completely separate network card from the others. Some share memory between the ports. The second, lots of dropped packets.

So, in total; dropped packets can happen. They will happen. You will cry. But there is a bright side, if you are using a single camera per computer setup you are fine. If you end up needing multi-cam single port or multi-cam multi-port setups like I have look at other GigE software vendors that support your camera; their usability and overall work-ability, on average, seem much better than vendor supplied.

-Sincerely,

The guy who thought he could run 12 cameras into a single server with 12 individual ports.