Why interoperability is important for open standards - and how to get it

What do we mean by an open standard? A thousand-page explanation could easily be written, but for the world of embedded computing it usually means a succinct definition of everything a vendor needs to know about building equipment (and writing software) that will work with compatible products from other vendors. In our business ('s), these standards are usually defined at the board level: boards from Vendor A and Vendor B will plug into a chassis built by Vendor C and everything will work together as intended.

While large ecosystems of products exist around popular standards, they are becoming more complex. Just plugging a bunch of stuff together obtained from multiple vendors and expecting it to work immediately is no longer always realistic. Modern high-performance systems like () call for signaling rates up to 10 Gigabits per second (Gbps) through boards, connectors, and backplanes. ATCA's system management infrastructure was the industry's first open standard that allowed High Availability (HA) systems to continue to work in the presence of malfunctions or failures. A lot of complex software and hardware needs to work reliably to build these systems. "Plug-and-play" is not automatically assured.

So why not use proprietary products from a single vendor if you want everything to work out of the box? After all, those vendors offer everything assembled, tested, and integrated. And, there is usually only one phone call to make if something doesn't work. Sounds good, right? Sometimes it is and sometimes it isn't. The downside of proprietary systems is that they are usually expensive and often don't use the latest technology or offer particularly high performance. Upgrades are usually slow to market and are also expensive because the vendor already owns you and can charge what they please. And only very large companies (with their very large overhead expenses) can be true experts at the myriad of technical disciplines necessary to build a complex, high-performance system.

These are some of the issues driving many industries to move to open standards, if they have not already done so. Open standards are usually developed by standards organizations (like PICMG) that have hundreds of members with an extremely diverse technical talent base. When dozens of vendors compete for a customer's business there is price and performance competition, which is a good thing for the customer. If the customer doesn't like their vendor(s), they can go someplace else. And open standards-based products tend to offer leading-edge technology and improvements. Certain companies will specialize in certain areas, providing best-in-class performance. Sounds great, huh? No brainer, right?

Hold on. As standards-based systems get more complex, more software and fine tuned hardware is required. Simply buying a bunch of stuff that claims to be compliant to a standard, plugging it together, and expecting it to all work immediately is becoming less and less realistic for many reasons. Complexity is the main one. Another is that popular standards are often written with dozens, hundreds, or even thousands of options that may or may not interoperate.

So what can be done? One simple answer is to test standardized equipment from various vendors for compliance. In reality, compliance testing can be ugly. It is expensive and sometimes disputes arise as to the exact meaning of the always-imperfect language written into the standard. And, the first time a product is deemed to be non-compliant, the testing body usually gets sued. As such, most standards organizations shy away from compliance testing unless they are large and have a strong legal arm. They rely on vendors to self-certify instead.

So how do you achieve multi-vendor for standards-based equipment? One real solution is to put vendors and their equipment in a secure environment and let them test everything with everything else. PICMG was a pioneer in doing this and has been conducting Interoperability Workshops (IWs) on a regular basis since the mid 1990s. These events are held as needed, usually at least once a year, or more often if a standard changes or adds significant new features (for example, 40G ATCA). Testing is structured so that everyone arrives with a list of what they want to test and with whom, and adequate time is scheduled so that all participants can complete the tests they need. IWs are conducted in a "safe" environment so that "what happens in the room stays in the room."

PICMG has conducted ATCA IWs for a decade, and other technologies, such as Serial, are now part of the program. The 24th PICMG IW will be held September 16-19 and hosted by Pentair in Straubenhardt, Germany.

Improving multi-vendor interoperability through these workshops still does not solve all of the problems. That is where systems integrators play an increasingly important role, but we'll talk about that in another issue.