Do you remember when your company connected the first sensor or device to the Internet of Things (IoT)? It wasn’t plug and play. It was more wish and wait. You were excited about the potential, but it was weeks before the device was communicating properly. You didn’t understand why it took so long. Both you and your folks in IT were likely frustrated by that point.
Even today, you see stressed IT personnel walking the halls every time a new device is added. You hear words such as legacy, protocols, and IPv6. Very little of it makes sense, but you have a feeling it’s going to cost the company valuable time and money.
Manufacturing is not the first industry to face the problem of connecting new devices to an existing infrastructure. The financial services sector has been struggling with connectivity issues for decades. Like manufacturers, financial institutions began with in-house, closed systems that shared information within the physical confines of a building or branch.
These closed systems didn’t need to share information with anyone or anything outside of the company. No one cared (yet) how the data was shared. After all, the company was only sharing with itself. Then, the closed system needed to communicate with a device that was not connected to the system. To communicate, the system and the device needed a way to connect and a way to exchange data. This required communication and data protocols.
- Communication protocols are ways that different technologies tell each other they want to exchange data.
- Data protocols define how the information is going to be sent and received.
There was no standard for connectivity, so organizations set their own.
Companies created protocols to communicate with devices. Manufacturers developed protocols for their devices to talk to companies. Very few methods were the same. Most of them were proprietary, meaning the method was only valid for a specific company or device. Proprietary systems worked until companies needed to connect hundreds of devices to their infrastructure in hours, not days.
Further complicating the problem were advances in technology that made the proprietary methods obsolete. Newer devices used a completely different paradigm for connecting and exchanging data. Suddenly, those proprietary or legacy systems were restricting growth, but to replace the legacy systems was cost-prohibitive.
The obvious solution was to create open systems where data could be exchanged according to set standards. No system would be proprietary. That ideal was simply too expensive. Decades of investment went into creating those legacy systems, and data integrity had to be preserved. No single standard exists for data exchange; however, a set of standards has emerged. This unofficial set of standards means that financial institutions only need to support a limited number of protocols to connect with the majority of devices. Manufacturing is facing similar problems as it tries to merge legacy systems with new technology.
How Open is Open
Unfortunately, no simple solution exists for the challenges facing manufacturers. No one has a crystal ball to know what technologies will dominate in the future. Part of the difficulty rests with the number of possible protocols and the layers through which data must flow. A secondary problem is the lack of consensus among the technical community as to the best way to approach the problem. Most technology vendors are biased towards their solution, so how do manufacturers decide?
Layers and Protocols
Think of data layers as floors in a building. To get to the top floor, you have to pass through the lower floors. The following are the four primary layers:
- Link Layer; At the link level, protocols determine how to send the data over the network's physical layer. Will the device connect through a wired Ethernet cable, wireless broadband, or mobile communications?
- Network Layer; Based on the link layer, a network protocol is required to establish a primary connection. For the IoT, an Internet Protocol such as IPv4 or IPv6 is used.
- Transport Layer; The third layer defines the communication protocol to be used between the device and the network. The most common protocol in the internet world is transmission control protocol (TCP). However, it is not the only transport protocol. The best protocol depends on the network requirements for speed, data integrity, and reliability.
- Application Layer; The final layer defines how the data is sent over the network. It is also the layer with the most variability. Examples of the application layer protocols include HTTP, XMPP, CoAP, and a number of others.
When the protocols used within each layer do not align, data is lost or misinterpreted. Standards help organizations exchange information accurately and reliably.
A pivotal question surrounding standardization is who sets the standards? As early as 2008, a group of companies was working to create an organization that would develop standards for interoperability on the IoT. This organization was recognized in 2017 as the Open Connectivity Foundation (OCF). This recognition enabled the OCF to submit specifications to the International Standards Organization (ISO) for publication, thus creating ISO standards for IoT connectivity.
Adherence to these ISO standards is voluntary, although acceptance of the OCF specifications is increasing. Approximately 400 companies belong to the foundation, including companies such as Samsung, Microsoft, and Cisco. Whether the industry goes with the OCF standards or another set of standards, manufacturers should push for an open standard of interoperability. Adhering to a standard means that devices can be added to a network faster, more reliably, and at a lower cost. It also means organizations can realize:
- Better quality and reliability; Products that comply with a standard meet a level of quality and reliability.
- Interoperability; Standardized communication protocols can be added to devices to support multi-vendor solutions.
- Global scalability; Standards allow industrial users with worldwide operations to deploy the same devices across an enterprise.
Manufacturers should not feel constrained by technology. The financial sector began to realize the benefits of standardization when financial institutions stopped purchasing devices that did not conform to their network requirements. They did not disregard technology when they made their decisions, but they realized that standardization was a business imperative.
We at GT have a wealth of experience in interoperability, especially when it comes to delivering business knowledge of all types across a variety of user devices and systems. Lean on this expertise to get yourself started or to optimize your existing digital transformation journey.
Leave a comment