May 25, 2022

Overcoming the IoT interoperability hurdle


Just about every surface in the material world, or environment in the digital world, is being fitted with a form of data emission to describe itself and its behavior. This includes but is not limited to cars, software applications, factories, 400-ton mining trucks, financial markets, power grids, ice caps, satellites, apparel, appliances, phones, bodies, brains, jet engines—the list goes on. If the ripening of your apples isn’t already being tracked by line graphs in a cloud-based app, it may be soon. 

It’s no surprise, then, that data is growing exponentially. By 2025, the world will be fitted with 41.7 billion IoT devices that transmit 73.1 zettabytes of data. The importance of data rises as businesses increasingly use it to make business-altering decisions, and more data means instrumenting things that previously went uninstrumented.

This expected ballooning of data is a good thing for us data nerds, but it comes with an infamous set of complications. As it stands, hundreds of vendors and thousands of individually contributing community members are responsible for the instrumentation of the world. But this network of contributors is far from a well-oiled machine; disparate technologies deal with disparate data communication all over the world.
What’s needed is a set of IoT interoperability standards across data collection, stream processing, visualization, alerting, and machine learning and AI frameworks.

Dissecting one of IoT’s biggest challenges

A quick look at the IoT landscape shows how quickly the complexity can grow. There are multiple variables in play, and there is a potential for issues at every level of IoT architecture.

Take instrumentation as one example. If a device vendor writes metric emission into the firmware, the instrumentation is typically unchangeable to users; if you’re lucky, the target systems where these devices send data may be configurable. Meanwhile, purpose-built sensors are designed to fit equipment to gather their respective signals, but efficiently collecting the data can again be hampered by vendor firmware. You could deploy third-party services to pull or “scrape” data from data sources, but this requires that the data sources actually allow it.

After instrumentation, the next thing to consider is the architecture of the pipeline from the data source to where that data is ultimately analyzed—a convoluted space to say the least. The moment you begin to think about agents, gateways, message queues, and streaming engines, the questions pour in. Which do you use? One? Some? All? Which of each do you use? How many? Where do you put them? What order do they go in?

To complicate matters further, the answers to these questions depend on your answers to all the other questions—a Cartesian product of possible solutions. They’re all interdependent decisions, so the technologies you use need to be both evaluated and decided on essentially simultaneously. Is your head spinning yet? It’s no wonder digital transformation has felt more like a lifestyle than a step toward progress.

And it doesn’t end there. What does the data actually look like? In which formats are data being emitted and transmitted? Do you use JSON? CSV? XML? Some binary? In most cases, the answer is likely a combination of these. Finally, we also need to decide on the way the technologies transfer data in these various formats. In other words, which protocol should we use? It could be OPC, MQTT, Sparkplug, Modbus, HTTP, TCP/UDP, WebSocket, or a number of other options.

What IoT interoperability will require

At this point in time, there is no perfect answer to true IoT interoperability other than getting everyone on the same page—quite the lofty goal. The first step to getting to that point is designing and utilizing tools that enable interoperability in a way that can take IoT a significant step forward.

There are several quality technologies aimed at this; they all do it in slightly different ways and they are aimed at different targets. In some cases, these platforms and services complement one another. When it comes to interoperability, a system requires a laundry list of available inputs and outputs. Further, this system will see data coming from inputs in all sorts of shapes and sizes, and the ways in which that data is distributed are just as varied. There are multiple moving parts at every level that contribute to the complexity of the challenge.

As IoT extends into every facet of our lives, operators’ and data architects’ big challenge will be delivering data solutions that are interoperable with legacy, current, and future systems—ultimately getting the data in the hands of the operators and analysts who need it. It’s a common goal of the IoT space to enable gleanable insights from this explosion of data. Thus it’s the shared responsibility of the community to make interoperability a consideration in the work they do going forward.

Sam Dillard is a senior product manager at InfluxData. He is passionate about building software that solves real problems and the research that uncovers these problems. Sam has a BS in Economics from Santa Clara University.

New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to newtechforum@infoworld.com.

Copyright © 2022 IDG Communications, Inc.



Source link

Leave a Reply

Your email address will not be published.