This article appeared in Evaluation Engineering and has been published here with permission.
The migration to more sophisticated cloud-based IoT functionality is relentless and rapid. However, ensuring optimal functionality in the complicated infrastructure that's our current connectivity ecosystem is no small task. Advances in leveraging computing power to increase performance in situations where the application system’s hardware may not be the most capable.
One of the companies addressing advanced computing functionality in the cloud is IOTech, an edge software company, which recently launched Edge XRT, a time-critical edge platform for Microsoft Azure Sphere, a secured, high-level application platform with built-in communication and security features for connected devices. Designed and optimized for resource-constrained environments, Edge XRT enables device connectivity and edge intelligence for MCUs, gateways, and smart sensors at the IoT edge.
Edge XRT for Azure is fully compatible with Azure Sphere-certified chips and Azure Sphere OS. An Azure Sphere device is designed to integrate securely with the Azure Sphere security service running in the cloud, ensuring the integrity of the device. Edge XRT also simplifies connectivity to sensors and devices at the edge by configuration versus coding, enabling connectivity to Azure Sphere devices using standard industrial protocols such as Modbus, BACnet, EtherNet/IP, and others.
The More Things Change…
To get a deeper perspective on the situation as well as the application spaces impacted, we spoke with Andy Foster, Product Director at IOTech, about edge computing, and the IoT in the cloud. The conversation began with an observation that the more things change, the more they stay the same.
EE: We’re going to really drag you back a little bit because my position has always been the more things change, the more they stay the same, and these are the same arguments we used to have with thin client, right? Back in the day when it was just a server in the basement and dumb terminals on desktops.
%{[ data-embed-type="image" data-embed-id="607ffddea6ade973668b48ae" data-embed-element="span" data-embed-size="320w" data-embed-align="left" data-embed-alt="Andrew Foster Product Director Io Tech (1)" data-embed-src="https://ift.tt/3xppxYu" data-embed-caption="" ]}%Andy Foster: I think that's right. I think we go through these kinds of cycles in technology, kind of in re-invest itself every few years. So you said, you moved from things like centralized cloud-based applications, and particularly what we're involved in at my company, to more decentralized edge-based systems. But these trends are trends that are not just new over the last couple of years. This idea between centralized and decentralized clients, client server type of architectures has been around for over 30 years.
Probably, the areas of concern are changing in terms of the type of applications, the type of use cases that we're trying to address. But, architecturally, these things come in waves and problems come in cycles, and we're in a cycle now—I think we see with edge computing where things are moving from a centralized architecture. With the advent of industrial IoT in particular, the need for more edge-based decentralized systems is becoming really important. That's the market that we're trying to address, but I think it's a little bit new, slightly more nuanced than that.
What we're finding is that in fact, a lot of companies are building newer IoT use cases. The actual strategy it's not typically just a simply an edge or a cloud strategy; we'll find in many cases you need to have like a hybrid strategy. In terms of the AI use-cases in the markets and the industrial verticals that we support, we find that truly autonomous systems are quite rare, although you may be moving a lot of the edge processing or processing down to your edge devices close to where the data sources are typically. So things like edge computing and how we support edge computing and scale is very important. Interoperability between the edge and the cloud was also a very important concern.
EE: Not to not to drag up more jargon, and I hate jargon, but isn't that what the term fog is trying to apply to?
Andy Foster: That's right. Fog is a term that's not used quite as commonly now, I think, and it was no different than years ago. It was a Cisco term, I think, kind of original by Cisco Systems. It kind of embraces the continuum from the edge all the way through the cloud; it's kind of the spine of the fog.
EE: Looking at it from the point of view of the designer, the moving parts have increased by multiple orders of magnitude. There were different protocols in the hardware days, but they were expressed with connector shapes. Today you could have a device that could functionally utilize upwards of say four different bandwidth solutions wirelessly.
Andy Foster: That's absolutely true, and I think that's one of the challenges for industrial IoT for edge care systems. Particularly when you're in industrial worlds, it's quite a complex world. Even in a factory environment that's very heterogeneous, your connected systems being able to connect to the PLC or get their off equipment is not particularly new.
What you did typically find traditionally is that these connected systems are siloed systems. You can never get a dead stream from a particular piece of equipment or equipment on a production line, but that data is very much kind of a single string of siloed information. What the IoT and edge computing is trying to achieve is to basically give you a much broader picture of exactly what's going on, for example, on a production line.
What that means is that instead of just being able to tap into a single data source, the IoT applications may need to connect in parallel to multiple data sources, and in an industrial environment, many different communication protocols can be used. It's one of the big challenges. One of the areas that we put a lot of focus on as a company is actually providing a lot of broad out-of-the-box, configurable connectivity for their myriad of standards that you have to encounter, everywhere from a factory floor to a retail space, for example.
So you've got the wireless and all over-the-air protocols like Zigbee and Bluetooth and lots of specialized protocols, real-time computing, and proprietary standards and protocols that come from some of the big automation vendors. So that's a major challenge in terms of acquiring data and supporting these new cases.
It's one of the key areas where I think this is where things like the platform technology, the infrastructure, plays a big role. And basically being able to support that, even being able to allow the users or the system integrators who are building applications to be able to integrate those data streams or view the data, or if you're doing something like a predictive analytics application, which means you're running locally at the edge, you get good response.
You've got analytics not driven by simply measuring the temperature coming off the machine. You might have to have multiple different sensors coming off different protocols to give you the full picture to drive the analytics. So what you actually need to do is be able to tap into all these data streams easily. You don't want to have a big, complex, costly software integration activity, and you've got to be able to pull the data into your system. Typically, if you're going to process it locally, normalize the information so it becomes a common format. Then you can feed your analytics or the application just to do something interesting with that data.
For example, you may be predicting that there's maybe a problem with the machine, either the engine or the mortar on a piece of equipment, for example. So being able to tap into these multiple data streams and acquire that data, that didn't feed it in an online form to the analytics or the application, either at the edge or potentially you could still be sending it to the cloud as well, is a big challenge.
EE: The recent Mars mission is a triumph of telemetry. And the space program had to deal with the issues of latency a long time before industry did, and talking about old concepts, old terms, and new concepts, new terms, they had edge computing before they called it edge. They just simply put a computer in every spacecraft.
Andy Foster: Edge-based computing or centralized computing or cloud-based computing and decentralized systems are not new things, new terms. People have been doing edge computing in factory environments, in industrial environments for a long period of time. The emergence of cloud-based, cloud access to run and deploy application workloads in the cloud is something that's been around for the last 20 years. I think what is new maybe with the new architectures are obviously the convergence between, for example, the operational technology world and the cloud world.
Some of the new challenges instead of just processing, say a single file or data stream, you need to be able to process multiple, data streams in parallel, do something sophisticated. Drive the AI or the analytics on the edge,. But it's also important that you have that interoperability between the edge systems and the cloud systems. So that's something that's a bit different and I think we're finding that it's not simply a case of saying we're going to deploy everything on the edge.
Now everything's going to come edge-based. Most of the systems, which our customers deploy, effectively become like a hybrid architecture. So they do run local edge computing, or local workloads on my edge, maybe for latency response times, cycle times; maybe to protect sensitive data where that makes sense. Or they do things like sending a subset of the data up to the cloud for longer-term analysis, big data analytics, maybe retraining AI-models, which get subsequently get redeployed down on the edge nodes.
Every system has to have some level of security, for example, and depending on what you're trying to achieve, will have a big bearing on the type of security that you need to be able to deploy within the system. That security is a big challenge, particularly when part of your system is connected to external networks. How do we deploy it across our factories?
We've got multiple factories, with about a hundred nodes in a factory. That's thousands of nodes that we need to be able to deploy this technology on. We need to be able to manage the nodes that they're running on, and we need orchestration of the software and the applications that are running on these large-scale networks.
"device" - Google News
April 22, 2021 at 03:00AM
https://ift.tt/3tHpsgi
Device Connectivity and Edge Intelligence in Resource-Constrained Situations - Electronic Design
"device" - Google News
https://ift.tt/2KSbrrl
https://ift.tt/2YsSbsy
Bagikan Berita Ini
0 Response to "Device Connectivity and Edge Intelligence in Resource-Constrained Situations - Electronic Design"
Post a Comment