New Applications Foreseen from Combination of AI and Powerful Edge Devices
Perry Lea is a 30-year veteran technologist. He spent over 20 years at Hewlett-Packard as a chief architect and distinguished technologist of the LaserJet business. He then led a team at Micron as a technologist and strategic director, working on emerging compute using in-memory processing for machine learning and computer vision. Perry’s leadership extended to Cradlepoint, where he pivoted the company into 5G and the Internet of Things (IoT). Soon afterwards, he co-founded Rumble, an industry leader in edge/IoT products. He was also a principal architect for Microsoft’s Xbox and xCloud and today is a director of architecture for Microsoft. Perry has degrees in computer science and computer engineering, and an EngrD in electrical engineering from Columbia University. He is a senior member of the Institute of Electrical and Electronics Engineers (IEEE) and a senior member/ distinguished speaker of the Association for Computing Machinery (ACM). He holds 50 patents, with 30 pending. After recently publishing a new book, “IoT and Edge Computing for Architects,” he took a few minutes to speak with AI Trends Editor John P. Desmond about his work.
AI Trends: Thank you Perry for talking to us today, just after the release of your new book, very timely. So what would you say is the best way to define the Internet of Things?
Perry Lea: Well I like to define it as the ability to connect the previously unconnectable world. Thirty years ago we didn’t have the technology, or the genuine interest in connecting inanimate objects or unconnectable things and do it pervasively. So, what’s happened in the last 30 years is we’ve ridden on Moore’s Law, Dennard scaling [Ed. Note: As transistors get smaller, their power density stays constant.], Nielsen’s Law [Ed. Note: Users’ internet bandwidth grows by 50 percent per year.], the big hierarchy of computer science and computer engineering laws that have driven the industry. That has now stretched down into IoT devices connecting things, people, animals, vegetables and minerals. And now you have the ability to do that at scale, and at a cost point that you can do interesting things.
And people are coming up with very creative and interesting and sometimes business-savvy IoT devices, and other times they’re more of hype and fads, or have a business model that just won’t succeed. So the book is about IoT and edge computing, which kind of talks about that, but from an enterprise commercial scale, industrial scale.
Sounds good. Can you say what is the business case for IoT and edge computing? Will it save money?
Well, it can save money, or it can make money. There are business cases where it’s positive, and there are also business cases where it simply doesn’t make sense. I’ll briefly talk about those later.
I tend to think of IoT, first and foremost, as the insurance of things, not the Internet of Things. And I have personal experience with this too in saving my own home from flood damage. In a lot of cases, the devices are monitoring, they’re watching, they’re trying to find a problem before it becomes exacerbated for the customer or the client.
And you think of things like connected homes. Well, a lot of connected homes are about monitoring the state of the home, the temperature, whether the furnace is working or the air conditioning. Or whether an alarm or a flood alert went off. Those are for insurance, and you extend that to retail and enterprise.
In the enterprise and industrial setting, sensors look at things like predictive maintenance. They can analyze classical devices on a factory floor that haven’t been touched in 40 years. You can monitor them to find out when a spindle or a bearing will wear out. That saves money and time. It is the insurance of things.
In other cases, a lot of information is accumulated and you can apply more learning to it, some adaptive and predictive analytics, to do some cool things. So, in a lot of cases I think the IoT is about the insurance of things, remotely monitoring devices, and edge computing is a little different. I put that in a different category.
What does AI bring to the table for IoT?
Well you now have the ability to collect a lot of information, and as you are able to connect an entire herd of livestock, for example, you can start monitoring the behavior and health of animals. You can start accumulating data on that herd. And this is an example where you bring AI to this exorbitant amount of data, and you are able to train models to find for example, sickly animals, or find underweight animals, or animals that are carriers of a disease, before that disease spreads, say in a feedlot, or in a farm.
So, AI can absorb this accumulation of data and that’s good, because you’re accumulating quite a bit of data. IoT is only useful by the number of nodes that you’re connecting; that’s Nielsen’s Law. And the more you connect, the more valuable your network is. And the more connected devices, the more data that you can produce, and AI machine learning and deep learning are applications there.
That comes with a problem though, because as you’re connecting more and more devices, you start plaguing this network with bottlenecks from the sheer amount of data you’re moving over wires or into the cloud. The trend is to actually run inference on the edge, and keep the data in the cloud. That’s the typical model that’s used right now. There are some exceptions, but that helps in curbing the amount of data that you have to transport, and also reduces your latency.
These are all really important from a system point of view, because you don’t just sprinkle a little AI on a bunch of IoT devices, and hope to garner some deeply hidden messages for your business. You actually have to build the setup from a system point of view; otherwise it becomes too costly and impractical. I’ve seen a lot of projects fail on their proof of concept because of that.
I know that sensors are important for IoT. What are some of the trends in sensor technology?
Sensors aren’t new. There is micro-electronic sensing, there is sound, infrared, lidar, optical- and vision-based sensors. Anything that can receive an input and generate a signal or generate a binary value, is a sensor. You’re moving data from the analog world to the digital world, and that’s how they’ve been working for the last 60 years.
Trends in sensing technology are that more intelligence is being placed on the sensors. So again, riding Moore’s law, you can do more at the sensor level at the far edge. You can rule out bad data, you can run machine learning or deep learning models on sensors, or on vision based sensors, and do inference there and rule out false negatives, or false positives and only extract data that’s meaningful. It has everything to do with reducing the amount of data that you’re pushing over the network, and doing more work closer to where the data is being generated.
You have the ability now to sense everything around sound and vibration, including looking into other parts of the spectrum. Some sensors are looking at Wi-Fi signals, and using them to snoop through walls and to find out where people are in emergency situations or in a catastrophe. And some sensors are doing very interesting things with lidar and radar-based technologies, encountering objects and looking at objects. So the advances in sensor electronics have greatly been enhanced in the last 10 years, riding on all the IoT devices that are building that industry.
What are the major trends in edge computing?
Edge computing has been around for a long time. Back 30 years ago, I called it embedded systems. And again, with Moore’s law, you’re able to actually push very powerful computing, very close to IoT devices, and sensors, and aggregate data and perform what had been previously required in a cloud, on the edge.
We are seeing a shift from 40 years of traditional IT in a corporate setting. If you’re in a corporation and you have an IT department, you have policies and security controls and software updates, and IT is basically managing the information and the appliances that work with the data.
Edge computing could be in a completely remote location in the middle of nowhere, completely unmanned, and not conveniently placed in a data center at a corporate premise. So they’re unsecure, they’re unreliable, and there’s no staff there, but they’re building edge systems that are capable of integrating very seamlessly to corporate IT infrastructure. So an IT manager can manage an edge computing device, and an OT [operational technology] user can actually use the device and peer into the environmental control, or some appliance.
That’s a major step, because that’s taken the last 60 years of IT policy learning and formalized it into edge computers. Many interesting things are going on with edge computing. Ambient computing is pushing these edge computers everywhere, but making them unobtrusive, making them part of just the environment, having them communicate with each other and in controlling an entire environment and experience holistically.
So you don’t see an edge computer in a little black box with a monitor connected to it. They’re embedded in walls, they’re embedded in lights, in switches, in the infrastructure of a residence, or a place of commerce, or an industry, and they’re just ubiquitous.
Another interesting trend I’m excited about is synthetic sensing, which requires the performance you get with high-performance edge boxes, but it takes all of these different data streams coming from the sensors connected to it, which could include heat sensors, vibration sensors, electromagnetic sensing and whatnot. It could be normalizing all that data from hundreds of different sensors, and using deep learning models to actually extract information that would not be perceivable with just one single sensor.
So you can look at an entire environment in the home, for example, and understand if someone left a burner on without having an actual sensor attached to a stove or a burner. Those are some exciting trends in sensors.
What impact do you think 5G will have on IoT and edge computing?
Well, 5G has provisions for IoT devices. One of the claims is that they can have greatly improved density. Another claim is that they can have a million IoT devices, streaming data within a square kilometer. And they also have provisions — one I think is absolutely necessary — which is lower latency. So a device can actually send data with sub one millisecond latency to an edge device, or the cloud, or whatnot. That starts to enable at least some degree of real time control over IoT devices.
I think it is actually more meaningful for edge computing. Because in edge computing there’s a technology called MEC [mobile edge computing]. 5G is constructed as basically a virtualized system. Everything is a virtual extension running off of hypervisors through the entire 5G stack. That allows for companies to start building edge appliances very close to the customers.
So for example, in a streaming service, 5G could be used with edge blades or edge hardware in a carrier substation, or cell tower, or within a local premise that the carrier might be using. And they would lease space, and lease time on their network for 5G services.
So, a streaming service like Netflix, or video gaming and streaming could develop MEC hardware that would be installed in a data center, very close to the user, reducing latency and resolving similar network traffic. So 5G is really built for edge computing, and edge hardware appliances. That’s kind of a big shift from where 4G LTE [Long-Term Evolution] was. You probably won’t see any use cases like this popping up in the next year. But I think within the next five years, edge computing and 5G will be pervasive.
What is fog computing? Why do we need that?
I tend to call fog computing a subset of edge computing. Essentially you need a computer at the edge somewhere, and fog computing is an extension of the cloud. We see a lot of hype and buzzwords around things like this, like mist computing and whatnot, but essentially it’s an extension of the APIs, and the services that you would see in the cloud.
So if you’re developing a traditional SaaS application in the cloud, you will need a DevOps team, programmers very familiar with developing applications and deploying them at scale on the cloud. Fog extends that, so these nodes running at the edge are essentially extensions of the cloud.
And from a developer standpoint, they don’t really need to know where the software is residing, or how it’s auto-scaled, or how it’s deployed on the cloud or the edge, or maybe a hybrid of both. And it’s really a practicality for modern programmers who are very cloud focused and SaaS focused, who will face very little learning curve to start developing edge-based software. So a classic example would be something like Amazon IoT Greengrass, or Azure IoT.
These are extensions that allow you to deploy things in the cloud, or on the edge, or you can have an extension of both. From a programmatic point of view, it’s easy to manage, easy to code for and easy to scale. But it does have that cloud-based component to it that you have to consider. And not all applications that people are developing for IoT and edge really need the fog. Some edge- based devices can be completely autonomous by design.
In the security area, what are the risks that hackers can get control of, or they can spy on an IoT system? Can the systems be made secure?
Well, there have been issues with some home appliances. In one instance, some secure information was available to employees of the organization that was building the appliance. There have been breaches, there have been denial of service attacks; The biggest attack in the world, the Mirai attack [August 2016] was based on essentially IoT devices. It was a massive denial of service, that used these appliances to basically attack the internet, and shut down specific websites.
That will happen again. Like I said earlier, a lot of these edge devices are now catching up to 60 years of IT security management and policies. They have been traditional embedded systems, and embedded systems have been secure through obscurity for a very long time.
Well now they’re connected, and they’re running Linux stacks. They’re sitting there with open ports. They haven’t had all the hardening and the maturity that a traditional server in a data center has, and now edge systems are the servers. So there is a risk, and a lot of the risks can be mitigated and controlled through basically common practices, common security provisions that you would employ in your home or in your corporation, but you would also apply them on the edge. Security professionals have many measures to choose from. With IoT, some degree of security maturity needs to be built into the system.
Are there any consortiums that are involved with IoT that you would care to highlight?
Well, I’m a big proponent of IEEE and the Eclipse Foundation, eclipse.org. Many organizations come and go, but IEEE is a very good organization. They have a subchapter for IoT that is worth looking at. Personal area networks are very prevalent now; Bluetooth SIG is a very good organization to join to help keep up to date on new standards. Another one is the OASIS group, the organization that controls the communication from edge devices to the cloud.
They do a lot more, but one prevalent communication standard is called MQTT. It’s an open standard, and is by far the most prevalent edge-to-cloud communication system that exists. Others I would call out are the Object Management Group, and the task forces they have. They do a lot of work with UML and organizations like that.
On the connectivity side, you could throw in the ITU and 3GPP [The 3rd Generation Partnership Project, umbrella term for a number of standards groups developing protocols for mobile communications.] as organizations to look out for telecommunications senders, especially 5G. Also, the Wi-Fi Alliance is working on Wi-Fi post 802.11ac and 802.11ax, with some really exciting development happening now. Many organizations are out there; those are the standouts in the various segments of IoT.
What are some of the use cases for IoT and edge computing that in your view illustrate what the technology is delivering today, and maybe what the potential is for the future?
When you look at successful IoT deployments, things that are driven at scale and where there’s high volume growth and interest, one obvious area that continues to grow is telematics, as in fleet telematics and mobile telematics.
This is where you have a vehicle or a ship or carrier, and you’re monitoring the status of the cargo, and of the delivery and trying to optimize it. You try to ensure, for example in dairy production where you have a lot of milk being generated in a constant volume, that you know where you’re moving it. It has a shelf life; you can only keep it in a tank for so long. It has to be cooled or kept within a range of temperatures; you’re under the gun.
And so you’re monitoring the fleet, you’re monitoring the passage of this material, the movement of this material, and you’re monitoring the health of the material, the temperature range that it falls in, so you can place it at the right processing center at the right time to have this constant stream, this flow of milk production be managed.
The same is true, and in other areas in telematics, for example, in municipalities, monitoring your fleet of snowplows, or your fleet of garbage trucks and garbage collection, to understand where they are, and how they’re operating. In snow removal, you don’t want to send multiple plows to the same street and not use them optimally.
So you’re trying to extract and optimize your problem, and you can only optimize a problem when you can measure it. So IoT allows you to measure what has been previously unmeasurable. Other use cases around residential include personal assistants, security systems, irrigation systems. Many people put a high value on being able to monitor their home remotely. That saved me from a flooding situation I was able to respond to before there was additional damage. As more advanced processing is moving close to the devices through edge computing, it opens up many new opportunities.
You mentioned in the beginning that some things are not appropriate for IoT and edge computing. You want to mention maybe some of those things?
I’ve seen some devices that are, I couldn’t understand the business behind, many of them consumer brands. One was an IoT connected hairbrush that monitors how you brush your hair. I was thinking, what data are you gathering from that? How does it optimize a problem? That didn’t make any sense to me.
Another device monitors how often you’re drinking water from an internet-connected bottle of water. Maybe there is a business case for some medical situations to monitor patients or maybe in home care, but this was being marketed into the hiking and weekend enthusiast community. I couldn’t see justifying paying five times the price for a water bottle, where you could measure and monitor how many times you’ve lifted it to drink.
So there’s a lot of hype and fads that kind of distract from the reality. That’s where you have to put some blinders on, and not embed intelligence into every device, just because you can.
If there are any students or early career professionals out there reading this, who and they want to get involved with IoT and edge computing, what should they study, or how do you recommend they go up the learning curve?
Well, when I got involved with this, when I wrote the first book, and the IoT and Edge Computing for Architects third book, I mean, what became absolutely clear was if you’re taking data, and you’re building an IoT device, and you’re putting something at the edge, and you want to do something intelligent and beneficial with that data, with that device and with whatever it’s connected to, you get involved with many domains of engineering and technology when building a holistic system.
I would also advise that if you have a very narrow or myopic view, if you’re very siloed, such as being focused exclusively in SaaS development, I would extend that to other domains of engineering as well. In this call we’re talking about everything from sensor physics, to electronics, to energy harvesting, to embedded system design, the things that are traditionally in a computer engineering or electrical engineering realm.
When you start moving into embedding operating systems, IT appliances, IT policies and hardening your edge device, now you’re starting to develop security skills.
We didn’t really talk about all this, but the way communication happens with wireless and wired IoT devices is a huge consideration, and has many ramifications on the overall cost or even whether you’re going to succeed in your project. So you need an understanding of telecommunications from a signal point of view, an understanding of the Bluetooth stack, and the ability to decide whether to use 5G everywhere. That probably isn’t a good idea. Understanding all of that helps. I briefly talked about MQTT and other protocols, and all the nuances of those protocols in a pub-sub type of protocol. [Publish/subscribe messaging is used in serverless and microservices architectures.] You need to know the security issues there.
When you start building a network, you can have North, South, East and West components, edge computers talking to each other. You need to decide what to do with the data, maybe it marshals up to the cloud. So understanding SaaS skills, or platforms as a service (PaaS), or how auto-scaling works, or basically cloud dynamics are all very useful skills.
So you have all this data you’ve accumulated in the cloud, and now you want to do something meaningful for customers. They are going to want visibility into that data. So you need to understand UI and user experience dynamics, where they can see a lot of data and garner information easily just looking at some kind of chart.
Then you want to go a step further and try to do something intelligent with the data. Now you need machine learning skills, and data science skills. Maybe not necessarily deep learning in all applications, but you might use a Bayesian type machine learning model, or you might use a random decision forest or just a decision tree. Embedding that expert knowledge into a system that can take all this data and give you something like predictive analytics, or predictive maintenance is invaluable.
Understanding that whole stack is really helpful. I think it’s good to be an expert in one area, but it’s really quite helpful to understand that whole spectrum of technologies.
That’s a lot to know. It seems having one major, like computer science, would be too narrow.
When I wrote the book, it was for architects, and it’s for anyone. If you’re a SaaS developer, you can start by looking at cloud-based chapters of the book and then extend outward. You might be deep in one area, and broad in a lot of areas. With some exposure, at least you can start talking the same vernacular. You can start making decisions based on where you put your training for a deep learning system. What types of hardware do you need to run a good, realtime inference on the edge? How much power is that going to need? What if you don’t have a good reliable power source, how is that going to work?
IoT devices have many nuances. If you’re doing fleet telematics and you drive under a tunnel, and lose your signal, what happens? Do you cache data on the edge? Do you process locally? You’re no longer completely digital, you’re dealing with the analog world, and doing it on a massive scale. Stuff goes wrong all over the place.
Those are all the questions I had. Is there anything you would like to add or emphasize?
We live in an interesting time, right now, fighting the coronavirus, and people are learning that doing things remotely is where the future’s going. Look at autonomous driving, remote work and highly-automated systems. This is a growing area. When you wipe away all the hype and the surrounding fads, including claims that will have a trillion IoT devices by 2025 or 2030, it is a double-digit growth area. So it is where to go, and there’s a lot of room for improvement.
Learn more about Perry Lea.