The Sustainability Podcast

Learn About Edge Computing Virtualization, Management and Orchestration with Darren Kimura, COO of Zededa

The Smart Cities Team at ARC Advisory Group Season 7 Episode 11

The only way to harmonize all data diversity at the edge is to drive a standard, open source, cloud-native edge. This edge has to be highly efficient and different from previous generation of distributed computing- this edge has to be virtualized. In this intriguing podcast, learn how ZEDEDA focused on edge virtualization to deliver visibility, control and protection for distributed edge gateways, applications and networks at the enterprise edge as a cloud-based service.

--------------------------------------------------------------------------

Would you like to be a guest on our growing podcast?

If you have an intriguing, thought provoking topic you'd like to discuss on our podcast, please contact our host Jim Frazer

View all the episodes here: https://thesustainabilitypodcast.buzzsprout.com

Learning About Edge Computing Virtualization, Management and Orchestration with Darren Kimura, COO of Zededa

SUMMARY KEYWORDS

edge, device, applications, cloud, data, running, server, factory floor, operating system, systems, case, ai, compute, license plate, darren, snmp, community, fascinating, podcast, oftentimes

SPEAKERS

Darren Kimora, COO, Zededa & Jim Frazer

The only way to harmonize all data diversity at the edge is to drive a standard, open source, cloud-native edge. This edge has to be highly efficient and different from previous generation of distributed computing- this edge has to be virtualized. In this intriguing podcast, learn how ZEDEDA focused on edge virtualization to deliver visibility, control and protection for distributed edge gateways, applications and networks at the enterprise edge as a cloud-based service.

The following is a transcript of a conversation at the ARC Forum in Orlando, Floriida this past June.


Jim Frazer  

Welcome to another episode of the Smart Cities Podcast. Today we're broadcasting live in ARC 26th annual forum here in Orlando, Florida. Today, I'm thrilled to be joined by Darren Kimora of Zededa to talk about edge devices and more specifically, Edge management, edge device software platforms that would manage those terminal devices out in the field. So Darren, welcome. It's great to have you. How are you today?

 

Darren Kimora

I'm great. It is awesome to be here. Thanks for having us.

 

Jim Frazer  

Hey, that's, that's great. So let's just, let's get started with perhaps a little bit of background about yourself, your company, and how did you come to this edge device ecosystem?

 

Darren Kimora

Yeah, so I'm Darren Kimora. I'm the Chief Operating Officer for Zededa. My responsibility is, you know, running, running the business and all things related to product, go to market engineering, whatever the case may be. The Vita has been around since 2016. And it's a really interesting company. First of all, oftentimes, we get asked what is the names of Zededa mean, because it's a pretty unique name, are one of our founders we have, we have three founders with the company, one of the founders is from Morocco. And the words Zededa translated means innovative or new. So we always think about what we do as being very, at our core needing to be very innovative. So that's what the data means. Back in 2016, this is when at a time when things were changing quite a bit, IoT was very big topics, like Fog Computing was out there, edge computing was also out there. And really, the definitions were quite loose. But the founders really saw an opportunity for cloud computing to change and be brought closer to where the workloads are being, for example, in a factory floor, you know, taking the compute and moving it all the way down to where the robot arm is, or and where the automation needs to be. And that was the idea. But the technology wasn't quite there yet. So what they decided to do was really work on the operating system level and figure out a way to make the different applications that are running in those locations, for example, the factory floor, work with the technologies of tomorrow, things like AI, TensorFlow, and whatnot, and how do you bring all of this together. And when they did that, what they developed was an operating system, a brand new operating system, very thin, very lightweight, designed for those applications designed for the factory floor. What they then decided to do, which was really fascinating was take that and contribute it to the open source. And as a result, bring a community in and around the technology to improve it to support it to make it better bring use cases in. And really that was kind of the legacy behind what ZTE is the next part that they did. And I'll keep this part really short, because I know we want to get into a lot here today. But was they came up with a way to orchestrate it because one of the things we recognized was you're not just going to have one or two of those servers on the factory floor, you're going to have hundreds, maybe even 1000s. So how do you go about managing it. So they developed a cloud based orchestration system, which allows you to go down to each one of those devices very securely and control them and manage them and monitor them. And really, that's what's the data has today, an open source operating system and a cloud based orchestration system.

 

Jim Frazer  

Darren, that's a that's a great point. We all know, the world is evolving to open standardized protocols that will allow interoperability, and frankly, applications to be developed that none of us could even dream of the let's look a little more foundationally at each of the building blocks of an edge based cloud fog type system. What actually is an end device? I like I know that some of our listeners may not be familiar with that. So just

 

 

can we start there? Absolutely. That's a great place to start. So let me maybe explain the way we think about edge and the definition. So if you think about like a triangle at the very top of a triangle, you might start with cloud compute the hyperconvergence infrastructure providers like AWS, Azure, etc GCP, the edge kind of starts at the next layer down. And what typically happens from an edge definition is you're going to have your telco providers, your CDN providers, which will be your geographical data centers, which is really kind of were the beginning of the edges. The next tier down, you're going to have the edge, which will be probably something like an on prem data center, if you have a factory, for example, and there's a data center there, the data center might be racked and cooled with ups and whatnot, that would be your, you know, kind of like an edge data center, but on prem. And the next tier down is what we call the distributed edge. And this is going to be devices like gateway devices or small servers that you might have running inside machines. So these are, you know, not your typical one u two u type servers, but smaller than that, they might be self cooled solid state, they might be a non rack, mount them on the ceiling, whatever the case may be. And really, that's where we think about the distributed edge as playing in. So those servers could be anywhere they could be in solar farms running trackers, they could be in cars, if you think about autonomous vehicles, like a Tesla, for example, that's also an Edge Server running inside of that Tesla, that's also an edge device. So the form factor is vary depending on what the use cases are. But really, all of that is edge.

 

Jim Frazer  

Just to clarify, for our audience, you know, introduce a few new terms, distributed Edge Server, when a more commonly used term that I'm familiar with is simply an edge device. So those do these distributed edge servers are they one step up in a hierarchy above the standalone, let's give not the standalone sole function edge device,

 

 Darren Kimora

I think they're the same. So I would think about an edge device as being a broader category, and the distributed edge type server to be a subset in that broader category. So the distributed Edge server is going to be your smaller, like a gateway, for example, you could have a gateway there, which would be more in the distributed because it's all over the place, you're going to have, you know, perhaps one in each of the autonomous cars, or one of the industrial processes, for example, but small, so but they're all generally of the classification of edge device.

 

Jim Frazer  

Okay, so we've talked about that architecture. Let's, let's just talk about some edge device applications. And as you know, before we started recording today, you and I had talked about license plate reading cameras, I think that that's a very good example of a sole use sole application edge device. Can you talk a little bit about what you know, what's the value proposition of something like that? Yeah.

 

Darren Kimora

Yeah, you know, the, the interesting thing about edge is, it's kind of everywhere. So it's not any specific sector, it's pretty much every sector. So what we've seen quite a bit is, for example, in the retail markets, where you have camera systems, which today are digital, and they have some common server on site, you know, all the different cameras, maybe have 90 cameras out there all connected back to some server, what you cannot do is you cannot take each of the frames from those cameras and push it all the way up to the cloud, right? It'd be very prohibitive to do that very expensive, you would, you know, clog your pipes could be very difficult on the other side to receive all of that information. So what you ideally probably want to do is you want to run local compute, to be able to take all of that information and scrape it for whatever it is that you're looking for. So one example is license plates on vehicles, where these cameras are just streaming data, they're capturing frames all the time. But when you actually see a car, enter into a picture, for example, and into the camera itself, you can use machine vision or some form of artificial intelligence to really begin to hone in on identifying the license plate of that vehicle. When you can capture it, you can OCR that, get that information, send that back, you know, captured in your edge compute device locally on site, take that license plate information and shoot that backup to the cloud. So now from the cloud, you can actually go in and take a look at all the different vehicles for example and all the different license plates that have entered and exited that parking lot or that intersection but really, if you think about the entire description I just made all of the heavy data is actually there still on site. That's a great example. And

 

Jim Frazer  

in fact, I would point our listeners to a podcast we recorded yesterday here at the ARC Forum with a blue skies AI who has a an artificial intelligence based inspection system for medical supplies and it actually was originated in Intel for semiconductor testing. 

 

Jim Frazer

And so there's a in that application there's a tremendous amount of processing horsepower that's looking at misalignment of medical products as they're coming down the assembly line, and the AI is applied, and the only message that's sent is pass or fail. In that case, in the license plate reading application, it's typically seven characters, six or seven cat alphanumeric characters that are sent back to the state regulatory folks. And then the state would via the cloud report back stolen vehicle, no insurance, or something similar to that. So what are the challenges in in the in edge device development? And in particular, the management, I think, for one of the largest challenges probably is managing these disparate devices that are all over the place on different media, different geographies. And some probably come and go on the network as well.

 

Darren Kimora

 

Yeah, those are you hit on a bunch of key points there. One of the things we're working hard to solve as a data is how do you migrate a organization that's been running their systems for the last 20 years, probably on some, you know, Windows application, and modernize it so that you can now run things like artificial intelligence, right next to it. And that's quite a challenge, because, again, the data and the quantity of data, the gravity, the volume is, is local. So what we have, for example, in the operating system that we produce is the ability to actually run multiple runtimes, side by side. And why that's important is now that allows you to take a physical server, a device that you know, the end user may like and a brand they may trust, and take the Windows application that they've been using for the last 15 years, run it next to something in Linux, where they might have a proprietary application, maybe doing something like preventative maintenance, for example. So that allows them to gently transition into this new digital world. But using stuff applications, in this case that they're very familiar with, that's one of the biggest challenges is the leap to go from, you know, your traditional on prem environment into the new cloud, for example, world, pretty, pretty great. And allowing them to take these baby steps to get there is significant. So the learning curve is lower, the amount of instrumentation goes down quite a bit, it's cheaper to do it this way, as well. So there's a lot of advantages for this approach.

 

Jim Frazer  

So this, this application resides in the, in your application in the cloud, or in the device. 

 

Darren Kimora

Yeah, it's actually both. So what we do with the edge orchestration and the edge operating system we have is we can take an application, which you wrote for AWS, or Azure, or any other cloud implementation, poured it down into the data marketplace, it's designed to be able to port very easily with very little if none work, and then run that application physically on that edge server that you might have on the factory floor. So now that application can do local processing. But you're also still able to take the information you want, and process it also in the cloud. So that heavy lifting can be done locally. And then the end of the high level analytics, the Power BI etc, can be done in the cloud. So the low latency applications can be done out there in the field. And the and the heavy data mining could be done in less time dependent, can be done back home on the cloud, or vice versa, or vice versa. Because again, when you have a lot of the data, you might want to be doing a lot of the heavy lifting, physically on site, and then you extract the high value information and push that up. But then you aggregate that across all of the different locations that you have. So it's just a higher level view a bird's eye view of your data.

 

Jim Frazer  

That's fascinating. I know, in particular in the traffic signal industry. In the last decade, they've moved to a Linux operating system. And there are now a number of applications that diagnostic applications, all types of different applications that reside there. And the interesting part there is now that data gets shared across those, the two of those are three or four applications. And again, you start getting applications created that no one has really even conceived of.

 

Darren Kimora

Yep. Yeah. And that's happening on the regular now. I mean, we have organizations that are running proprietary applications, things like lightning strikes and trying to calculate where the lightning strikes are or weather patterns. These are not applications that are made publicly available but proprietary and we have to be able to enable them to use that side by side next to open applications which might be doing things like firewalls you know, for example, yeah, right. Yeah,

 

Jim Frazer  

There's certainly a plethora of those gunshot detection yes things good example that things like that. Yeah, the gunshot detection is fascinating that every gun model has actually has a unique signatures you could even identify the actual type models have that's been fired. Yep. So, what are again, let's go back to some of those challenges, and in particular, how does your organization surmount this? What makes what makes your offering unique?

 

 

Darren Kimora

Yeah, I think the big question is, you know, what about security. And that's a top of mind for any large organization when they're working in the cloud. But just in general, right, security is such a key topic these days. We start with security at our core. So when we're looking at, how do we go about creating a secure connection, we're all the way down to the silicon. Now we don't make hardware, we work with over 70 different hardware models out there from the major manufacturers, we connect to TPM. For example, when we initially installed the operating system on a piece of bare metal, we're looking at the way that that servers designed the bios of it, we're looking at every single line of code as it boots, we take all of that information, and we connect it back to our cloud, that creates a secure connection via encryption, in transit and at rest. And every time afterwards, when that device is connected to the cloud, we match it, we call that remote attestation. And if the device doesn't connect correctly, we do not allow it to connect to your network, we allow it to run. So you can be quote, unquote, offline, but not necessarily connected. So for example, if someone went into that, that Ed server, and a lot of these distributed servers are going to be like little boxes sitting on top someone's desktop, or wherever the case may be, you might have a USB port there, someone could have inserted a USB key, and that may have malware on it. If that and you wouldn't have any idea because a lot of these sites may be autonomous on that. So you wouldn't have any other idea, except for the fact that when it tries to remotely test itself to our cloud, we know, hey, something happened, something changed. It's been spoofed. And when we see that, we let you know that we let the user know that, hey, you need to know this. So you can go out there and proactively take action before you connect it to your network and take everything down. So we start with security at the beginning. And then we take a look at things like usability. Most of our customers have 10s of 1000s of different node devices. But the last thing you want to do is physically send a technician out to every single site, plug in a laptop, to bring that device up, configure it and then have to send that technician back out there every time there's a software update. So we allow you to be able to create a image of what you want software application, whatever the case may be, and, and attach it to a particular device or cluster and be able to push that down. So now one admin is able to manage 10s of 1000s of nodes around the world and ensure that it's running in the exact way you want it with the right applications that you want it to be running with. So you know allowing for that, that ease and one touch provisioning such a big game changer for our customers.

 

Jim Frazer  

What let me just ask what markets have you? Are you participating in?

 

Darren Kimora

Yeah, you know, oil and gas has always been a leader in IoT and edge and they continue to lead particularly these days with oil prices as they are and more exploration needed. So we're seeing just tremendous activity in that market. industrial and manufacturing, you know, Process automation, like what we see a lot here at ARC forum. You know, many of the end users here are exactly the types of organizations implementing edge computing now, because they're trying to modernize they're trying to automate, they're trying to make their systems more robust and redundant. You know, this is this is a goldmine for companies doing edge computing in my in my opinion. And then we're seeing things like autonomous now, you know, AWS, or Amazon warehouses, which run autonomous forklifts and drones, each one of those, those systems is an edge device. Each one of those drones has an edge device in it, each one of those Tesla's out there on the road. So Automotive is really picking up quite a bit right now as well. But it's everywhere. Evie, charging stations, as I mentioned, solar farms, it's pretty much edge computing is absolutely everywhere. Now.

 

Jim Frazer  

That's that is fascinating. Well, is it an open API protocol.  Do you need functional profiles from those device manufacturers? Or do you code those yourself? Or how does? How does that work out?

 

Darren Kimora

Yeah, two ways. So it definitely helps when we have a relationship directly with the hardware providers. Because, exactly to your point, each one of these devices, even in the same model class, there could be little differences that make them tricky. So if we have that, and we have a relationship, as the model is being produced by the vendor, we can work with them to make sure that Eve is going to work with them. But the second way, and probably the most powerful way is the community because Eve Eve sounds again, for the edge virtualization engine. It's the operating system, which which I've been describing, is actually in the open source under Linux Foundation, we have an active community of over 10,000 contributors who are working on these things. They're working on interoperability of hardware, they're working on, you know, using Eve in new and unique ways. And that's a totally open source world out there.

 

Jim Frazer  

Well, then let's let's go into some of the technical debt. details there about, if you do if a supplier shows up with a new edge device you've never seen before? Yeah. How do you know the arrangement of the objects in there? What kind of definition files might they provide? Or how does those fly go about, like engaging with the data?

 

Darren Kimora

Yeah. One way is that we, they directly engage with us. So they become a registered supplier with the data. And we get them to, you know, tell us the types of models that they have, a lot of times, they have a virtual image of what that looks like. And what we try to do is we try to put Eve on it. And we see how that works. That allows us to identify bugs or inconsistencies. And from there, we can actually begin to work on the patches, or whatever needs to be done to make that work. So that's what we typically do is we would rather start with having a direct engagement with the vendor, know what they offer, get access to their their images and work with them directly. But it doesn't always happen that way. Sometimes it happens by an end user who has a device and is in in our open source community, we don't even know as the data doesn't even know that they're out there, what they'll do is they'll try to put Eve on it themselves. And then through the community, they'll submit what they're trying to do and where the problems exist. And oftentimes, we have people on the regular basis as a data looking at the communication there as well, we'll drop in, and we'll try to help out if we've seen the CES situation before, if we've seen the patch, or fix or whatever that needs to happen there before we can directly get involved with that. And that happens, like I said, naturally, because of the open source nature of what we do, we don't oftentimes even know. Now, once we do know, of course we do, then try to make a relationship with the vendor so that we can continually be ahead of the curve. So as they're coming out with a new GA of a product, we want it to work. And as I think we continue to get bigger as an organization and more well known, I think that'll become more of the normal course of business.

 

Jim Frazer  

So is there a certification process that this is a good citizen on this?

 

Darren Kimora

Yeah, Yep, absolutely. And who manages that. So we have the data official devices, we call this the approved device list. And that's a certification for devices we've seen before we work with we know that will work consistently with our solution, then we have our community Device List, which is basically a list of devices that have been reviewed and submitted by the community to have worked.

 

Jim Frazer  

Yeah, I'm guessing this is intuiting, that this is not a trivial scenario. You know, I, the first analog I think of is SNMP. And MIBs. Yeah, where, where you get MIB nodes SNMP IP address, and it pretty much works. But you here, you have a plethora of different silicon providers and operating systems in those terminal devices, that makes this a non trivial situation.

 

Darren Kimora

It's true, I wouldn't say though, SNMP is? Well, it's supposed to be a standard is very unstandard. Or anyone who's worked in networking. So as a result, you know, having come from the networking industry, myself, I know that can be also very challenging. But But to your point, absolutely, each one of these devices are configured in a different way. The chips are different, the I O 's are different. So there's oftentimes a little bit of work that we have to do. But but after you know, haven't been around for about six years, we've seen a lot of the variations in the past. So the fixes come pretty quickly. Now.

 

Jim Frazer  

Darren, what do you see for the future of edge devices in general, and edge device management platforms in particular,

 

 

Darren Kimora

the future is very exciting. As we look at what end users are trying to do, they're pushing edge now to be even beefier than what it used to be. So a lot more compute a lot more cores, for example, allowing them to do more things, multiple runtimes, running in parallel to each other. So you can now do things like aI next to, you know, frame grabbing with machine vision and automation. So just more more autonomy, I think that you're gonna see from these systems. What I really excited about is the end to end nature of what we're seeing, you know, organizations now that are looking to deliver complete solutions, AI solutions, for example, without data science, no code, our local type offerings, but embedding the data platform in there to become the operating system of use. And then also the the orchestrator of choice, which is what we're trying to do. We're not trying to compete with anyone making an application we're trying to enable the infrastructure so those applications can be made.

 

Jim Frazer  

Now, one point we didn't we didn't discuss here is, you know, as I'm thinking about your platform, it's it's frequently used by the end user community. on oil refinery process control, what does the user interface look like? And what about usability for folks that are not? Do they need to be edge device gurus?

 

Darren Kimora

Yeah, yeah, they have to be edge devices. No, no, no. Because there's no such thing. Not yet anyway, maybe there are and and we'd love to meet them, by the way. But but for the most part edge is so new that you have to make it simple, because the definitions even still are being you know, worked on. So what we offer is through the data cloud, we have a very familiar type of UI, it's very intuitive, doesn't require any training, you can figure out by yourself step by step on how to do it. But of course, we have the documentation as well, to set up a device manager device, deploy an application, monitor its health. And we do that through through our cloud. But in addition to that, we offer over 100 different API's from our edge engine, which allows our customers to take those API's and run that into their own cloud. So for example, some of our customers have their own UI, which they run different types of analytics across their systems. And they ingest information via API from the data into that UI. So we're it's like the data inside, we're behind the scenes, providing them all of the information, the configuration control, but we're not, you know, they're not necessarily logging into the control by itself.

 

Jim Frazer  

Well, Darren, thank you this is this has really been a very fast half hour of demystifying what is an edge device, edge edge manage edge device management platforms. Do you have any last minute parting words for our audience?

 

Darren Kimora

Yeah, you know, I think what we're trying to do as a data is really be a thought leader as it relates to all things edge computing. So one thing we recently launched is what we call the edge Academy. And the edge Academy has a lot of studied information. But a lot of stuff just in general about edge. If you want to get smarter about edge, just learn about what it is, you should go there and you can get access to this as you'd Academy for free, we make that resource available for free. Just simply go to the data.com ZED da.com and get access to it from there. Also, on our website, a lot of white papers, we sponsor a lot of stuff not necessarily related to the data, things like the state of the edge annual report where we talk with over 100 250 different major end users around the world and get their views on where the future is going to go, what they're investing in. That's also available for free. All of these resources are available for free on our website.

 

Jim Frazer  

That's that's just fascinating. Thank you for all of that. Lastly, if any of our listeners would like to contact you, how do they actually find the data?

 

 

Darren Kimora

We make it easy? Just go to the Data website and you can get access to us contact us find any one of us at any given point in time. We're there to help.

 

Jim Frazer  

Well, again, we our guest today has been Darren Kamara, CEO of zudena. And thank you very much, Darren.

 

Darren Kimora

This has been this has been awesome. Thank you so much. Oh, great. Great.

 

Jim Frazer  

Very welcome. Thank you, everybody. We look forward to seeing you on another episode of the smart city podcast.