Logistics Viewpoints

Safety & Sustainability Impacts of an AI / Lidar System with Austin Wilson of Velodyne

Season 7 Episode 7

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 47:50

Austin Wilson, Velodyne Director of Intelligent Infrastructure discusses, safety, sustainability  Lidar, intelligent infrastructure & Vision Zero, including:

  • What is Lidar? 
  • What is Vision Zero? 
  • What challenges do you see in the intelligent transportation market (VZ/safety or otherwise)  that LIDAR uniquely addresses?
  • How does Lidar surmount those obstacles?
  • What is Velodyne’s Intelligent Infrastructure Solution and the technology behind it?
  • What benefits does Velodyne’s IIS offer?
  • Can you share any details on city partners that are actively utilizing the solution to improve traffic safety and achieve Vision Zero goals?
  • What kinds of technology competes with lidar-enabled infrastructure? 
  • What are the challenges within the industry that you see for these existing technologies that lidar solves?
  • Can you discuss how traffic safety is often "reactive" rather than "proactive" and how Velodyne Lidar is a "proactive" solution?
  • Pedestrian fatalities continue to rise, despite an increase in technologies meant to protect them. Can you tell us how Velodyne’s IIS can be utilized to improve safety for vulnerable road users?
  • What do you see in the future for Lidar in general and Velodyne in particular

--------------------------------------------------------------------------

Would you like to be a guest on our growing podcast?

If you have an intriguing, thought provoking topic you'd like to discuss on our podcast, please contact our host Jim Frazer or Our Producer Tom Cabot

View all the episodes here: https://thesustainabilitypodcast.buzzsprout.com

Jim Frazer  

Welcome to another episode of the ARC  Podcast. I'm Jim Frazer, Vice President of smart cities here at ARC advisory group. And today, I'm thrilled to welcome Austin Wilson, the Velodyne, Director of intelligent infrastructure. Austin, can you tell us a little about your career journey, what led you to work with LIDAR intelligent infrastructure and perhaps even a little bit about Vision Zero.

 

Austin Wilson  

absolutely appreciate having me on today, Jim and giving me the stage to talk about some of the work that Velodyne and I are up to, it's greatly appreciated. My background of how I got where I am today is probably not the traditional type of route that that you hear from people that are working in civic innovation, or government innovation or traffic safety or anything like that. I actually, as a college dropout that started my own company, which they then sold and exited about a decade later. And after that, I wanted to get into the tech world just because I had a passion for technology. And couple years before that I had an experience of unfortunately being hit by a vehicle in Los Angeles while I was on my skateboard. And through that experience, I started to really try to on my own understand how organizations at the time thinking organizations, not local governments, make decisions on corrective actions on how they can make roadways safer for vulnerable road users at the time, I didn't think about it at that level. But after being hit by a vehicle, you tend to notice things on the road more and think about things a little bit differently. And so in my journey of getting into tech, I started working for a company called Urban leaf, which is a tech startup based out of San Francisco. And through that I kind of fell in love with working with the public sector and understanding large problems that are within communities and how you can leverage technology to then understand those challenges and potentially solve them through XY and Z. And so for me, you know, I fell in love with that whole aspect of problem solving and leveraging technology to help us understand the outcomes. And after working at Urban League for several years, I branched out and started my own consultancy called Govtech labs, where I would advise gov tech companies about how to scale their product into the public sector ecosystem at the local and state level. And through that experience, development, LIDAR was a client of mine, and we started launching several projects. And they came to me with an opportunity to join them full time, and help them continue to launch the intelligent infrastructure solution and effectively be the general manager of the market for Velodyne. And the rest was history. And so I've been full time with them for about five or six months. And it's honestly the opportunity of a lifetime to be able to use my passions and the things I'm excited about to work on every day and deliver those safe and meaningful outcomes to communities around the world.

 

Jim Frazer  

Wow. Well, thanks, Austin. You know, for those who might not be familiar with the technology, and in fact the acronym, can you talk a little bit about what your what is like our what is what does that acronym stand for? And you know, what is Vision Zero?

 

Austin Wilson  

Excellent question. So LIDAR. The acronym stands for light detection and ranging. So think of sonar, which is, you know, sound radars radio. Lidar is light detection and ranging. We use laser beams to create a 3d representation of a surveyed environment. So LiDAR, quote, unquote, sees in 3d and provides a very high-resolution point cloud in real time for a wide range of environments through Day Night rain or snow. The way that I kind of like to say that easy is it lets computers have vision that enables computers to have computer vision so your drone, your robot your autonomous vehicle can actually see in Vision Zero, not so much to do with LiDAR, but Vision Zero itself is a strategy to eliminate all traffic fatalities, and severe injuries while increasing safety, health and well ability for all and this was first implemented, I think in Sweden in the 1990s. And Vision Zero has been a pretty big success across Europe. And now it's gaining a ton of momentum here in America. And I think there's probably like 65 to 70 cities that are now committed to Vision Zero, and trying to reach, you know, specific goals by next year to again, reduce and eliminate all traffic fatalities, and severe injuries on the roads. And I, I can't remember exactly what city it might have been. It might have been Oslo. But I believe in 2020, they had zero vulnerable road user deaths, because they executed on their vision zero plan well, and over the course of a decade, they were able to reduce that down to zero, which is pretty wild. If you if you really think about that.

 

Jim Frazer  

That's pretty nuts. That's almost unbelievable. I'm not Austin, let's, let's talk a little bit more about LIDAR. You know, you're saying it's, you know, computers will have vision, you know, most of the human vision is two dimensional. Is LIDAR two dimensional? Or would you consider it multi-dimensional? Does it give us actual distances and shapes?

 

Austin Wilson  

Oh, absolutely. It's definitely three dimensional. So as the laser beam leaves the LIDAR sensor, it bounces off an object bounces back, and then creates a 3d image of its environment to tell you how far away something is, its trajectory, how tall it is. And then you can take all of that data as we do, process that on the edge, and then we send that data to our cloud. So our users can then experience that data, that 3d data that the LIDAR sensor creates and make informed and prescriptive decisions based off those data sets about how to fix intersections, arterioles, highways and things of that nature.

 

Jim Frazer  

Well, okay, Austin, anytime anyone mentions vision and camera systems, there's a privacy concern about facial recognition. And you know, situations like that there's law line are suffer from those same challenges in the public perspective.

 

Austin Wilson  

So it is not something that I that I talk with our city partners about is the fact that LIDAR has no bias in terms of the human being or the object or animal or vehicle, or whatever it may be that it sees. So as cameras at an intersection, you know, there's a lot of civic unrest with putting four or five, six cameras on an intersection, you got 1000, intersections around your city, you got cameras everywhere, you know, that can be facial recognition, they can be taking patient biometrics, all sorts of different things. LIDAR doesn't do any of that LIDAR doesn't see skin color, eye color, hair color, or anything, right. It just sees, let's say, in this circumstance, we're talking about the vulnerable road user just sees the human being. So it's a great technology for cities to leverage to not break the community's trust.

 

Jim Frazer  

That's good. So that that leads to my third question. So what challenges do you see in this intelligent transportation market that LIDAR maybe uniquely addresses?

 

Austin Wilson  

Well, safety for one, sustainability, efficiency, equity and privacy concerns, as you and I were just discussing, the privacy concerns that the front part of that where I said to equity, you know, cities right now are looking for solutions that are systemically safe throughout their communities, meaning they don't want it to just work in one community, they want it to work in every facet of their community. Now, I live in Kansas City, Missouri. Currently, we have a pretty diverse community here in KCMO. For example, if you go about 10 blocks east of where I currently live, the life expectancy is about 18 years less than where I live, that's not that far away from where I live, yet. It's a 17-year difference. So when you're talking about, let's say, the communities that are east of where I live in Kansas City, those are the communities where people are underserved, you know, can be underappreciated, undervalued. And unfortunately, you get a lot of people that say darker skin complexions, and Kansas City who live in those communities. As we were talking about earlier, Jim, a lot of these accidents happen at dusk, or at night or at dawn. And unfortunately, a lot of these vulnerable road users are hit by these vehicles at that time. So when you're talking about equity, if my technology can work in any community the same because it doesn't see the human being or the eye color or the skin color, it can be equitably distributed to drive innovation and systemic safety throughout a community, which is extremely again valuable for communities these days to scale technologies throughout an entire community to deliver outcomes, not just a portion of the community.

 

Jim Frazer  

Arbitrary Boston that you mentioned sustainability, how does your solution really what sustainability components does it impact?

 

Austin Wilson  

So one aspect of the technology and this is not even necessarily too much? to do with the LiDAR, the LIDAR data enables our cloud to, to give this data to the user. So it's called Green allocation, which is, how much of your green lights are allocating users and how often so as our system is detecting, let's say red lights, green lights, and yellow lights, we can leverage that data or a city can leverage that data and say, well, this light, you know, people are waiting at this light for 65% of the time, at a red light, instead of driving for 30% of the time in a green light, and maybe it's 50% of the time at a yellow light, if you can start to reduce the amount of time that people are waiting at red lights, you're effectively reducing the amount of co2 that that, let's say intersection is producing from calls from cars that are just waiting. That's kind of why we have the new technology, when you see new cars, you know, they are at an intersection waiting for the green light, and they hit the gas and they turn back on. That's the automotive world trying to say how do we mitigate, you know, emissions at red lights where our technology can actually quantify that specific intersection, and then offer data sets to say this is what needs to happen to reduce your co2 or your emission outputs.

 

Jim Frazer  

Wow, that's it. That's a great example. That's fascinating. With that example, you're focusing on natural vehicles. Safety, when we think of safety, and most of the impacts of deaths and injuries, it very often falls on to pedestrians. And as we know, pedestrians don't like to wait. Many of them, in fact, in the most optimistic examples, 25% of them will actually press the crosswalk button. And only 25% of those will actually wait for the crosswalk to turn green. Supposing that that that's probably some low hanging fruit for the Velodyne system.

 

Austin Wilson  

Yeah, we could call it low hanging fruit. I guess that's an interesting term. But yeah, I suppose you're right. You're correct there for sure.

 

Jim Frazer  

Okay. So can you tell us then perhaps in a little more depth of depth about Velodyne intelligent infrastructure solution, and you know, all the technology behind it? Because I know, it's not just the LIDAR sensor itself?

 

Austin Wilson  

Yeah, yeah, absolutely. So we'd launched in May of 2021. And we keep calling it the intelligent infrastructure solution. It's honestly a mouthful, we call it is for the sake of conversation. So I S is this breakthrough hardware and software technology, utilizing best in class AI, which is actually from our partner, blue city.ai. And they're a small company based out of Canada, and we had a lot of goals, and our mission really aligned with theirs. And so a partnership was very easy to come upon. And the entire idea is to make cities safer, and more equitable and greener. And so how that works is, again, is creates, as we were saying earlier, that that real time 3d map of roads and intersections providing really precise traffic monitoring and analytics, and it reliably collects all of this data and lighting, weather conditions of all sorts supporting 24/7 365 operation that kind of unpack that a little bit for you. We talked with some state do T's and some of their systems aren't detecting and picking up data for about, let's say, four to six hours at night, because the cameras they have unfortunately, in ambient lighting, don't, let's say function that well, or heavy snow or heavy rain. And so if you're thinking about a system that's collecting data, but it's missing, you know, 20% of the day because of X reason, that's 20% of your daily, you're missing data every single day, every single year, whereas LIDAR isn't going to be having that issue in our solution goes 365 24/7. And so is that's advancing safety through multimodal analytics and detects road users like vehicles, pedestrians, cyclists, we can tell the difference between trucks and trailers and all sorts of different vehicles. And it can predict and diagnose and address road safety challenges helping municipalities and some of our other customers, make those informed decisions and take the corrective action.

 

Jim Frazer  

So the components of the IIS package are the field LIDAR sensor array, and the AI package and the conductivity between those How do you manage the conductivity connectivity between the sensor and your AI cloud.

 

Austin Wilson  

So it's actually  pretty elegant. So we put our sensor up on a pole, and we only need one sensor per intersection. This is our rotational This is our 32 channel, ultra-puck that we put up on the traffic pole. And we have a pod up there that's powering the actual device. And then we also have an edge box. So the edge box, think of it as a supercomputer is processing all of that produces a massive amount of data as you can imagine. So the edge box is actually processing all that data in real time. Sometimes this box goes in the traffic cabinet. Sometimes it goes on the pole, just depending on, you know, a few other circumstances. But after all that data is processed on the edge, then it's sent up to the cloud, where a city can then leverage that data to make those decisions. So it's a full suite of the sensor and edge box. And then analytics,

 

Jim Frazer  

does the AI work? Does it do calculations based on adjacent sensors that might be on sequential intersections on a road?

 

Austin Wilson  

So in terms of like our system, are you asking in terms of like other systems that are on the road?

 

Jim Frazer  

In a custom configuration? Is there is this you know, a point-to-point monitoring system per intersection? Or? Or does this also help in, in, say, eco driving, where I'm sensing a collection of vehicles coming down the road, and I would like this signal to turn green, and then, you know, a quarter of a mile down the road, the next one can be, can anticipate those vehicles coming down the road?

 

Austin Wilson  

Yeah, so I love that question, Jim. So I've been talking about this with my team a lot lately. And so we're looking at a digital twin type of model. So you know, we can manage intersections, we have traffic actuation, we can create virtual loops, which effectively can replace inductive loops. And when we're thinking about creating these digital twins with the sensor at an intersection, and then let's say it's just one block away from the next intersection, you have multiple blocks of this, by putting a sensor at each intersection, and then putting a sensor at midblock. As long as we can fuse the LIDAR data, we can create a digital twin to give a city or any of our other partners, I really, really clear understanding of how all of their routes are moving. And from there, you know, we're not doing this now. But this is kind of just a wacky idea I've been thinking of is, you know, can we put sensors on buses, and as buses are actually going around the city, those buses are connected to our intelligent infrastructure and all that data from what the buses are picking up? Because now we have moving assets, not just LIDAR at fixed positions, you know, intersections, can those buses detect new graffiti on a wall and then automatically send that data to Public Works telling them they need to go let's say clean that wall off? Or can it detect a pothole, and then as that bus is on its route, every day measure that same pothole, and as it gets to a certain size, you know, notify the needed department within the city to go and fix that pothole. Again, that's kind of future state. But I'd like to question just because we're, we're really thinking that way, how we can create a full view? Yeah,

 

Jim Frazer  

the perspective, I'm taking in my thinking and asking that question is that, you know, there is connected vehicle technology, you know, C to Vx, v, v to v, that is starting to be implemented in new vehicles where there's a message set that as very low latency, but only goes 100 meters. And there is, you know, oh, you know, one of the one of the more impactful technologies there is, in fact, eco driving where the vehicle can inform the traffic signal that, hey, I'm coming. And yes, a number of vehicles and foreign traffic signal, we're on our way, maybe it'll stay green a little bit longer to allow you to pass through, or the traffic signal itself can tell the vehicle that I'm turning red, don't you know, why don't you just coast into the signal. That's a great technology, which is not part of autonomous driving. But the adoption rate is going to take some time, particularly when the average age of a vehicle out there on the road today is you know, 11 or 12 years old. So that that's all well and good, but it's going to take some time. So in a way, some of this fixed infrastructure can really do that kind or a similar kind of work. You know, very, very quickly and easily, without waiting for that adoption rate.

 

Austin Wilson  

Yeah, I mean, you're absolutely right, we're so we're working on a project in Helsinki. Right now, we are working with CommSignia. They are RSU partner who we are integrated with overseas right now. And we have one of our sensors at a specific intersection. And we have a couple of test vehicles. And so the idea is, since one of my centers can be placed at an intersection but detect at all four approaches. If a vehicle is approaching the intersection, I can send data from, let's say, my system to the RSU, from the RSU to the ODU that's in the vehicle. And now the vehicle knows where all the pedestrians are on the street. So as somebody is coming up to a crosswalk, the system will actually let the vehicle know before the human being will even see the person in the crosswalk that there's somebody that's going to be there in parallel when the car meets that crosswalk like so it's not parallel, it's perpendicular, but you know what I mean? When it meets that crosswalk, if it's turning left or right, and being that my center is at that fixed position at the intersection, whether or not the car is going right or left. It's going to know where people are before it makes that turn. So now a car knows where people are around its environment before it even makes a turn. We're really just trying to test the data accuracy and efficiency. It is early

 

Jim Frazer  

for that Austin. But, but that's a great, that's a great steppingstone to the full implementation. Because today, that V to V chipset is actually in your cell phones and in a perfect world down the road. If your phone is on and you are walking through the crosswalk, your phone will send that message 100 meters to the vehicle that's bearing down on you. We but not that hasn't been turned on in current generations of phones, nor is it in every vehicle yet. So this really is a great intermediary step. That's great. I'm looking forward to hearing how that develops in the next weeks and months.

 

Austin Wilson  

Absolutely.

 

Jim Frazer  

Yeah. Um, how about can you share any details on city partners that are utilizing your solutions? And what are they what are they finding? Yeah,

 

Austin Wilson  

absolutely. So some datasets I can share some I can't necessarily mentioned, as I was talking about earlier, we have been installing Boca where you live over at East Palmetto Park and, and OSHA. And they install on the southwest corner. So you know, man, next time you're over there, look up in the you know, snap a pic and send it over to me because I haven't seen it up on the pole yet. So you could help us with some marketing content there in what the city was, was trying to accomplish a few things. But something that was important to them was that it's that's on ocean, right? So you get a lot of people that are walking across from the beach, walking across the street to where those hotels and restaurants are. And they had a, let's say, an aging population there that crosses the street a little bit slower. And so they wanted to use a system that can hold traffic for let's say, somebody who is in a wheelchair or somebody who were is elderly and you know, gets across the street a little slower. And so that was kind of their first step, and you know, how can we use a sort of technology to detect people crossing the street and hold the lights for those runnable road users. And then from there, it spiraled into a bunch of other, you know, things that we're doing with them now. And we're also working with cities like Austin, Texas. And we're working with him specifically to help them achieve some of their vision, zero goals of eliminating traffic fatalities, and severe injuries. And we are deployed at Springdale and seven, I feel like that's, I feel like that's the installation. Jason John Michael would kill me if I if I got that wrong. But we are actually deployed at an intersection. And I believe the data is that 70% of the intersection accidents that happened within Austin, Texas, happen at the actual intersection where we are deployed. And so we are working with the city to understand near miss detection, red light running and a lot of other datasets. So that we can say, well, how do we redesign this road? You know, what, what do we need to do here, in order to make the streets safer. And we were actually out in Austin, Texas, a month and a half ago, we were at the South by Southwest events and is actually won an innovation award out there. And we went out to go see the installation. Because none of us had seen it in person, we were all really excited. And we wanted to take pictures. And it was a very big moment for us. And it was a very sobering experience to roll up to the intersection. And 30 feet east of the intersection was a shrine of flowers where somebody clearly had been hit by a vehicle and killed at that necessarily at the intersection, but about 30 yards back from the queue, I want to say, and it was a very big reminder of why we're doing what we're doing. And it was a very impactful moment for me, and probably the entire team that was with us. And you know, we're also partnered with the city of San Jose helping them deliver meaningful outcomes and meet their vision zero goals as well. I can share a fun little anecdote with you. One of our partner cities, actually was able to detect that one of their intersections, I mean, they had buses, they had trucks and cars billowing through red lights. And we could say, well, this is how far after the red light, you know, the truck went through it. So when you have trucks or cars or buses, going through red lights, two or three seconds after the red light, it's a big warning sign as to what the heck is going on. And usually is not just based on driver behavior. And then our system also told us that this was happening within like the same 45-minute window every single day. So this is the world of like proactive and reactive in terms of traffic safety is that the city leverage that data to understand that there were near misses happening because of the red light running. So our system fused the red-light running data together with near miss data, which is what we called post encroachment time, which is the mathematical equation of when one object enters and leaves another area, and another object enters and leaves that exact same area. When do they almost hit one another right? And so that data is extremely important for cities to have to leverage to understand before accidents happen. So the system said red light running near misses are happening. It's at this time of day so the city He sent someone down there at that time of day. And what we were able to deduce was there was a glare that was coming off of a really big building. And people were heading west. And they were getting blinded, because this was at dusk, as we were talking about earlier. And so people weren't seeing the light change from green, yellow, red. And so looking at this data, we're thinking God, well, how do we solve this challenge? Like my technology doesn't solve that challenge, it gave the city the datasets to understand that that problem is happening. The actual solution to the problem was planting trees that were tall enough to block the glare. So that right there is like, that's the definition of urban innovation, right? We leverage technology to understand a problem. But planting trees was the solution to stop people from running red lights, and then almost hitting people on the road. I mean, how cool is that?

 

Jim Frazer

Austin because I do live in Boca Raton. Now I am familiar with that intersection. It's fascinating that the amount, the variety of road users there that you mentioned, really, I never thought about it this way. But it is very broad, you not only have the seniors who are walking slower. And you know, some do have, you know, assistive devices like, you know, wheelchairs and walkers and things. But you have a very active community of road biking people. And sometimes you have 20, or 30, bicyclists in a clump. You have a playground right at the right at the intersection as well, where a lot of kids are coming back and forth. You also have a fire station. And a lot of people walking and biking to the beach as well. So knowing what I know about LiDAR, I'm supposing that you can do a fairly good job of identifying each of those different kinds of users, whether it's a pedestrian, a single biker, a group of bikers, or perhaps even someone walking dragging a cooler to the beach.

 

Austin Wilson  

Yeah, absolutely, we can detect all sorts of different road users. If that's the perception software and the AI that's able to really detect that which the LIDAR enables the software to do that. You know, you also have tons of people on scooters. When he talks about people on bikes, there's the other aspect of people on electric bikes now that go like twice as fast as normal bikes. So you have people on the same type of, let's say, vehicle, which is a bike going twice as fast. You know, it's Florida. I mean, I've lived there for a while, I mean, I rode around with my shirt off a lot. You're not waiting, people aren't wearing leather jackets or anything like that, you know, it's a little different, right. So I think that environments, I'm in Boca made a really smart decision by putting that sensor there to really understand if it can help them solve challenges before they scale it. And that's really what I that's really what I liked and appreciated about Booker's approach to their community was, we don't want to just buy this and just hope that it works. We want to take this approach to understanding the data sets that can offer how we can leverage those datasets to solve challenges around our community. Well, let's prove this works at one intersection before we actually scale it. And I love that approach. Because it mitigates risk of failure, it doesn't waste taxpayers’ dollars. And it really helps the city understand at a more holistic level, how to sustain that innovation and deliver those outcomes. And you know, bocce, as you said,

 

Jim Frazer  

it's going to be a fascinating a fascinating application. It occurs to me, we left out one road user that comes to mind is the ubiquitous golf cart community we have down here as well.

 

Austin Wilson  

You know, not to say we couldn't train our AI to learn what a golf cart is, we can absolutely do that. We had a scenario in Austin, Texas, where we had trucks going through an intersection, but these big blobs, these big black blobs, were just going through the intersection and we couldn't figure out what the heck it was. And then we kind of figured we're like, well, like this is Austin, Texas. There's lots of people with trailers with horses in it. But on the back of trucks, we need to train our AI what a trailer is, so that if there's a near miss somebody is almost hit by a trailer or something like that we can identify that it wasn't just the vehicle. It was like the trailer that's attached to the hitch. The same thing happened with trolleys in a city we deployed into these big blobs were going through the intersections, we realized those were trolleys and so we had to train the AI, what the trolley was. It's actually pretty amazing how quickly AI learns. It's pretty cool.

 

Jim Frazer  

Okay, so, so Austin, what type of technology competes with LIDAR is not the only sensor that you could put out there in the field. What are the challenges within the industry that you see for those existing technologies that LIDAR in fact solves?

 

Austin Wilson  

So I'll start this by saying, you know, that's certainly not bashing any of the other technologies that that I mentioned right now there's, there's, you know, pros and cons to everything in life, right? When talking about let's say, like LIDAR versus cameras, the cameras produced 2d images you know, we were talking about before we were talking about the 3d images, but we live in a 3d world so getting data at a 3d level can be a lot more accurate and it's a timely then that 2d data and mentioning cameras again, in those low lighting conditions, or for people at night, that have darker skin complexions, again, cameras struggles to see in those ambient lighting settings or if it's heavy rain, or if it's heavy snow, where in those moments, that's where LIDAR can actually shot. That's kind of the craters on life source, I didn't mean to, you know, make a pun there. But this is really where we're LIDAR shines is within those current environments. And when it comes to LIDAR versus radar, you know, like cameras, radars face similar challenges in the technology is not refined enough to accurately identify objects like LIDAR can, in addition, you know, radars, resolution or image clarity is relatively poor compared to LIDAR. So, you know, I see cities, leveraging cameras and LiDAR, at some intersections, I see them leveraging camera, Lidar and radar at some intersections, some intersections, maybe just we'll have LiDAR, no cameras, well, that probably won't happen. But in the circumstance of all of those technologies, you know, if I can deploy one LIDAR sensor, and get the same type of coverage as four to six cameras, or let's say, four to six, or six to 10, radar sensors, just one LIDAR sensors, already saving the taxpayers of whatever community, tons of money in hardware, if you're deploying at 100 intersections, that adds up pretty quickly as you can imagine. And again, kind of going back to, to, I guess, like the cost of it, you know, our goal is to continue, we continuously lower the cost of LiDAR, so it becomes easier for governments to introduce to their communities. And then of course, there's privacy, this is one of the biggest things that we talked about with our study partners is that, as we're, you know, going to the, you know, the fourth industrial revolution, and tons of innovations are being deployed and all this emerging technology is going to be hitting the streets, especially with the infrastructure package, you know, that just got written into Scott written in, you know, public trust and data protection is an ever growing public concern. And where camera necessarily doesn't mitigate that LIDAR certainly does.

 

Jim Frazer  

You know, Austin, you bring up cost, and I'm not looking for any hard and fast numbers. But, you know, there's been a perception, you know, Lidar and we know, costs have been coming down in recent years, you know, due to Moore's law, you know, and just, that's the way semiconductors go on. But can you give us an idea of the order of magnitude of costs between say, well, the install installed cost of four cameras, one at each mast arm at a at an intersection versus one LIDAR sensor today? Is it roughly equivalent? Or what could you say about that?

 

Austin Wilson  

Well, coming to hardware, it's going to be a lot less to deploy one. LIDAR center opposed to four cameras. You know, one camera may cost the same as one LIDAR sensor, but being the cameras, only two D, you have to put four cameras at an intersection to get that full 360-degree view. So from a hardware perspective, it's already going to be economically beneficial for cities to look to start deploying less hardware also, it looks prettier with you know, not so much crap. All over your all over your traffic lights and you and your poles and whatnot.

 

Jim Frazer  

That's a very good vigor. Good answer. And it’s not the install cost a small part of that exactly. The hardware, it's the continuing maintenance in the truck rolls to adjust for different cameras that, you know, often get windblown and bent over and different. Get environmental impacts.

 

Austin Wilson  

Yeah, I mean, we, to get point, we have a sensor up in Canada, kind of want to say it's, it's up in British Columbia, I can't remember might be Edmonton. And we literally have had that sensor at the exact same position for well over a year through multitude of different weather conditions. If you remember last year, we had a pretty warm summer, in the, in the Northwest. And of course, the Canadian winters you can imagine are pretty brutal. Our sensor we haven't had to go in and clean it once. We haven't had to adjust it once. It's been pretty amazing how resilient that little sucker has been up on that pole.

 

Jim Frazer  

That brings up I know that with LED streetlights with the life is very long, it could be 1020 years. So the major maintenance issue becomes cleaning the lens is there a suggested time to clean the lens.

 

Austin Wilson  

If you haven't LIDAR sensor on a drone or a robot or a vehicle, you know, they're lower to the amount of drone I guess a robot or a vehicle they'll lower to the ground and they're moving so it's easier to get dust that's kicked up. You know, it's I was talking with the Iowa DoD the other day and they were showing their Velodyne sensors that are on one of their autonomous vehicles and its Iowa Right. So they're not just testing this in their, urban environment of Des Moines, they're testing this out on dirt roads. And when you have an autonomous vehicle on a dirt road, well, that LIDAR sensor gets kicked up pretty dang quick, as you can imagine, so when it comes to different technologies, there's different measures for cleaning. And let's say the sustainability and durability for us with our LIDAR sensor, that's up, you know, let's say 1520 feet in the air up on a sensor up on a pole, it doesn't get a lot of dust kicked up on it, it doesn't get a lot of interaction, it doesn't move, it sits completely just still. And so right now, our every day that goes by is just kind of more data saying this is how long this little sucker can last. We're very much not in research and development. But we're very much about proving of concept and a pilot phase of this technology. And so, with all the projects we launched, we're obviously learning a ton of new things. But in all intents and purposes, when we've had sensors out in Florida in the summer, you know, they're not malfunctioning and we have sensors that are in Canada and the coldest winters not malfunctioning, nothing's needed cleaning. Nothing has malfunction, yet not to say it won't happen. But, you know, we're about a year and a half into this. So that's the date I think it'd be for now.

 

Jim Frazer  

Oh, no, that's, that's a great, good, great to know, I know that, you know, in the early days of LED streetlights, you know, the heat issues. We’re challenging because there's a tremendous amount of heat that needs to be dissipated when you're running an ampere to an inner streetlight. And in in that the basis of my question, but it does come also from the streetlight world where there actually is a calculated dirt depreciation factor on light output, for streetlights. And occasionally, it's suggested to ride around every few years and do a do a pressure wash of your lens. But I'm guessing your senses are robust enough there you with a five or 10% degradation? It works the same.

 

Austin Wilson  

Yeah. Yeah, absolutely. I mean, it's it has to right.

 

Jim Frazer  

So let's move on to some somewhat forward-looking issues. Traffic Safety is, as often said to be reactive rather than proactive. How, how was Velodyne LIDAR a proactive solution?

 

Austin Wilson  

So the way that I talk about this a lot with our city partners, and with traffic engineering and civil engineering firms we work with, traditionally speaking, there's an accident and road or at an intersection. And then that data can take anywhere from like six months, maybe sometimes even a year, to get in the hands of somebody that needs to analyze that data to understand why the accident happened and what needs to be done to fix that intersection? Well, if it's because of poor design, or because of new technology that's on the streets like scooters, or maybe it's something that is repetitive that could be fixed that's causing this accident over the course of a year that somebody is waiting to analyze that data, what if there's more accidents for the exact same reason, right. So it's kind of the reactive world of an accident has to happen, in order for an organization to analyze and then fix that intersection, that's all reactive, meaning that if you're just going to be reactive, you're never going to stop something before it happens. And so the world of is the post encroachment time algorithm that we have, which is the Near Miss detection can tell a city, you know, you haven't had six people hit at this intersection, you've had six people almost hit at this intersection, just like that story I was telling you about by one of our partner cities, detecting red light runners in near misses. That was a city understanding that this was happening prior to an accident happening, right? That's a whole proactive, you know, mindset of we're actually going to understand the data before anything is happening to the people on our roadways. And so that's in my mind, that's, that's the easiest way to kind of explain how we are taking the public sector into the future of giving them the correct data set so they can proactively solve challenges for the community.

 

Jim Frazer  

That's fascinating. Um, does this brings up a related question of does your sensor look only at the roadway or also to adjacent sidewalks?

 

Austin Wilson  

Oh, roadway sidewalks, the whole shebang?

 

Jim Frazer  

That's, that's interesting because I am a bit of a bicycle advocate here in Boca and I don't want to speak for the city. But there recently I believe was a study of bike accidents. And a variety of many of them particularly in the downtown areas happened on the sidewalk because the bike lanes were inadequate, and we're not you know, a five-foot marked bike lane. So the rules that are being developed are, you know, are now looking at why are accidents happening on sidewalks and driveways rather than in the bike lanes? So this is interesting, it can pick up those scenarios as well.

 

Austin Wilson  

Yeah, I mean, absolutely, I mean, I have a I mean data that I can share with you just showing bikers blowing through red lights, right, like, you see that quite often, like bikers, they're in a flow, they don't want to read at a red light, there's going to check that it's safe. And then they're going across the street, because they're on a bike ride. It's, I mean, I admittedly, I do that with my, with my bike, but when I know, it's when I know it's safe to cross the street. So if a city can understand, let's say, bikers are, let's say, running through these red lights, and let's say there's near misses a city would be able to be able to detect that and understand why those near misses are happening. And then again, take action to fix x, y, and z at set intersection. And also, obviously, you know, that goes for sidewalks as well.

 

Jim Frazer  

Austin, this has been a fascinating discussion so far today. As we're nearing the end of our time, let me ask, what do you see in the future for LIDAR in general, and Velodyne, in particular.

 

Austin Wilson  

So a few priorities here, really driving down the cost of LIDAR technology, making LIDAR ubiquitous, providing solutions that make the technology easy and easier to deploy in urban or rural environments. really maximize and deliver an excellent customer experience, their ease of use, you know, world class service, etc. And continue expanding LIDAR applications in automotive, you know, widespread infrastructure, and, of course, industrial and robotics uses. So,

 

Jim Frazer  

Austin, we're just about out of time, do you have any parting thoughts for our listeners? Ah,

 

Austin Wilson  

yeah, you know, something that kind of, that got me into this whole world was, you know, I didn't really understand my local government as much as I do today. And going back with, say, six, seven years, when I, when I was very much of a novice and my understanding of what my local government does for me, you know, I encourage any sort of listeners to, to engage with their local governments and understand who are the people that are that are driving innovation and who are driving sustainability, and all these other, you know, big buzzwords these days, that that cities and everyone is using, and, you know, a lot of people that work for your local government, you know, they're not politicians, most people that work for the local government, none of them are politicians, they all choose to work for the government. And these people are overworked, they are underpaid, they are undervalued, and they have a really hard job. And so I would encourage any of the listeners to really start to dig into their local communities and understand who is doing what and who is in charge of what other than just the politicians, and really getting into different programs and helping your local government understand how they can help you the citizen deliver a better outcome to your community. So that's, I guess, my partner,

 

Jim Frazer  

I suppose I will, I will second that, that the, you know, very often those public workers are looking for input from stakeholders. And they really do try to collect those perspectives and see which ones have consensus. And if it's about, you know, bike and pedestrian safety at a particular intersection, well, they usually do their best to try to rectify those problems.

 

Austin Wilson  

Yeah, absolutely. And it's, it's also something to consider, which is, you know, that the government at the local level does not have a move fast and break things mentality, right. It's, it's don't mess up, don't waste taxpayers’ dollars. Everything has to work, you know, if not, heads roll. And so, you know, I'm excited to see the culture of local governments and, you know, do T's and state governments change, with allowing more of a culture of failure where people can learn and feel safe, you know, innovating, right, in innovation, there's a level of failure that you have to it has to be accepted. Me personally, I'd rather know that Kansas City is going to go below $100,000 of taxpayers’ dollars testing new technologies opposed to putting something to RFP for 20 million, then it fails, because they didn't test it. And they wasted $20 million. Like I would rather hear my local government, you know, being willing to take those risks. And it's really cool to see that change happening slowly.

 

Jim Frazer  

Well, I today we've talked with Austin Wilson Velodyne, Director of intelligent infrastructure. Austin, before we go, how can listeners contact you,

 

Austin Wilson  

you can email me at a wilson@velodyne.com or you can reach out to me on LinkedIn. There's probably about, you know, 2000, Austin Wilson's but you know, hopefully I'm up there at the top.

 

Jim Frazer  

Well, thanks. This has really been enlightening. Thank you very much. I thank you all our listeners, and hopefully we will see you again on another episode of the smart city podcast.

 

Austin Wilson  

Absolutely. Jim, you're the man thank you so much for having me on and allowing me to tell our story.

 

Jim Frazer  

It's our pleasure. Looking forward to seeing all of our listeners again on our next episode.

 

SUMMARY KEYWORDS

intersection, lidar, lidar sensor, vehicle, work, sensor, data, people, technology, austin, road, red light, city, cameras, community, happening, cities, understand, detect, deployed

SPEAKERS

Jim Frazer, ARC Advisory Introduction, Austin Wilson