Working Together to Raise the Bar on Safety
Degraded Visual Environment (DVE) technology is an important feature for airborne platforms, but it appears even my CAR needs degraded visual environment technology! Degraded visual environment systems use Radio Frequency or Electro Optic sensing technologies – or combinations thereof – as well as data fusion algorithms to provide the best imaging solution available to the pilot; this essentially allows a pilot to ‘see’ their landing area when dangerous weather or ground conditions otherwise make ground visuals almost impossible to see, and landing a hazard. This not only keeps the pilot and crew safe, but helps avoid unnecessary aborted landings, aircraft rerouting, and the subsequent loss of money, time, and resources. Degraded visual environment’s undeniable benefits to the pilot and crew of aircraft reminded me of similar safety technology used every day on our roads.
The other day, my car went into the shop for a few days of long overdue repairs. I was given a loaner vehicle – a shiny new model with all the latest technology: digital dashboard, heated steering wheel (good for those cold Canadian winters), and some nifty driver assist features that I had read about, but not yet experienced. At first, I was delighted to change the look of the dash from Elegance White to Performance Red at the push of a button. I’ve been called an aggressive driver, and Performance Red suits me just fine. As I explored the car over the next few days, I began to feel an unnerving and somewhat concerning sense of the future of our roadways and felt this car was a cautionary precursor to the next generation of semi- and autonomous driving vehicles…. and I’m not sure we’re truly ready for it.
As I passed by a speed limit sign at the side of the road, my dashboard came alive with a matching Maximum Speed sign graphic, and then immediately scolded me with flashing lights that I had exceeding the posted speed limit. I felt a strange unease – how dare my car tell me how fast to drive! As I turned onto a side street, where I knew the speed was different, the dashboard remained steady with the previous road’s speed indication. Tapping my inquisitive mind, I determined the car was not identifying the road speed via GPS locations – it had cameras looking out the front window to identify speed signs in real-time as I drove. It even told me to slow down as I went through a construction zone, capturing and identifying the temporary reduced speed sign. Smart car, I thought to myself.
Later that same day, I pressed a button on the dash and activated the lane departure feature. The car, as I later researched, uses cameras to identify the lane marking lines on the road just as all us human drivers unconsciously do with our eyes. I quickly learned that if I were to drift to the edge of my lane without activating the turn signal, indicating my intent to switch lanes, the car would warn me and the steering wheel would ever-so-gently nudge me back into my lane. I experienced the full effect of this feature when I made a lane change without activating my turn signal (I was not intentionally testing the car…sigh). The steering wheel rumbled and nudged me back to where I should have been, and a flashing light near my right side mirror illuminated, indicating another car was in the adjacent lane - behind me but rather close. I was immediately thrown back to the day when I was 16 years old, with the Driver’s Education instructor sitting in the passenger seat, reminding me to signal my intent and to perform that ever-important shoulder check before switching lanes. Smart car, I once again thought, as I activated the turn signal and accelerated into the next lane to avoid cutting off the nearby car.
That night, my neighbor told me he’s been waiting for the newer model of the car I was driving. It’s more advanced than my loaner model, and (apparently) has the ability to fully drive at highway speeds and stay within the lane indicators - all without human control. Pilot Assist is what they are calling it – part of the “semi-autonomous features” package. Park Assist will parallel park the car for you, and City Safety will automatically brake if somebody, or something, jumps out in front of your car.
This IS the future of driving, I thought. My future car will have the ability to drive itself and adjust to the road conditions just as we expect every driver that gets behind the wheel today to do. Yet the engineer within me wonders if this technology is truly ready for prime time.
Several times, I experienced the steering wheel nudge me when I was, truthfully, in the middle of my lane. Slow curves, I wondered, might be a weakness in its algorithm. I was relieved that the car’s only action was a steering wheel rumble and a gentle nudge, and not a “human override” action of taking full control of the vehicle. An oddly familiar mechanical voice whispered in the back of my head: “I’m sorry Aaron, I can’t let you do that”. Over the next couple of days, I monitored my car’s Pilot Assist feature, as it nudged me periodically, and critically monitored my engrained driving habits. One day, when it was raining an all-too-familiar spring shower, I intentionally let go of the steering wheel to see how well my silicon pilot could stay within the lane markers. Sure enough, the car gradually slid into the adjacent lane without even a single flash or warning. It became clear to me that my Pilot Assist could not discern the lane markers on the wet pavement, and needed the help of my trusty trained eyes.
What if cars had degraded visual environment systems to guide them on the road? Would the car have been able to process visual input from a variety of different camera sensors, and merge them into a composite image where the lane markers would be identifiable? My car would have stayed in the lane, and the Pilot Assist feature would have avoided a potentially hazardous lane change. I can only hope that, for my neighbor’s sake, the makers of his new car have tweaked the Pilot Assist algorithm with some basic degraded visual environment features for the coming model year.
Degraded Visual Environments Video
Degraded Visual Environment systems provide pilots with the advanced visual technology needed to safely land their aircraft. See how our DVE systems can help you!
Degraded Visual Environment (DVE): Uncovering the Invisible Case Study
Using a Degraded Visual Environment (DVE) system enables aircraft pilots to visualize the landing area in poor conditions, allowing them a safe, on time descent to their target destination.
Senior Product Manager
Aaron Frank joined Curtiss-Wright in January 2010. As a Senior Product Manager within the C5ISR group, he is responsible for a wide range of COTS products utilizing advanced processing, video graphics/GPU and network switching technologies in many industry-standard module formats (VME, VPX, etc.). His focus includes product development and marketing strategies, technology roadmaps, and being a subject matter expert to the sales team and with customers. Aaron has a Bachelor of Science in Electrical Engineering degree from the University of Waterloo.