Retrofitting a Classic Mini with an autonomy stack

Listen to this article

Voiced by Amazon Polly
Classic Mini

The sensor array on Tangram Vision’s Classic Mini. | Photo Credit: Tangram Vision.

For the past decade, an increasing number of self-driving test mules have plied the streets of Silicon Valley. Waymo. Zoox. Motional. Cruise. They’ve become so commonplace that residents like myself barely blink an eye when a Jaguar I-Pace outfitted with multiple LiDARs, cameras, and radar units silently glides by.

But put those cars on a race track? Now that’s a spectacle. And that’s exactly what Joshua Schacter has done since 2016, when he launched Self Racing Cars (SRC). Since then, SRC has become an autonomous proving ground for companies big and small.

SRC lets these companies, as well as hobbyists, students, and researchers, test the capabilities of their autonomous vehicles at California’s legendary Thunderhill Raceway Park. Vehicles are grouped into classes like fully autonomous, tele-operated, or human-driven but sensor equipped. Over the weekend, each vehicle gets multiple opportunities to set lap times, capture data, test systems, and compete to see which vehicle can run the fastest autonomous lap.

As an automotive enthusiast who has also worked in perception and sensors for nearly 15 years, being a part of SRC was not a matter of if, but when. Fortunately, participating with Tangram Vision was the perfect opportunity to head to the track and test new sensor streaming, fusion, and runtime modules that our team has been building over the past few months.

But what vehicle to bring? As you may have seen from the main image of this article, the vehicle we chose is not a typical platform for autonomy or sensor testing. And, as it turns out, outfitting a Classic Austin Mini Cooper with sensors is not a straightforward task.

Mechanical mounting

One of the first challenges we tackled was determining where the sensors could go, and how they would be mounted to the Mini. Given its 1950s origins, the Mini was clearly not designed with sensors in mind, much less many other items we take for granted on modern cars, like safety equipment. In a Classic Mini, you are the bumper.

Modern car design offers many suitable options for sensor mounting. They feature rigidly mounted side mirrors, which can be used as a stable platform to attach a sensor. They often have flat roofs upon which a sensor can be easily mounted. Flat windshield and rear window glass allows for quick internal mounting of cameras. A Classic Mini has none of these features. With the exception of the door skins and side window glass, everything is curved, which complicates the task of finding stable, flat mounting surfaces for sensors. Therefore, we turned to a solution that many other autonomous vehicle developers have chosen for prototyping: a roof rack.

Given that the Classic Mini has been out of production for 21 years, there are no longer bespoke racks being produced for it. Fortunately, the Mini’s 1950s design is equipped with prominent rain gutters on the roof, which is similar in design to what you find on a modern Jeep Wrangler. This is exactly the rack we found that we could adapt to the Classic Mini.

Having sourced an appropriate rack, our remaining mechanical mounting needs were solved by multiple trips to a local Home Depot. We built a flat, rigid platform for the sensors with 16-gauge steel panels that we bolted directly to the rack cross bars. Our chosen sensors (two Velodyne Pucks and an Intel RealSense D435i) all included a threaded insert for tripod mounting, which used standard 1/4″-20 threads. We were able to easily attach all of the sensors using 1/4″-20 bolts upthreaded through the metal platform.

We needed to raise our LiDAR units further off the roof to ensure that they would be able to capture sufficient data in 360 degrees around the Mini. It turns out that the 4” footprint of the Velodyne Puck units is a perfect fit on an electrical junction box — and that is exactly what we used. As a directional sensor, the Intel RealSense D435i simply needed to mount at the front of our sensor platform. An upside down 1/4″-20 bolt at the front of the rack did the trick. To keep vibration to a minimum, all sensors were isolated from the metal rack with red rubber packing gaskets. With the three sensors securely mounted in their proper positions, our next step was to route cabling for power and data.

Tangram Vision’s sensor-equipped Classic Mini at Self Racing Cars 2021. | Photo Credit: Tangram Vision.

Transmitting data and power

Both the Velodyne and RealSense units require an AC power source, which meant installing an AC/DC inverter in the Mini. Our first concern was whether the Mini’s electrical system would even be up to the task of powering the inverter, as it would need to power the three sensors, a USB hub, and a laptop PC. After all, the Mini’s electrical system used Lucas components. Lucas is affectionately known as the “Prince of Darkness” among British auto enthusiasts due to the manufacturer’s reputation for spotty quality and sudden component failures. Fortunately, the Mini’s electrical system did just fine, powering all components reliably through the event.

With power solved, the last challenge was data cabling from the sensors to a compute source. The Intel RealSense uses a single USB-C port for both power and data, hence the powered USB hub. This also meant sourcing a USB cable that could transmit both power as well as data at a high enough rate over a long length; the cable that came with the RealSense was not long enough to reach from the center of the Mini’s roof to the USB hub in the interior. The Velodyne sensors split power and data into two, with the latter achieved via Cat6 Ethernet, with no practical limits on cable length. With our sensor rig mounted, powered, and transmitting data, the final step was capturing and processing the data it generated while hurtling around Thunderhill at high speed.

Tangram Vision Classic Mini

Component, power, and data diagram for the Classic Mini sensor array. | Photo Credit: Tangram Vision.

Software testing for the Tangram Vision SDK

We tested three aspects of the Tangram Vision SDK at SRC: sensor runtime, multi-modal sensor synchronization, and LiDAR streaming.

Perhaps it was apt that we powered our RealSense with a Lucas alternator, as the RealSense series has gained a Lucas-like reputation for reliability among its many users. RealSense sensors can shut down unexpectedly, and can prove difficult to reboot quickly after a shutdown. The Tangram Vision runtime module is designed to solve these stability issues by integrating RealSense libraries into Rust, a memory-safe programming language that is becoming increasingly popular in robotics. By leveraging many of the safety features of Rust, the Tangram Vision runtime was able to stream the USB-equipped D435i consistently and reliably throughout the weekend. This was tested with an instant boot prior to our lapping session, and thirty minutes of continuous, fault-free data collection during our high-speed mapping laps of Thunderhill’s West track.

Our Velodyne LiDAR testing was simpler; we’ll be releasing support for LiDAR runtime, calibration, and spatial registration in the Tangram Vision SDK in the near term, and SRC allowed us an opportunity to test our LiDAR pipeline. Unfortunately, one of our Velodyne Puck units failed before we began testing, so we were only able to capture data from a single LiDAR unit during the event.

For the Velodyne Puck and Intel RealSense D435i sensors that did work during the SRC weekend, we were able to capture simultaneous datasets to test real-time sensor synchronization across two different sensing modalities from two different sensor manufacturers (these synchronized data sets, as well as other team’s data sets, will be released on the Self Racing Cars website in the next few weeks).

We’d be remiss if we did not mention the harsh environment under which our systems and other teams’ systems were tested. With a noontime high temperature of 86°F and no clouds in the sky, cars, sensors, and drivers alike were heat-soaked and saturated in full-spectrum sunlight. Thunderhill’s two-mile long West track was opened in 2014, with a challenging layout full of decreasing radius corners, elevation changes, and off-camber chicanes. Collectively, these twists and turns ensure that even the most stable sensor will experience physical forces outside of what would be encountered on a typical city street or inter-urban highway.

NVIDIA Self Racing Cars

NVIDIA R&D’s multi-sensor equipped Ford Fusion test car. | Photo Credit: Tangram Vision.

Other teams at SRC

SRC attracts a diverse set of teams that bring different kinds of vehicles with different levels of autonomy. Along with Tangram Vision, other teams that participated in SRC this year included:

  • PointOne Navigation: The company provides spatial localization for autonomous and ADAS-enabled cars. PointOne Navigation not only completed this year’s fastest full autonomous lap with its self-driving Lexus, but it also completed a full autonomous lap … in reverse. Over the two-mile course, its forward-facing lap time was a quick 2:49, with an even more impressive lap time of 4:37 in reverse.One more fun fact: PointOne’s autonomy stack was developed by CEO Aaron Nathan for the 2007 DARPA Urban Challenge. This 14-year-old code was written in C# and runs on Windows 7, yet it still regularly excels every year when Aaron brings it to SRC.
  • Qibus: vehicle tele-operation on demand
  • Faction: lightweight, driverless vehicle fleets for delivery and transportation. Faction uses ArciMoto three-wheeled EVs, and completed multiple autonomous laps at the event.
  • AEye: high-performance, adaptive LiDAR sensors
  • NVIDIA: the R&D team tested a Ford Fusion outfitted with multiple LiDARs, cameras, radars, and other sensors.
  • Monarch Tractor: The company develops compact, autonomous, electric tractors. Unfortunately, Monarch’s tractor was too heavy to be allowed on track, but it was able to navigate autonomously around the track paddock.
  • Boltu Robotics: it builds autonomous delivery robots. Boltu brought an autonomous Prius to this year’s event.

As it did this year after its 2020 COVID-19 hiatus, Self Racing Cars will return to Thunderhill Raceway Park in 2022 for another weekend of autonomous excitement. Tangram Vision will be back with our Classic Mini Cooper with added evolution in our sensor package. That said, we’re still trying to figure out how to automate a manual gear shift. Got any ideas for us? Whether or not you can help us figure that one out, we highly recommend that you sign up to participate or be a spectator at next year’s event.

Adam Rodnitzky

About the Author

Adam Rodnitzky is a serial entrepreneur in perception and sensors. He was co-founder of ReTel Technologies, one of the first companies to use human-in-the-loop applied to video analytics. After ReTel, he joined Occipital as GM and launched Structure Sensor and SDK, the most ubiquitous mobile depth sensor and SDK platform for iOS.

Adam is now COO and co-founder of Tangram Vision. Tangram Vision helps autonomy and robotics companies reliably scale their perception stack with enterprise-grade tooling to manage sensor integration, stability, fusion, calibration, and more.


Credit: Source link

Comments are closed.