A new era in autonomous navigation or for what did we drive toy-cars at CES 2019?
The Navigine team in partnership with Analog Devices created and presented a brand-new concept of autonomous car. Not only it is a cutting-edge marketing tool, it is also a hands-on demonstration of the operational capabilities of the new Analog Devices sensors. How does this project contribute to new generation of drones and trigger a revolution in autonomous navigation?
We have officially started 2019 year with an inspiring launch of our autonomous car project made in partnership with Analog Devices, a global leader company in the design and manufacturing of analog, mixed-signal, and DSP integrated circuits. This project was commenced in October and after three months of meticulous work on navigation algorithms, special drivers, prototype construction and components production was finally presented during Consumer Electronics Show (CES) January, 9-12 in Las Vegas, USA.
All visitors got an opportunity to drive four toy-cars (0,5 m X 0,3m size), equipped with Analog Devices company’s triaxial accelerometer and triaxial gyroscope, and to evaluate the performance of the sensors in terms of power and accuracy. The Navigine team created modified positioning algorithms for these autonomous cars. The obtained positioning accuracy became a convincing evidence of high quality and sophisticated technical characteristics of Analog Devices products compared to other products on the market.
This showcase was not only a great marketing experiment for visitors’ attraction and entertainment, but also a hands-on demonstration of the operational capabilities of the sensors. The real-life tests confirmed that the limits of Analog Devices sensors is higher than of those presented by competitor sensor manufacturer.
How was it?
So, how did we spend those fascinating days at CES? We drove cars, of course! And not only for entertainment.
While an toy-car was driving along a track, Navigine’s system recreated it’s trajectory on the screen. The positioning was based on the data collected via inertial sensors (accelerometer, gyroscope) as well as on the measurements of shaft revolutions and a level of pressure on a joystick trigger. Simultaneously a user could see a real trajectory of his car with the help of another computer model which was generated based on data from optical devices, and compare them. The smaller was a difference between these two trajectories, the recreated and the real, the better were the tested sensors.
For instance, our car demonstrated only 0,15 m mistake after ten rounds on the track. It is a great result for such an experiment. Nevertheless, there are three more factors, influencing the final quality and accuracy of the obtained solution: absence of collisions, correct determination of motion direction and wheel slips. Firstly, a reconstructed trajectory has to be visually smooth. A car could not fly or jump, that is why fractures or breaks of measured trajectory are not acceptable. Secondly, if your car runs 10 m, the system has to evaluate this distance as 10 m, not 7 or 15. Thirdly, the direction of a car should be the same from the start to the finish. If a car started to drive north, in the end a measured direction must be northward. And our algorithms proved that Analog Devices sensors coped with all these challenges.
And what about our difficulties?
The creation of special navigation algorithms for autonomous cars was a real challenge for us, too! It is much easier to develop a positioning system for a real car than for a toy one. Unlike the latter, a real automobile does not crash into things, turn over or deal with other emergencies so often. Moreover, a navigation system of a real car usually has an access not only to inertial sensors but also to a variety of optical devices, such as video cameras and radars. And an idea of this autonomous car experiment was to evaluate the sensors’ ability to position an object without any optical devices, in all circumstances, including underground and remote areas.
The first challenge, a toy car had provided, was a bigger dynamic. Evidently, a toy car has some ‘super powers’: it brakes hard, revolves around itself swiftly and lengthily, drifts, speeds up and hits obstacles. All these effects are so spectacular, but at the same time they overheat inner sensors and provide an additional challenge for us, such as elaboration of appropriate algorithms, able to process data from these overheated sensors correctly.
The second problem was questionable accuracy of data about the amount of shaft revolutions. Usually this data helps us to identify the speed of an object. However when a wheel is turning, its radius is increasing due to the centripetal acceleration, as a result we had to evaluate this parameter dynamically.
What is next?
There are two potential ways of further development of this autonomous car project. The first one is using such drones in preparation of digital maps. Nowadays this process requires a lot of time and human and financial resources. Usually it requires special employees, who walk around a building with a smartphone and measure the level of radio signal in different spots. During these walks a person has to spend some time in every testing spot and regularly upload the data about his own location in our system. Not only is it long and boring work for employees, it is also an inconvenient process for some of our customers. Malls are an example. While usually such buildings are crowded during the day, they are empty at night. However, it is quite difficult to organize night work.
Autonomous cars could be used instead. One car has an opportunity to carry a few (5-7) smart phones and be controlled by special signals. So, such a drone could independently
drive around a building and automatically collect and upload the data both about the signal intensity and its own location. Moreover, it is possible to leave such cars for a few days’ work, including, nights.
This is a great tool not only for costs but also for time reduction. Firstly, a car is able to position itself and gather statistics simultaneously. Secondly, such drones do not need breaks for rest, eat and sleep.
Another prospect of development our autonomous cars project is to implement these sensors and algorithms in real automobiles, as well as other vehicles for positioning them without GPS. So called “cold start” is one instance. When we start a car, we need to wait for some time before catching the GPS-signal and positioning. Our autonomous system ensures the instant positioning and readiness for navigation. Another sphere for implementation is car searching. A lot of cars regularly disappear from radars. It is a huge problem for logistics or car-sharing companies, as well as for police and rescue services.
We hope that our partnership with Analog Devices company will continue and together we can push a revolution in autonomous navigation and contribute to new generation of drones.