Specialists at Innopolis University have developed a simulator that will make autonomous vehicles cheaper and safer

Developed by Unity 3D, Innopolis Simulator for autonomous moving objects can be used by developers and end-users for testing, debugging and educational activities. The Russian IT University Invention was presented on October 15.

Researchers of the Center for Technologies in Robotics and Mechatronics Components at the Innopolis University have created a software package for debugging and modeling the behavior of a fully featured model in the simulator. With this software the developers of autonomous systems can guard critical design errors, correct them in the early stages, conduct more prototype tests and save on field tests. “We will need 2-3 people, who have to do a lot of rides on a real site to test a real car. With the simulator we can do everything in-building and with the participation of one person, explains Sergey Kopylov, the research engineer at the Autonomous Transportations Laboratory.

Специалисты Университета Иннополис разработали симулятор, который сделает беспилотники дешевле и безопаснее

Innopolis Simulator considers possible various scenarios on the road, simulates the movement of traffic and pedestrians and their detection, simulates all the necessary detectors and sensors – radars, lidars, cameras, GPS, IMU. The software includes ground truth modules that determine the exact location of objects in space.

We also developed a system for visualizing active cameras, which transports images to the drone system in real time. We created modules for mapping and dynamic loading of maps, the surface texturing in real time, of the landscape and some environmental objects, depending on the position of the vehicle in global coordinates” continues Sergey Kopylov. – We made possible to convert coordinates from different systems and create a roadway. Integrated module for creating and exporting HD cards”.

Innopolis Simulator 2.PNG

Compared to analogs, the Innopolis University simulator has 10 unique parameters: flexible configuration of each sensor, removal of datasets, development of unique scenarios, support for various models of moving objects, a module for mapping, geocoding and procedural generation of surfaces and objects depending on the vehicle’s position, creation module and export of HD maps, daytime and weather effects control module, an analytics module, scene editing module for the runtime.

“The development of a virtual environment for simulating the movement of a robotic city car will reduce the cost of its development process, especially in hardware setup, sensory and control equipment testing. Specialists will understand what equipment is needed, what can be abandoned. Our simulator will help to increase the driving safety in a real urban environment, added Alexander Klimchik, Head of the Center for Technologies in Robotics and Mechatronics Components.