Self-driving vehicle R&D has been evolving over the past few years. Self-driving depends on several integrated technologies, one of which is high-definition mapping & map-based localization. The art and science of mapping along with national and international standards and specifications, have resulted in adequate quality mapping products especially in the civilized part of the world for construction, infrastructure, and transportation networks but they are not satisfactory enough for more critical precision needed for autonomous driving. The developing part of the world lacks the existence of many mapping products, let alone high-definition maps. Therefore, on the one hand, the lack of high-definition maps might be an obstacle for self-driving vehicle development.
On the other hand, sensor technology has significantly evolved in the past few decades. Including:
– Multi-head cameras using oblique/nadir components
– Outstanding lidar systems and radar sensors
– The evolution of GPS into GNSS that integrated satellite positioning systems from around the globe
– The evolution of inertial sensor technology due to their successful use in the automotive industry to trigger airbags and even their implementation in cellular phones that led to smaller, lighter weight, higher accuracy, and more economical Inertial measuring units (IMUs).
Reaping the benefits of these evolving technologies to address the high-definition mapping challenges is the main scope of this proposed session. Designed to be informative, educational, and interactive, the workshop is designed to address the following audience categories:
– Decision Makers
– Technicians, Technologists, and Engineers
– Students
The workshop will start by addressing the ‘here and now’ technological challenges and their associated solutions, as well as the future technological development plans to address that. First, the technological evolution of film into digital cameras, GPS into GNSS, laser profilers into lidar systems, INS into IMU, etc., will be addressed, including the algorithmic evolution of aerotriangulation, SLAM, direct georeferencing (DG) and integrated sensor orientation (ISO). The benefits of integrating all the aforementioned technologies in the form of simultaneous adjustment and mapping (SAM). Then, it will address the integration mechanisms of imaging, lidar, radar, inertial and GNSS into not only georeferencing an image or lidar range but also leveraging the precise lidar ranges and image pixels to improve the precision of the trajectory in post-processing and real-time modes.