Theme
NNG at ELIV 2023
Conference
NNG CTO Dr. Martin Pfeifle will attend and speak at International VDI Congress "ELIV 2023" in Bonn, Germany.
Bonn, Germany
18 Oct
19 Oct, 2023
Augmented Reality (AR) apps for depicting guidance announcements, either on a front facing camera or a Head-up-Display, are drawing increasing attention. In [1], [2] we outline how visual-inertial odometry, combined with HD lane and landmark information, leads to a precise 6-Degree of Freedom pose (x, y, z, roll, pitch, yaw) which forms the foundation for precise AR guidance applications. The precise positioning and lane data allow for the mapping of guidance information precisely to road and lane elements (cf. Figure 1).
In this paper, we will show that high quality AR apps are possible without HD maps, especially if we consider ADAS information as well [3]. Firstly, HD maps are commonly not available in rural areas and thus the whole AR guidance experience would be limited to a rather small operational design domain. Secondly, guidance information should not be limited to static map data but should also take the objects around the car into consideration. For instance, the driver should be informed if the trajectory of vulnerable road users, e.g., bicyclists, interferes with their trajectory (cf. Figure 2).
In the paper, we will explain that it is possible to create an adequate 6-DoF pose by using visual inertial odometry, without HD landmark information. This pose will then be used to draw guidance information, derived from a navigation system, into a 3D scene, e.g., turn information, POIs, or house numbers (cf. the blue arrows in Figure 2, and the scenarios shown in [3]). In addition, an active and complete visual perception engine detects cars, pedestrians, bikes, traffic signs, traffic lights, lane information, etc. This perception engine can run inside the AR module, processing front-facing camera images. Alternatively, the AR module can read perception information provided by an external ADAS unit. On top of the perception element, the AR guidance module needs to decide which objects need to be highlighted. To do so, the trajectory of the ego vehicle needs to be checked against the most likely trajectories of all other road users and, in cases of possible collision, warn the driver accordingly (cf. Figure 2 and the scenarios shown in [3]).
The lecture will be prepared jointly by the authors, who have decades of experience in the industry. The presenter, Dr. Martin Pfeifle, CTO of NNG, has 20 years of experience in Standardization, Navigation, Cloud, and ADAS Perception and has delivered keynote speeches at various industry conferences, incl. ELIV 2021 and ELIV 2022.
Get in touch to learn more about our latest products and services or company news