Advancements in Cooperative Driving Systems using Digital Twin Technologies and in Content-Adaptation Rendering for Demanding XR Interactions.

A Showcase of DIDYMOS-XR’s recent research achievements.

The University of Patras (UPAT), as a vital member of the DIDYMOS-XR team, recently achieved a significant milestone with the acceptance of their scientific paper titled “Cooperative Saliency-Based Pothole Detection and AR Rendering for Increased Situational Awareness”. It has been accepted for publication in the IEEE Transactions on Intelligent Transportation Systems, a high-impact factor journal.

This particular work revolves around a groundbreaking cooperative obstacle detection and rendering scheme that employs LiDAR data and driving patterns to identify obstacles within the road range. The proposed system facilitates information sharing among connected vehicles, allowing drivers to be notified about approaching potholes and other road obstacles even when there is no direct line of sight. This cooperative driving scheme not only enhances the situational awareness of drivers, but also significantly reduces the risk of accidents caused by unexpected obstacles through automated cooperative obstacle detection, visualization, and information sharing in a V2X (vehicle-to-everything) setting. Additionally, a point cloud processing system takes road environment data as input and classifies it into safe and potentially hazardous regions by identifying obstacles within the road’s range.

UPAT’s active participation extended to the INTERNATIONAL CONFERENCE on Intelligent Systems & Consciousness Society 2023 (ITS2023), hosted by the University of Patras on November 2-3, 2023. At this prestigious conference, the DIDYMOS-XR team presented a keynote address titled “Digital Twins and the City of the Future: Sensing, Reconstruction, and Rendering for Advanced Mobility.”

The core focus of the presented research lies in the development of an adaptive, scalable, Explainable Artificial Intelligence (XAI)-powered, interoperable processing, and Augmented Reality (AR) rendering system, supported by credible reconstructed Digital Twins. The implementations are designed to meet users’ requirements, rendering explainable information in a non-distractive but understandable-by-everyone (personalized) way. An AI pre-processing toolkit manages data processing tasks, including denoising, registration, completion, reconstruction, segmentation, and saliency estimation of the input received by the driver’s monitoring system to create representative Digital Twins of the scene. This system performs extensive scene analysis and object recognition under diverse and dynamically changing conditions such as weather, lighting, and traffic. Furthermore, cooperative information from other connected vehicles, obtained through Vehicle-to-Everything (V2X) infrastructures, is evaluated to enhance the accuracy of the results.

Another noteworthy event in UPAT’s recent endeavours was their participation in the 20th EuroXR International Conference 2023, held in De Doelen, Rotterdam, on November 29-01, 2023. The UPAT team presented a conference paper titled “Aggressive Salience-aware Point Cloud Compression.”

This paper introduces a geometry-based, end-to-end compression scheme designed for large point cloud scenes. The proposed method emphasizes the most visually significant parts of the point cloud and compresses the position of each point based on an extended saliency metric, which combines the viewer’s relative position and geometric saliency. The quality reduction in perceptually insignificant parts of the scene adds a realistic sensation and is not noticeable by the user, even for aggressive compression rates. Qualitative tests demonstrated that the quality of the reconstructed point clouds remains almost unaffected even for very small bit rates. This is particularly advantageous for highly demanding XR interactions and user-based content-adaptation rendering applications.

These achievements underscore UPAT’s commitment to cutting-edge research and technological advancements in intelligent transportation systems and XR applications.

By Gerasimos Arvanitis and Konstantinos Moustakas