Dynamic Multi-modal Fusion for Biosignal-based Motion Sickness Prediction in Vehicles
View abstract on PubMed
Summary
This summary is machine-generated.Autonomous vehicles present motion sickness challenges. A new dynamic multi-modal fusion framework effectively classifies motion sickness by adaptively focusing on relevant biosignals in real-world driving.
Area Of Science
- Automotive Engineering
- Biomedical Engineering
- Human-Computer Interaction
Background
- Motion sickness (MS) is a significant challenge for autonomous vehicle (AV) adoption and passenger comfort.
- Traditional MS research methods lack real-world complexity and robust data analysis capabilities.
- Multi-biosignal fusion offers a promising approach to capture complex physiological responses related to MS.
Purpose Of The Study
- To introduce a novel dynamic multi-modal fusion for motion sickness classification (DMFMS) framework.
- To address the limitations of traditional MS detection in noisy, real-world driving environments.
- To enhance the accuracy and reliability of MS prediction in autonomous vehicles.
Main Methods
- Developed a DMFMS framework incorporating confidence-aware learning for modality reliability estimation.
- Implemented a dynamic gating mechanism to adaptively weigh modality contributions.
- Integrated a spatial-temporal attention module (STAM) to focus on relevant biosignal information.
- Conducted experiments using a multi-biosignal dataset from 13 subjects in real driving scenarios.
Main Results
- The DMFMS framework demonstrated superior performance compared to conventional MS prediction models.
- Adaptive focusing on significant samples and noise evaluation improved classification accuracy.
- Confidence-aware learning and dynamic gating effectively managed noisy multi-modal data.
- The STAM successfully filtered extraneous information, enhancing focus on critical biosignals.
Conclusions
- The proposed DMFMS framework represents a superior solution for detecting motion sickness in real-world autonomous driving.
- Dynamic multi-modal fusion effectively handles the complexities of biosignal data in noisy environments.
- This approach enhances passenger comfort and safety in autonomous vehicles.

