skip to main content
poster

Synchronising Physiological and Behavioural Sensors in a Driving Simulator

Published: 12 November 2014 Publication History

Abstract

Accurate and noise robust multimodal activity and mental state monitoring can be achieved by combining physiological, behavioural and environmental signals. This is especially promising in assistive driving technologies, because vehicles now ship with sensors ranging from wheel and pedal activity, to voice and eye tracking. In practice, however, multimodal user studies are confronted with challenging data collection and synchronisation issues, due to the diversity of sensing, acquisition and storage systems. Referencing current research on cognitive load measurement in a driving simulator, this paper describes the steps we take to consistently collect and synchronise signals, using the Orbit Measurement Library (OML) framework, combined with a multimodal version of a cinema clapperboard. The resulting data is automatically stored in a networked database, in a structured format, including metadata about the data and experiment. Moreover, fine-grained synchronisation between all signals is provided without additional hardware, and clock drift can be corrected post-hoc.

References

[1]
S. H. Fairclough, "Fundamentals of physiological computing," Interacting with Computers, vol. 21, no. 1-2, pp. 133--145, Jan. 2009.
[2]
M. Bartels and S. P. Marshall, "Measuring cognitive workload across different eye tracking hardware platforms," presented at the Proceedings of the symposium on eye tracking research and applications, 2012, pp. 161--164.
[3]
D. De Waard and V. Studiecentrum, The measurement of drivers' mental workload. Groningen University, Traffic Research Center, 1996.
[4]
J. Engström, E. Johansson, and J. Östlund, "Effects of visual and cognitive load in real and simulated motorway driving," Transportation Research Part F: Traffic Psychology and Behaviour, vol. 8, no. 2, pp. 97--120, 2005.
[5]
J. Son, Y. Lee, and M.-H. Kim, "Impact of traffic environment and cognitive workload on older drivers' behavior in simulated driving," International Journal of Precision Engineering and Manufacturing, vol. 12, no. 1, pp. 135--141, 2011.
[6]
M. Akin, M. B. Kurt, N. Sezgin, and M. Bayram, "Estimating vigilance level by using EEG and EMG signals," Neural Computing and Applications, vol. 17, no. 3, pp. 227--236, 2008.
[7]
R. N. Khushaba, S. Kodagoda, S. Lal, and G. Dissanayake, "Driver drowsiness classification using fuzzy waveletpacket-based feature-extraction algorithm," Biomedical Engineering, IEEE Transactions on, vol. 58, no. 1, pp. 121--131, 2011.
[8]
T. C. Chieh, M. M. Mustafa, A. Hussain, E. Zahedi, and B. Majlis, "Driver fatigue detection using steering grip force," presented at the Research and Development, 2003. SCORED 2003. Proceedings. Student Conference on, 2003, pp. 45--48.
[9]
S. Hu and G. Zheng, "Driver drowsiness detection with eyelid related parameters by Support Vector Machine," Expert Systems with Applications, vol. 36, no. 4, pp. 7651--7658, 2009.
[10]
R. F. Knipling, W. W. Wierwille, and I. America, Vehiclebased drowsy driver detection: Current status and future prospects. National Highway Traffic Safety Administration, Office of Crash Avoidance Research, 1994.
[11]
A. J. Karran, S. H. Fairclough, and K. Gilleade, "Towards an adaptive cultural heritage experience using physiological computing," in CHI '13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, 2013, pp. 1683--1688.
[12]
S. Oviatt and A. Cohen, "Written and multimodal representations as predictors of expertise and problemsolving success in mathematics," in Proceedings of the 15th ACM on International conference on multimodal interaction, Sydney, Australia, 2013, pp. 599--606.
[13]
A. L. Thompson and D. Bohus, "A framework for multimodal data collection, visualization, annotation and learning," in Proceedings of the 15th ACM on International conference on multimodal interaction, Sydney, Australia, 2013, pp. 67--68.
[14]
J. Zhang, J. Jia, Q. Zhang, and E. M. K. Lo, "Implementation and Evaluation of Cooperative Communication Schemes in Software-Defined Radio Testbed," in INFOCOM, 2010 Proceedings IEEE, 2010, pp. 1--9.
[15]
A. M. Arthur, R. Lunsford, M. Wesson, and S. Oviatt, "Prototyping novel collaborative multimodal systems: simulation, data collection and analysis tools for the next decade," in Proceedings of the 8th international conference on Multimodal interfaces, Banff, Alberta, Canada, 2006, pp. 209--216.
[16]
A. Amditis, A. Polychronopoulos, I. Karaseitanidis, G. Katsoulis, and E. Bekiaris, "Multiple sensor collision avoidance system for automotive applications using an IMM approach for obstacle tracking," in Proceedings of the Fifth International Conference on Information Fusion, 2002, 2002, vol. 2, pp. 812--817 vol.2.
[17]
IBM and Eurotech, "MQTT v3.1 Protocol Specification." IBM, 19-Aug-2010.
[18]
V. Lampkin, W. T. Leong, L. Olivera, S. Rawat, N. Subrahmanyam, and R. Xiang, Building Smarter Planet Solutions with MQTT and IBM WebSphere MQ Telemetry. 2012.
[19]
O. Mehani, G. Jourjon, T. Rakotoarivelo, and M. Ott, "An Instrumentation Framework for the Critical Task of Measurement Collection in the Future {I}nternet," Computer Networks, vol. 63, pp. 68--83, Apr. 2014.
[20]
B. Dainton, "Temporal Consciousness," in The Stanford Encyclopedia of Philosophy, Spring 2014., E. N. Zalta, Ed. 2014.

Cited By

View all
  • (2020)Automated Time Synchronization of Cough Events from Multimodal Sensors in Mobile DevicesProceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418855(614-619)Online publication date: 21-Oct-2020

Index Terms

  1. Synchronising Physiological and Behavioural Sensors in a Driving Simulator

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '14: Proceedings of the 16th International Conference on Multimodal Interaction
    November 2014
    558 pages
    ISBN:9781450328852
    DOI:10.1145/2663204
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 November 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cognitive load
    2. data collection and storage
    3. driving simulator
    4. multimodal data synchronisation
    5. physiological and behavioural sensors

    Qualifiers

    • Poster

    Funding Sources

    Conference

    ICMI '14
    Sponsor:

    Acceptance Rates

    ICMI '14 Paper Acceptance Rate 51 of 127 submissions, 40%;
    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 22 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)Automated Time Synchronization of Cough Events from Multimodal Sensors in Mobile DevicesProceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418855(614-619)Online publication date: 21-Oct-2020

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media