skip to main content
research-article
Public Access

LineChaser: A Smartphone-Based Navigation System for Blind People to Stand in Lines

Published: 07 May 2021 Publication History

Abstract

Standing in line is one of the most common social behaviors in public spaces but can be challenging for blind people. We propose an assistive system named LineChaser, which navigates a blind user to the end of a line and continuously reports the distance and direction to the last person in the line so that they can be followed. LineChaser uses the RGB camera in a smartphone to detect nearby pedestrians, and the built-in infrared depth sensor to estimate their position. Via pedestrian position estimations, LineChaser determines whether nearby pedestrians are standing in line, and uses audio and vibration signals to notify the user when they should start/stop moving forward. In this way, users can stay correctly positioned while maintaining social distance. We have conducted a usability study with 12 blind participants. LineChaser allowed blind participants to successfully navigate lines, significantly increasing their confidence in standing in lines.

Supplementary Material

VTT File (3411764.3445451_videofigurecaptions.vtt)
VTT File (3411764.3445451_videopreviewcaptions.vtt)
MP4 File (3411764.3445451_videofigure.mp4)
Supplemental video
MP4 File (3411764.3445451_videopreview.mp4)
Preview video

References

[1]
Japan Tourism Agency. 2020. Manual For Welcoming People With disabilities and Elderlies. Retrieved in December 28, 2020 from https://www.mlit.go.jp/common/001226565.pdf.
[2]
Dragan Ahmetovic, Federico Avanzini, Adriano Baratè, Cristian Bernareggi, Gabriele Galimberti, Luca A Ludovico, Sergio Mascetti, and Giorgio Presti. 2019. Sonification of rotation instructions to support navigation of people with visual impairment. In 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom. IEEE, Los Alamitos, CA, USA, 1–10. https://doi.org/10.1109/PERCOM.2019.8767407
[3]
Aipoly. 2020. Aipoly Vision. Sight for Blind and Visually Impaired.Retrieved in December 27, 2020 from http://aipoly.com.
[4]
Aira. 2020. Aira. Retrieved in December 27, 2020 from https://aira.io/.
[5]
Aaron Bangor, Philip Kortum, and James Miller. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 4, 3 (2009), 114–123. https://doi.org/10.5555/2835587.2835589
[6]
BeMyEyes. 2020. BeMyEyes. Retrieved in December 27, 2020 from https://www.bemyeyes.com/.
[7]
Jeffrey R Blum, Mathieu Bouchard, and Jeremy R Cooperstock. 2011. What’s around me? Spatialized audio augmented reality for blind users with a smartphone. In International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services. Springer, New York, NY, USA, 49–62. https://doi.org/10.1007/978-3-642-30973-1_5
[8]
Nicholas A Bradley and Mark D Dunlop. 2002. Investigating context-aware clues to assist navigation for visually impaired people. In Proceedings of Workshop on Building Bridges: Interdisciplinary Context-Sensitive Computing, University of Glasgow. Strathprints, GBR, 5–10.
[9]
John Brooke 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.
[10]
Envision Technologies B.V.2020. Envision AI. Enabling vision for the blind.Retrieved in December 27, 2020 from https://www.letsenvision.com.
[11]
Hsuan-Eng Chen, Yi-Ying Lin, Chien-Hsing Chen, and I-Fang Wang. 2015. BlindNavi: A navigation app for the visually impaired smartphone user. In Proceedings of the 33rd annual ACM conference extended abstracts on human factors in computing systems. ACM, New York, NY, USA, 19–24. https://doi.org/10.1145/2702613.2726953
[12]
Sakmongkon Chumkamon, Peranitti Tuvaphanthaphiphat, and Phongsak Keeratiwintakorn. 2008. A blind navigation system using RFID for indoor environments. In 2008 5th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Vol. 2. IEEE, Los Alamitos, CA, USA, 765–768. https://doi.org/10.1109/ECTICON.2008.4600543.
[13]
Giovanni Ciaffoni. 2020. Ariadne GPS - Mobility and,ap exploration for all. Retrieved in December 27, 2020 from https://www.ariadnegps.eu/.
[14]
Cloudsight. 2020. TapTapSee. Mobile camera application designed specifically for the blind and visually impaired iOS users.Retrieved in December 27, 2020 from http://www.taptapseeapp.com.
[15]
Navid Fallah, Ilias Apostolopoulos, Kostas Bekris, and Eelke Folmer. 2012. The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 425–432. https://doi.org/10.1145/2207676.2207735
[16]
Alexander Fiannaca, Ilias Apostolopoulous, and Eelke Folmer. 2014. Headlock: a wearable navigation aid that helps blind cane users traverse large open spaces. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, New York, NY, USA, 19–26. https://doi.org/10.1145/2661334.2661453
[17]
Dieter Fox, Wolfram Burgard, and Sebastian Thrun. 1997. The dynamic window approach to collision avoidance. IEEE Robotics & Automation Magazine 4, 1 (1997), 23–33. https://doi.org/10.1109/100.580977
[18]
Aura Ganz, Siddhesh Rajan Gandhi, Carole Wilson, and Gary Mullett. 2010. INSIGHT: RFID and Bluetooth enabled automated space for the blind and visually impaired. In 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology. IEEE, Los Alamitos, CA, USA, 331–334. https://doi.org/10.1109/IEMBS.2010.5627670
[19]
Giuseppe Ghiani, Barbara Leporini, and Fabio Paternò. 2009. Vibrotactile feedback to aid blind users of mobile guides. Journal of Visual Languages & Computing 20, 5 (2009), 305–317. https://doi.org/10.1016/j.jvlc.2009.07.004
[20]
Nicholas A. Giudice. 2020. COVID-19 and blindness: Why the new touchless, physically-distant world sucks for people with visual impairment. Retrieved in December 27, 2020 from https://medium.com/@nicholas.giudice/covid-19-and-blindness-why-the-new-touchless-physically-distant-world-sucks-for-people-with-2c8dbd21de63.
[21]
Nicholas A Giudice, Benjamin A Guenther, Nicholas A Jensen, and Kaitlyn N Haase. 2020. Cognitive mapping without vision: Comparing wayfinding performance after learning from digital touchscreen-based multimodal maps vs. embossed tactile overlays. Frontiers in Human Neuroscience 14 (2020), 87. https://doi.org/10.3389/fnhum.2020.00087
[22]
Nicholas A Giudice, William E Whalen, Timothy H Riehle, Shane M Anderson, and Stacy A Doore. 2019. Evaluation of an accessible, real-time, and infrastructure-free indoor navigation system by users who are blind in the mall of america. Journal of Visual Impairment & Blindness 113, 2 (2019), 140–155. https://doi.org/10.1177/0145482X19840918
[23]
Google. 2020. Google Maps. Retrieved in December 27, 2020 from https://maps.google.com.
[24]
William Grussenmeyer, Jesel Garcia, and Fang Jiang. 2016. Feasibility of using haptic directions through maps with a tablet and smart watch for people who are blind and visually impaired. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, New York, NY, USA, 83–89. https://doi.org/10.1145/2935334.2935367
[25]
João Guerreiro, Daisuke Sato, Saki Asakawa, Huixu Dong, Kris M Kitani, and Chieko Asakawa. 2019. CaBot: Designing and Evaluating an Autonomous Navigation Robot for Blind People. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 68–82. https://doi.org/10.1145/3308561.3353771
[26]
Rabia Jafri, Rodrigo Louzada Campos, Syed Abid Ali, and Hamid R Arabnia. 2017. Visual and infrared sensor data-based obstacle detection for the visually impaired using the Google project tango tablet development kit and the unity engine. IEEE Access 6(2017), 443–454. https://doi.org/10.1109/ACCESS.2017.2766579
[27]
Robert K Katzschmann, Brandon Araki, and Daniela Rus. 2018. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 3(2018), 583–593. https://doi.org/10.1109/TNSRE.2018.2800665
[28]
Seita Kayukawa, Keita Higuchi, João Guerreiro, Shigeo Morishima, Yoichi Sato, Kris Kitani, and Chieko Asakawa. 2019. BBeep: A sonic collision avoidance system for blind travellers and nearby pedestrians. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300282
[29]
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2020. BlindPilot: A Robotic Local Navigation System That Leads Blind People to a Landmark Object. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1–9. https://doi.org/10.1145/3334480.3382925
[30]
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2020. Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4, 3 (2020), 1–22. https://doi.org/10.1145/3411825
[31]
Seita Kayukawa, Hironobu Takagi, João Guerreiro, Shigeo Morishima, and Chieko Asakawa. 2020. Smartphone-Based Assistance for Blind People to Stand in Lines. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1–8. https://doi.org/10.1145/3334480.3382954
[32]
Jee-Eun Kim, Masahiro Bessho, Shinsuke Kobayashi, Noboru Koshizuka, and Ken Sakamura. 2016. Navigating Visually Impaired Travelers in a Large Train Station Using Smartphone and Bluetooth Low Energy. In Proceedings of the 31st Annual ACM Symposium on Applied Computing. ACM, New York, NY, USA, 604–611. https://doi.org/10.1145/2851613.2851716
[33]
Eunjeong Ko, Jin Sun Ju, and Eun Yi Kim. 2011. Situation-based indoor wayfinding system for the visually impaired. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. ACM, New York, NY, USA, 35–42. https://doi.org/10.1145/2049536.2049545
[34]
Bing Li, J Pablo Munoz, Xuejian Rong, Jizhong Xiao, Yingli Tian, and Aries Arditi. 2016. ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In European Conference on Computer Vision. Springer, New York, NY, USA, 448–462. https://doi.org/10.1007/978-3-319-48881-3_31
[35]
Jack M Loomis, Yvonne Lippa, Roberta L Klatzky, and Reginald G Golledge. 2002. Spatial updating of locations specified by 3-D sound and spatial language.Journal of Experimental Psychology: Learning, Memory, and Cognition 28, 2(2002), 335. https://doi.org/10.1037/0278-7393.28.2.335
[36]
Manuel Martinez, Angela Constantinescu, Boris Schauerte, Daniel Koester, and Rainer Stiefelhagen. 2014. Cognitive evaluation of haptic and audio feedback in short range navigation tasks. In International Conference on Computers for Handicapped Persons. Springer, New York, NY, USA, 128–135. https://doi.org/10.1007/978-3-319-08599-9_20
[37]
Microsoft. 2020. Seeing AI. A free app that narrates the world around you. Retrieved in December 27, 2020 from https://www.microsoft.com/en-us/seeing-ai.
[38]
Microsoft. 2020. SoundScape. Retrieved in December 27, 2020 from https://www.microsoft.com/en-us/research/product/soundscape/.
[39]
MIPsoft. 2020. Blindsquare.
[40]
Masayuki Murata, Dragan Ahmetovic, Daisuke Sato, Hironobu Takagi, Kris M Kitani, and Chieko Asakawa. 2018. Smartphone-based indoor localization for blind navigation across building complexes. In 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom). IEEE, Los Alamitos, CA, USA, 1–10. https://doi.org/10.1109/PERCOM.2018.8444593
[41]
Madoka Nakajima and Shinichiro Haruyama. 2013. New indoor navigation system for visually impaired people using visible light communication. EURASIP Journal on Wireless Communications and Networking 2013, 1(2013), 37. https://doi.org/10.1186/1687-1499-2013-37
[42]
Yasushi Nakauchi and Reid Simmons. 2002. A social robot that stands in line. Autonomous Robots 12, 3 (2002), 313–324. https://doi.org/10.1023/A:1015273816637
[43]
Orcam. 2020. Help People who are Blind or Partially Sighted - OrCam.Retrieved in December 27, 2020 from https://www.orcam.com/en/.
[44]
Manoj Penmetcha, Arabinda Samantaray, and Byung-Cheol Min. 2017. Smartresponse: Emergency and non-emergency response for smartphone based indoor localization applications. In International Conference on Human-Computer Interaction. Springer, New York, NY, USA, 398–404. https://doi.org/10.1007/978-3-319-58753-0_57
[45]
Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, New York, NY, USA, 545–550. https://doi.org/10.1145/2037373.2037458
[46]
Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 402–413. https://doi.org/10.1145/3308561.3353779
[47]
Joseph Redmon and Ali Farhadi. 2018. YOLOv3: An Incremental Improvement. arxiv:1804.02767 [cs.CV]
[48]
Timothy H Riehle, Shane M Anderson, Patrick A Lichter, Nicholas A Giudice, Suneel I Sheikh, Robert J Knuesel, Daniel T Kollmann, and Daniel S Hedin. 2012. Indoor magnetic navigation for the blind. In 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, Los Alamitos, CA, USA, 1972–1975. https://doi.org/10.1109/EMBC.2012.6346342
[49]
Jean-Paul Rodrigue, Claude Comtois, and Brian Slack. 2016. The geography of transport systems. Routledge, Abingdon, Oxfordshire, UK.
[50]
Alberto Rodríguez, J Javier Yebes, Pablo F Alcantarilla, Luis M Bergasa, Javier Almazán, and Andrés Cela. 2012. Assisting the visually impaired: obstacle detection and warning system by acoustic feedback. Sensors 12, 12 (2012), 17476–17496. https://doi.org/10.3390/s121217476
[51]
David A Ross and Bruce B Blasch. 2000. Wearable interfaces for orientation and wayfinding. In Proceedings of the fourth international ACM conference on Assistive technologies. ACM, New York, NY, USA, 193–200. https://doi.org/10.1145/354324.354380
[52]
Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M Kitani, and Chieko Asakawa. 2019. NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features. ACM Transactions on Accessible Computing (TACCESS) 12, 3 (2019), 14. https://doi.org/10.1145/3340319
[53]
Hsueh-Cheng Wang, Robert K Katzschmann, Santani Teng, Brandon Araki, Laura Giarré, and Daniela Rus. 2017. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In 2017 IEEE international conference on robotics and automation (ICRA). IEEE, Los Alamitos, CA, USA, 6533–6540. https://doi.org/10.1109/ICRA.2017.7989772
[54]
Richard Welsh. 1981. Foundations of orientation and mobility. Technical Report. American Foundation for the Blind.
[55]
Michele A Williams, Amy Hurst, and Shaun K Kane. 2013. ” Pray before you step out” describing personal and situational blind navigation behaviors. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 1–8. https://doi.org/10.1145/2513383.2513449
[56]
Koji Yatani, Nikola Banovic, and Khai Truong. 2012. SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 415–424. https://doi.org/10.1145/2207676.2207734
[57]
Chris Yoon, Ryan Louie, Jeremy Ryan, MinhKhang Vu, Hyegi Bang, William Derksen, and Paul Ruvolo. 2019. Leveraging Augmented Reality to Create Apps for People with Visual Disabilities: A Case Study in Indoor Navigation. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 210–221. https://doi.org/10.1145/3308561.3353788

Cited By

View all
  • (2024)AngleSizer: Enhancing Spatial Scale Perception for the Visually Impaired with an Interactive Smartphone AssistantProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785258:3(1-31)Online publication date: 9-Sep-2024
  • (2024)Snap&Nav: Smartphone-based Indoor Navigation System For Blind People via Floor Map Analysis and Intersection DetectionProceedings of the ACM on Human-Computer Interaction10.1145/36765228:MHCI(1-22)Online publication date: 24-Sep-2024
  • (2024)ChitChatGuide: Conversational Interaction Using Large Language Models for Assisting People with Visual Impairments to Explore a Shopping MallProceedings of the ACM on Human-Computer Interaction10.1145/36764928:MHCI(1-25)Online publication date: 24-Sep-2024
  • Show More Cited By

Index Terms

  1. LineChaser: A Smartphone-Based Navigation System for Blind People to Stand in Lines
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
    May 2021
    10862 pages
    ISBN:9781450380966
    DOI:10.1145/3411764
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 May 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. line detection
    2. orientation and mobility
    3. pedestrian detection
    4. visual impairment

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • JST-Mirai Program
    • JSPS KAKENHI
    • Early Bird, Grant-in-Aid for Young Scientists

    Conference

    CHI '21
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)410
    • Downloads (Last 6 weeks)38
    Reflects downloads up to 21 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)AngleSizer: Enhancing Spatial Scale Perception for the Visually Impaired with an Interactive Smartphone AssistantProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785258:3(1-31)Online publication date: 9-Sep-2024
    • (2024)Snap&Nav: Smartphone-based Indoor Navigation System For Blind People via Floor Map Analysis and Intersection DetectionProceedings of the ACM on Human-Computer Interaction10.1145/36765228:MHCI(1-22)Online publication date: 24-Sep-2024
    • (2024)ChitChatGuide: Conversational Interaction Using Large Language Models for Assisting People with Visual Impairments to Explore a Shopping MallProceedings of the ACM on Human-Computer Interaction10.1145/36764928:MHCI(1-25)Online publication date: 24-Sep-2024
    • (2024)Development and Evaluation of Collision Avoidance User Interface for Assistive Vision Impaired NavigationAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686354(1-3)Online publication date: 13-Oct-2024
    • (2024)Dude, Where's My Luggage? An Autoethnographic Account of Airport Navigation by a Traveler with Residual VisionProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675624(1-13)Online publication date: 27-Oct-2024
    • (2024)WorldScribe: Towards Context-Aware Live Visual DescriptionsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676375(1-18)Online publication date: 13-Oct-2024
    • (2024)FetchAid: Making Parcel Lockers More Accessible to Blind and Low Vision People With Deep-learning Enhanced Touchscreen Guidance, Error-Recovery Mechanism, and AR-based Search SupportProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642213(1-15)Online publication date: 11-May-2024
    • (2024)Cultivating Navigational Autonomy in the Visually Impaired: A Novel Approach with VirtualEYE2024 IEEE Conference on Artificial Intelligence (CAI)10.1109/CAI59869.2024.00081(408-413)Online publication date: 25-Jun-2024
    • (2024)Gaze-Led Audio Description (GLAD). Concept and Application to Accessibility of Architectural HeritageTransforming Media Accessibility in Europe10.1007/978-3-031-60049-4_4(53-72)Online publication date: 20-Aug-2024
    • (2023)Grid Map Correction for Fall Risk Alert System Using SmartphoneJournal of Robotics and Mechatronics10.20965/jrm.2023.p086735:3(867-878)Online publication date: 20-Jun-2023
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media