banner

VR based gesture elicitation for user—Interfaces with low vision

K. Kalaiselvi, R. Bhuvaneswari, R. Rekha, T. Rajesh Kumar, Rishav Banerjee, Omang Baheti

Abstract


User interfaces (UI) and menus in virtual reality (VR), which frequently replicate traditional UI for computers and smartphones, are not created factoring for individuals with low eyesight as they demand accurate pointing and good vision to engage effectively. As an alternative method of user interaction with UI, using gestures can be recommended. Comparing gesture-based interaction with the conventional point-and-click technique for changing system settings like volume, brightness, and window manipulation in order to test this hypothesis is employed. Accessibility, spatial awareness, and precision for those with low vision while lowering cognitive load and enhancing immersion for all users can be improved by leveraging gestures. The objective of the research work is to explore the framework of Gesture Elicitation in VR environments for users with low vision. In this research work the usage of gestures as a more effective and immersive means of interacting with menus, which will not only enhance the experience of normal VR users but also drastically reduce the friction experienced by those with visual impairments is proposed. User studies demonstrate a noticeable improvement in the aforementioned areas, with faster work completion times, more immersion, and better user satisfaction.


Keywords


virtual reality; gesture elicitation; accessibility

Full Text:

PDF

References


1. Zhao Y, Cutrell E, Holz C, et al. Seeing VR: A set of tools to make virtual reality more accessible to people with low vision. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Published online May 2, 2019. doi: 10.1145/3290605.3300341

2. Oculus. Available online: https://www.meta.com/ help/quest/articles/headsets-and-accessories/controllers-and-hand-tracking/hand-tracking-quest-2/ (accessed on 20 June 2020).

3. Immersed, Virtual reality workspace, Available online: https://immersed.com/ (accessed on 20 June 2020).

4. Oculus. Hand pose detection interaction software development kit. Available online: https://developer.oculus.com/documentation/unity/ unity-isdk-hand-pose-detection (accessed on 20 June 2020).

5. Wu M, Balakrishnan R. Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. Proceedings of the 16th annual ACM symposium on User interface software and technology. Published online November 2, 2003. doi: 10.1145/964696.964718

6. Sagayam KM, Hemanth DJ. Hand posture and gesture recognition techniques for virtual reality applications: a survey. Virtual Reality. 2016, 21(2): 91-107. doi: 10.1007/s10055-016-0301-0

7. Chua SND, Chin KYR, Lim SF, et al. Hand Gesture Control for Human–Computer Interaction with Deep Learning. Journal of Electrical Engineering & Technology. 2022, 17(3): 1961-1970. doi: 10.1007/s42835-021-00972-6

8. Chen LC, Cheng YM, Chu PY, et al. The Common Characteristics of User-Defined and Mid-Air Gestures for Rotating 3D Digital Contents. Universal Access in Human-Computer Interaction Interaction Techniques and Environments. Published online 2016: 15-22. doi: 10.1007/978-3-319-40244-4_2

9. Matlani R, Dadlani R, Dumbre S, et al. Virtual Mouse using Hand Gestures. 2021 International Conference on Technological Advancements and Innovations (ICTAI). Published online November 10, 2021. doi: 10.1109/ictai53825.2021.9673251

10. Harrington K, Large DR, Burnett G, et al. Exploring the Use of Mid-Air Ultrasonic Feedback to Enhance Automotive User Interfaces. Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. Published online September 23, 2018. doi: 10.1145/3239060.3239089

11. Leng HY, Norowi NM, Jantan AH. A User-Defined Gesture Set for Music Interaction in Immersive Virtual Environment. Proceedings of the 3rd International Conference on Human-Computer Interaction and User Experience in Indonesia. Published online April 18, 2017. doi: 10.1145/3077343.3077348

12. Lin W, Du L, Harris-Adamson C, et al. Design of Hand Gestures for Manipulating Objects in Virtual Reality. Lecture Notes in Computer Science. Published online 2017: 584-592. doi: 10.1007/978-3-319-58071-5_44

13. Yan Y, Yu C, Ma X, et al. VirtualGrasp. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Published online April 19, 2018. doi: 10.1145/3173574.3173652

14. Khundam C. First person movement control with palm normal and hand gesture interaction in virtual reality. 2015 12th International Joint Conference on Computer Science and Software Engineering (JCSSE). Published online July 2015. doi: 10.1109/jcsse.2015.7219818




DOI: https://doi.org/10.32629/jai.v7i5.1056

Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 K. Kalaiselvi, R. Bhuvaneswari, R. Rekha, T. Rajesh Kumar, Rishav Banerjee, Omang Baheti

License URL: https://creativecommons.org/licenses/by-nc/4.0/