Multimodal Interaction Techniques for Disabled Developers

Paudyal, Bharat (2023) Multimodal Interaction Techniques for Disabled Developers. Doctoral thesis, Birmingham City University.

[img]
Preview
Text
Bharat Paudyal PhD Thesis published_Final version_Submitted Mar 2023_Final Award Mar 2024.pdf - Accepted Version

Download (3MB)

Abstract

Technologies such as speech recognition, eye tracking, and mechanical switches present alternative opportunities to make coding more accessible for people with physical impairments. These technologies have previously been explored independently to assess their potential for supporting development work, although each input method exhibits unique strengths and challenges. A multimodal approach utilising different combinations of these technologies holds significant potential to address the individual limitations of each technology. However, there has been a lack of research to date investigating how these input approaches can be combined and the extent to which they can support inclusive coding experiences for people with physical impairments.

To address the limited work in this area, three independent research studies were conducted investigating multimodal input approaches for disabled developers. The first study focused on developing a research prototype utilising a combination of speech, gaze and mechanical switches to support writing and editing syntax. A user evaluation with 29 non-disabled developers found that the system was perceived positively in terms of usability and facilitated the successful completion of common coding activities. A follow-up study with five developers who have physical impairments validated that this target audience could successfully utilise the multimodal approach to complete standard coding tasks.

The second study investigated the usability and feasibility of different multimodal voice coding approaches (i.e. natural language and fixed commands) in conjunction with a mechanical switch to support coding activities. A comparative study with 25 non-disabled developers found that both approaches demonstrated similar levels of efficacy and usability, although participants highlighted significant potential in terms of natural language coding. This approach was therefore developed further and evaluated within a multi-session study with five developers who have physical impairments. Results validated the feasibility of the multimodal natural language approach to support developers in successfully completing coding activities.

The final study investigated the efficacy of tailoring code navigation features commonly used within mainstream development environments (i.e. “Find by Reference”, “Go to Definition”, and “Find”) for multimodal voice and mechanical switch interaction. A user evaluation with 14 developers who have physical impairments highlighted that the code navigation approaches were efficient to utilise and demonstrated a high-level of usability.

The contributions presented in this thesis highlight how the combination of different alternative input methods can provide more inclusive coding experiences for developers with physical impairments.

Item Type: Thesis (Doctoral)
Dates:
DateEvent
31 March 2023Submitted
15 March 2023Accepted
Uncontrolled Keywords: Voice coding, assistive technology, coding tools, multimodality, interaction design, human-centric design, eye tracking
Subjects: CAH10 - engineering and technology > CAH10-03 - materials and technology > CAH10-03-02 - materials technology
CAH11 - computing > CAH11-01 - computing > CAH11-01-01 - computer science
Divisions: Doctoral Research College > Doctoral Theses Collection
Faculty of Computing, Engineering and the Built Environment > College of Computing
Depositing User: Jaycie Carter
Date Deposited: 08 May 2024 10:06
Last Modified: 19 Jun 2024 12:12
URI: https://www.open-access.bcu.ac.uk/id/eprint/15342

Actions (login required)

View Item View Item

Research

In this section...