Serious Gaming For Learning The Intuitive, Non-Natural Control Of Prosthetic Hands

Morten Kristoffersen

    Research output: ThesisThesis fully internal (DIV)

    321 Downloads (Pure)

    Abstract

    When someone loses their hand, they are often given a prosthetic hand to restore function. Most prosthetic hands can only open and close and are controlled by using two sensors to detect a weak electric signal which is generated when muscles in the stump contract. The most advanced prosthetic hands have more functions and are controlled using machine learning (artificial intelligence). Machine learning techniques uses many sensors to detect patterns of electrical activity in the stump to determine the intent of the user. Unfortunately, the patterns for different contractions are often too similar, which means that the intent of the user cannot be determined. However, users can train with a coach to learn how to perform the contractions in such a way so that the patterns become distinct.
    My PhD research explores user training with an emphasis on serious games. A serious game teaches the user a new skill by playing the game. The benefit of using serious games is that users can train at home and they might be more effective than training with a coach. The main results of my research are that a coach is not needed to train and that serious games can train users to generate more distinct pattens than training with a coach. However, I also found that when evaluating the effect of training on prosthetic control performance, neither coaching nor serious game training led to improvements. These results highlight the need for more research in user training for controlling prosthetic hands.
    Original languageEnglish
    QualificationDoctor of Philosophy
    Awarding Institution
    • University of Groningen
    Supervisors/Advisors
    • van der Sluis, Corry, Supervisor
    • Bongers, Raoul, Co-supervisor
    • Murgia, Alessio, Co-supervisor
    Award date3-May-2021
    Place of Publication[Groningen]
    Publisher
    DOIs
    Publication statusPublished - 2021

    Cite this