alexa
Reach Us +44-1522-440391
Using Augmented Reality Techniques to Simulate Myoelectric Upper Limb Prostheses | OMICS International
ISSN: 2155-9538
Journal of Bioengineering & Biomedical Science

Like us on:

Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on Medical, Pharma, Engineering, Science, Technology and Business
All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Using Augmented Reality Techniques to Simulate Myoelectric Upper Limb Prostheses

Edgard Afonso Lamounier Jr*, Kenedy Lopes, Alexandre Cardoso and Alcimar Barbosa Soares

Faculty of Electrical Engineering, Biomedical Engineering and Virtual Reality Group, Federal University of Uberlandia, Uberlandia-MG, Brazil

*Corresponding Author:
Edgard Afonso Lamounier Jr
Faculty of Electrical Engineering
Biomedical Engineering and Virtual Reality Group
Federal University of Uberlandia, Uberlandia-MG, Brazil
Tel: +55 34 3239 4701
Fax: +55 34 3239 4704
E-mail: [email protected]

Received Date: November 24, 2011; Accepted Date: February 14, 2012; Published Date: February 17, 2012

Citation: Lamounier EA Jr, Lopes K, Cardoso A, Soares AB (2012) Using Augmented Reality Techniques to Simulate Myoelectric Upper Limb Prostheses. J Bioeng Biomed Sci S1: 010. doi: 10.4172/2155-9538.S1-010

Copyright: © 2012 Lamounier EAJr, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Visit for more related articles at Journal of Bioengineering & Biomedical Science

Abstract

This article proposes the use of Augmented Reality (AR) techniques for control and simulation of myoelectric prostheses. The system has been designed so that it is able to reproduce the operation of a real prosthesis in an immersive AR environment, using a virtual device that operates in similar fashion to the real one, resulting in a training environment for users and therapists. Motion and posture of the virtual prosthesis is controlled by EMG signals collected via surface electrodes and classified into four classes of movements. The results of tests with non-amputee volunteers show that the system is capable of generating the correct prosthesis motion and posture in the AR environment, in real time.

Keywords

Upper limbs prostheses; AR tool kit; Augmented reality; Virtual reality

Introduction

This article proposes the use of Augmented Reality (AR) as an aiding tool for the helping patients to operate and properly control myoelectric upper limb prostheses. The overall aim of this work is to reproduce the operation of a real prosthesis in an immersive AR environment, using a virtual device that operates in similar fashion to the real one, resulting in a training environment for users and therapists. Also, since real upper limb prostheses are relatively heavy and it can become uncomfortable and cumbersome, especially during the first stages of fitting, the use of a virtually weightless and fully controllable device can help reducing the great physical and mental effort usually necessary, especially in the first trials. Furthermore, the virtual device can be easily programmed to act according to the users’ ability. For many years, great efforts have been applied in the search of better strategies for controlling artificial limbs [1-7]. One of the major challenges is to produce devices that perfectly mimic their natural counterparts. A very popular approach for prosthesis control is based on the use of EMG signals (the electrical manifestation of the neuromuscular activation associated with a contracting muscle), collected from remnant muscles, to generate control inputs. Since those devices, also known as myolectric prostheses, use a biological signal to control their movements, it is expected that they should be much easier to operate. However, that is not always the case. In fact, as described in [8], the prosthesis control is very unnatural and requires a great mental effort, especially during the first months after fitting. As a result, a meaningful number of patients give up using those devices very early. Hence, it is important to devise new strategies for control and also for training new users.

Augmented Reality has been identified in a variety of medical areas as an important aiding tool [9-11] capable of changing difficult tasks into more palatable ones, such as: image-guided surgery, telemedicine, medical training and surgical planning (see Figure 1 as an example).

bioengineering-biomedical-science-computed-tomography-augmented

Figure 1: Computed Tomography (CT) image associated with Augmented Reality (AR) (extracted from [9]).

In this sense, this work has emerged as an attempt to devise more suitable strategies to be used in those critical initial months after fitting of myoelectric prostheses. To do so, the research group, over the years, has developed new methods for detection and processing of EMG signals in order to extract the correct commands issued by the user which, in turn, could be used to control the movements of a device in a virtual reality (VR) environment. In fact, this work is an extension of previous works on the use of VR systems for prosthesis control [8,12]. However, although a purely non-immersive VR environment showed some good results, it was thought that an Augmented

Reality environment would provide a more realistic experience. In other words, it is expected to provide the user with a more natural and intuitive training environment, through AR immersion experiences, that can achieved by using reliable techniques for capturing and processing biological EMG signals.

Materials and Methods

Underlying works

A set of Virtual Reality techniques to create and to simulate a skeletal model of human upper limbs was proposed by [13]. The authors developed a system where a set of forces generated by the muscles are parameterized and fed into biomechanics equations to achieve limb motion animation. MATLAB® is used to solve the equations (Figure 2). Three degrees of freedom are possible: arm flexion/extension, arm adduction/abduction and forearm flexion/extension. Although virtual simulation is achieved, capturing and classification of EMG signals and Augmented Reality techniques are not supported.

bioengineering-biomedical-science-human-upper-esqueleton

Figure 2: Human upper esqueleton in a VRML viewer (extracted from [11]).

The interface developed in [14], denominated ROAD, use a set of sensors, known as resistor forcefeeling (FSRs) that show a reduction of its resistance with application of force. The sensors are fixed inside a data sleeve with adjustment belts to suit the user upper limbs. The objective is to detectsuperficial forelimb activity. The authors claim that the system can be used for physiotherapic rehabilitation or as a supporting tool for prostheses adaptation. The computer program, written in Labview®, works in two separate stages: training and operation. During training, specific actions must be executed using multiple repetitions of three different movements. During normal operation the user’s action is detected by the sensors and filtered to obtain the control inputs and generate the required movement, which is represented by a virtual member created as part of a training or rehabilitation environment, with suggested exercises, as illustrated in Figure 3. The authors have proved that the system contributes for reducing the time for training and adaptation for a real prosthesis. Nevertheless, training and operation phases are performed only within a virtual environment and EMG signals are not analyzed.

bioengineering-biomedical-science-rehabilitation-exercises-pegboard

Figure 3: Rehabilitation exercises proposed by the ROAD system: (a) Pick-andplace (b) Pegboard (extracted from [12]).

Smith et al. [15], proposed the combination of continuous decoding of finger position, based on EMG signals, with a virtual prosthetic to evaluate controllability, in an online setting. In particular, it was investigated whether or not intact limbed subjects could exert control over individual fingers of this virtual prosthesis in target touching tasks that require both active movement as well as sustained contractions, at various locations in the flexion space of the fingers (Figure 4).

bioengineering-biomedical-science-virtual-prosthesis-signals

Figure 4: Controlling a virtual prosthesis through the use of the CyberGlove, while EMG signals are simultaneously recorded and processed (extracted from [13]).

During tests, subjects were able to achieve overall accuracy (in hitting virtual targets) on the range of 81.56% and 94.06%, depending on the measurement between target region and target angle. Despite the success demonstrated in the experiments, the benefits of exploring Augmented Reality techniques, for this kind of application, are not investigated in this work

As described earlier, this work is an extension of previous researches developed by the authors on the use of VR systems for prosthesis control, where Augmented Reality is added to provide the user with a more natural and intuitive training environment. For this new system, the authors used the techniques already developed as the basis for capturing and processing the EMG signals used to control the movements of the virtual prosthesis in the AR environment. Next, an overview of those methods is shown (for more details please refer to [8]).

Figure 5 shows the basic architecture used for capturing and processing EMG signal in order to generate control inputs for a virtual device. The raw EMG signal, detected by surface electrodes, is amplified by a factor of 1000x, bandfiltered between 20Hz and 1000Hz, to remove unwanted artifacts, and digitalized with 16 bits at 2kHz sampling rate. However, in order to provide the proper movement of the virtual device, it was necessary to process the EMG signal in order to find out which movement the user wanted to perform. To do so, the areas of activity in the EMG data were detected (windowing) and the resulting signal was then processed to generate a set of features used by an artificial neural network to classify the EMG signal. Basically, each EMG contraction was represented by a set of Auto-Regressive (AR) coefficients calculated according to a modified algorithm, as proposed in [8]. The method provided an iterative and fast method to figure out the parameters of the AR-model adaptively and feed then as inputs to the neural network. According to the authors, the choice a neural network as a classifier was due to its ability to learn and later recognize signals as being part of the same class of movement in real time (generally delays much longer than 100mS between the muscle contraction and the activation of the prosthesis are unacceptable). Also, depending on the level of amputation, different users may generate different levels of contractions of the remaining part of the limb, for the same class movement. Besides, even if a single user performs only isometric or isotonic contractions, there will not be two identical contractions for the same movement. The neural network was trained with four classes of movements (elbow flexion, elbow extension, wrist pronation and wrist supination) with a group of 25 patterns per class. After successfully trained, the neural network was presented with a new set of EMG pattern and “asked” to find out which movement was related to it. The results show a near perfect performance of the classifier (95% to 100% rate of success), when using the described method. The authors believe that such performance was achieved due to the judicious process used to detect and acquire clear EMG signals, along with the improvements made on the estimation of the AR coefficients.

bioengineering-biomedical-science-architecture-capturing-signal

Figure 5: System architecture for capturing and processing EMG signal to control a virtual prosthesis (extracted from [10]).

The output of the neural network was then used as control input to the virtual device, modeled through VRML (Virtual Reality Modelling Language) and Java, as shown in Figure 6

bioengineering-biomedical-science-virtual-environment-elbow

Figure 6: Example of a virtual environment, were a virtual arm shows elbow extension (left) and elbow flexion (right) (extracted from [10]).

By carefully evaluating each one of those systems, it is possible to conclude that none of them contemplates the level of integration between user and the virtual environment so that a natural and intuitive environment is achieved. Table 1 shows a comparison among them, taking into account some important features for a “true” myoelectric virtual prosthesis.

Features Soares et al. [8] Delp and Loan. [10] Kuttuva et al. [14] Smith et al. [15]
Other software dependent Yes Yes Yes No
Real EMG signal for control Yes No No Yes
Real time processing Yes Yes Yes Yes
Virtual reality techniques Yes Yes Yes Yes
Augmented reality techniques No No No No

Table 1: Comparing virtual devices used for upper prosthesis simulation.

Next, the new approach proposed by the authors is described. We believed that the use of Augmented Reality, were the images of the virtual device are combined into the images of the real world, can provide a much more natural and realistic environment for training upper limb prosthetic users.

Using Augmented Reality (AR) for prosthesis simulation

Upper limb myoelectric prostheses rely upon signals detected by surface electrodes, which are placed in the socket (where the stump is fitted), to detect the muscle activity of a residual limb. The joints of the device (fingers, wrist, elbow etc.) are operated by small electric motors which, in turn, are controlled by a microprocessor or microcontroller (Figure 7). In so doing, it is expected that, as the user contracts remnant muscles, the prosthesis will react accordingly, performing the required task.

bioengineering-biomedical-science-myoelectric-prosthesis-signal

Figure 7: The basis of a myoelectric prosthesis: EMG signals emanated from remnant muscles are used as a control input for the device.

Proposed architecture and experimental apparatus

Figure 8 shows the architecture proposed by the authors to generate a proper environment to control and simulate a virtual myoelectric prosthesis. In our system, the user is fitted with a head mounted device that includes a camera, for capturing the real images of the user´s view point, and a display to show the mixed images (augmented: real and virtual). The EMG signals are collected and processed, as described in [8], to generate inputs to the virtual reality unit. A processing center decides when to update both the virtual arm and the augmented reality images to further send them to the graphics user interface (the head mounted display).

bioengineering-biomedical-science-architecture-augmented-upper-limb

Figure 8: Proposed architecture for Augmented Reality upper-limb prostheses.

The 3D objects (Figure 9) was modeled using 3Dstudio Max® and includes all parts of a real device, with elements representing each one of the joints of full arm prosthesis (Figure 10). The model has been generated in blocks and then exported to VRML97 for integration into the virtual and augmented environments (Figure 11).

bioengineering-biomedical-science-proposed-virtual-prosthesis

Figure 9: The proposed virtual prosthesis.

bioengineering-biomedical-science-freedom-virtual-prosthesis

Figure 10: Degrees of freedom proposed for the virtual prosthesis.

bioengineering-biomedical-science-prosthesis-rigid-transformations

Figure 11: Virtual prosthesis performs solid rigid transformations (movements) into a virtual environment.

During operation, the camera captures the image and locates a marker at the patient´s shoulder. The algorithm then searches for a virtual object that corresponds to such marker (Figure 12) and inserts it into the real world, captured by a camera. The ARToolKit™ framework [16] was used to combine the virtual scenes, generated by the computer, with the real world observed by the user.

bioengineering-biomedical-science-marker-shoulder-scene

Figure 12: Marker placed at the user´s shoulder. The marker is used for the system to decide where the virtual arm should be positioned when combining the virtual object into the real world scene.

Results

The first set of experiments aimed to evaluate the response of the virtual device to EMG commands. As shown in Figure 13, the new virtual device responded as expected, executing different movements (elbow flexion/extension or wrist pronation/ supination) to each one of the four different classes of EMG contraction.

bioengineering-biomedical-science-virtual-prosthesis-inputs

Figure 13: Responses of the virtual prosthesis to different EMG inputs.

Figure 14 shows the standard setup used to operate the Augmented Reality system. The user/volunteer is fitted with the head mount 3D set and the marker placed over the shoulder. The EMG surface electrodes must be positioned over the remnant muscles that will be used to generate the control inputs for the system. As described in [10,14] at least three EMG sites must be used for achieving proper training of the classifier and further classification of the required movement. In the experiments performed by the authors the EMG activity was collected from the long head of the biceps brachii, triceps brachii long head and triceps brachii lateral head.

bioengineering-biomedical-science-operate-myoelectric-environment

Figure 14: The standard setup used to operate the proposed virtual myoelectric upper limb prosthesis in the Augmented Reality environment.

As shown in the block diagram of Figure 8, the system uses the outputs of the EMG data classifier to generate the prosthesis motion and posture in the virtual environment, which is combined with the real images to prove the Augment Reality feedback. If necessary the system can output the standard VR environment (without real world scenes, as shown in Figure 15) or the full AR image feedback, as shown in Figure 16.

bioengineering-biomedical-science-point-view-environment

Figure 15: User´s point of view within the VR environment.

bioengineering-biomedical-science-point-view-environment

Figure 16: User´s point of view within the AR environment.

Discussion and Conclusion

This paper proposes the use of Augmented Reality (AR) as an aiding tool for the helping patients to operate and properly control virtual myoelectric upper limb prostheses. The control inputs for the AR environment is generated by EMG signals, captured via surface electrodes and classified into four classes of movements. The results of tests with non-amputee volunteers show that the system is capable of generating the correct prosthesis motion and posture in the AR environment, in real time.

This first prototype has not been used in clinical trials. However, we are in the process of testing it with amputee volunteers. The results of those experiments and further adjustments in the system will be shown in future publication.

Acknowledgement

The authors would like to express their gratitude to “Coordenação de Aperfeiçoamento de Pessoal de Nível Superior” (CAPES - Brazil), “Conselho Nacional de Desenvolvimento Científico e Tecnológico” (CNPq – Brazil) and “Fundação de Amparo à Pesquisa do Estado de Minas Gerais” (FAPEMIG – MG – Brazil) for the financial support.

References

Select your language of interest to view the total content in your interested language
Post your comment

Share This Article

Relevant Topics

Recommended Conferences

Article Usage

  • Total views: 13043
  • [From(publication date):
    specialissue-2013 - Aug 24, 2019]
  • Breakdown by view type
  • HTML page views : 9166
  • PDF downloads : 3877
Top