top of page

Smart Prosthetics Prototype

Concept

One of the most important components of good design is a strong understanding of the context that the user lives in. This is particularly important and challenging in designing wearable technology when contextual information can be delivered to a device that is an extension of the user's profile. Prosthetics have always required user input to control the device, but what if prosthetics' behaviors were influenced by contextual information from the user's environment? For this project, I designed a method of using NFC technology to trigger hand positions in a 3D printed prosthetic hand. 

 

Over the past couple years Near Field Communication (NFC) technology has grown to become a standard in most smartphones. A basic application of NFC technology has been seen in the Google Wallet and Apple Pay systems that allow people to make purchases at stores by simply touching their phone to the checkout machine.

The great thing about NFC is that the patch triggering a response does not require power, and currently, these patches can have tasks programmed wirelessly into their memory from a smartphone to perform interactions on the smartphone (e.g. turn on Bluetooth and automatically pair to car stereo). For my scenario with prosthetic hands, an NFC patch could have a hand position programmed into it by the phone, and the prosthetic hand could theoretically have a sensor that reacts automatically to the program in the NFC patch when they are close.

 

This addition to prosthetic devices could possibly be cheaper to develop than prosthetic hands that respond to muscle and nerve stimulii. It wouldn’t be as robust as such devices, but it has the potential to help users with their daily tasks like opening doors at home, using the computer’s keyboard, and grabbing their favorite water bottle.

Video Prototype of the NFC activated prosthetic hand in action!

Design
3D Printed Hand

I first had to figure out how I would go about obtaining a 3D printed prosthetic hand. Through a lot of research into 3D printed prosthetic hands, I learned that there were many different open-source models online with the appropriate files to print the parts. It looked simple enough, but after looking into the availability of the 3D printers on campus and the possibility of messing up the initial prints, I chose to contact a research group that had access to their own 3D printed prosthetic hands. The group let me borrow a couple of their hands that were based on the open-source files for the Raptor Hand by e-Nable that can be found here: ​http://enablingthefuture.org/upper-limb-prosthetics/the-raptor-hand/

Since the Raptor Hand is controlled manually by the user’s own wrist movements, the user group that would benefit from this hand are those that have partially developed hands past the wrist. The hand is limited to a simple grip interaction, which is caused by the tension in the wires that line the hand from the wrist down to the joints in the fingers. By loosening the Phillips screws that dictate the tension in the wires, I was able to create different hand positions for the case of a behavioral prototype test.

 

Mobile App Prototype

In order to perform a full usability study on the entire system of programming the NFC patches and using the patches to trigger hand positions, I first needed to create a prototype for the mobile application that would be used to program the NFC patches. These patches would have to be uniquely programmed to communicate with the user's prosthetic hand. I first created sketches to show the series of interactions required by the application to successfully program the hand positions into each NFC patch. The prosthetic hand would first need to be registered with the mobile app, and afterward, the user would be able to register multiple positions to the NFC patches. Each of these patches would have a unique ID associated to each hand position, and all of the hand positions would be listed on the main screen for the application.Once the flow of interactions was understood, I moved from paper to digital by creating mockup wireframes in Axure.

 

In regards to the process of registering hand positions to the NFC patches, I initially intended the user to have full control over each finger to allow for full customization on the hand postion. Since the project's timeframe was limited to one week, however, I used only a handful of images to represent different hand positions in the wireframe, which allowed me more time to test the prototype's interactions and to create a video prototype for the scenarios where the system would work. For illustration purposes, I also created loading screens to mimic the time it would take to pair the prosthetic hand and program the NFC patches with the application. At the bottom of the app's main screen is a large button to reset the prosthetic hand back into a relaxed position. I imagined this would be used if the prosthetic hand's position had an error, forcing it to lock in one position, but ideally, this feature would also be found using a reset button on the hand. 

 

Below are screenshots from the wireframing process of the mobile application as well as a link to the interactive prototype for the application!

Mobile Application Prototype: http://4mavkw.axshare.com/

Laser Cut NFC Patches

It is possible to buy NFC patches, but since they usually cost about a dollar per patch, I chose to draw up some quick sketches in Rhino and laser cut them from cardstock in the design lab. Each patch has a number associated to it, and upon creating a new position for the prosthetic hand, the user needs to indicate the patch number in the provided text field. This will help the user know which patch is associated to which position when viewing the registered positions on the mobile application.​

Testing & Reflection

For my user test scenario, I told two users to go through the steps of pairing the application with the prosthetic hand, registering custom hand positions to the NFC patches, and activating the hand positions by waving the prosthetic hand over the NFC patch they programmed. When the participant would get to the last task in the study, I used a screwdriver to adjust the tension and position of the fingers on the hand. This was noted as disruption to the flow of the test.

Cutting out the NFC patches

From these tests, I wanted to see if the users thought that all of the steps for registering the positions were intuitive and if the users found the positions practical for the tasks they needed to do. Unfortunately for the case of these studies, both of my participants did not have any disabilities with their hands, meaning my results would not be practical results for users that actually use or need a prosthetic hand.

 

However, I still received some important feedback from the tests.

 

  1. Pairing hand to application - One of my participants noted that there should be a different way to pair the hand with the application (the current method requires the user to close and open the hand). This makes sense because the prosthetic hand is not necessarily supposed to be controlled manually.

  2. Locking and unlocking the position - Both participants commented about the process of unlocking the hand from the current position after activating it on a NFC patch. For example, the prosthetic hand would need to know when it is appropriate to stay locked to a position and when to unlock. For example, the hand should know to stay locked when holding a water bottle but should also know when to unlock from holding it.

  3. Further customization of hand positions - One participant wished that there were more customization options when creating a position to register. This was understandable because the app only showed a small sample of possible hand positions. The participant also noted that it would help to specify how tight of a grip the hand should make.

 

Future Iterations

Although I did not get to test my product with a potential user, the feedback I received did show me ways I can improve my product for higher fidelity prototypes. The improvements I’m going to note will still be focused on iterations that do not include a fully functional 3d printed prosthetic hand equiped with servos, an arduino, NFC sensor, and any other components I would need to make a complete device.​

 

  1. Rather than having the hand pair to the application through movement of the hand, there should simply be a bluetooth button on the prosthetic hand that pairs it to the application.

  2. The hand should have a way to notice when certain objects should retain the hand position until the appropriate time to unlock the grip. As an initial design, there could be an unlock button that would be pressed by the user’s other hand to release the grip that was activated by the NFC patch.

  3. The position customization options available in the current prototype get the basic point across, but there should be more specific customization to the position of individual fingers on the prosthetic hand. Along with this, it would be interesting to look into the option of allowing the user to set how strong they want the grip to be for the set position.

     

bottom of page