The Robotic Sixth Finger: a wearable extra limb to compensate hand function in chronic post stroke patient


Fig. 1. The Robotic Sixth Finger concept. The device is worn like a bracelet and pops up when needed.

This post summarises our research on wearable extra fingers. We started to investigate how to enhance the capability of the human hand by means of wearable robots in 2011 [1]. The goal was to integrate the human hand with an additional robotic finger as represented in Fig. 1. We firstly investigate the potentials of extra-finger in healthy subjects. Such devices could give humans the possibility to manipulate objects in a more efficient way, enhancing our hand grasping dexterity/ability. The first prototype has been presented in [2] together with several examples of the extra-finger applications. Together with the design issues related to portability and wearability of the devices, another critical aspect was integrating the motion of the extra–fingers with that of the human hand. In [3], we presented a mapping algorithm able to transfer to the extra–fingers a part or the whole motion of the human hand. A commercial dataglove was used to measure the hand configuration during a grasping task. A video is available here. Although this control approach guarantees a reliable tracking of the human hand, there was two main drawbacks to be solved. First, the user lacked a feedback of the robotic finger status and could only perceive the force


Fig. 2. The Robotic Sixth Finger together with the vibrotactile interface ring.

exerted by the device mediated by the grasped object. The second problem was related to the approaching phase of the grasp. In fact, the algorithm presented in [3] considers the motion of the whole hand to compute the motion of the extra finger, thus limiting the possibility of the user to make fine adjustments to adapt the finger shape to that of the grasped object. In [4] we addressed these issues by introducing a vibrotactile interface that can be worn as a ring. The human user receives information through the vibrotactile interface about the robotic finger status in terms of contact/no contact with the grasped object and in terms of force exerted by the device. Regarding the grasp approaching phase, we introduced a new control strategy that enables the finger to autonomously adapt to the shape of the grasped object.


Fig. 3. The Robotic Sixth Finger for hand grasping compensation in chronic stroke patients.

The experience gained with healthy subjects was fundamental for the development of Robotic Sixth Finger for compensating hand function in chronic stroke patients. We proposed to use a robotic the Robotic Sixth Finger together with the paretic hand/arm, to constrain the motion of the object. The device can be worn on the user’s forearm by means of an elastic band. The systems acts like a two-finger gripper, where one finger is represented by the Robotic Sixth Finger, while the other by the patient’s paretic limb. The patient can regulate the finger flexion/extension through a wearable switch embedded in a ring worn on the healthy hand. Two possible predefined motions can be chosen to obtain either a precision or a power grasp. In addition to the switch, the proposed ring interface also embeds a vibrotactile motor able to provide the patient with information about the force exerted by the device. The preliminary results with patients are presented in [5] and a video is available here.

Related publications

[1] O. A. Atassi, “Design of a robotic sixth finger for grasping enhancement,” Master’s thesis, Universita` degli Studi di Siena (advisor: Domenico Prattichizzo), December 2011.

[2] D. Prattichizzo, M. Malvezzi, I. Hussain, G. SalviettiThe Sixth-Finger: a Modular Extra-Finger to Enhance Human Hand Capabilities. In Proc. IEEE Int. Symp. in Robot and Human Interactive Communication, Pages 993-998, Edinburgh, United Kingdom, August 2014.

[3] D. Prattichizzo, G. Salvietti, F. Chinello, M. MalvezziAn Object-based Mapping Algorithm to Control Wearable Robotic Extra-Fingers. In Proc. IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics, Pages 1563-1568, Besançon, France, July, 2014.

[4] I. Hussain, L. Meli, C. Pacchierotti, G. Salvietti, D. PrattichizzoVibrotactile haptic fedback for intuitive control of robotic extra fingers. In Proc. IEEE World Haptics Conference (WHC), Chicago, IL, June, 2015.

[5] I. Hussain, G. Salvietti, L. Meli, C. Pacchierotti, D. PrattichizzoUsing the robotic sixth finger and vibrotactile feedback for grasp compensation in chronic stroke patients. In Proc. IEEE/RAS-EMBS International Conference on Rehabilitation Robotics (ICORR), Singapore, Republic of Singapore, 2015. [Finalist for the Best Student Paper Award]

[6] D. Prattichizzo. The interplay between humans and robots in grasping. In Proc. International Symposium on Robotic Research, Sestri Levante, Italy, September, 2015

[7] I. Hussain, G. Salvietti, M. Malvezzi and D. Prattichizzo. Design guidelines for a wearable robotic extra-finger. In proc. IEEE Int. Forum on Research and Technology for Society and Industry, Turin, Italy September, 2015

Human Guidance with Wearable Haptics: the research so far in our SIRSLab

This post summarises our contributions to the research on human guidance with wearable devices. One of our recent work “Evaluation of a predictive approach in steering the human locomotion via haptic feedback”, [1], has been recently accepted for publication in the proceedings of 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015). In this work, we consider the human as an unicycle robot, following the work by Arechavaleta [2], and we exploit a path following control [3] for generating appropriate haptic (vibrotactile) cues to be applied on the user. The task here is to follow some ideal lines, with no knowledge of them. This is just one examples of the application of the wearable haptics – in this case, with vibrotactile feedack – for human guidance.

This paper is an evolution of one of our previous works [4], in which we were guiding the human considering her/him inside a mixed human-robot formation, where the role of leader and follower was, in some sense, blended. Here, vibrotactile cues were used to tune the position of the human user, so that she/he was maintaining a rigid formation w.r.t. to the robot, while it was moving toward a target.

We exploited also the idea of considering the human as leader of a mixed human-robot team [5, 6]. In these cases, the haptic feedback was used to notify the user about violations of formation constraints, so that she/he could modify her/his pace and maintain the formation.

Most of our results have received funding from the European Union Seventh Framework Programme FP7/2007-2013 under grant agreement n. 601165 of the project “WEARHAP – WEARable HAPtics for humans and robots” and under grant agreement n. 288917 of the project “DALi – Devices for Assisted Living”, and from the European Union’s Horizon 2020 research and innovation programme – Societal Challenge 1 (DG CONNECT/H) under grant agreement n. 643644 of the project “ACANTO: A CyberphysicAl social NeTwOrk using robot friends”.

Publications/Videos/pdf also available on our website (

[1] M. Aggravi, S. Scheggi, and D. Prattichizzo – Evaluation of a predictive approach in steering the human locomotion via haptic feedback – IROS, 2015

[2] G. Arechavaleta, J.-P. Laumond, H. Hicheur, and A. Berthoz – On the nonholonomic nature of human locomotion – Autonomous Robots, 2008

[3] C. Canudas De Wit, G. Bastin, and B. Siciliano – Theory of robot control – Chapter 9 Nonlinear Feedback Control

[4] S. Scheggi, M. Aggravi, F. Morbidi, and D. Prattichizzo – Cooperative human-robot haptic navigation – ICRA, 2014, [pdf]

[5] S. Scheggi, F. Morbidi, and D. Prattichizzo. Human-robot formation control via visual and vibrotactile haptic feedback – IEEE Trans. on Haptics, 2014, [pdf]

[6] S. Scheggi, F. Chinello, D. Prattichizzo. Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks. In Proc. ACM Int. Conf. on PErvasive Technologies Related to Assistive Environments, PETRA ’12, Pages 1-4, 2012, [pdf]