New paper accepted for publication in the Proceedings of the 2016 IEEE International Symposium on Robot and Human Interactive Communication (Ro-Man 2016)

The paper entitled Haptic Wrist Guidance Using Vibrations for Human-Robot Teams has been accepted for publication in the proceedings of the 2016 IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). In this work, we exploit haptic signals in a heterogeneous human-robot formation for guiding the user’s wrist towards a location desired by the robot. Possible applications of such model can be in Urban-Search-And-Rescue (USAR) situations, where robots may be able to enter disaster sites too dangerous or too difficult for humans to get to. Once there, robots can gather information about the situation, providing human operators with video feeds, maps, and sensor data.

Concept
Haptic Wrist Guidance Using Vibrations for Human-Robot Teams – Concept

The human operator is tracked by the robot, and is equipped with a vibro-tactile armband that informs her/him about the desired location, guiding the user with vibrations using the concept of the so-called virtual magnet.

In this work, we take advantage of a haptic armband with only four vibrating motors. Different vibration patterns are used to guide the human wrist motion. The use of a single armband guarantees high wearability and portability of the system, while reducing power consumption and increasing the total autonomy. To prove the reliability of the guidance with a vibrating armband, we use an external tracking system instead of a camera on-board the robot.

Armband
Haptic Wrist Guidance Using Vibrations for Human-Robot Teams – Haptic armband

The desired position is represented by a sphere of radius r in a three-dimensional space. The proposed framework has been tested with seven subjects in two possible guiding scenarios: point-to-point and trajectory guidance. In the former evaluation, each of the participant was able to reach the eight desired wrist positions, whereas for the trajectory guidance, all participants were able to complete the task (shaping a vertical eight-shaped figure in a 3D space).

Results
Haptic Wrist Guidance Using Vibrations for Human-Robot Teams – Results

The proposed system can be exploited in human-robot teams. The haptic communication channel can be used an effective way to let a robots display a target position or a desired path to a human mate. We are currently integrating the tracking system in a mobile robot. We are also investigating solutions to suggest the orientation of the wrist in addition to the 3-D position.

M. Aggravi, S. Salvietti, D. Prattichizzo. Haptic Wrist Guidance Using Vibrations for Human-Robot Teams. In Proc. IEEE International Symposium on Robot and Human Interactive Communication. In Press, 2016.

Paper “Haptic assistive bracelets for blind skier guidance” accepted and presented at Augmented Human 2016

Blindness dramatically limits quality of life of individuals and has profound implications for the person affected and the society as a whole. Physical mobility and exercises are strongly spurred within people, as ways to maintain health and well-being.
We introduce a novel use of haptic feedback in this context. In particular, the skier can receive directional information through two vibrating bracelets worn on both the forearms. At the same time, the instructor can take advantage of his instrumented ski pole to pass information to the skier. The communication through haptic cues has been proven to be processed faster by the brain, demanding a less cognitive effort with respect to the auditory level.

device

The visually impaired is provided with a pair of vibrotactile bracelets, and a mobile computing device, e.g., a smartphone. The instructor is provided with a pair of augmented ski poles, embedding two additional electronic devices. Each augmenting device is composed of a microcontroller, a wireless communication antenna, a battery for powering the electronics, and a switch. Each pole is connected wireless with the mobile computing devices worn by the blind skier. Pressing the switch mounted on the left/right poles triggers a signal, with the aim to activate the left/right vibrating bracelet worn by the blind skier. This triggering signal is directly sent through the wireless communication to the mobile computing device on the blind skier, whose maximum distance from the ski instructor must be lower than 10 m, being this a confident functioning distance for the wireless communication. Once the mobile computing device has received a triggering signal, it activated the corresponding vibrotactile bracelet. The bracelet keep vibrating until the instructor press again the button. This solution allows the instructor to regulate the length of the haptic stimulus.

Conference Website: http://www.augmented-human.com
 

PDF

M. Aggravi, G. Salvietti, D. Prattichizzo
“Haptic assistive bracelets for blind skier guidance”
Proceedings of the 7th Augmented Human International Conference 2016, ACM, 2016.
DOI: 10.1145/2875194.2875249

New paper published on Medical & Biological Engineering & Computing “Hand–tool–tissue interaction forces in neurosurgery for haptic rendering”

Haptics provides sensory stimuli that represent the interaction with a virtual or tele-manipulated object, and it is considered a valuable navigation and manipulation tool during tele-operated surgical procedures. Haptic feedback can be provided to the user via cutaneous information and kinesthetic feedback.

 

deviceSensory subtraction removes the kinesthetic component of the haptic feedback, having only the cutaneous component provided to the user. Such a technique guarantees a stable haptic feedback loop, while it keeps the transparency of the tele-operation system high, which means that the system faithfully replicates and render back the user’s directives.

 

figure1This work focuses on checking whether the interaction forces during a bench model neurosurgery operation can lie in the solely cutaneous perception of the human finger pads. If this assumption is found true, it would be possible to exploit sensory subtraction techniques for providing surgeons with feedback from neurosurgery. We measured the forces exerted to surgical tools by three neurosurgeons performing typical actions on a brain phantom, using contact force sensors, whilst the forces exerted by the tools to the phantom tissue were recorded using a load cell placed under the brain phantom box. The measured surgeon-tool contact forces were 0.01 – 3.49 N for the thumb and 0.01 – 6.6 N for index and middle finger, whereas the measured tool- tissue interaction forces were from six to eleven times smaller than the contact forces, i.e., 0.01 – 0.59 N.

 

Fingerprint_detail_on_male_finger_smallThe measurements for the contact forces fit the range of the cutaneous sensitivity for the human finger pad, thus, we can say that, in a tele-operated robotic neurosurgery scenario, it would possible to render forces at the fingertip level by conveying haptic cues solely through the cutaneous channel of the surgeon’s finger pads. This approach would allow high transparency and high stability of the haptic feedback loop in a tele-operation system.

 

PDF: http://sirslab.dii.unisi.it/papers/2015/Aggravi.MBEC.2015.Surgeons.pdf

M. Aggravi, E. De Momi, F. DiMeco, F. Cardinale, G. Casaceli, M. Riva, G. Ferrigno, D. Prattichizzo, D.
“Hand-Tool-Tissue Interaction Forces in Neurosurgery for Haptic Rendering.”
Medical & Biological Engineering and Computing, Springer, 2015.
DOI: 10.1007/s11517-015-1439-8

Human Guidance with Wearable Haptics: the research so far in our SIRSLab

This post summarises our contributions to the research on human guidance with wearable devices. One of our recent work “Evaluation of a predictive approach in steering the human locomotion via haptic feedback”, [1], has been recently accepted for publication in the proceedings of 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015). In this work, we consider the human as an unicycle robot, following the work by Arechavaleta [2], and we exploit a path following control [3] for generating appropriate haptic (vibrotactile) cues to be applied on the user. The task here is to follow some ideal lines, with no knowledge of them. This is just one examples of the application of the wearable haptics – in this case, with vibrotactile feedack – for human guidance.

This paper is an evolution of one of our previous works [4], in which we were guiding the human considering her/him inside a mixed human-robot formation, where the role of leader and follower was, in some sense, blended. Here, vibrotactile cues were used to tune the position of the human user, so that she/he was maintaining a rigid formation w.r.t. to the robot, while it was moving toward a target.

We exploited also the idea of considering the human as leader of a mixed human-robot team [5, 6]. In these cases, the haptic feedback was used to notify the user about violations of formation constraints, so that she/he could modify her/his pace and maintain the formation.

Most of our results have received funding from the European Union Seventh Framework Programme FP7/2007-2013 under grant agreement n. 601165 of the project “WEARHAP – WEARable HAPtics for humans and robots” and under grant agreement n. 288917 of the project “DALi – Devices for Assisted Living”, and from the European Union’s Horizon 2020 research and innovation programme – Societal Challenge 1 (DG CONNECT/H) under grant agreement n. 643644 of the project “ACANTO: A CyberphysicAl social NeTwOrk using robot friends”.

Publications/Videos/pdf also available on our website (sirslab.dii.unisi.it)

[1] M. Aggravi, S. Scheggi, and D. Prattichizzo – Evaluation of a predictive approach in steering the human locomotion via haptic feedback – IROS, 2015 [pdf]

[2] G. Arechavaleta, J.-P. Laumond, H. Hicheur, and A. Berthoz – On the nonholonomic nature of human locomotion – Autonomous Robots, 2008

[3] C. Canudas De Wit, G. Bastin, and B. Siciliano – Theory of robot control – Chapter 9 Nonlinear Feedback Control

[4] S. Scheggi, M. Aggravi, F. Morbidi, and D. Prattichizzo – Cooperative human-robot haptic navigation – ICRA, 2014, [pdf]

[5] S. Scheggi, F. Morbidi, and D. Prattichizzo. Human-robot formation control via visual and vibrotactile haptic feedback – IEEE Trans. on Haptics, 2014, [pdf]

[6] S. Scheggi, F. Chinello, D. Prattichizzo. Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks. In Proc. ACM Int. Conf. on PErvasive Technologies Related to Assistive Environments, PETRA ’12, Pages 1-4, 2012, [pdf]

New paper accepted for publication in the Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015)

In this paper, we present a haptic guidance policy to steer the user along predefined paths and we evaluate a predictive approach to compensate actuation delays that humans have when they are guided along a given trajectory via sensory stimuli.

The proposed navigation policy exploits the nonholonomic nature of human locomotion in goal directed paths, which leads to a very simple guidance mechanism. The proposed method has been evaluated in a real scenario where seven human subjects were asked to walk along a set of predefined paths via vibrotactile cues. Their poses as well as the related distances from the path have been recorded using an accurate optical tracking system.

Results revealed that an average error of 0.24 m is achieved by using the proposed haptic policy, and that the predictive approach does not bring significant improvements to the path following problem for what concerns the distance error. On the contrary, the predictive approach achieved a definitely lower activation time of the haptic interfaces.

M. Aggravi, S. Scheggi, D. Prattichizzo. Evaluation of a predictive approach in steering the human locomotion via haptic feedback. In Proc. IEEE/RSJ International Conference Intelligent Robots and Systems. In Press, 2015.

Dicono di noi

Alcuni articoli e video apparsi sui media:

2014 

July 18, 2014-Il Sole 24 Ore- A Siena i robot salgono in cattedra e li programmano i bimbi della scuola materna.
July 12, 2014-Il corriere della Sera/SCUOLA- In classe con la maestra e i robot.
July 1, 2014 – Il Corriere della sera.it- Siena, il bracciale per non vedenti che ha reso Massimo libero.
June 30, 2014 – l’Espresso- Toccami automa di Viola Bachini.
June 25, 2014 – RNext La repubblica delle idee-Tecnologia tattile e indossabile, così la robotica ci da una mano.
June 24, 2014 – La Repubblica.it (video) –Talk of Professor Prattichizzo @RNext Siena
June 10, 2014-Il Corriere di Siena- Quando la robotica diventa educativa.
June 6, 2014-La Stampa.it- Si sperimentano i robot con i più piccoli.
June 4, 2014-Antenna Radio Esse- La robotica per l’apprendimento delle nuove tecnologie-Antenna radio esse.
June, 4 2014-SienaFree.it- La robotica per l’apprendimento delle nuove tecnologie-SienaFree.it
June 4, 2014- Ufficio Stampa UNISI-La robotica per l’apprendimento delle nuove tecnologie.
May 8, 2014 – Il Corriere della Calabria- Le emozioni a portata di mano.
March, 2014 – MICRON: the journal of ecology, science and knowledge. – Che robot mi metto oggi.
March 31, 2014- Rassegna stampa UNISI- L’arte di manipolare gli oggetti con le mani robotiche
February 24, 2014 – ilCittadino online – La robotica e le sue applicazioni mediche al TEDx Roma
February 22, 2014 – Wind Business Factor (video) – Interview to Prof. Domenico Prattichizzo by Wind Business Factory at TEDx Roma
February 15, 2014 – RAI3 TG Regione Toscana (video) – Robot da indossare
January 24, 2014 – SienaFree.it – Unione italiana ciechi la robotica un sogno realizzabile

2013 

December 13, 2013 – La Repubblica – L’archivio delle carezze le nostre emozioni in un file
December 6, 2013 – La Repubblica Firenze – Il guanto Robot
September 26, 2013 – Il Corriere fiorentino – La ricerca fa show
May 28, 2013 – Appetizer TV program (video) – Intervista al Prof. Domenico Prattichizzo
May 28, 2013 – Appetizer TV program (video) – Intervista al Prof. Domenico Prattichizzo
May 13, 2013 – Il Sole 24 Ore/Eventi – Dai supercomputer alla robotica per comunicare con il tatto
May 8, 2013 – Il Gazzettino Senese – WEARHAP, il robot che si indossa
May 6, 2013 – La Repubblica.it/Affari e finanza – L’automa sarà presto indossabile
May 2, 2013 – Antenna Radio Esse (video) Intervista al Prof. Domenico Prattichizzo
April 30, 2013 – La Nazione – Siena – Progetto di Ingegneria ottiene fondi e crea posti di lavoro
April 29, 2013 – Ansa – Tocco a distanza, ateneo Siena capofila progetto Ue Siena,
April 29, 2013 – Siena Free – WEARHAP: sistemi robotici indossabili per uomini e robot Siena,
April 29, 2013 – C3T News (video) Intervista al Dr. Simone Rossi
April 29, 2013 – Il Cittadino Online – Wearhap: la rivoluzione delle tecnologie robotiche
April 29, 2013 – TGT Italia 7 (video) Intervista al Prof. Domenico Prattichizzo
April 29, 2013 – SienaNews.it – Sistemi robotici indossabili per uomini e robot: al via un grande progetto di ricerca europeo guidato dall’Università di Siena
April 29, 2013 – Meteoweb – Scienza: al via il progetto Wearhap, sistemi robotici indossabili anche per gli esseri umani

2012

May 10, 2012 – Il Corriere di Siena – La robotica a sostegno della riabilitazione motoria
May 10, 2012 – Sienanews.it – Corso di Robotica a Siena
May 09, 2012 – La Nazione Siena – Università di Siena, workshop internazionale di robotica

2010

May 24, 2010 – WIRED.com – Augmented Reality: Haptic Augmented Reality
March 23, 2010 – Sky TG24 – Intervista a “Io Reporter
March 27, 2010 – Sky TG24 – Intervista a “Io Reporter”
May, 2010 – La Repubblica.it/Affari&Finanza – Il tocco esperto del merdico arriva via Skype
March, 2010 – Il Corriere di Siena – Esperienze tattili a distanza
March, 2010 – Goveroitaliano.it – Conferenza stampa del Ministro Brunetta

2009

June, 2009 – La Repubblica.it/Affari&Finanza – Il made in Italy dei Robot in cima al podio mondiale
May 29, 2009 – Reuters Italia – A Professore senese premio internazionale di robotica in Giappone
May 30, 2009 – Corriere di Siena – Premio internazionale a Prattichizzo
January 19, 2009 – La Repubblica.it – Automi virtuali in second life

2007

May, 2007 – Teletruria.it – Intervista a Prof. Prattichizzo
May, 2007 – L’Arena,Il giornale di Verona – Mini robot nel cervello guidato dal chirurgo

2006

July, 2006 – Panorama – Muovi la mano con il pensiero

2005

July, 2005 – Panorama – Una carezza al feto
November, 2005 – La Repubblica.it – Si può toccare il feto in 3D grazie alle interfacce aptiche

2004

June, 2004 – Video – La Macchina del tempo