Courtesy IEEE World Haptics 2013, Daejeon Korea

Program

In the 2015 IEEE World Haptics Conference, we will have major changes in the program in order to better accommodate the paper and demo submissions that have been drastically increasing in the past several years.

Program at a glance

All times are Chicago local time (UTC-5h). Click sessions for more info.

Monday (6/22) Tuesday Wednesday Thursday Friday (6/26)
Breakfast Breakfast
08:30 08:30
Breakfast Breakfast
McA: Welcome McA: Haptic Technology - Devices and Algorithms for Feedback and Exploration
09:00 09:00
Morning Workshops and Tutorials McA: Haptic Science - Perception, Virtual Reality, and Human-Computer Interaction McA: Mid-Air and Wearable

NwR: Motor Control and Learning

Northwestern Evanston Campus Lab Tours

(continental breakfast)

09:30 09:30
10:00 10:00
Coffee Break Coffee Break
10:30 10:30
LRs: Demonstrations
McA: Tactile Devices and Rendering

NwR: Perception - Softness and Size
McA: Human-Computer Interaction

NwR: Dynamics and Interaction
11:00 11:00
Buses to University of Chicago
11:30 11:30
12:00 12:00
Lunch
(provided)
Lunch
(on your own)
Lunch
(on your own)
Lunch
(on your own)

Bensmaia Lab Tour @ University of Chicago

(lunch provided)

12:30

13:00
12:30

13:00
13:30 13:30
Afternoon Workshops and Tutorials McA: Plenary Session: Somatosensory Prosthetics McA: Plenary Session: Somatosensory Prosthetics McA: Perception - Weight, Vibration, Force, and Temperature

NwR: Teleoperation
14:00 14:00
Buses to RIC
14:30 14:30
LRs: Demonstrations LRs: Demonstrations Rehabilitation Institute of Chicago (RIC) Tours
Coffee
15:00 15:00
McA: Eurohaptics PhD Talk
15:30 15:30
Walk to PTHMS
McA: Early Career Talk Physical Therapy and Human Movement Sciences (PTHMS) Lab Tours
16:00 16:00
McA: Texture

NwR: Sensors and Actuators
McA: Recognition, Modeling, and Rendering

NwR: Perception - Applied
McA: Awards
16:30 16:30
Closing Reception
Walk to Boat Tour
17:00 17:00
Opening Reception

(dinner served)

McA: WHC Retrospective Architectural Boat Tour
McA: Exhibitors
17:30 17:30
McA: Poster Teaser
18:00 18:00
Dinner

(on your own)

Buses to Banquet
18:30 18:30
19:00 Banquet at Planetarium 19:00
19:30 19:30
Poster Setup
20:00 20:00
Poster Session at Orrington "Panic at Cloud 9" @ Second City
20:30 20:30
21:00 21:00
21:30 21:30
22:00 Buses Back 22:00
Legend

McA: McCormick Auditorium
NwR: Northwestern Room
LRs: Louis and Lake Rooms

Workshops & Tutorials

All workshops & tutorials will be held on Monday June 22, 2015 at Northwestern University.

Full Day

(09:00~17:00)

Organizers
Sliman Bensmaia, University of Chicago, USA

Speakers
Sliman Bensmaia, University of Chicago, USA
Ingvars Birznieks, University of New South Wales, Australia
Juan Huang, Johns Hopkins University, USA
Mitra Hartmann, Northwestern University, USA
Daniel Goldreich, McMaster University, Canada
Masashi Nakatani, Columbia University, USA
Daniel O’Connor, Johns Hopkins University, USA
Andrew Pruszynski, University of Western Ontario, Canada
Allan Smith, Université de Montréal, Canada
Stephen Helms Tillery, Arizona State University, USA
Johan Wessberg, University of Gothenburg, Sweden
Jeffrey Yau, Baylor College of Medicine, USA

Organizers
Pedro Lopes, Hasso Plattner Institut, Germany
Max Pfeiffer, University of Hannover, Germany
Michael Rohs, , University of Hannover, Germany
Patrick Baudisch, Hasso Plattner Institut, Germany

Speakers
Pedro Lopes, Hasso Plattner Institut, Germany
Max Pfeiffer, University of Hannover, Germany

More info:
PDF
http://haptics15.plopes.org/
(electronic kits included)

Morning 

(9:00~12:00)

Organizers
Claudio Pacchierotti, Istituto Italiano di Tecnologia, Italy
Domenico Prattichizzo, University of Siena, Italy
Katherine J. Kuchenbecker, University of Pennsylvania, PA, USA

Speakers
Antonio Gangemi, University of Illinois Medical Center
Lawton Verner, Intuitive Surgical, Inc.
Cagatay Basdogan, Koc University
Claudio Pacchierotti, Istituto Italiano di Tecnologia, Italy
Domenico Prattichizzo, University of Siena, Italy
Allison M. Okamura, Stanford University
Dong-Soo Kwon, KAIST
Katherine J. Kuchenbecker, University of Pennsylvania, PA, USA

More info:
PDF
http://sirslab.diism.unisi.it/whc15-cutaneous-in-medicine/

Organizers
Roberta Klatzky, Carnegie Mellon University, USA
Astrid Kappers, VU Amsterdam, Netherlands

Speakers
Wouter Bergmann Tiest, VU University of Amsterdam, Netherlands
Knut Drewing, Giessen University, Germany
Martha Flanders, University of Minnesota, USA
Hilary Kalaghar, Drew University, USA (developmental)
Carla Pugh & Shlomi Laufer, University of Wisconsin, USA
Roberta Klatzky, Carnegie Mellon University, USA

More info:
PDF

Organizers
Oliver Schneider, UBC, Canada
Karon MacLean, UBC, Canada

Speakers
Ali Israr, Disney Research
David Birnbaum, Director, UX Design, Immersion
Sriram Subramanian, University of Bristol
Marianna Obrist, Department of Informatics, University of Sussex

More info:
PDF
http://oliverschneider.ca/HaXD

Organizers / Speakers
Yoshihiro Tanaka, Nagoya Institute of Technology
Junji Watanabe, NTT Communication Science Laboratories / Tokyo Institute of Technology
Shogo Okamoto, Nagoya University

More info:
PDF
http://yoshihiro.web.nitech.ac.jp/whc15_texture_workshop

Organizers
Michael Adams, University of Birmingham, UK
Jean-Louis Thonnard, Université Catholique de Louvain, Belgium
Vincent Hayward, Université Pierre et Marie CURIE, France
Betty Semail, Université Lille I, France

Speakers
Johan Wessberg, University of Gotenberg, Sweden
Michael Adams, University of Birmingham, UK
Jean-Louis Thonnard, Université Catholique de Louvain, Belgium
Vincent Hayward, Université Pierre et Marie CURIE, France
Betty Semail, Université Lille I, France

More info:
PDF

Afternoon

(13:30~17:00)

Organizers
Domenico Prattichizzo, University of Siena and IIT
Antonio Frisoli, Scuola Superiore Sant'Anna
Miguel Otaduy, Universidad Rey Juan Carlos

Speakers
Domenico Prattichizzo, University of Siena and IIT
Antonio Frisoli, Scuola Superiore Sant'Anna
Miguel Otaduy, Universidad Rey Juan Carlos
Vincent Hayward, UPMC
Marc Ernst, University of Bielefield
Sandra Hirche, TUM
Antonio Bicchi, University of Pisa
1 industrial speaker (To be confirmed)

More info:
PDF

Organizers
Hiroyuki Shinoda, the University of Tokyo
Ali Israr, Disney Research

Speakers
Jun Rekimoto, the University of Tokyo
Naotaka Fujii, Brain Science Institute RIKEN
Masahiko Inami, Keio University
Ali Israr, Disney Research
Hiroyuki Shinoda, the University of Tokyo

More info:
PDF
http://www.hapis.k.u-tokyo.ac.jp/public/HAA/

Organizers
David Abbink - Delft University of Technology, Netherlands
Astrid Kappers, VU Amsterdam, Netherlands
Frans van der Helm, Delft University of Technology, Netherlands

Speakers
David Abbink - Delft University of Technology, Netherlands
Astrid Kappers, VU Amsterdam, Netherlands
Winfred Mugge, Delft University of Technology, Netherlands
Tricia Gibo, Delft University of Technology, Netherlands
Jeroen Wildenbeest, Delft University of Technology, Netherlands

More info:
PDF

Organizers
Gavin Buckingham, Heriot-Watt University, UK

Speakers
Gavin Buckingham, Heriot-Watt University, UK
Lee Baugh, University of South Dakota, USA
Myrthe Plaisier, VU University Amsterdam, Netherlands
Joachim Hermsdörfer, Technical University Munich, Germany
Christian Wolf, Justus-Liebig-Universität Gießen, Germany
Vincent Hayward, Université Pierre et Marie Curie, France
Michael Parzuchowski, University of Social Sciences and Humanities in Sopot, Poland

More info:
PDF

Back to the Program ↑

Plenary Sessions

World Haptics 2015 will feature a series of plenary talks by four renowned researchers from neuroscience on the topic of Somatosensory Prosthetics

Tuesday Plenary Session: Somatosensory Prosthetics

Time: Tue, 13:30 ~ 14:30
Room: McCormick Auditorium
Session chair: Allison Okamura

Dustin J Tyler
Biomedical Engineering
Case Western Reserve University
Associate Director, Advanced Platform Technology Center (APT)
Louis Stokes Cleveland Dept. Vetrans Affairs Medical Center

Dustin J. TylerDustin J. Tyler received his Ph.D. in biomedical engineering from Case Western Reserve University in Cleveland, OH and his B.S. in electrical engineering from Michigan Technological University in Houghton, MI. His primary appointment is associate professor of biomedical engineering at Case Western Reserve University. He is a principal investigator at the Louis-Stokes Cleveland Department of Veterans’ Affairs Medical Center (LSCDVAMC) and Associate Director of the Advanced Platform Technology Center, a Department of Veteran’s Affairs Rehabilitation Research & Development Center of Excellence. Dr. Tyler is founder and President of Bear Software, LLC which is currently developing neural stimulation devices for the management of dysphagia. He has spent several years in industry to commercialize neural prosthesis for restoration of function in spinal cord injured and stroke patients, leading research and development efforts, as well as, managing the development of clinical programming software for Class III medical devices. He is a member of the IEEE Engineering in Medicine and Biology, Material Research Society, Biomedical Engineering Society, AAAS, Society for Neurosciences, and Tau Beta Pi. In addition to service as reviewer for several journals, he has served as an associate editor to IEEE-TNSRE and is a founding and contributing associate editor for the Journal Bioelectric Medicine. He has served on the organizing committees for symposiums at the Material Research Society Meeting, session chair for multiple IEEE-EMBS and BMES conferences. He is the founding co-chair of the Cleveland Neural Engineering Workshop (Cleveland NEW). The third semi-annual Cleveland NEW will be held in June, 2015. Dr. Tyler’s research interests include clinical implementation of neural interfaces; neural interfaces for restoration of sensation and control in limb loss; neuromimetic devices and materials; MEMS technology applied to development of advanced neural devices; neuromodulation system development; computational neural modeling; neuroprostheses for restoration of lost function in physically intact, but neurologically impaired individuals; and neuroprotheses for head and neck applications.

Abstract: One of the first biology lessons in grade school is of the five senses: sight, sound, smell, taste, and touch. They are the (only) connection to the world and people around us. Of the five senses, touch is the one we take most for granted and least understand the devastating impact of its loss. The sense of touch comes from the skin, the largest sensory organ in the human body. For nearly 2 million people in the US and 185,000 more each year, the loss of sensation is one of the most significant effects of limb loss resulting from trauma or vascular disease. Body-powered prostheses are often preferred over more functional powered devices because the user can “feel” the pressure of a grip through a requisite body harness. Providing representative sensory information on the residual limb is unnatural and does not directly match the sensory locations expected by the user’s visual experience of the prosthesis. We have addressed these challenges with permanently implanted, multi-contact nerve cuff electrodes on the residual ulnar, radial, and median nerves of subjects with limb loss. These electrodes directly and selectively activate the peripheral neural pathways, and hence all upstream pathways, normally responsible for sensation. Ninety percent of the channels produce physically unique locations of sensation, distributed around the hand and wrist. The quality of the sensation is controlled by using a patterned stimulation intensity. The pattern pulses are critical to the brain’s interpretation of the sensation. Varying the pattern can produce multiple different sensations at a common location. The addition of sensation to the user during tasks improves fine motor control with standard myoelectric prostheses. The system has been implanted and stable for three years. The user reports feeling their hand – the missing hand – in touching and manipulating objects. Restoring feeling has allowed the individuals to, “feel [my] hand for the first time since the accident,” and “feel [my] wife touch my hand.” With more than five subject-years of experience, this work is leading the evolution of a new era in prostheses and haptic interfaces.

Zelma Kiss
Associate Professor of Neurosurgery and Neuroscience
University of Calgary

Zelma KissZelma HT Kiss MD PhD is an Associate Professor of Neurosurgery and Neuroscience at the University of Calgary, where she conducts basic and clinical research on the mechanisms of action of deep brain stimulation (DBS) and somatosensory restoration with neural prostheses. She earned her MD at the University of Ottawa followed by neurosurgery training and a PhD in neurophysiology at the University of Toronto. After fellowship training with the originator of DBS for movement disorders, Professor Benabid, at Université Joseph Fourier in Grenoble France, she worked initially at University of Ottawa obtaining further training in electrophysiology, and moved to Calgary in 2000. She is an Alberta Heritage Foundation for Medical Research Clinical Scholar and studies how DBS works in animal models and movement disorders, novel indications for DBS such as depression, and her clinical program also includes management of patients with pain. She directs the Neuromodulation Program for Alberta Health Services Calgary and the Clinician Investigator training Program at the University of Calgary.

Abstract: A useful tactile somatosensory prosthesis must (i) evoke somatosensory percepts in limited body regions, (ii) provide graded sensation, (iii) have reproducibility and persistence, (iv) provide the perception of slip and pressure, and (v) provide proprioception. With thalamic stimulation we have been able to produce 3 of these 5.

Microelectrode recording and stimulation in non-sedated humans prior to deep brain stimulator (DBS) implantation allows investigation of the percepts induced by electrical stimulation of somatosensory pathways. In thalamus we target the kinesthetic nucleus to treat tremor, and the immediately adjacent tactile nucleus to treat pain. Cells in these nuclei respond to movement around a joint, deep pressure and light touch. Electrical stimulation (333 Hz, 0.2 ms square wave pulses, 1-25 uA, 2-10 s trains) is applied through 25-40 um microelectrodes and the DBS macroelectrode and patients describe what they feel.

First we digitized the activity of individual neurons in thalamic somatosensory nuclei and played these patterns back as electrical pulses in thalamus of different subjects. We learned that natural percepts were rare; most subjects described such stimulation as unnatural and ‘tingling’. High frequency stimulation was just as good as complex spike trains at evoking ‘natural’ percepts, and even macrostimulation could induce natural percepts. Lower currents were more likely to induce natural percepts. Different patterns could evoke different percepts in the same brain location and these were generally reproducible. Persistence of percepts was related to the time that high frequency stimulation was applied: higher duty cycles (especially continuous stimulation) reduced the duration of percepts evoked. Macrostimulation was as good as microstimulation in evoking a variety of percepts, it just did not have the somatotopic focus that microstimulation had.

Recently we tried to improve the naturalness of percepts by applying interleaved patterns of stimulation through 2 microelectrodes 200-300 um apart. While this did not change the ‘naturalness’, different interleaved patterns did alter the percepts described.

In thalamus we have been unable to evoke sensations of slip, however we could evoke pressure in rare cases. We were also unable to elicit body movement / limb position percepts. We could induce motor effects with real body movement using macrostimulation, but these were uncomfortable in our intact subjects. This remains the most challenging aspect in the design of a somatosensory neural prosthesis and suggests that thalamus may provide some percepts, but other sites may be required for the full range of somatic sensation.

Wednesday Plenary Session: Somatosensory Prosthetics

Time: Tue, 13:30 ~ 14:30
Room: McCormick Auditorium
Session chair: Roger Gassert

Sliman Bensmaia
Assistant Professor, Department of Organismal Biology and Anatomy
University of Chicago

Sliman BensmaiaDr. Sliman Bensmaia received a B.A. in Cognitive Science from the University of Virginia in 1995, and a PhD in Cognitive Psychology from the University of North Carolina at Chapel Hill, in 2003, under the tutelage of Dr. Mark Hollins. He then joined the lab of Dr. Kenneth Johnson, at the Johns Hopkins University Krieger Mind/Brain Institute, as a postdoctoral fellow until 2006, at which time he was promoted to Associate Research Scientist. In 2009, Dr. Bensmaia joined the faculty as an Assistant Professor in the Department of Organismal Biology and Anatomy at the University of Chicago, where he is also a member of the Committees on Neurobiology and on Computational Neuroscience. The main objectives of Bensmaia’s lab are to discover the neural basis of somatosensory perception using psychophysics, neurophysiology, and computational modeling. Bensmaia also seeks to apply insights from basic science to develop approaches to convey sensory feedback in upper-limb neuroprostheses.

Abstract: Our ability to manipulate objects dexterously relies fundamentally on sensory signals originating from the hand. To restore motor function with upper-limb neuroprostheses requires that somatosensory feedback be provided to the tetraplegic patient or amputee. Given the complexity of state-of-the-art prosthetic limbs, and thus the huge state-space they can traverse, it is desirable to minimize the need of the patient to learn associations between events impinging upon the limb and arbitrary sensations. With this in mind, we seek to develop approaches to intuitively convey sensory information that is critical for object manipulation – information about contact location, pressure, and timing – through intracortical microstimulation (ICMS) of primary somatosensory cortex (S1). To this end, we first explore how this information is naturally encoded in the cortex of (intact) non-human primates (Rhesus macaques). In stimulation experiments, we then show that we can elicit percepts that are projected to a specific localized patch of skin by stimulating neurons with corresponding receptive fields. Similarly, information about contact pressure is conveyed by invoking the natural neural code for pressure, which entails not only increasing the activation of local neurons but also recruiting adjacent neurons to signal an increase in pressure. In a real-time application, we demonstrate that animals can perform a pressure discrimination task equally well whether mechanical stimuli are delivered to their native fingers or to a prosthetic one. Finally, we propose that the timing of contact events can be signaled through phasic ICMS at the onset and offset of object contact that mimics the ubiquitous on and off responses observed in S1 to complement slowly-varying pressure-related feedback. We anticipate that the proposed biomimetic feedback will considerably increase the dexterity and embodiment of upper-limb neuroprostheses and will constitute an important step in restoring touch to individuals who have lost it.

Lee E. Miller
The Edgar C. Stuntz Distinguished Professor of Neuroscience
Departmens of Physiology, Physical Medicine and Rehabilitation, and Biomedical Engineering
Northwestern University

Lee E. MillerLee E. Miller is the Edgar C. Stuntz Distinguished Professor of Neuroscience in the Departments of Physiology, Physical Medicine and Rehabilitation, and Biomedical Engineering at Northwestern University. He received the B.A. degree in physics from Goshen College, Goshen, IN, in 1980, and the M.S. degree in biomedical engineering and the Ph.D. degree in physiology from Northwestern University in 1983 and 1989, respectively. He completed two years of postdoctoral training in the Department of Medical Physics, University of Nijmegen, The Netherlands.

Dr. Miller has had a career-long interest in the motor and sensory signals that are generated by single neurons in the brain during arm movement. His early work was devoted to studying these signals in the brainstem, cerebral cortex, and cerebellum, and their relation to muscle activity. In the past 10 years, Dr. Miller’s lab has increasingly focused on translational research, pioneering the use of brain machine interface technology in projects aimed at restoring movement and sensation to paralyzed patients. His interdisciplinary approach has led to productive collaborations locally, nationally, and internationally. In 1997, Dr. Miller received a North Atlantic Treaty Organization award to promote international collaborative research for a project at the University of Bochum, Germany. He was later appointed a Senior Visiting Fellow at the Institute of Neurology at the University College London in 2002. He has authored over 100 manuscripts, book chapters, and review articles.

Miller currently serves as a development officer, and president elect of the Society for the Neural Control of Movement and a board member of the International Brain-Computer Interface Steering Committee.

Abstract: Brain Machine Interfaces (BMIs) that use recordings from motor areas of the brain to effect movement of a robotic limb or even a patient’s paralyzed limb have progressed tremendously in the past decade. However, a major issue to be addressed is the need to provide proprioceptive feedback through an afferent neural interface. Loss of proprioception largely eliminates the ability to plan movement dynamics or to make rapid corrections to limb perturbations even in the presence of vision. The representation of proprioceptive signals within the cortex has been far less studied than has touch, and while some progress has been made toward restoring touch through intracortical microstimulation of somatosensory cortex (S1), there has been as yet, very little corresponding success for proprioception.

We have completed a series experiments designed to study the way limb movements are encoded by neurons in area 2 of S1. These neurons signal limb movement, whether generated actively by the monkey or as the result of a passive limb displacement. The discharge of most neurons is tuned to the direction of hand movement and can be summarized reasonably accurately by a sinusoidal tuning curve with a single “preferred direction” (PD). There is even evidence of an efference copy component of S1 activity that precedes the onset of active movement and is well aligned spatially with the afferent component. The representation of different movement directions by populations of S1 neurons is linearly separable, as is the brain state representing active and passive movements. The latter is likely due to the interaction of kinematic and force representation by individual neurons.

We have now begun a new series of experiments, the goal of which is to evoke a sensation of directed limb movement by stimulating electrodes within S1 to recreate these natural patterns of cortical activity. By stimulating small groups of electrodes with similar PDs, we have succeeded in inducing perceptions of limb motion that appear to be similar to those caused by actual movement. We are working to develop a neuroprosthesis based on continuously varying stimulation of many electrodes, in order to restore proprioceptive feedback to patients with high-level spinal cord injury.

Back to the Program ↑

Semi-Plenary Talk: TCH Early Career Award 2015

Time: Thur, 15:45 ~ 16:15
Room: McCormick Auditorium
Session chair: Dangxiao Wang

IEEE Technical Committee on Haptics (TCH) presents TCH Early Career Award biannually. The aim of the award is, as announced by TCH, to recognize outstanding contributions to the area of haptics by members of our community who are in the earlier stages of their career. Eligible members are those who are within 10 years of receiving their doctoral degree by 1/1/2015 and are in an untenured position. The recipient of the 2015 TCH Early Career Award is Ki-Uk Kyung, and we are pleased to host a semi-plenary talk of Ki-Uk at World Haptics 2015.

Ki-Uk Kyung
Principal Research Scientist
Director, Transparent Transducer and User Experience Research Center
Electronics and Telecommunications Research Institute (ETRI), Korea

Ki-Uk KyungKi-Uk Kyung is a principal research scientist at Electronics and Telecommunications Research Institute (ETRI), Korea. He received a Bachelor’s degree in mechanical engineering from Korea Advanced Institute of Science and Technology (KAIST) in 1999, and a Ph.D. degree from KAIST in 2006 under supervision of Prof. Dong-Soo Kwon. He then joined POST-PC Research Group at ETRI and suggested pen-like haptic interfaces named Ubi-Pen Series between 2006 and 2009. He has participated in development of ISO standards for haptics as a delegate member representing Korea since 2007. In 2008, he worked as a visiting scientist in the Touch Lab at MIT. From 2010, he has managed a project TAXEL (TActile pixel) for convergence of haptic and visual elements and has proposed several flexible visuo-haptic interfaces. In 2012, Dr. Kyung founded the Transparent Transducer and User Experience Research Center at ETRI. The main objectives of his lab are to discover soft and transparent materials for flexible sensors/actuators and to apply them to future devices such as flexible display, wearable device and etc.

Abstract: This talk introduces a historical change of haptics research based on personal experience. For a long time, researchers had proposed various force or tactile display devices which need be placed on a table. The devices were mainly composed of very stiff supporting structure and rigid actuating/sensing components such as electric motors, piezoelectric actuators, force/torque sensors and etc. With recent development of visual display devices, haptic interfaces have been investigated for interacting with portable touchscreen devices. For installation of haptic interface into touchscreen devices, the haptic components need to be miniaturized and sometimes they have to be transparent. Now, we have more challenging issues since flexible electronic devices and wearable devices are rapidly arousing people’s interest in the market as well as a field of research. In order to apply haptics technology to future flexible interfaces, we need to consider new appearance of actuators and sensors. This talk starts from brief description of compact tactile displays and haptic interfaces for touchscreens, and mainly introduces current research activities for flexible and transparent haptic interface.

Semi-Plenary Talk: Eurohaptics Best PhD Thesis Award 2014

Time: Thur, 15:15 ~ 15:45
Room: McCormick Auditorium
Session chair: Jan van Erp

Every year the EuroHaptics Society awards a prize for the best PhD thesis of the year. The EuroHaptics Society Ph.D. Award competition is open to all candidates, who in the broad sense disserted on a topic of relevance to the science and/or technology of haptics. The winner of the award will be announced at the conference.

Current robotic teleoperation systems have very limited haptic feedback. This omission is related to many different factors, including the negative effect that haptic feedback has on the stability of these systems. In this respect, cutaneous feedback has recently received great attention; delivering ungrounded sensory cues to the operator’s skin conveys rich information and does not affect the stability of teleoperation systems.

This work addresses the challenge of providing effective cutaneous feedback in robotic teleoperation, with the objective of achieving the highest degree of transparency while guaranteeing the stability of the considered systems. On the one hand, it evaluates teleoperation systems that provide only cutaneous cues to the operator, thus guaranteeing the highest degree of safety. On the other hand, in order to achieve higher level of performance, it also investigate novel force feedback systems for teleoperation able to provide mixed cutaneous and kinesthetic cues to the operator.

Back to the Program ↑

Featured Sessions

Single-track sessions will feature 11 papers selected by the Conference Editorial Board on the basis of reviews to be presented (15 minutes each; 12 minutes of oral presentation + 3 minutes of Q&A).

Haptic Science - Perception, Virtual Reality and Human-Computer Interaction

Time: Tue, 09:00 ~ 10:15
Room: McCormick Auditorium
Session chairs: Roberta Klatzky and Jean-Louis Thonnard
Valerie Morash
Movement strategies were investigated in a one-handed haptic search task where blindfolded sighted participants used either one or five fingers to find a landmark on an unstructured tactile map. Search theory predicts that systematic strategies, such as parallel sweeps and spirals, should be more prevalent when the searcher's detection radius is small (one finger) than when the detection radius is large (five fingers). Movement patterns were classified as either non-systematic or systematic, and systematic strategies were more common in one-finger than five-finger searches. Therefore, systematic haptic search strategies are used and modulated by detection radius for untrained sighted participants.
Jess Hartcher-O'Brien, Malika Auvray, Vincent Hayward
Vision-to-touch prostheses typical code space-to-space, space-to-intensity, or space-to-frequency. Yet, organisms use a space-to-time-delay mapping to anticipate and avoid impending collisions. Many organisms have developed computational short-cuts where distance-to-target is proportional to a time-span. Can untrained humans spontaneously employ such a short-cut to estimate distance-to-obstacle in the absence of vision? Tactile feedback, a pulse, was delivered to the hand with a delay proportional to an obstacle's distance detected by the device's optical-rangefinder. Observers were naïve to the nature of the code but quickly calibrated and accurately estimated distance within a range of 4m for delays corresponding to a velocity of 1m/s.
Emma Treadway, Brent Gillespie, Darren Bolger, Amy Blank, Marcia O'Malley, Alicia Davis
The use of haptic display to refer cues from a prosthetic terminal device promises to improve the function of myoelectrically controlled upper limb prostheses. This promise is often evaluated in experiments involving non-amputees. However, the availability of auxiliary haptic feedback from an intact hand may confound attempts to use non-amputees as stand-ins for amputees. In this paper we test the influence of auxiliary haptic feedback on myoelectric control by asking non-amputees to use myoelectric control to perform visual and haptic compensatory tracking with either a hard object, soft object, or no object in their grasp.
Yongseok Lee, Inyoung Jang, Dongjun Lee
For Wearable finger-based haptics with cutaneous \ feedback, one of the key challenges is the finger-tracking, which can never be perfect, yet, if used in VR, would \ be adequate as long as its tracking error is under a certain \ detection threshold. For such system, we aim to quantitatively answer the \ following questions: 1) what is the detection threshold (JND) of visual-proprioceptive conflict \ (i.e., error tolerance of the finger-tracking system in VR); and 2) \ is it possible to further reduce this visual-proprioceptive conflict \ by utilizing cutaneous haptic feedback. These results \ would be useful to determine the design specification of finger tracking \ systems.
Jongman Seo, Seungmoon Choi
Vibrotactile flows refer to vibrotactile sensations that move continuously on the surface of a mobile device. This paper extends the dimension of vibrotactile flows to two dimensions by means of edge flows -- vibrotactile flows that rotate along the edges of a rectangular device. We carried out a longitudinal user study to measure the information transmission capacity of 32 edge flows. Results showed that the information transmission capacity of the edge flows was 3.70 bits, which is greater than for most previous studies, and that some extent of practice is required for robust identification although it can take place quickly.

Back to the Program ↑

Haptic Technology - Devices and Algorithms for Feedback and Exploration

Time: Wed, 08:45 ~ 10:15
Room: McCormick Auditorium
Session chairs: Cagatay Basdogan and Vincent Hayward

Patrice Lambert, Just Herder
This paper introduces a novel parallel architecture that provides 6 DOF motion and 1 DOF grasping capabilities while all the motors are located at the base. Classical parallel haptic devices usually do not provide grasping capabilities since all motors are located on the base. Thanks to a novel configurable platform, the grasping capability of this haptic device is part of the mechanical architecture itself and can be fully controlled by base-located motors.
Colin Ho, Jonathan Kim, Sachin Patil, Ken Goldberg
We introduce a novel haptic display designed to reproduce the sensation of both lateral and rotational slip on a user's fingertip. The device simulates three-degrees-of-freedom of slip by actuating four interleaved tactile belts on which the user's finger rests. We present the specifications for the device, the mechanical design considerations, and initial evaluation experiments. We conducted experiments on user discrimination of tangential lateral and rotational slip. Initial results from our preliminary experiments suggest the device design has potential to simulate both tangential lateral and rotational slip. Source files: https://github.com/Slip-Pad
Ildar Farkhatdinov, Arnaud Garnier, Etienne Burdet
We present an MR compatible haptic interface for human motor control studies, which can be easily installed and removed from the scanner room. The interface is actuated by a shielded motor located 2.1 m away from the MR scanner. Torque is transmitted to a subject's wrist through cable transmission. The handle is adjustable to different hands size, enabling comfortable wrist movements. A dynamic model of the interface is presented and identified for position and torque control modes. Phantom MR compatibility test in clinical environment showed that the interface is compatible with strong magnetic field and radio frequency emission.
Hiroaki Yano, Shoichiro Taniguchi, Hiroo Iwata
In order to realize the fingertip of a user is freed from the restrictions of the end effector of haptic interface, separate contact and reaction force points are selected for haptic rendering. A prototype system, which consists of a motion tracker, a 2-DOF haptic interface with a force sensor, and a visual display, is developed. The system measures the position of the fingertip of the user and produces an appropriate horizontal force on the thenar eminence of the user. By using the interface, the user can perceive the surfaces of 3D virtual objects and the frictional force on the surface.
Rebecca Fenton Friesen, Michael Wiertlewski, Michael Peshkin, Ed Colgate
This paper presents the design of a bioinspired artificial fingertip that resembles the mechanical behavior of a human fingertip under conditions of both static deformation and high frequency excitation. Force-deformation characteristics and response to a transient mechanical perturbation are both shown to be in good qualitative agreement with those of a real finger. More importantly, the fingertip exhibits friction reduction when interacting with TPads (variable friction tactile displays based on transverse ultrasonic vibrations). Comparison with artificial fingertips that do not exhibit friction reduction suggests that mechanical damping characteristics play a key role in the amount of friction reduction achieved.
Matti Strese, Clemens Schuwerk, Eckehard Steinbach
When a tool is dragged over a surface, vibrations are induced that can be captured using acceleration sensors. This paper presents an approach for tool-mediated surface classification which is robust against varying scan-time parameters and works without explicit scan force and velocity measurements. We focus on mitigating the effect of varying contact force and hand speed conditions on our proposed features as a prerequisite for a robust machine-learning-based approach for surface classification. Our approach allows for classification of surfaces under freehand movement conditions. We achieve a classification accuracy of 95% with a Naïve-Bayes Classifier for a database of 69 textures.

Back to the Program ↑

Regular Sessions

The regular format of paper presentation at World Haptics 2015 is podium talks (12 minutes each; 10 minutes of oral presentation + 2 minutes of Q&A) that will take place in two parallel sessions.

Tactile Devices and Rendering

Time: Tue, 10:45 ~ 12:00
Room: McCormick Auditorium
Session chairs: Evren Samur and Hiroyuki Shinoda

Sofiane Ghenna, Frederic Giraud, Christophe Giraud-Audine, Michel Amberg, Betty Lemaire-Semail
The goal of our study is to provide a multi-touch ultrasonic tactile stimulation on a surface using ultrasonic vibrations. We propose a method to control ultrasonic waves on a beam, allowing to obtain a Multi-touch ultrasonic tactile stimulation in two points, to give the sensation to two fingers, from two piezoelectric transducers. The multi-modal approach and the vector control method are used to regulate the vibration amplitude, in order to modulate the friction coefficient with the fingers. A psychophysical experiment with 6 subjects is conducted to demonstrate the idea.
Thomas Sednaoui, Eric Vezzoli, Brigida Dzidek, Betty Lemaire-Semail, Cedrick Chappaz, Michael Adams
Previously proposed models of the ultrasonic lubrication of a finger mediated by flat surfaces are not consistent with the experimental results for vibrational amplitudes greater than a few microns. This paper presents experimental data acquired through a dedicated passive touch tribometer and compares it with existing model of squeeze film lubrication. Considering the large difference between analytic and experimental results an experimental model of ultrasonic lubrication at high vibrational amplitudes is then proposed.
Eric Vezzoli, Brygida Dzidek, Thomas Sednaoui, Frederic Giraud, Michael Adams, Betty Lemaire-Semail
Ultrasonic vibration of a plate can be used to modulate the friction of a finger pad sliding on a surface. This modulation can modify the user perception of the touched object and induce the perception of textured materials. In the current paper, an elastic model of finger print ridges is developed. A friction reduction phenomenon based on non- Coulombic friction is evaluated based on this model. Then, a comparison with experimental data is carried out to assess the validity of the proposed model and analysis.
Matteo Bianchi, Mattia Poggiani, Alessandro Serio, Antonio Bicchi
This work presents FYD-pad, a fabric-based yielding tactile display for softness and texture rendering. The system exploits the control of two motors to modify both the stretching state of the elastic fabric for softness rendering and to convey texture information on the basis of accelerometer-based data. At the same time, the measurement of the contact area can be used to control remote or virtual robots. In this paper, we discuss the architecture of FYD-pad and the techniques used for softness and texture reproduction as well as experiments with humans to show the effectiveness of the device in delivering tactile information.
Craig Shultz, Michael Peshkin, Ed Colgate
We present and discuss a nearly century old haptic effect with human fingertips. This effect, based on the 1923 work of Johnsen and Rahbek, is capable of producing DC electrostatic forces on the bare finger an order of magnitude larger than those previously reported in literature. We propose an electrical circuits based force model for this effect, drawn from research done concerning electrostatic chucking devices, and show how this model fits in with previous electrostatic force models. Through this discussion we aim to clarify the concept of electrovibration, and expand this concept to the more general principle of electroadhesion.
Eder Miguel, Maria Laura D'Angelo, Ferdinando Cannella, Matteo Bianchi, Mariacarla Memeo, Antonio Bicchi, Darwin G. Caldwell, Miguel A. Otaduy
The computation of skin forces and deformations for tactile rendering requires an accurate model of the extremely nonlinear behavior of the skin. In this work, we first describe a measurement setup that enables the acquisition of contact force and contact area in the context of controlled finger indentation experiments. Second, we describe an optimization procedure that estimates the parameters of strain-limiting deformation models that match best the acquired data. Together, we achieve the characterization of finger mechanics and the design of an accurate nonlinear skin model for tactile rendering.

Back to the Program ↑

Perception - Softness and Size

Time: Tue, 10:45 ~ 12:00
Room: Northwestern Room
Session chairs: Mounia Ziat and Martha Flanders

Alexandra Lezkan, Knut Drewing
In softness exploration participants repeatedly indent the surface. We investigated how executed peak forces are modulated for different indentations depending on the softness of the object. We assumed that movement control is determined by available predictive and sensory signals. The results show that participants systematically apply lower forces when sensory or predictive signals indicate softer objects as compared to harder objects. Thus, we consider softness exploration as a sensorimotor control loop, in which predictive and sensory signals determine movement control. Further, predictive signals are shown to maintain highly important throughout the entire exploration, even in the presence of sensory signals.
Anna Metzger, Knut Drewing
We investigated relative contributions of different haptic signals to softness perception. Subtle external vertical forces were transmitted to the human finger during the exploration of silicone-rubber stimuli to dissociate force estimates provided by kinesthetic signals and the efference copy from cutaneous force estimates. We measured Points of Subjective Equality of manipulated references to stimuli explored without external forces. PSEs shifted as a linear function of external force to higher compliances with pushing and to lower compliances with pulling force. The relative contribution of kinesthetic/efference copy information to perceived softness was 23% for rather hard and 29% for rather soft stimuli.
Femke E. van Beek, Dennis J.F. Heck, Henk Nijmeijer, Wouter M. Bergmann Tiest, Astrid M.L. Kappers
In controlling teleoperation systems subject to communication delays, unstable behavior is often prevented by injecting damping. A proper perception of hardness is required to efficiently interact with objects, but it is unknown if injecting damping influences the perceived hardness of objects. To investigate this, participants compared the hardnesses of lightly and heavily damped objects, using tasks with and without free-air movement, while their movements were recorded. The results show that adding damping decreases the perceived hardness for the former task, while it increases perceived hardness for the latter task. This shows that damping influences perceived hardness in a task-specific way.
Evan Fakhoury, Peter Culmer, Brian Henson
This paper investigates the effect of maximum indentation force and depth on people's ability to accurately discriminate compliance using indirect visual information only. Participants took part in two psychophysical experiments where they were asked to choose the 'softest' sample from a series of presented sample pairs. Participants observed a computer-actuated tip indent the pairs to one of two conditions; maximum depth (10mm) or maximum force (4N). Results suggest that participants performed best in the task where they judged samples being indented to a pre-set maximum force. Moreover, our findings highlight the effect of visual information on compliance discrimination.
Wouter Bergmann Tiest, Vincent Hayward
We have investigated differences in perceived object size when exploring the inside or outside of objects. Ten blindfolded subjects compared the size of circular disks and holes using either the index finger, two different probes, the finger-span method, or an infinitesimal virtual probe. For the large probe and the finger-span method, an object felt on the inside was perceived as smaller than an object felt on the outside. This indicates that subjects are unable to sufficiently correct for the diameter of the probe when exploring objects. Without a probe, most subjects perceived the objects to be bigger on the inside.
Knut Drewing, Steffen Bruckbauer, Dora Szoke
When small holes are felt with the tongue they are perceived to be larger as compared to when felt with the finger. We hypothesize that differences in perceived size are due to differences in the effector's deformation at the edge of the explored hole, which correlate with the effector's pliability. Experiment I demonstrates that the tongue perceives holes to be larger when it exerts higher forces on the holes. Experiment II demonstrates that holes at the toe are perceived to be smaller than holes at the finger and considerably smaller than holes at the tongue. These findings corroborate our hypothesis.

Back to the Program ↑

Texture

Time: Tue, 16:00 ~ 17:15
Room: McCormick Auditorium
Session chairs: Yon Visell and Seokhee Jeon

Heather Culbertson, Katherine Kuchenbecker
Dragging a tool across a textured surface produces vibrations that convey information about the surface qualities. These vibrations naturally depend on the tool's normal force and tangential speed, but virtual surface textures don't always mimic this behavior. We conducted a human-subject study to analyze the importance of creating virtual texture vibrations that respond to user force and speed. Removing speed responsiveness caused a significant decrease in realism, but removing force responsiveness did not. This result indicates that realistic virtual texture vibrations should vary with user speed but may not need to vary with user force.
Athanasia Moungou, Jean-Louis Thonnard, André Mouraux
When sliding our fingertip on a surface, complex vibrations are produced in the skin. In the present study, we used electroencephalography (EEG) to record steady-state evoked brain potentials (SS-EPs) and characterize the cortical activity related to the passive tactile exploration of textured surfaces. Using square-wave gratings of different spatial period (SP), ranging from coarse to smooth, and a sinusoidal grating with a glued fabric on it, we expected that these stimuli would elicit SS-EPs at different frequencies depending on the SP. Our results suggest that SS-EPs could be used to study the brain responses regarding the tactile exploration of textures.
Séréna Bochereau, Stephen Sinclair, Vincent Hayward
One human finger explored plastic Braille dots using a variety of velocity and force profiles. Characteristics of the interaction were studied to explore the interdependence of amplitude/duration across signals. Both amplitude, defined as maximum tangential force, and duration, varied with velocity and normal force, however the integral of the tangential force over time was independent of either variable. When three consecutive dots of varying height were examined, the tangential force integral increased in proportion to height. We propose that the nervous system may use this quantity as an invariant to recognise the same spatial asperity explored under different exploratory conditions.
David Meyer, Michael Peshkin, Edward Colgate
Texture modeling strives to encapsulate the important properties of texture in a concise representation for interpretation, storage, and rendering. Models for tactile texture have yet to describe a representation that is both perceptually complete and sufficiently compact. In this work, we take inspiration from models of visual and auditory texture and propose a representation of tactile texture that separates localized features from textural aspects. We investigate the length scales at which humans can localize features, and represent textures as spectrograms that capture those features. Additionally, we demonstrate a reconstruction algorithm capable of recreating texture from a spectrogram without perceptual consequence.
Sunghwan Shin, Reza Haghighi Osgouei, Ki-Duk Kim, Seungmoon Choi
This paper presents a new approach for data-driven modeling and rendering of isotropic surface textures from contact acceleration data on the basis of frequency-decomposed neural networks. We propose two neural network models in different topologies, unified and decomposed, and experimentally evaluate their performance using the acceleration data collected by a motorized 2D texture scanner we developed. The modeling performance of the unified neural network model is shown comparable to the best available in the literature. We also provide preliminary but promising modeling results for anisotropic textures. \
Mohammadreza Motamedi, Jean-Philippe Roberge, Vincent Duchaine
This paper presents a robotic system that was used to study the restoration of touch sensitivity. Here, a combination of tactile sensors, robotic fingers, and a haptic interface enabled us to undertake different types of experiments on human subjects. To this end, we have conducted two separate tests on 8 human subjects in order to assess the effectiveness of the static and dynamic modalities in different detectable ranges of the skin sensitivity.

Back to the Program ↑

Sensors and Actuators

Time: Tue, 16:00 ~ 17:15
Room: Northwestern Room
Session chairs: Peter Berkelman and Masashi Konyo

Yoshihiro Tanaka, Duy Phuong Nguyen, Tomohiro Fukuda, Akihito Sano
We developed a wearable tactile sensor for measuring skin vibrations using a polyvinylidene fluoride (PVDF) film, which is a polymer piezo material. The sensor allows users to touch with bare fingers and to conduct active touch, and detects skin-propagated vibrations when fingertip touches an object. A transfer function from vibrations applied on the fingertip to the sensor output was expressed by using a finger model, a sensor model, and an electric model of the PVDF film. Then, the measurement of the frequency response of the sensor, the estimation of vibrations, the sensor output for three different textures were tested.
Adam Spiers, Harry Thompson, Anthony Pipe
Conventional haptic sensing technologies are impractical for clinical application to minimally invasive surgery, due to size, sterilization robustness and cost vs. tool disposability. In this work we validate the concept of remote force measurement, where force interactions at the tip of an EndoWrist surgical tool are observed via simple torque sensors near the tool's actuators. This method provides reusable sensors located outside of the human body, sidestepping many key issues that have limited practical haptic sensing in this scenario. The resulting unprocessed torque data indicates contact with synthetic soft tissue at various actuator velocities and during external shaft loading.
Tobias Bützer, Bogdan Vigaru, Roger Gassert
Fiberoptic force sensors are commonly used in fMRI applications to measure interaction forces with subjects or reduce the inherent dynamics of a haptic interface trough force feedback. In this paper we propose a compact and integrated elastic probe for fiberoptics-based force sensing, developed using low-cost off-the-shelf 3D printing technology. Characterization of the sensor probe shows high linearity, repeatability and temporal stability, as well as high reproducibility in terms of the manufacturing process. The realized sensor is integrated into a linear grasper to evaluate its performance in force-feedback applications, underlining the potential of this technology for use in fMRI-compatible haptic interfaces.
Won-Hyeong Park, Tae-Heon Yang, Yongjae Yoo, Seungmoon Choi, Sang-Youn Kim
This paper presents a flexible and bendable vibrotactile actuator that can be easily applied to shape changing mobile devices. The proposed vibrotactile actuator is made with an electro-conductive membrane based on polyurethane, a base membrane, and an airgap. Actuation of the proposed actuator is controlled by a polarity of both charged membranes and the actuator performance can be modulated by increasing level of biased electric potential. We conducted a user experiment, and it shows that the proposed actuator can provide vibrations with sufficient strength for perception.
Rebecca Jarman, Balazs Janko, William Harwin
Small DC permanent magnet electric motors are commonly used as \ actuators in haptic devices and tend to spend a significant period of \ time in a `stalled' condition where they oppose an applied force. \ This paper identifies the relationship between heat loss and force \ generation for these haptic actuators. The work then presents results \ on current over-stressing of small DC motors so as to understand the \ risks of demagnetisation against thermal damage to the armature. \ Results indicate that it should be possible to apply short current \ over-stresses to commercial DC permanent magnet motors to increase end \ point force.
Alexander Russomanno, Brent Gillespie, Sile O'Modhrain, Mark Burns
We explore key design parameters for integrating fluidic logic and pneumatic actuators for a very-large shape display for application in braille and tactile graphics. We present a simple model of pressure-controlled flow valves, which are analogous to electric transistors. The model highlights a valve design that achieves noise immunity and enables signal propagation, both critical goals for creating fluidic logic circuits. Based on the pressure-controlled valve design, we built a pressure-based latching memory unit that can be integrated with pneumatic actuators to enable the control of an arbitrary number of tactile features with only a few external electronic control valves.

Back to the Program ↑

Human-Computer Interaction

Time: Wed, 10:45 ~ 12:00
Room: McCormick Auditorium
Session chairs: Manuel Cruz and Ali Israr

Zhaoyuan Ma, Darren Edge, Leah Findlater, Hong Tan
The present study used a flat keyboard without moving keys and enabled with haptic keyclick feedback to examine the effect of haptic keyclick feedback on touch typing performance. We investigated how haptic keyclick feedback might improve typing performance in terms of typing speed, typing efficiency and typing errors. We found that haptic feedback increased typing speed and decreased typing errors compared to a condition without haptic feedback. Furthermore, the participants preferred auditory or haptic keyclick feedback to no feedback, and haptic feedback restricted to the typing finger alone is preferred to that over a larger area of the keyboard.
Jin Ryong Kim, Hong Z. Tan
We investigate the effect of information content in sensory feedback on typing performance using a flat keyboard. We evaluate and compare typing performance with key-press confirmation and key-correctness information through sensory feedback on the flat keyboard. Twelve participants are asked to touch-type randomly selected phrases under various combinations of visual, auditory and haptic sensory feedback conditions. The results show that typing speed is not significantly affected by the information content in sensory feedback, but the uncorrected error rate is significantly lower when key-correctness information is available. Our findings are useful for developing flat keyboards with assistive information through sensory feedback.
Yongjae Yoo, Taekbeom Yoo, Jihyun Kong, Seungmoon Choi
This paper is concerned with the emotional responses of tactile icons. Using three sets of tactile icons in which four physical parameters-amplitude, frequency, duration, and envelope-were systematically varied, we estimated their valence and arousal scores in a perceptual experiment with 24 participants. Results showed that the four parameters have clear relationships to the emotional responses of tactile icons. Our tactile icons spanned to a large region in the valence-arousal space, but they did not elicit very positive-relaxing or very negative-relaxing emotional responses. These findings provide the design guidelines of tactile icons that have desired emotional properties.
Siyan Zhao, Ali Israr, Roberta Klatzky
Handheld and wearable devices engage users with simple haptic feedback, e.g., alerting and pulsating. Here we explored intermanual apparent tactile motion -- illusory movement between two hands as a means to enrich such feedback. A series of psychophysical experiments determined the control space for generating smooth and consistent motion across the hands while users held the device and a multimodal factor to match moving visual cues across the screen to moving tactile motion across hands. The results of this research are useful for media designers and developers to generate reliable motion across the hands and integrate haptic motion with visual media.
Farah Arab, Sabrina Panëels, Margarita Anastassova, Stéphanie Cœugnet, Fanny Le Morellec, Aurélie Dommes, Aline Chevalier
Haptic technologies can open up new avenues for assisting older people in their daily activities, in particular for navigation. However, older adults' specific needs for a haptic navigation aid have seldom been investigated, nor for the design of haptic patterns that would be both acceptable and efficient for them. This paper contributes to this challenge through a user evaluation that was conducted to assess patterns, designed for and by the elderly, during a navigation task in an urban environment. The results led to a number of recommendations for the design of haptic patterns adapted to the older adults' needs.
Hasti Seifi, Kailun Zhang, Karon MacLean
With haptics commonplace in consumer devices, diverse user perception and aesthetic preferences confound haptic designers. End-user customization drawn from example sets is an obvious solution, but haptic collections are notoriously difficult to explore. This work addresses the provision of highly navigable access to large, diverse sets of vibrotactile stimuli, on the premise that multiple access pathways facilitate discovery and engagement. We propose and examine five organization schemes (taxonomies), describe how we created a 120-item VT library, and present and study VibViz, an interactive tool for end-user library navigation and our own investigation of how different taxonomies can assist navigation.

Back to the Program ↑

Dynamics and Interaction

Time: Wed, 10:45 ~ 12:00
Room: Northwestern Room
Session chairs: Richard Adams and Katherine J. Kuchenbecker

Ehsan Noohi, Sina Parastegari, Milos Zefran
While hand trajectory has been successfully modeled for single arm reaching movement, few works have considered the bimanual reaching movement and no study has modeled the dyadic reaching movement. In this paper, we study both bimanual and dyadic reaching movements and show that the motion trajectory follows the minimum-jerk trajectory. To the best of our knowledge, this is the first work that studies the dyadic reaching movements. Furthermore, we show that our model is consistent with the existing theories on single arm motions, when applied to each of the cooperating arms.
Colin Gallacher, John Willis, Jozsef Kovecses
In this study we investigate the role that the inertia tensor coupling has on user performance during navigational tasks. We also adapt the operation and admissible-motion space representation for haptic systems in which forces causing deviations from a desired path can be thought of as parasitic forces that degrade a users performance. Dynamic simulations were carried out to gain insight into the effects of navigating along paths of varying coupling using a 2DOF five-bar mechanism and were experimentally validated with a Quansar 2DOF Pantograph device.
Domenico Buongiorno, Michele Barsotti, Edoardo Sotgiu, Claudio Loconsole, Massimiliano solazzi, Vitoantonio Bevilacqua, Antonio Frisoli
The paper presents a myoelectric control of an arm exoskeleton designed for rehabilitation. A four-musclesbased NeuroMusculoSkeletal (NMS) model was implemented and optimized using genetic algorithms to adapt the model to different subjects. The NMS model is able to predict the shoulder and elbow torques which are used by the control algorithm to ensure a minimal force of interaction. \ \ The accuracy of the method is assessed through validation experiments conducted with two healthy subjects performing free movements along the pseudo-sagittal plane. The experiments show promising results for our approach showing its potential for being introduced in a rehabilitation protocol.
Anthony Chabrier, Franck Gonzalez, Florian Gosselin, Wael Bachta
Haptic interfaces aim at realistically simulating physical interactions within a Virtual Environment (VE) through the sense of touch. While this can be attained when interacting through a handle grasped in hand, this is much more difficult when considering dexterous interactions with manual interfaces or exoskeletons due to the high number of degrees of freedom and limited space available. This paper proposes a design methodology dedicated to such devices. It is shown that a device allowing to interact with the five fingertips and the side of the index allows interacting naturally within a VE more than 50% of the time.
Laszlo Kovacs, Jozsef Kovecses
In the dynamic analysis of haptic systems the human operator is often neglected and only the uncoupled device is investigated. When considered, the human model is typically represented by passive impedance elements employing simple mass-spring-damper representations. The dynamics of coupled systems with multiple degrees-of-freedom device- and human operator models have not been much investigated. In the present paper, we discuss reduced order dynamic representations for such complex models. We also consider the coupled system, and demonstrate the effect of the human operator on the combined dynamics. Structural flexibility, different grasping conditions, and active human stabilization with reflex delay are considered.
Aghil Jafari, Muhammad Nabeel, Jee-Hwan Ryu
Our group have proposed Input-to-State Stable (ISS) approach, which reduces the design conservatism of the passivity-based controllers by allowing bigger output energy from the haptic interface compared with the passivity-based controller while guaranteeing the stability. This paper extends the ISS approach for multi-DoF haptic interaction. For multi-DoF haptic interaction, penetration depth-based rendering method using Virtual Proxy (VP) is adopted, and VP allows us to decouple the interaction into each axis. Then, we extend the previous one-port ISS approach to two-port ISS approach, and generalize this into multi-DoF ISS approach by augmenting each two-port analysis. \

Back to the Program ↑

Recognition, Modeling, and Rendering

Time: Wed, 16:00 ~ 17:00
Room: McCormick Auditorium
Session chairs: Matthias Harders and Miguel Otaduy

Uriel Martinez-Hernandez, Nathan F. Lepora, Tony J. Prescott
We present an intrinsic motivation approach for haptics in robotics. First, a probabilistic method is employed to reduce uncertainty present in tactile measurements. Second, tactile exploration is actively controlled by intelligently moving a robot hand towards interesting locations. The active behaviour performed with a robotic hand is achieved by an intrinsic motivation approach, which permitted to improve the accuracy over the results obtained with a fixed sequence of exploration movements. Our was validated in simulated and real environments with a three-fingered robotic hand. The results demonstrate that our method is robust and suitable for haptic perception in autonomous robotics.
Sunghoon Yim, Seokhee Jeon, Seungmoon Choi
This paper presents an extended data-driven haptic rendering method capable of reproducing force responses during sliding interaction on a large surface area. \ The core part of the approach is a set of input variables for data interpolation model training, which includes a proxy -- the contact point on a surface when undeformed. The behavior of the proxy is simulated in real-time based on a sliding yield surface -- a surface separating the sliding and the sticking area in the external force space. During rendering, the RBF based interpolation model using the proposed input variable set estimates force responses in real-time. \ \ \
Arash Mohtat, Jozsef Kovecses
Producing a sharp feeling of impact is important for high-fidelity force feedback rendering of virtual objects. This paper studies the direct impulse-based rendering paradigm to achieve this goal. Three challenges are investigated: (a) counteracting the undesired sampled-data-induced energy dissipation; (b) meeting actuation limits by impulse distribution; and, (c) rendering resting contacts by hybridizing impulse and penalty-based formulations. Then, a unified control-oriented framework entitled the generalized contact controller is developed for implementing the proposed formulations and solutions. Our simulation and experimental results show the promise of the framework for generating a sharper unfiltered feeling of impact at relatively low sampling rates.
Brian Tse, Alaistair Barrow, Barry Quinn, William Harwin
Using haptic interfaces to assist the training of undergraduate dentists provides a unique opportunity to advance rendering algorithms and engineering of haptic devices. In this paper we use the dental context to explore a rendering technique called smoothed particle hydrodynamics (SPH) as a potential method to train students on appropriate techniques for insertion of filling material into a previously prepared (virtual) dental cavity. The paper also considers how problems of haptic rendering might be implemented on a Graphical Processing Unit (GPU) that operates in the haptics control loop. Novel smoothing function in SPH was developed and its flexibility is presented. \
Alvaro G. Perez, Daniel Lobo, Francesco Chinello, Gabriel Cirio, Monica Malvezzi, José San Martín, Domenico Prattichizzo, Miguel A. Otaduy
This paper introduces a tactile rendering algorithm for wearable cutaneous devices that stimulate the skin through local contact surface modulation. The first step in the algorithm simulates contact between a nonlinear skin model and virtual objects, and computes the contact surface to be rendered. The second step takes the desired contact surface as input, and computes the device configuration by solving an optimization problem, i.e., minimizing the deviation between the contact surface in the virtual environment and the contact surface rendered by the device. The method is implemented on a thimble-like wearable device.

Back to the Program ↑

Perception - Applied

Time: Wed, 16:00 ~ 17:00
Room: Northwestern Room
Session chairs: Gregory Gerling and Lynette Jones

Oleg Špakov, Jussi Rantala, Poika Isokoski
Haptic stimulation appears as a promising feedback channel for non-visual feedback related to gaze events and helps monitoring the tracking accuracy when using mobile eye-tracking device. Short repetitive vibrations from four actuators applied to the head and neck of the user were tested in this context and compared to the back that has often been used for cueing in other studies. The results showed that 1) the haptic stimulation on the head and neck cues users as efficiently as the stimulation of the back, and 2) sequential activation of multiple actuators is more appropriate for cuing gaze than simultaneous activation.
Anton Filatov, Ozkan Celik
In this paper, we present results from two human subject experiments focusing on effect of prehensor stiffness on object stiffness discrimination task performance in a body-powered prosthesis. We broaden the existing knowledge base by (1) using an experimental setup which mimics the control inputs of a body-powered prosthesis operated in voluntary closing mode, and (2) exploring the impact of end effector stiffness modulation on the quality of haptic feedback the user receives about the environment. Results indicated that tuning of prehensor stiffness can increase the performance of the user in correctly and more easily identifying objects with varying stiffness.
Tomi Nukarinen, Jussi Rantala, Ahmed Farooq, Roope Raisamo
Navigation systems usually require visual or auditory attention. Providing the user with haptic cues could potentially decrease cognitive demand in navigation. This study is investigating the use of haptic eyeglasses in navigation. We conducted an experiment comparing directional haptic cues to visual cueing in a car navigation task. The results showed that in comparison to the visual text cues the haptic cues were reacted to significantly faster. Haptic cueing was also evaluated as less frustrating than visual cueing. The paper suggests that haptic eyeglasses can decrease cognitive demand in navigation and have many possible applications.
Mounia Ziat, Andrea Savord, Ilja Frissen
Rumble strips (RS) offer ideal conditions for multimodality research. Designed to reduce crashes and alert drowsy or inattentive drivers, their effectiveness in crash reduction is not questioned but little is known regarding how information from tactile vibrations and auditory rumbling is integrated during low-vision driving conditions. In this paper, we report descriptive data related to participants' perceptual experience while driving on a RS road during a snow storm, as well as data collected from participants driving in a simulated snow storm environment, and suggest future research perspectives.
Sebastian Merchel, M. Ercan Altinsoy, Anna Schwendicke
The examination of auditory intensity perception has a long history, and comprehensive knowledge exists. However, tactile intensity perception has not been studied as thoroughly. A short literature review provides an overview of the current state of research, with a focus on perceived vibration magnitude. To broaden our knowledge, tactile intensity perception was investigated further in this study. The growth of perceived intensity of seat vibrations with increasing vibration level was compared to auditory loudness. Therefore, a magnitude estimation experiment was performed. Curves of equal vibration intensity have been determined.

Back to the Program ↑

Mid-Air and Wearable

Time: Thur, 09:00 ~ 10:30
Room: McCormick Auditorium
Session chairs: Hiroyuki Kajimoto and Sriram Subramanian

Seki Inoue, Yasutoshi Makino, Hiroyuki Shinoda
A method to present volumetric haptic objects in the air using spatial modulation of ultrasound is proposed. Previous methods of airborne ultrasonic tactile display were based on vibrotactile radiation pressure and sensor feedback systems, which result in low spatial receptive resolution. The proposed approach produces a spatially standing haptic image using stationary ultrasonic waves that enable users to actively touch 3D images without depending on vibrotactile stimulation and sensor feedback, which is a completely silent and free of the problems caused by feedback delay and errors. This paper evaluates our algorithm to synthesize a haptic holographic image and subjective experiments.
Dong-Bach Vo, Stephen Brewster
While mid-air gestures offer new possibilities to interact with or around devices, some situations, such as interacting with applications, playing games or navigating, may require visual attention to be focused on a main task. Ultrasonic haptic feedback can provide 3D spatial haptic cues that do not demand visual attention for these contexts. In this paper, we present an initial study of active exploration of ultrasonic haptic virtual cues that investigates the spatial localization with and without the use of the visual modality. Our findings will allow designers to create better mid-air interactions using this new form of haptic feedback. \
Hojin Lee, Ji-Sun Kim, Seungmoon Choi, Jae-Hoon Jun, Jong-Rak Park, A-Hee Kim, Han-Byeol Oh, Hyung-Sik Kim, Soon-Cheol Chung
This paper reports that lasers radiated to a thin light-absorbing elastic medium attached on the skin can elicit tactile sensations of mechanical tap with little individual variability. The underlying mechanism is thought to be the thermoelastic effect of laser, which creates elastic waves in the medium. We characterize the associated stimulus by measuring its physical properties. Its perceptual identity is also confirmed by comparing the laser stimulus to mechanical and electrical stimuli using perceptual spaces. To our knowledge, this is the first study that discovers the possibility of indirect laser radiation for mid-air tactile rendering. \ \
Tommaso Lisini Baldi, Mostafa Mohammadi, Stefano Scheggi, Domenico Prattichizzo
In the last years, wearable haptic technologies became very promising since they provide the users with tactile force feedback via small and wearable interfaces. However, they have no position sensing thus additional technologies are required. In this paper we present a sensing glove based on inertial and magnetic sensors for hand tracking which can be combined with cutaneous devices for the rendering of the force feedback. Preliminary experiments showed the effectiveness of the proposed approach. A comparison between using the glove with and without the cutaneous devices was presented.
Daniele Leonardis, Massimiliano Solazzi, Ilaria Bortone, Antonio Frisoli
A novel wearable haptic device for modulating skin stretch at the fingertip is presented. Rendering of skin stretch in 3 degrees of freedom, with contact - no contact capabilities, was implemented through rigid parallel kinematics. The novel 3-RSR configuration allowed compact dimensions with minimum encumbrance of the hand workspace and minimum inter-finger interference. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the device. Experimental results showed that participants performed a virtual grasping task more precisely and with grasping forces closer to the expected natural behavior with the provided haptic feedback.
Irfan Hussain, Leonardo Meli, Claudio Pacchierotti, Gionata Salvietti, Domenico Prattichizzo
In this paper, we present a robotic extra finger coupled with a vibrotactile ring interface. The human user is able to control the motion of the robotic finger through a switch placed on the ring, while being provided with vibrotactile feedback about the forces exerted by the robotic finger. To understand how to control the vibrotactile interface to evoke the most effective cutaneous sensations, we executed perceptual experiments to evaluate its absolute and differential thresholds. We also carried out a pick-and-place experiment with ten subjects. Haptic feedback significantly improved the task's performance.
Ginga Kato, Yoshihiro Kuroda, Ilana Nisky, Kiyoshi Kiyokawa, Haruo Takemura
Force feedback in tool-mediated interactions with \ the environment is important for successful performance of \ complex tasks. Stylus-based haptic devices are studied and used \ extensively, and most of these devices require either grounding \ or attachment to the body of the user. In this paper, \ we propose a novel method to represent the vertical forces that \ are applied on the tip of a tool using a non-grounded rotation \ mechanism by mimicking the cutaneous sensation caused \ by these forces. To evaluate this method, we develop a novel \ chopaticks-type haptic device that renders the \ sensation of manipulating objects using tools.

Back to the Program ↑

Motor Control and Learning

Time: Thur, 09:00 ~ 10:30
Room: Northwestern Room
Session chairs: Emanuele Ruffaldi and James Patton

Teng Li, Dangxiao Wang, Shusheng Zhang, Yuru Zhang, Chun Yu
In this paper, human's capability to control absolute magnitudes of fingertip force under audio or visual feedback was observed. Twelve participants applied a target force by pressing a force sensor with their fingers and maintained the force within various specified tolerances for a certain duration. The results showed that the applied fingertip force obeyed Fitts' law in both visual and auditory feedback modes when the index of difficulty was smaller than a threshold. This may be used as guidelines for the applications that rely on accurate and quick changing force control over a target region such as multiple tapping tasks.
Hendrik Börner, Satoshi Endo, Antonio Frisoli, Sandra Hirche
This paper investigates the effects of vibrotactile stimulus designs for spatially guiding a user in time-critical dynamical tasks. We contrast two types of vibrotactile stimulus, representing either optimal hand velocity or acceleration for the stabilization of an inverted pendulum. The analyses of the participants' stabilization and learning behavior revealed a significant improvement in performance caused by additional velocity-dependent feedback, likely due to more efficient processing for the velocity-dependent motor guidance in the central nervous system. This study suggests how human-centric vibrotactile stimuli should be designed and how they can be effectively transmitted to the human user for time-critical behavioral guidance.
Mu Xu, Dangxiao Wang, Yuru Zhang, Jian Song, Dong Wu
Compared with single-modal sensorimotor task, cross-modal tasks are more complex. In this paper, we compared the performance of single and multi-sensory cues in single-modal and cross-modal tasks. The results showed that for single-modal task, the performance of using cross-modal cues was slightly worse than that of using single-sensory cues, which implied that the single-sensory cue was more suitable for the single-modal tasks while the multi-sensory cues might produce distraction to participants. For cross-modal control tasks, the multiple cues produced better performance than the single which implied that multi-sensory cues were more effective than the single when learning complex cross-modal tasks.
Alejandro Melendez-Calderon, Moria Fisher, Michael Tan, Etienne Burdet, James Patton
Interactive technologies can help people acquire movement skills, and one way is by using visual distortions to boost neural adaptation. An extreme version of such approach is to train a movement without moving by creating a synesthetic illusion of movement -- displaying virtual motions when there is none. While this approach uses no proprioceptive error to drive adaptation, our results show encouraging evidence that motor skills can be acquired through such illusions of movement.
Moria Fisher, Felix Huang, Verena Klamroth-Marganska, Robert Riener, James Patton
Error feedback is critical for supporting motor adaptation in skilled tasks. Error augmentation interventions, in which participants' errors are amplified during training, have shown success over repetitive practice. Here we show that the statistical error tendencies can inform the design of customized error augmentation training forces. We hypothesized that with customized error augmentation participants will adapt faster to learning a visual-motor distortion and have greater improvement than participants receiving standard error augmentation. We found that participants receiving customized forces adapted faster and consequently changed with smaller forces. These promising results support the need for customization to target subject specific errors.
Gaofeng Yang, Dangxiao Wang, Yuru Zhang
In this paper, we studied how the accuracy of the force control could be enhanced through repetitive training. Participants were trained to apply a target force under concurrent visual feedback (Group A) and delayed visual feedback (Group B). After each training session, participants took a force test without feedback. The results show that Group B achieved the expected accuracy faster than Group A and suggest the delayed feedback appears to be more effective on consolidating motor memory in force control. Our finding opens a new opportunity to further explore the relationship between feedback and cognitive process of motor skill learning.
Caitlyn Seim, Tanya Estes, Thad Starner
Passive Haptic Learning (PHL) enables users to acquire motor skills by \ receiving tactile stimulation while no perceived attention is given to \ learning. Initial work used gloves with embedded vibration motors to \ passively teach users how to play simple, one-handed piano melodies. In an effort to create a more practical system for learning full piano pieces, we present research on Passive Haptic Learning.

Back to the Program ↑

Perception - Weight, Vibration, Force, and Temperature

Time: Thur, 13:30 ~ 14:45
Room: McCormick Auditorium
Session chairs: Knut Drewing and Wouter Bergmann Tiest

Takeshi Yamamoto, Koichi Hirota
The authors have investigated virtual realization of the shaking interaction using a haptic device that can present inertial force. This paper reports experiments on the discrimination and estimation of content's weight through the shaking interaction. In the experiments, the recognition of weight of solid and liquid contents by the subject was evaluated using the real box and the haptic device. The result proved that the device can present solid content in a similar accuracy to the real box. Moreover, it was suggested that the motion of the gravity center has a significant effect on the recognition of the content weight.
Stefano Papetti, Hanna Järveläinen, Gian-Marco Schmid
An experiment was performed to study the effect of actively applied forces on vibrotactile thresholds. The task consisted in pressing the fingertip against a flat rigid surface which provided broadband vibration noise of varying amplitude. \ Possibly due to the concurrent effect of large contact area, spectrally complex stimuli and active pressing force, the measured sensitivity thresholds are considerably lower than what is found in most of the previous literature. Moreover, significant differences in thresholds were found between the lowest and middle force level, and the highest and middle force level.
Fabio Tatti, Gabriel Baud-Bovy
The haptic modality is a direct and informative communication channel when manipulating objects jointly, but it may also be difficult to interpret. Indeed, the forces generated by the partner and the environment are summed together. This work uses techniques from psychophysics to investigate the ability of humans to untangle these two sources in a task where dyads had to identify the direction of a weak external force. We present a variety of force sharing strategies adopted by the dyads and their implications on the subjects' task performance.
Anshul Singhal, Lynette Jones
Thermal stimuli provide a novel dimension to present information to users of hand-held devices provided the inputs are tailored to the properties of the sensory system. In this experiment thermal pattern identification was measured on the hand using six stimuli that varied with respect to the direction, magnitude and rate of temperature change. The individual mean scores ranged from 80% to 98% correct with an overall mean of 91%. The Information Transfer values ranged from 1.76 to 2.49 bits with a group mean of 2.26 bits. These findings indicate that thermal icons offer considerable potential for presenting information.
Christian Hatzfeld, Mario Kupnik, Roland Werthschützky
This work shows an implementation of Psi and UML psychometric methods to unforced choice paradigms. These paradigms have shown a similar performance to forced choice paradigms but are expected to create less confusion for test subjects for low stimuli intensities. An implementation of an unsure test person is presented. Psi and UML methods are compared to the UWUD method, Variation Coefficient and the Sweat Factor are considered as measures for repeatability and efficiency and the threshold bias is used for evaluation of the accuracy. Based on the simulation results, both methods seem suitable to be combined with unforced choice paradigms.

Back to the Program ↑

Teleoperation

Time: Thur, 13:30 ~ 14:45
Room: Northwestern Room
Session chairs: Jee-Hwan Ryu and Domenico Prattichizzo

Amit Bhardwaj, Subhasis Chaudhuri
For the haptic transmission in a teleoperation, the temporal resolution Tr (minimum time spacing required in perceiving the jump discontinuity) \ needs also to be considered while effecting perceptually adaptive \ sampling. In this work, we propose a statistical method to estimate the temporal resolution Tr. In order to achieve this, we design an appropriate experimental set up, and record the haptic responses for several users extensively. We also study the effect of perceptual fatigue on the temporal resolution Tr, and validate all results using the classical psychometric approach.
Carsten Neupert, Sebastian Matich, Christian Hatzfeld, Mario Kupnik, Roland Werthschützky
Pseudo haptic sensation is an illusion based on visual stimuli. In virtual environments it is used to simulate material properties such as stiffness, mass and friction. Transferring the principle of pseudo-haptic feedback to real haptic teleoperation systems can provide a haptic sensation of an interaction without active haptic feedback. \ \ In this work, we discuss the usability of pseudo haptic feedback for its application in teleoperation systems. Therefore, the mechanisms of pseudo-haptic feedback are explained theoretically and are implemented to a one degree of freedom teleoperation system. \ \ Experiments with ten subjects show, that pseudo-haptic feedback is principally usable for haptic teleoperation. \
Michael Lin Yang, Samuel Schorr, Iris Yan, Allison Okamura
During robot-assisted surgery, the absence of environment force sensing limits surgeons to rely on interaction forces between hand and manipulator to modulate the grip force applied on the environment. To determine the effect of master manipulator gripper stiffness on performance in a teleoperated task, we designed a open source gripper, the OmniGrip. The OmniGrip replaces the Phantom Omni's stylus and provides the ability for user programmable force characteristics. We conducted studies in which participants used an OmniGrip to teleoperate a Raven II in a pick-and-place task. Increasing the stiffness of the OmniGrip resulted in reduced interaction forces at the slave-side.
Jan Smisek, Rene M. van Paassen, Andre Schiele
The purpose of this paper is to analyze the effects of inaccuracies of haptic guidance systems during execution of constrained tasks. The Lawrence teleoperation framework is extended in this paper by addition of an impedance type, position based attractive haptic guidance and analyzed from a control system perspective. We focus on systems where haptic guidance is used together with position or force based feedback. The effect of inaccurate guidance is discussed by using the proposed framework and quantified. Theoretical results are experimentally verified on a real haptic teleoperation setup.
Clemens Schuwerk, Xiao Xu, Wolfgang Freund, Eckehard Steinbach
We describe a client-server architecture for haptic interaction with deformable objects. The object deformation is computed on the server and transmitted to the clients. There, an intermediate representation of the deformable object is used to locally render haptic force feedback displayed to the user. Based on a one-dimensional deformable object, we analyze the transparency of this architecture for a single client interaction. The delay introduced by the deformation simulation and the client-server communication leads to increased rendered forces at the clients. We propose a method that adaptively adjusts the stiffness used in the local force rendering to compensate for this.
Xiao Xu, Burak Cizmeci, Clemens Schuwerk, Eckehard Steinbach
We propose a perceptual haptic data reduction approach for teleoperation systems which use the time domain passivity approach (TDPA) as their control architecture for dealing with time-varying communication delay. Our goal is to reduce the packet rate over the communication network while preserving system stability in the presence of time-varying and unknown delays. Experiments show that our proposed approach can reduce the average packet rate by up to 80%, without introducing significant distortion. In addition, the proposed approach outperforms the existing wave variable-based approaches in both packet rate reduction and subjective preference for the tested communication delays.

Back to the Program ↑

Demonstrations

Hands-on demos are a vital part of any haptics conference. This year we feature nearly 60 contributed demos and exhibits as well as 10 finalists in the first-ever Student Innovation Challenge. Listed below are all demonstrations listed below according to their position in the demo hall.

Back to the Program ↑

Mid-Air Tactile Stimulation Using Laser-Induced Thermoelastic Effects: The First Study for Indirect Radiation

Hojin Lee, Ji-Sun Kim, Seungmoon Choi, Jae-Hoon Jun, Jong-Rak Park, A-Hee Kim, Han-Byeol Oh, Sung-Jun Park, Hyung-Sik Kim, Soon-Cheol Chung
Booth: D-1

Localized Multi-finger Electrostatic Haptic Display

Senem Ezgi Emgin, Enes Selman Ege, Cagatay Basdogan
Booth: D-2

Shape and Friction Recognition of 3D Virtual Objects by Using 2-DOF Indirect Haptic Interface

Shoichiro Taniguch, Hiroaki Yano, Hiroo Iwata
Booth: D-3

TouchMusic: Music Experience System for the Hearing-Impaired

Gunhyuk Park, Yongjae Yoo, Seungmoon Choi, Changdo Song, Minjoo Cho, Giuyeol Kim, Jaehun Kim, Sangmin Lee, Kyogu Lee
Booth: D-4

Palpation System Utilizing Haptic Bidirectionality for Laparoscopic Surgery

Tomohiro Fukuda, Yoshihiro Tanaka, Michitaka Fujiwara, Akihito Sano
Booth: D-5

Multimodal Perception of Histological Images for Persons Blind or Visually Impaired

Ting Zhang, Juan Wachs, Bradley Duerstock
Booth: D-6

Intermanual Apparent Tactile Motion on Handheld Tablets

Siyan Zhao, Ali Israr, Roberta Klatzky
Booth: D-7

Soft Finger Tactile Rendering for Wearable Haptics (Demo)

Daniel Lobo, Alvaro G. Perez, Francesco Chinello, Gabriel Cirio, Monica Malvezzi, José San Martín, Domenico Prattichizzo, Miguel A. Otaduy
Booth: D-8

A Novel Tactile Display for Softness and Texture Rendering in Tele-Operation Tasks

Mattia Poggiani, Matteo Bianchi, Antonio Bicchi
Booth: D-9

Data-Driven Haptic Modeling and Rendering of Deformable Objects Including Sliding Friction

Sunghoon Yim, Seokhee Jeon, Seungmoon Choi
Booth: D-10

Alphabet Letter Display via Imitated Writing Motion

Keisuke Hasegawa, Tatsuma Sakurai, Yasutoshi Makino, Hiroyuki Shinoda
Booth: D-11

Tactile Presentation to a Back Side Finger While Operating with a Front Side Finger for Smartphone

Sugarragchaa Khurelbaatar, Hiroyuki Kajimoto, Yuriko Nakai
Booth: D-12

The Animotus: A Shape Changing Haptic Navigation Aid

Adam Spiers, Janet van der Linden, Maria Oshodi, Aaron Dollar
Booth: D-13

Ultrasonic 3D Haptic Hologram

Seki Inoue, Yasutoshi Makino, Hiroyuki Shinoda
Booth: D-14

Jorro Beat: User Study of Improvement in Music Experience by Shower Tactile Stimulation

Keisuke Hoshino, Masahiro Koge, Taku Hachisu, Ryo Kodama, Hiroyuki Kajimoto
Booth: D-15

Electroadhesive Haptic Trackpad

David Meyer, Craig Shultz, Michael Peshkin, Ed Colgate
Booth: D-16

Acoustic Touch: Fingertip Actuation for Combined Audio-Tactile Display

Craig Shultz, Michael Peshkin, Ed Colgate
Booth: D-17

Tangment

Hirokazu Tanaka, Nobuhisa Hanamitsu, Kouta Minamizawa, Susumu Tachi
Booth: D-18

Vibro-tactile recognition with Deep Convolutional Neural Networks: To search for haptic contents using vibro-tactile waves

Nobuhisa Hanamitsu, Kouta Minamizawa, Susumu Tachi
Booth: D-19

Using the Leap Motion Controller for Hand Tracking and Wearable Haptic Devices for Contact Rendering

Leonardo Meli, Stefano Scheggi, Claudio Pacchierotti, Domenico Prattichizzo
Booth: D-20

eClover: A Combined Electrostatic and Four-Tactor Wearable System for Eyes-Free Interactions

Zhaoyuan Ma, Yukang Yan, Darren Edge, Hong Tan, Yuanchun Shi, Ed Colgate
Booth: D-21

Feel Messenger: Embedded Haptics for Social Networking

Oliver Schneider, Siyan Zhao, Ali Israr
Booth: D-22

2-DOF Skin-Stretch Haptic Feedback Device

Ehren Murray, Zach Bielak, Hannah Chen, Melissa Yuan, Kensey King, Amy Blank, Marcia O'Malley
Booth: D-23

Efficient encoding of force with neural spiking models and its visual display

Gregory Gerling, Tony Lin
Booth: D-24

A mobile and reconfigurable electrical stimulation platform to elicit a subject-specific tactile response for use in upper-limb prosthetics

Gregory Gerling, Sarah Lightbody, Tony Lin
Booth: D-25

VibViz: an Interactive Visualization for Organizing and Navigating a Vibrotactile Library

Hasti Seifi , Kailun Zhang, Karon MacLean
Booth: D-26

Compact, high-efficiency thermal display using spatially divided stimuli

Katsunari Sato
Booth: D-27

Virtual Mirror Box for Lower Limbs using EMG

Andrew Shirtz, Kelly Morrow, Andrea Savord, Mounia Ziat
Booth: D-28

Visual-Haptic Hallucinations

Daniel Wilbern, Andrea Savord, Kelly Morrow, Mounia Ziat
Booth: D-29

Skin Strain Inducing Elbow Joint Flexion

Kotaro Shikata, Yasutoshi Makino, Hiroyuki Shinoda
Booth: D-30

Effect of Waist-type Hanger Reflex on Walking for Navigation.

Yuki Kon, Takuto Nakamura, Michi Sato, Hiroyuki Kajimoto
Booth: D-31

Novel HMI for Needle Insertion

Antonius Hoevenaars, Jeroen Wildenbeest
Booth: D-32

Stroboscopic investigation of ultrasonic friction reduction on a vibrating plate

Rebecca Fenton Friesen, Michael Wiertlewski, Michael Peshkin, Ed Colgate
Booth: D-33

Applications of a Wearable Skin Vibration Sensor Using a PVDF film

Yoshihiro Tanaka, Tomohiro Fukuda, Duy Phuong Nguyen, Akihito Sano
Booth: D-34

The Whole Hand Haptic Glove Using Numerous Linear Resonant Actuators

Kenta Tanabe, Seiya Takei, Hiroyuki Kajimoto
Booth: D-35

A Pneumatic Tactile Display Using Microfluidic Circuitry

Alexander Russomanno, Brent Gillespie, Sile O'Modhrain, Mark Burns
Booth: D-36

Whole Body Vibrotactile Presentation with Music via the Clavicle

Rei Sakuragi, Sakiko Ikeno, Ryuta Okazaki, Hiroyuki Kajimoto
Booth: D-37

Projection-based Vibrotactile: Vibration Unit for Recognition of Shape Image Projection onto Whole Body

Haruya Uematsu, Daichi Ogawa, Ryuta Okazaki, Taku Hachisu, Hiroyuki Kajimoto
Booth: D-38

Fingertip Music Player: A Haptic Game for Training Attentional Control Skill

Teng Li, Hafiz Malik Naqash Afzal, Dangxiao Wang, Wei Chen, Xiaohan Zhao, Yuru Zhang
Booth: D-39

Ultrasonic Lubrication Tablet Computer

Thomas Sednaoui
Booth: D-40

A Reconfigurable Planar Haptic Device with Tablet Display

Richard Pringle, Colin Gallacher, Morgane Ciot, Jozsef Kovecses
Booth: D-41

Controlling the Strength of the Hanger Reflex on the Wrist by Presenting Vibration

Takuto Nakamura, Narihiro Nishimura, Taku Hachisu, Michi Sato, Hiroyuki Kajimoto
Booth: D-42

Thin and Flexible Gel-type Vibrotactile Actuator

Won-Hyeong Park, Sang-Youn Kim
Booth: D-43

Virtual Exosuit with Haptic Feedback

Mark Haara, Vivian Stark, Dallas Johnson, Jeff Horn, Mounia Ziat
Booth: D-44

Realtime Electrotactile Force Feedback from a Finger Prosthesis using a Low-Cost Pressure Sensor

Aadeel Akhtar, Joseph Sombeck, Jesse Cornman, Sam Goldfinger,Timothy Bretl
Booth: D-45

Between Smoothness and Stickiness

Masaya Takasaki, Daisuke Yamaguchi, Yoichi Ochiai, Takayuki Hoshi, Takeshi Mizuno
Booth: D-46

Haptic Implementation on deformation models based on Oriented Particles

Haiyang Ding, Hironori Mitake, Shoichi Hasegawa
Booth: D-47

Multi Degree-of-Freedom Input-to-State Stable Approach for Stable Haptic Interaction

Aghil Jafari, Muhammad Nabeel, Jee-Hwan Ryu
Booth: D-48

Back to the Program ↑

Work-in-Progress Papers

Accepted work-in-progress papers will be presented at a poster session on Tuesday evening. There will be a short poster-teaser session at the main conference venue on Tuesday, and the poster session will be a standalone event held at the Hotel Orrington in downtown Evanston. To accompany the poster presentations of late-breaking haptics research, local craft beer from Chicago and Evanston will be served.

Poster Teaser Session

Time: Tue, 17:30 ~ 18:00
Room: McCormick Auditorium
Session chairs: Masashi Nakatani and Michael Zinn

Poster Presentations

Time: Tue, 20:00 ~ 22:00
Room: Grand Ballroom at the Hilton Orrington

All work-in-progress papers to be presented at World Haptics 2015 are listed below.

Back to the Program ↑

Wael Ben Messaoud, Eric Vezzoli, Frederic Giraud, Betty Lemaire-Semail
Poster: WIP-1
Ultrasonic vibrating devices are able to modulate the friction of a finger sliding on them. The underlying principles of the friction reduction are still unclear, and this work is carried out to investigate the influence of the ambient pressure on the friction modulation. A specific tactile stimulator has been used for this purpose and the friction between the finger sliding on the device has been recorded for an ambient pressure of 0.5 and 1 atm showing a significant difference for comparable experimental conditions. A comparison with the model proposed in literature is performed underlying that the squeeze film interaction can be present but not the only responsible of the friction modulation in this kind of devices.
Amy Blank, Cecilia Brookshier, Marcia O'Malley
Poster: WIP-2
Although the physical motion and interaction capabilities of prostheses are approaching the capability of human arms, physical touch feedback to the user is very limited, making interaction with objects difficult. We aim to address this deficit via non-invasive sensory substitution methods. Here, we describe preliminary testing in which two unimpaired subjects complete a functional task (grasp and lift of fragile objects) using a simple sEMG-controlled robotic gripper as a prosthesis proxy, with skin stretch feedback providing information about gripper aperture in the absence of visual feedback. Both subjects damaged fewer objects with skin stretch feedback available, suggesting promise for skin stretch feedback of proprioceptive information for functional tasks.
Haiyang Ding, Hironori Mitake,, Shoichi Hasegawa,
Poster: WIP-3
We propose a haptic rendering method for deformable models based on oriented particles. Oriented Particles(OP) is a fast and robust deformation method which is better fitted for multi object and robust deformation simulation systems. Our research focus on implementing haptic interaction for OP. Since the configuration of particles can be treated as the bounding volume of their belonging vertices, we propose the proxy collision detection by a two-layer collision procedure. Besides, haptic input is introduced into the simulation system as an external force and projected to the particles by the proportion of the distance. Our contribution is a basic haptic interaction implementation of OP. Thanks to the fast simulation of OP, our system is able to handle multiple objects with fast and robust haptic interaction.
Roberta Etzi, Francesco Ferrise, Monica Bordegoni, Alberto Gallace
Poster: WIP-4
Despite the large number of studies on multisensory texture perception, very little is known regarding the effects of the interaction between vision and touch on hedonic judgments, especially when the presentation of the stimuli is mediated by a haptic device. In the present study, the participants were simultaneously presented with combinations of different pictures representing common materials (glass, plastic, rubber and steel) and different virtual tactile surfaces rendered by varying the static and dynamic friction coefficients of the haptic device (Geomagic Touch). The participants were asked to rate the pleasantness and the roughness of the surface explored. The duration of exploration was also recorded. Both the pleasantness and roughness judgments, as well as the time of exploration, varied as a function of the tactile and visual stimuli presented. Taken together, these results clearly suggest that vision modulates tactile perception, and consequently hedonic preferences, when tactile sensations are provided by means of a haptic device. Importantly, these results offer interesting suggestions for the reproduction of more pleasant, and even more realistic, multisensory virtual surfaces.
Naomi Fitter, Michelle Neuburger, Katherine Kuchenbecker
Poster: WIP-5
Many people believe robots will play a range of roles in future society, such as caring for the elderly and entertaining children, but few existing robots can interact directly and productively with humans. Hand-clapping games serve as a good case study for exploring social-physical interaction between a human and a robot. We have adapted the Rethink Robotics' Baxter Research Robot, a notably collaborative platform, to be able to play the hand-clapping game "Slide" with a human partner. This short paper describes our hardware and software choices as well as our plans for studying hand-clapping as an exemplar task in haptic human-robot interaction.
Tricia Gibo, David Abbink
Poster: WIP-6
Haptic guidance has previously been investigated to facilitate the training of motor skills, whereupon a robotic device assists a trainee in executing the desired movement during practice. However, the benefits of haptic guidance over unassisted practice are unclear, as many studies have reported a null or even detrimental effect of guidance on learning. While prior studies have used haptic guidance to help refine a movement strategy, our experiment explores its effect on the discovery of new movement strategies. Subjects learn to manipulate a virtual under-actuated system via a haptic device either with or without haptic guidance. The guidance enables subjects to experience a range of movements to complete the task, rather than one strictly-enforced trajectory. Subjects who train with haptic guidance appear to adopt a new movement strategy that requires greater control of the system's degrees of freedom and increases the potential for faster task completion.
David Gueorguiev, Séréna Bochereau, André Mouraux, Vincent Hayward, Jean-Louis Thonnard
Poster: WIP-7
The fingertip is astonishingly sensitive. We can easily discriminate surfaces on the basis of 10-20 micrometer features. Many mechanisms have been linked to texture discrimination: spatial coding of the surface topography, temporal coding of slip-induced vibrations, or certain types of skin deformation elicited by frictional effects. This study investigates the finger?�?s sensitivity to flat surfaces made of different materials. We first show that flat surfaces with different chemical structures can be discriminated through touch without the help of thermal perception. Detailed measurements and analysis of the interfacial forces suggest that discrimination would be performed mostly on the basis of information available during the initial transitory phases leading to sliding.
Keisuke Hasegawa, tatsuma sakurai, Yasutoshi Makino, Hiroyuki Shinoda
Poster: WIP-8
In this paper we report a method to transmit symbolic information to untrained users with only their hands. Our simple concept is presenting three-dimensional letter trajectories to the readers' hand via a stylus which is electronically manipulated. Despite its simplicity, experimental participants were able to read 14mm-height lower-case alphabets displayed at a pace of one letter per second with the accuracy rate of 71% with their first trials, which was improved to 91% after 5-minute training. These results showed small individual differences among participants. Our findings include the fact that this accuracy is retained even when the letters are downsized to 7 mm. Our method can be applied to such handy devices that would allow us reading texts in our pockets in a situation where no visual and auditory modalities are available.
Jennifer Hui, Alexis Block, Katherine Kuchenbecker
Poster: WIP-9
Minimally invasive surgery enables faster patient healing than open surgery, but it precludes the surgeon from examining tissue with his or her fingers, a process known as palpation. This project seeks to determine whether a tactile sensor could be used to automatically detect tumorous tissue regions. A simulated tissue model was constructed from silicone rubber with 15 rigid spherical lumps of three diameters embedded at five depths. A biomimetic tactile sensor (SynTouch BioTac) mounted on a CNC milling machine pressed straight down into the model at a grid of points spaced by 2.5 mm. Nineteen electrode impedance values (corresponding to fingertip shape) and one DC pressure value were collected for two trials at each of the 6,175 palpation points. A point was categorized as Above-Center, Near-Edge, or Background by its lateral distance from the closest lump center. This data set was used to create Support Vector Machine classifiers that used pressure and electrode values or pressure alone. The electrode impedance readings were found to enable better classification than pressure alone, supporting the utility of tactile sensing during palpation.
Gholamreza Ilkhani, Mohammad Aziziaghdam, Evren Samur
Poster: WIP-10
Electrovibration is a promising method to generate tactile feedback on touch screens. Although recently it received considerable attention from the haptics community, almost all of the effort so far focused on single touch applications. However, localization of feedback is required to benefit from the rapidly growing multi-touch systems. In this study, we propose a method and present a prototype to create localized tactile feedback using electrostatic attraction. For this purpose, custom-made electrode arrays are manufactured in a matrix form. Applying high-voltage AC signals on certain X and Y electrode lines results in a perceivable change of friction at the intersection point of the electrode lines when finger moves over it. The prototype is tested with a tribometer in which lateral friction forces are measured. FFT analysis with respect to finger location reveals that localized haptic feedback is possible with the proposed method and the tactile display prototype. It is possible to extend this research towards multi-touch haptic feedback on touch screens.
Hiroki Ishizuka, Norihisa Miki
Poster: WIP-11
Computed tomography (CT) scan and endoscopy have been widely used to detect tumors inside the body. However, these methods can?�?t detect small tumors or tumors in depth. To detect these tumors, intravital palpation with tactile sensor and high resolution tactile display for stiffness is needed. In this study we developed high resolution tactile display for stiffness with magnetorheological (MR) fluid. MR fluid is a suspension and changes apparent stiffness with an external magnetic field. MR fluid was encapsulated in flexible polydimethylsiloxane (PDMS) membrane. The membrane was fabricated with droplet molding and was hemispherical shape. The MR fluid encapsulated structures were arranged with the pitch of 1 mm. The resolution of the display was 5 mm. We conducted mechanical and sensitive evaluation. In mechanical evaluation, we confirmed that the display could reproduce normal tissue or tumors on the display. We also conducted sensory evaluation with a subject. The results indicated that the subject could perceive stiff spots on the display. The results indicated that the display had the potential for palpating application.
Jeonggoo Kang, Heewon Kim, Seungmoon Choi, Ki-Duk Kim, Jeha Ryu
Poster: WIP-12
This short paper presents preliminary results from an experiment that investigated the effects of input voltage signals with a high frequency carrier on the perceived intensity of electrovibration. To measure the perceived intensity of stimuli, a fixed 6-point Effect Strength Subjective Index (ESSI) is used. Results indicated that the high frequency carrier enabled significant increases in the intensity perception, as predicted in theory. In addition, the waveform of voltage signal affected the perceived intensity of electrovibration. Although preliminary, these findings suggest simple and effective methods for reducing the high voltage requirement of electrovibration devices.
Sugarragchaa Khurelbaatar, Hiroyuki Kajimoto, Yuriko Nakai
Poster: WIP-13
In most common methods of tactile presentation on touch screen, the tactile display was directly attached or contacted onto the screen. Therefore, the tactile display must be transparent so that it does not obstruct the view of the screen. On the other hand, if the tactile sensation is presented at back side of the device, the tactile display does not need to be transparent. However, the cost of covering the whole back side with high-density tactile display is high. To overcome these limitations, we proposed to use a small and dense tactile display placed at the back side and touched with one finger. The tactile display is able to present the information around the operating finger touching the screen to the finger on back side of device. This paper reports the ability of shape discrimination, by comparing two cases where the device is operated by one hand and both two hand.
Roel Kuiper, Dennis Heck, Irene Kuling, David Abbink
Poster: WIP-14
Support systems that present additional task information may assist operators, but their usefulness is expected to depend on several factors such as 1) the kind of conveyed information, 2) through what modality and 3) the task difficulty. In an exploratory experiment these three factors were manipulated to quantify their effects on operator behavior. Subjects (n=15) used a haptic manipulator to steer a virtual nonholonomic vehicle through abstract environments where obstacles needed to be avoided. A simple support was designed conveying near-future predictions of the path of the vehicle; and a more elaborate support that continuously suggests the trajectory to be taken (factor 1). Both types of information were offered either with visual or haptic cues (factor 2). The resulting four support systems were tested in ?�?difficult?�? and ?�?easy?�? environments (factor 3), allowing respectively less or more variability in realized trajectories. The results show improvements for the simple support only when this information was presented visually, but not when offered haptically. The more elaborate support led to further improvements, equally for both haptic and visual support. It is concluded that, for the experimental conditions studied, a suggested path is the best support, regardless of the presented modality, haptically or visually.
Irene A. Kuling, Femke E. van Beek, Winfred Mugge, Jeroen B.J. Smeets
Poster: WIP-15
Visuo-haptic matching errors are idiosyncratic and consistent over time. Therefore, it might be useful to compensate for these matching errors in the design of haptic guidance in tele-operation systems. In this study, we investigated whether compensating for visuo-haptic matching errors results in better precision and less conflict between (the guidance from) the system and user in a reaching task. Participants had to reach for visual targets with the handle of a haptic device held in their unseen dominant hand, with haptic guidance towards the target position, the idiosyncratic matching error or without haptic guidance. The results shows that both types of guidance improve precision, but the disagreement between the participant and the guidance was smaller when the guidance was towards the end position of the idiosyncratic matching error. This was reflected in a lower residual force on the handle at the end position. Overall, adjusting for idiosyncratic visuo-haptic matching errors seems to have significant benefits over guidance to the visual target position.
Scinob Kuroki, Nobuhiro Hagura, Shin'ya Nishida, Patrick Haggard, Junji Watanabe
Poster: WIP-16
An Asian spice, Szechuan pepper (sanshool), is well known to induce a tingling sensation in the mouth and on the lips. A previous study revealed that the perceived frequency of this tingling is restricted to the low-frequency range, suggesting that this perception is mediated by the activation of the low-frequency mechanical input channel. To investigate the channel specificity of this sanshool-induced tingling sensation, we applied a chemical extract of sanshool on the fingers and conducted a psychophysical experiment. We measured the signal detection thresholds of low- or high-frequency vibrations with or without sanshool. Results showed that only detection threshold of low-frequency vibration was increased when sanshool was applied, suggesting that sanshool selectively activated the low-frequency channel.
Jocelyn Monnoyer, Michael Wiertlewski
Poster: WIP-17
Current touchscreen technology offers an intuitive human-computer interaction but often lacks haptic feedback. Typing text on a virtual keyboard is arguably the task that suffers the most from the absence of tactile acknowledgment. The feel of mechanical switches found in computer keyboards on flat screen have been previously simulated using vibrotactile actuation with high degree of realism. Here, we investigate the feasibility of modulating friction via ultrasonic vibration as a function of the pressing force to simulate similar effect. Transverse vibration is known to modulate sliding friction when a finger explores the surface but also have an effect on interaction involving change of normal pressure. Using a fast rendering scheme and a friction plate that can reach 4μm peak to peak of displacement at 35 kHz we modulate the friction of a finger at the instant where the user comes into contact with the glass. Different profile of friction reduction according to the normal force produce a range of effect, some of which resemble mechanical switches. \
Joe Mullenbach, Ahalya Prabhakar, Jacob Cushing, Ed Colgate
Poster: WIP-18
This work is a first step toward the goal of tactile reading on a flat touchscreen device. A set of variable-friction haptic icons were designed and assigned to letters of the alphabet. Forced-pace experiments were conducted to measure a subject?�?s ability to learn and identify icons presented individually and side-by-side. After 6-8 hours of training, maximum information transfer rates of 2.7 and 2.9 bits per second were found for the one icon and two icon cases respectively.
Taku Nakamura, Akio Yamamoto
Poster: WIP-19
This paper introduces a built-in capacitive-type sensing system that can simultaneously measure position and interaction force on a multi-user electrostatic haptic feedback system on an LCD. With multiple contact pads and a surface-insulated transparent electrode that covers an LCD, the system provides passive haptic force using electrostatically modulated friction force. Using the same setup, the system also detects pad positions, which was reported in the previous work. This work extends the sensing system such that it can detect both positions and interaction forces of the pads. To facilitate that, the system measures current ratio and total current amount, which correspond to the position and the force, respectively. A pilot experiment with a single pad demonstrates that the current amount accords with the interaction force.
Takuto Nakamura, Narihiro Nishimura, Taku Hachisu, Michi Sato, Hiroyuki Kajimoto
Poster: WIP-20
The Hanger Reflex is a phenomenon in which the head rotates involuntary when it is fastened with the wire hanger. This phenomenon is also found on the wrist. The Hanger Reflex is expected to produce smaller and more accurate haptic guide, because the device is fasten by end-effectors itself to user and does not require the fixing part. However, conventional device using the Hanger Reflex can only control the direction of the rotation, and fine control of the virtual force was not realized. To establish high efficient haptic device using hanger reflex, the control of the strength of the virtual force is required. In this paper, we used vibrators attached to Hanger Reflex device and describe the effect of the vibrator on the Hanger Reflex phenomenon on the wrist. The result of the experiment showed that the user feel enhanced virtual force when the vibration is presented under the Hanger Reflex.
Kei Nakatsuma, Ryoma Takedomi, Takaaki Eguchi, Yasutaka Oshima, Ippei Torigoe
Poster: WIP-21
Skin-to-skin contact is considered as an important factor of communication. A lot of researches and clinical studies have evinced its physical and emotional effect in various treatments and nursing care. By measuring and visualizing such touching activities, these studies could be conducted more quantitatively. Our goal is to measure skin-to-skin contact area precisely. In this manuscript, we introduce a prototype of our skin contact area sensing system and a result of a fundamental experiment. Our sensing method is based on propagation of elastic waves through our bodies. The result indicates that the propagation amplitude varies with the skin contact area.
Bram Onneweer, Winfred Mugge, Alfred C. Schouten
Poster: WIP-22
When we interact with objects, the central nervous system receives information about the interaction forces. Previous research showed that humans make systematic errors when reproducing a previously learned force. The question remains where these errors originate, i.e. do we produce the wrong force because we learned the wrong force or because we execute the wrong force. This study investigates whether the force reproduction error is a learning error or an execution error. Subjects performed a force reproduction task where they had to reproduce a learned target force of 10N in the same direction (reference trials) or in a 90 degrees counter clockwise rotated direction (test trials) for eight target directions. Ellipses were fitted through the reproduced forces during the reference and test reproduction trials separately. We hypothesize that the test ellipse will be similar to the reference ellipse for a reproduction error and rotated 90 degrees for a learning error. Results show that the test and reference ellipse have the same orientation and are similar for over 92%, indicating that the force reproduction error most likely originates during the reproduction phase. We suggest that the force reproduction error is an execution error and depends on the biomechanics of the arm.
Claudio Pacchierotti, Veronika Magdanz, Mariana Medina-Sanchez, Oliver G. Schmidt, Domenico Prattichizzo, Sarthak Misra
Poster: WIP-23
Artificial microrobots have recently shown promising results in several scenarios at the microscale, such as targeted drug delivery and micromanipulation of cells. However, none of the control systems available in the literature enable humans to intuitively and effectively steer these microrobots in the remote environment, which is a desirable feature. \ \ In this paper we present an innovative teleoperation system with force reflection that enables a human operator to intuitively control the positioning of a self-propelled microrobot, referred to as microjet. A novel particle-filter-based visual tracking algorithm tracks at runtime the position of the microjet in the remote environment. A 6-DoF haptic interface then provides the human operator with haptic feedback about the interaction between the controlled microjet and the environment, as well as enabling the operator to intuitively control the target position of the microjet. Finally, a wireless magnetic control system regulates the orientation of the microjet to reach the target point.
Max Pfeiffer, Le Duy Linh Phan, Michael Rohs
Poster: WIP-24
Nowadays haptic feedback is typically placed on the device rather than on the user. In this work we present an approach that applies haptic feedback directly to the user?�?s arm via electrical muscle stimulation. With this approach existing touch-sensitive devices or other surfaces can be equipped post-hoc with haptic feedback. We investigate types of patterns that can be used to create haptic output on flat glass-like surfaces to increase the realism of the interaction. By mapping visual features to haptic output, the haptic feedback becomes content-specific and dynamic.
Quang Trung Pham, Takayuki Hoshi, Yoshihiro Tanaka, Akihito Sano
Poster: WIP-25
Finite Element Method (FEM) analysis is widely used for micro-scale studies on mechanoreceptors. Previous studies primarily used two-dimensional (2D) models of the fingertip cross-sections vertical to fingerprint ridges including dermal papillae. Their results suggested that the 2D configuration of dermal papillae largely affects the stress concentration at the base of them where Merkel cells distribute. Oppositely, they barely affect Meissner corpuscles which are located at the tip of dermal papillae. Although these 2D models assume that the structure of dermal papillae is even in the third direction along fingerprint ridges, it is uneven in reality. This paper focuses on the effect of this unevenness of dermal papillae. Two types of 3D models are used. One is a model that is even along fingerprint ridges as is the case with 2D models, and the other is uneven as is the case with the real skin. This uneven model demonstrates the concentration of stress at the tip of dermal papillae. This result indicates that dermal papillae possibly enhance the response of FA-I, which is associated with Meissner corpuscles.
Hadi Rahmat-Khah, Ehsan Zahedi, Ali Safavi, Micheal Kia, Javad Dargahi, Mehrdad Zadeh
Poster: WIP-26
This paper reports the recent work-in-progress results on a gesture-based haptic guidance approach for human-robot interaction in a haptic-enabled environment. In this approach, a virtual task has been segmented into primitive gestures and reference hidden Markov models (HMMs) have been trained according to these gestures. Applied forces have adaptively been calculated in real time with respect to gestural differences between user motions and the reference models. Our approach is partially task independent and robust to spatial variation of gestures due to the HMM-based segmentation scheme. Our experimental results show that the variable impedance control can effectively improve user performance in the presence of human behavior uncertainty during a collaborative task. This model-based haptic guidance could be extended to provide haptic guidance in real-time skill assessment, sport training, and rehabilitation exercises.
Ben Richardson, Haonan Sun, Ed Colgate
Poster: WIP-27
In single fingertip interactions with a virtual environment, the illusion of softness can be created by partially wrapping the fingertip and increasing the contact surface area between the fingertip and the device, even in the absence of substantial fingertip displacement. We found that when the fingertip explores the virtual softness environment, contacting the opposing surface in a pinch grip increases the perceived virtual softness. We built a device that used two servo motors to manipulate the contact surface. The increase of perceived softness due to contact on the opposing surface, or the point of subjective equality (PSE), was on the order of 25%.
Satoshi Saga
Poster: WIP-28
When a human places his hands over a source of heat, his hands become worm owing to thermal radiation. In this study, we employ spatially controlled thermal radiation to display a virtual region. At a temperature near the nociceptive temperature, a person will tend to avoid a heated region. Using this space, our proposed system displays virtual shape-like region. In this paper, we describe the proposed radiation system and control method, and evaluate displayed feeling of proposed method.
Shunsuke Sato, Shogo Okamoto, Yoichiro Matsuura, Yoji Yamada
Poster: WIP-29
To create an active-touch texture sensor system, we have been developing a new method for estimating the deformation of a finger pad based on the skin deformation propagated to the radial side of a finger tip. In this study, in order to validate the method, we compared the deformation or its acceleration of finger pad estimated on the basis of the acceleration measured at the radial side and those estimated by the shear force applied to the finger pad while exploring a few types of materials. The estimation errors for the deformations at 40-450 Hz were smaller than human discrimination thresholds, indicating that the accuracy of our method is satisfactory compared with human perceptual sensitivity.
Kotaro Shikata, Yasutoshi Makino, Hiroyuki Shinoda
Poster: WIP-30
We present a new wearable device for inducing involuntary elbow joint flexion and extension by stretching the skin surfaces of upper arm and forearm. The device can teach proper motions to users through haptics. In previous studies, rotational motion of forearm was achieved by applying normal or shear deformation to the skin. Our proposed system can instruct bending motions of limbs and can be used along with the previous rotating induction device. We report that users can recognize bend/extend direction with more than 80 % accuracy with our device.
Lisa Skedung, Martin Arvidsson, Lovisa Ringstad, Mark. W Rutland
Poster: WIP-31
To be able to design materials with specific haptic qualities, it is important to understand not only the contribution of physical properties from the surfaces of the materials, but also the perceptions involved in the haptic interaction with the materials. In a series of experiments we have studied the effect of micro-topography and friction on haptic perception of materials. The results show that micro-textures share dimensionality with macro-textures, that slipperiness is the main perception and that perceived roughness seems to be multidimensional in itself with a surface structure component (feeling bumps on the surfaces) and a friction component.
Kwon Joong Son, Yong Seok Ihn, Keehoon Kim
Poster: WIP-32
A piezoelectric variable-friction tactile display (VFTD) utilizes a squeeze film effect induced by the ultrasonic flexural waves on the haptic panel. This paper presents a finite element analysis and an experimental evaluation of a degenerate vibration mode switching technique in pursuit of finding a feasible and practical method to prevent a user?�?s finger from crossing over a stationary nodal line. Finite element modal and harmonic analyses were conducted to confirm the existence of degenerate vibration modes and to determine the feasibility of squeeze film production in each mode. The simulation results showed good agreement with experimental measurements of natural frequencies and corresponding mode shapes.
Adam Spiers, Janet van der Linden, Maria Oshodi, Sarah Wiseman, Aaron Dollar
Poster: WIP-33
Flatland was a large scale immersive theatre production completed in March 2015 that made use of a novel shape-changing haptic navigation device, the ?�?Animotus?�?. Copies of this device were given to each audience member in order to guide them through a 112m2 dark space to large tactile structures accompanied by audio narration from the production?�?s plot. The Animotus was designed to provide unobtrusive navigation feedback over extended periods of time, via modification of its natural cube shape to simultaneously indicate proximity and heading information to navigational targets. Prepared by an interdisciplinary team of blind and sighted specialists, Flatland is part performance, part in-the-wild user study. Such an environment presents a unique opportunity for testing new forms of technology and theatre concepts with large numbers of participants (94 in this case). The artistic aims of the project were to use sensory substitution facilitated exploration to investigate comparable cultural experiences for blind and sighted attendees. Technical goals were to experiment with novel haptic navigational concepts, which may be applied to various other scenarios, including typical outdoor pedestrian navigation. This short paper outlines the project aims, haptic technology design motivation and initial evaluation of resulting audience navigational ability and qualitative reactions to the Animotus.
Kenta Tanabe, Seiya Takei, Hiroyuki Kajimoto
Poster: WIP-34
Haptic feedback is crucial for enriching the experience of virtual reality contents. While most haptic devices focused on the fingertip, or some required huge setups, we have developed a simple glove-type master hand that has two features. One is that it uses numerous actuators to cover the whole hand (52 vibrators). The other is that we employed linear resonant actuators to achieve high-speed response. We also developed VR environment that users can touch and feel VR object with the glove. In this paper, we conducted an experiment to verify the significance of the whole hand tactile stimulation and low latency feedback for the identification of contact shape. As a result, whole palm feedback shorten the exploration time and the tendency to improve accuracy was observed. This means whole hand, low latency feedback enhances VR touching experience.
Ozan Tokatli, Volkan Patoglu
Poster: WIP-35
We propose the use of fractional order models/controllers in haptic systems and study the effect of fractional order elements on the coupled stability of the overall sampled-data system. We show that fractional calculus generalization provides an additional degree of freedom for adjusting the dissipation behavior of the closed-loop system and generalize the well-known passivity condition to include fractional order impedances. Our results demonstrate the effect of the order of differointegration on the passivity boundary.
Ozan Tokatli, Volkan Patoglu
Poster: WIP-36
Fractional order calculus is a generalization of \ the familiar integer order calculus in that, it allows for differentiation/integration with orders of any real number. The use of fractional order calculus in systems and control applications provides the user an extra design variable, the order of differointegration, which can be tuned to improve the desired behavior of the overall system. We propose utilization of fractional order models/controllers in haptic systems and study the effect of fractional differentiation order on the stability robustness of the overall sampled-data system. Our results demonstrate that fractional calculus generalization has a significant impact on both the shape and area of stability region of a haptic system. Our result also include experimental verification of the stability regions predicted by the theoretical analysis.
Haruya Uematsu, Daichi Ogawa, Ryuta Okazaki, Taku Hachisu, Hiroyuki Kajimoto
Poster: WIP-37
Contents using users?�? whole-body motion is now becoming popular, along with the spread of the low-cost whole-body motion capture systems. To present tactile signals to the whole body, latency of the system becomes a critical issue because it leads to spatial gap between the desired position and actual stimulation position, and also leads to spatially unsharped sensation. To reduce latency, we have proposed to use a linear resonant actuator as a high response vibrator, and use projection light to be a input signal for vibrator control so as to eliminate latency derived from PC-sensor communication. To evaluate our idea, we had developed a vibration unit and conducted the experiment of shapes recognition. The result showed that high-response vibrator and larger number of vibrators are both effective for achieving quick recognition.
Nick J. van de Berg, Roy J. Roesthuis, Sarthak Misra, John J. van den Dobbelsteen
Poster: WIP-38
This study aims to investigate the effect of intra-operative haptic shared control in needle steering. A real-time path planner is used to estimate the optimal tip angle of an active tip-steered needle. The error between the planned and actual tip angle is fed back to the user by means of low intensity force cues through a haptic interface. Targeting experiments have been performed in a tissue phantom. The targeting error with visual + haptic feedback is 4.4: 4.0 mm (mean: std), compared to 5.2: 4.2 mm for visual feedback only. Variations in user response to the supplied force stimuli are seen. These effects motivate further investigations into the perception and performance for modified levels of shared control authority and task complexity.
Jeroen van Oosterhout, Cock Heemskerk, Marco de Baar, Frans van der Helm, David Abbink
Poster: WIP-39
Certain tele-manipulation tasks require manipulation of heavy objects by two asymmetric slaves, for example a crane for load support and hoisting and a dexterous robotic arm for fine manipulation. The optimal role for human operators and control interface design are as of yet unclear: can a single operator do the task alone or do two co-operating operators perform better? In a human factors pilot study we made a first attempt to identify the differences in task performance between co-operated control, bi-manual control and uni-manual control of two asymmetric slaves during a heavy load handling task. \ We expect that bi-manual control yields the longest task-completion time (due to conflicting spatial and temporal constraints), and uni-manual control the shortest. The results suggest this trend to exist and that future HMI should facilitate communication about abilities and limitations of the slave as well as present a shared internal task representations to operator(s).
Takumi Yokosaka, Scinob Kuroki, Junji Watanabe, Shin'ya Nishida
Poster: WIP-40
Humans actively scan surfaces with eye and hand movements to acquire tactile information. It is known that a judgment task involving a different haptic rating task induces different exploratory movement. However, it is rare to be asked to judge a tactile property in daily life. Here, we conducted experiments to elucidate the relationship between active movement performed without any judgment task and tactile evaluation. Participants scanned 35 material surfaces in free-touch (that is, without any judgment task) or with any one of four perceptual judgment tasks. By conducting a multiple regression analysis, we studied how well the analysis of eye and hand movements in free-touch enable us to predict the evaluated rating of a textural tactile property. Our results indicate that the analysis of the movements performed in judgment of each tactile property could predict evaluated ratings. Moreover, the analysis of the movements in the free-touch situation could predict roughness, stickiness, and warmth ratings even if participants did not judge the tactile property.

Back to the Program ↑

Student Innovation Challenge

The student innovation challenge focuses on making the world a better place through haptics. This year, participants built mobile applications using the TPad Phone. The TPad Phone is an Android smartphone with a variable-friction haptic display. Students have written Android application that uses the haptic display to solve some real-world problems. For more info, head over to the the Student Innovation Challenge page.

A list of all the entries in the challenge is shown below. To view all the Student Challenge videos as a playlist, click here.

One of the awards given to the student teams is a People's Choice Award chosen by attendees of the World Haptics Conference. VOTE HERE for the People's Choice Award.

Matti Strese, Clemens Schuwerk, Dmytro Bobkov
Technical University of Munich

Imagine you are using your mobile device to browse the Internet for new furniture, home decoration or clothes. Today’s systems provide us only with information about the look of the products, but how does their surface feel when touched? For the future, we imagine systems that allow us to remotely enjoy the look and feel of products. The impact of such technology could be enormous, especially for E-Commerce. Tangible example applications are product customization, selection of materials, product browsing or virtual product showcases. The “Remote Texture Exploration“ app displays surface textures using the TPad Phone, which can be received from a texture database or from remote smartphones. Vibration and audio feedback is included to enrich the user experience. The texture models used to display textures recreate important dimensions of human tactile perception, like roughness or friction. New texture models are created using live recordings from the smartphone sensors (IMU, camera, microphone).

Domenico Buongiorno, Domenico Chiaradia, Massimiliano Gabardi, Michele Barsotti
Scuola Superiore Sant’Anna

Tactile Blind Photography is an app for TPad Phone designed to help visually impaired people to take pictures of their friends (or taking selfies!) to improve their social experience and personal gratification.

The developed app improves photography experience by guiding the user taking a picture, a task reported very difficult by visually impaired people. The app allows to automatically detect faces on the scene and helps the blind photographer to take a picture by providing both audio and tactile feedback.

Finally, tactile feedback of the picture is provided by processing the image, in order to reduce complexity and to evidence only relevant features (i.e. people’s faces and picture edges). Two blind people tested our app reporting their feelings about the interaction with the TPad Phone.

Mariacarla Memeo, Fabio Tatti, Maria Laura D'Angelo
Istituto Italiano di Tecnologia, Genova

Classic musical readers for the visually impaired population use textual representation, representing notes with their Braille name preceded by possible alterations. Several technical challenges prevent this notation from being implemented on variable friction devices such as the TPad phone. While reading braille from these devices is challenging, they are optimized for the exploration of tactile maps and the encoding of spatial information.

A classic music stave is in itself a spatial map and provides an intuitive and efficient representation of music whereby temporal and pitch information are encoded in the note’s position. We exploited this property, in an android application for the TPad phone, which uses variable friction and vibration to create an interactive haptic stave where textures and haptic cues allow the user to browse and read sections of music on a hand-held device.

Gabriel Figueiredo, Matheus Tura, Bruno Cattelan, Wagner Rampon
Universidade Federal do Rio Grande do Sul

Password typing and pattern drawing are widely spread ways to grant access to systems. However, both are vulnerable to malicious users that can sneak around and look while you type or drag. Current biometric solutions have implications on the user agreeing to supply very personal data, such as fingerprint or iris pattern. Moreover, multi-user access with biometry is limited, as authorization is only possible to a new user if they are present to be signed up. A haptic password, on the other hand, can provide anonymity and can be shared others. The idea is to use friction-based barcodes. First, the user creates a numerical password. Afterwards, in a continuous route-independent drag, the user counts high-friction tactile lines separated by random spaces. When the touch is interrupted, the current count becomes one numerical input. When all the inputs are inserted in the correct order, the access is granted.

Hasit Mistry, Ashik Samad, Fida AlSughayer, Hsin Cheng Chen
University of Washington Bothell

Concepts of Physics are best understood when paired with practical experiences. In addition to establishing the link between the abstract and the tangible, experiments enhance the kinesthetic experience which, in turn, strengthens learning.

Physics experiments are conventionally restricted to specialized computer simulations or costly physical equipment. We are designing an educational tool that uses Haptic Feedback to enhance the experience of learning the fundamentals of Physics through experiments.

The app will allow users to simulate complex environments, compute forces acting on objects and predict the results. The user can construct different scenarios by placing objects, moving them across the screen and watching the environment react. They can adjust variables like gravity, weights of objects, elevation, and friction of surfaces.

Haptic feedback is employed to simulate forces in order to give a realistic field of experimentation, as compared to conventional tools.

Brenna Li, Gordon Minaker, Paul Bucci, Oliver Schneider
University of British Columbia

Finger painting just wouldn’t be same without the feeling of wet paint on your fingers. We set out to enhance the experience of drawing, painting, writing, and other mark-marking interactions with touchscreen devices. We’ve explored what mark-making tools should feel like through the lens of a drawing application: Roughsketch. The app features several mark-making tools including a pen, paint brush, eraser, airbrush, and pan/zoom sensations. The TPad Phone helps us bring these tools to life, delivering a delightful sensation to touchscreens - both with your fingertip, and with a stylus.

Cristina Ramírez-Fernández, Edgar Barreras-Sosa, Octavio Valenzuela
Universidad Autónoma de Baja California

This app is a haptic augmented reality system (HARS) for the treatment of small animal phobia. The treatment uses the systematic desensitization in the interaction with the animal. The HARS has three levels of interaction: ludic, formal and augmented. In addition, the HARS has three animals identified through a contextual study with 120 teenagers between 11 to 17 years old. The HARS has three main components: 1) selection of the animal (e. g. spider, cockroach, snake) and the diagnostic of phobia using the manual of mental disorders; 2) the treatment where patients can see, touch and hear the small animal; and 3) the statistics on stress and the time interacting with the animal. The HARS objectively measured the stress (accelerometer) and the time touching the animal (variable-friction haptic display). Finally, the stress data are classified and predicted using support vector machines.

Yongjae Yoo, Hojun Cha
POSTECH

Secrecy is a critical issue for smartphones. However, the current input methods for access codes are vulnerable to the shoulder surfers. For example, lock patterns can be easily captured or even worse, the last keyboard entry remains unmasked and can be easily stolen. As a solution to this problem we propose TeXecure, a texture-based on-screen secure input method. TeXecure offers several easily distinguishable textures as secure input. Instead of typing a password or drawing a pattern, a user needs to remember and select a series of textures as an access code. TeXecure can be embedded in various applications such as bank transactions, PIN inputs or even lock-screen of Android phones.

Dennis Babu, Daniel Gongora, Seonghwan Kim, Shunya Sakata
Tohoku University

User interactions with the recent smartphones and tablets are visually rich but poor in the sense of touch and thus is not fully immersive. We propose the HelloHapticWorld: a haptics educational kit which uses the variable friction of the TPad to virtually simulate the haptic sensation of tele-operated robot onto the fingertips of the user. The kit contains a mobile robot controlled by TPad and a modifiable field consisting of obstacles and road slopes of different sizes and shapes. This kit contributes in educating kids about haptic experiences by creating and exploring a DIY haptic world.

Salient Features of HelloHapticWorld

  • The forward and sideways swipe gestures in the TPad screen with variable friction feedback are used for robot control in forward and angular trajectories respectively.
  • Distance mapping of the obstacles and subsequent variable friction based haptic feedback.
  • Haptic reconstruction of road slope in TPad screen using variable friction display.