Article categories: In Other WordsSecond Nature
May 24th, 2011

Abstract: As the technological capability of robots increases and interactions between humans and robots become more complex, it is important for researchers to consider the potential for an emotional connection to exist between a human and a robot. We are currently working on a new humanoid robot specifically designed to investigate intimate human-robot interactions that are mediated by touch. This paper introduces the new robot and discusses the design of its artificial “skin”. After introducing the latest developments in tactile sensing in robotics, the paper presents our preferred approach: that of electrical impedance tomography (EIT). The paper concludes with our experimental results showing EIT reconstruction of a touch stimulus on the artificial skin.

Introduction

Our group at the Australian Centre for Field Robotics (ACFR) at the University of Sydney has been working together on projects involving media art and robotics since 2003. In 2006 we established the Centre for Social Robotics, a centre dedicated to the research and understanding of human-robot interactions that occur in social spaces. Social spaces are defined as spaces that involve (untrained) members of the general public such as hospitals, galleries, airports and domestic environments.

Our current research builds on our previous robotics project, Fish-Bird (2003-2007). In Fish-Bird, two autonomous robots in the form of wheelchairs impersonate two “characters” who fall in love but cannot be together due to technical difficulties. The robots communicate with each other and their audience via movement and written text. The Fish-Bird robots are neither anthropomorphic nor cute, yet their audience interacts with them in ways that require trust and shared intimacy.

As the technological capability of robots increases and interactions between humans and robots become more complex, it is important for researchers to consider the potential for an emotional connection that may exist (even momentarily) between a human and a robot. We are currently working on a project that aims to create a new humanoid robot. This five-year project aims to investigate intimate human-robot interactions in order to develop an understanding of the physicality that is possible and acceptable between a human and a robot. The project seeks to answer the key question: Can emotionally-driven human-to-human physical interactions serve as models for analogous interactions between humans and robots?

In this paper we concentrate on “affective touch” and the implementation of an artificial skin suitable for covering the entire body of our new robot. The implementation of touch sensing and the interpretation of touch is a large and unresolved research area that we believe will play a crucial role in the development of human-robot interaction.

In this paper we first describe work in related fields to provide background to our research. We then give a detailed description of the development of the project, with emphasis on the issues involved in “artificial skin” in the context of affective touch.

Background to the field of human-robot interaction

Physicality and tactility are major elements in the research and practice of many media artists. Numerous interactive artworks have been created that respond in some way to human touch. These include Fleishmann’s Liquid Views (1994), d’Urbano’s Touch me (1995), Schiphorst’s Bodymaps: artefacts of touch (1995) and Dubois’ Tact (2001), to name but a few.

For some time now there has been research directed towards building robots that can interact with humans. A pioneering project involved developing the “whole arm manipulator” (Guertin and Townsend, 1995), a robotic arm that is able to sense contact along its whole length and yield to pressure when it contacted obstacles (or humans!). Since this early work, robotics research directed towards interaction with people has coalesced into two main areas: humanoid robotics, and devices typically described in the robotics literature as ‘robots for psychological enrichment’ (Shibata; 2004).

The fundamental premise of researchers who work in the field of humanoid robotics is that machines (robots) that are designed to operate in social spaces should have capabilities that are “human-like”. The intent is to match the robot’s attributes, such as size, strength, dexterity, etc., to structures and artefacts in the built human environment. It is often uncritically assumed that a person will be more psychologically comfortable with a human-like robot. This assumption is thrown into doubt by Mori’s (1970) theory of the “Uncanny Valley”–the paradoxical feeling of strangeness when one views a human-like entity that is “not quite perfect”.

Although early work in humanoid robotics involved the creation of robots that were, paradoxically, very machine-like in their rigid appearance and behaviour, there is growing awareness in the robotics community of the importance of aspects such as the appearance, tactile feel and “social” behaviours of humanoid robots. Roboticists and social researchers are also beginning to appreciate the importance of social, emotional and ethical issues raised by the development of humanoid robots. For example, recently there has been provocative work on social and moral relationships (Wagner, Van der Loos and Leifer, 2000; Duffy, 2003; Kahn et al., 2004); mental models and shared grounds (Hwang, Lee and Kwon, 2006); emotional interaction (Arkin et al., 2003); the concept of ‘personal space’ (Walters et al., 2005); and long-term social interaction (Gockley et al., 2005; Bickmore and Picard, 2005) between a human and a robot.

From the first comprehensive elucidation of the concept of “sensitive skin” by Vladimir Lumelsky (2001), research in humanoid robotics has contributed several types of artificial skin based on organic transistors (Someya et al., 2005), resistive (Pan, Cui and Zhu, 2003 ), capacitive (Chang et al., 2006) and optical (Yamada et al., 2005) principals. Sensitive skin has been applied in a laboratory environment to humanoid robots (Odashima et al., 2006). Our research draws upon these advances and on related work that seeks to identify–using skin sensor outputs–how and where the skin is being touched (Chang et al., 2006; Stiehl and Breazeal, 2004).

Robotics research has also contributed novel flexible structures such as “snake-like” (Simaan, 2005) and “invertebrate” (Walker, 2000) robots that could provide a starting point for research into a more flexible humanoid form.

Robotic devices aimed at “psychological enrichment” usually take the form of an animal or pet, such as Paro the robotic seal (Wada, Shibata and Tanie, 2002) or Huggable the teddy bear (Stiehl and Breazeal, 2006 ). These devices aim to first recognise some aspect of a human user’s mental state by measuring how they handle the device, and then to respond in some physical way. There is also interesting related work on The Hug, (DiSalvo, 2003), a cushion-like form that mediates physical interaction between people in different locations using telepresence.

We are sceptical about “realistic” representational forms such as android [1] or pet-like robots as they can lead to a state of misrecognition. For example, in Japan there have been experimental trials where humanoid robots were placed in nursing homes. Eduardo Nebot (2005) reports that some of the residents interacted for up to six minutes with the robot, misrecognising it as a real person. Rather than creating a robot that is sufficiently realistic as to be easily misrecognised as a person, our design is based on an idealised representation of a human. It will have human-like elements, such as a head, torso and arms, but its colouring and surface finishes will be chosen to prevent misrecognition as a person.

Overview of the project

The aim of this project is to explore and understand the degree of engagement–intellectual, emotional and physical–that may be possible for a human to have with a robot. The project builds on our previous robotics project Fish-Bird (Figure 1). Data collected through surveys conducted during public exhibitions and laboratory demonstrations highlight the fact that participants were attracted to the robots not because of the way they look but because of the way they behave.

The project proposes the creation of a new robotic form that will interact with humans in public environments such as museums and art galleries. Spectators will enter a wide exhibition space where a kinetic object is moving about in a smooth, sinuous manner. The robot will be an impressionistic humanoid form, with a “head”, “body” and “arms”. The “skin” of the object will be smooth and will deliberately feel non-human-like. The height of the robot will be approximately 155 cm, slightly smaller than that of an average adult. When a spectator approaches the robot, it will respond physically by turning towards the person and gently moving closer to them. Devices for text display–such as LCD displays–will be embedded in its body as an additional communication link to the spectator/participant.

Appearance and behaviour

To date, emotional activation and/or mediation of interaction have received scant attention in robotics, with most examples (Breazeal and Scassellati; 1999. Wada, Shibata and Tanie; 2002) relying on cute “pet-like” appearance. For example, you may want to hold Paro the robotic seal because it looks cute.

There is evidence in the literature that any mismatch between a person’s expectation of an entity’s appearance and behaviour and the perceived appearance and behaviour is a potent source of negative feelings towards the entity. As a machine’s appearance and behaviour more closely reassembles a human, our expectation of it to exhibit human-like characteristics such as intelligence and emotions increases (Woods, Dautenhahn and Schulz; 2004. Hegel et al.; 2008). Mori’s theory of the Uncanny Valley (1970), extended by Ishiguro (2005), predicts that there is a point where the lack of “something” produces a negative familiarity (Figure 2) that results in dislike and rejection. It is, after all, impossible to make a replica of a human, and the consequent mismatch between appearance and actual behaviour will always give rise to feelings of repulsion and disapproval.

We believe strongly that, whilst any attempt to replicate a human is futile, it is important to include elements in the robot’s design that can lead a participant to identify it as an artificial “persona”. Although we have seen that people interact with almost anything that responds to them as a persona, including the wheelchair robots in Fish-Bird, in this project we are interested in understanding whether people can engage with robots in ways that are analogous to how they relate to other people.

One of our major considerations when designing this new robot is that the behaviour and “feel” of the robot when it touches or is being touched by a human should match its appearance. The project proposes a model of emotional engagement that postulates that the combination of close physical proximity and intimate messages though text (on the robot’s body) could lead to the emotional activation that is necessary to support physical interaction. The feel of the robot’s skin is especially important, since an unpleasant sensation when it is touched is likely to break any emotional connection that has developed.

Artificial sensitive skin

Touch is the largest of all human senses and the first one to develop. Our bodies are literally covered by millions of different touch receptors; our muscles, joints and organs are all connected to nerves that constantly send information to the brain. Almost everything we do, including walking, talking, sitting and kissing, depends on touch; it is almost impossible to imagine life without it.

The concept of “sensitive skin” for robots was first introduced by Vladimir Lumelsky (2001) to facilitate the use of unsupervised robots in unpredictable conditions. Some of its main purposes are to allow robots to be cautious and “friendly” to their environment; to improve object manipulation; and to improve human-robot interaction.

Sensitive skin is regarded by various robotics researchers as a large area, flexible (Minato et al.; 2007) and stretchable (Hoshi and Shinoda, 2006; Tada et al., 2007) array of sensors that fits into curved robot surfaces (Nagakubo, Alirezaei and Kuniyoshi; 2007). It may have the ability to sense tactile information such as pressure (Shimojo et al.; 2004), proximity (Stiehl and Breazeal; 2005), slip and texture (Hosoda; 2004), temperature (Stiehl et al.; 2005) and chemical stimulii. In addition, all hardware should be embedded in the robot and the covering material should be soft (Mukat; 2004) and feel “good” when touched (Stiehl and Breazeal; 2005).

Since the introduction of the concept in 2001, researchers have developed several types of artificial skin. These usually consist of a number of discrete sensors (Mukai et al.; 2008) connected individually or in a grid configuration (Papakostas, Lima and Lowe; 2002), capable of measuring one (Rossiter and Mukai; 2006) or more (Stiehl and Breazeal; 2005) physical properties and which replicates, to some degree, the appearance and feel of human skin (Lumelsky, Shur and Wagner; 2001; Stiehl et al., 2005; Ishiguro and Nishio; 2007). Methods for sensing range from the creation of artificial materials with human skin-like texture (Shirado, Nonomura and Maeno; 2006), sensors made of organic field-effect transistors (OFETs) (Someya, Sakurai and Sekitani; 2006), piezoresistive semiconductors (Mukai et al.; 2008), piezoelectric materials (Ishiguro, 2005; Ishiguro and Nishio; 2007) and optics (Nicholls; 1991) to a multi-modal approach (Takamuku et al.; 2007) using multiple layers of piezoresistors, piezoelectric sensors and temperature sensors (Figure 3).

 

Although human skin has a number of types of receptors that can be represented by different sensors such as piezoresistors (pressure), PVDF films (vibration), thermistors (temperature) and so on, we are not proposing to replicate human skin in all its complexity and capabilities.

In this project we seek to understand whether the social constructs and taboos relating to touch between people also extend to touch between a robot and a person. The interpretation of touch in human-to-human interaction is highly complex and is strongly influenced by the context, including the participants’ cultures, emotions, and beliefs (McDaniel and Andersen; 1998). Effective interpretation of touch therefore requires that one first understands the meaning of different tactile modalities as applied to different body areas. We discuss this in more depth in the following section.

Tactile communication

Early work on tactile communication often categorizes touch according to its function. For example, Heslin (1974) describes five types of touch as: (1) Functional/Professional, (2) Social/Polite, (3) Friendship/Warmth, (4) Love/Intimacy and (5) Sexual Arousal. His study demonstrates that touch is used to communicate different messages and the interpretation of a specific touch is affected by factors such as: the type or modality of the touch (pat, squeeze, brush, stroke, poke, etc.); its location, duration, intensity and frequency; gender (Heslin and Alper, 1983); and the degree of interpersonal involvement (McDaniel and Andersen, 1998) between the two people who are touching.

The location of touch can be divided into two classes: “non-vulnerable” body parts such as hands, arms, shoulders, and upper back, and “vulnerable” body parts such as head, neck, torso, lower back, buttocks, legs, and feet (Jones and Yarbrough; 2007). The more a touch is seen as an invasion of privacy the less positive–loving, pleasant and friendly–it is.

In addition, intensity and duration can be associated with certain emotions such as anger, fear, love, gratitude and sympathy (Burwell; 1999). For example, sympathy may be associated with a long duration but moderate touch, anger with a strong touch of moderate duration and gratitude with a hand shake.

In general, for touch to be positively accepted it should feel “good”. According to Burwell (1999), even casual touch from a stranger can be seen as positive and induce a feeling of well-being. Tiffany Field, in her book Touch (2001), relates a famous experiment conducted by Harry Harlow, who created two surrogate monkey mother machines. One was covered with a soft terry-cloth warmed with a light bulb and the other was made of bare wire mesh; as expected, the soft terry-cloth mother was preferred.

It is evident that during social interaction humans extract information from tactile stimuli that helps them interpret touch. In our robot it is therefore important to design a method for touch identification based on the integration of factors such as location, duration, intensity and modality of touch over most of the body area. In addition, we need an artificial skin that is flexible, stretchable, and contains an adequate number of sensors connected with minimal use of wires. It is for these reasons that we are proposing a sensitive skin based on the principle of electrical resistance tomography.

Sensitive skin with electrical impedance tomography

Electrical impedance tomography (EIT) is a technique used mainly in medical imaging to estimate the internal conductivity of an electrically conductive body by using only measurements from its boundary. Typically, electrodes are placed around the boundaries of the conductive body (e.g. the human thorax) and alternating current (typically 0.1–1.0 mA at 10–100 kHz) is applied to a pair of them. The current will flow not only between the two electrodes but through the whole conductive body and the resulting potentials can be measured at all electrodes. A change in the internal conductivity of the body alters the current pathways, causing changes to the electrical potentials measured at the boundary. By repeating these steps and scanning around the boundary, it is possible to estimate the current distribution inside the body. If direct current is used instead of AC and the same method is applied the technique is called electrical resistive tomography (ERT) (Holder, 2005).

EIT and ERT for sensitive skin were previously reported by Nagakubo, Alirezaei and Kuniyoshi (2007), Kato et al., (2007) and Alirezaei, Nagakubo and Kuniyoshi (2009) who placed electrodes at the border of a rubber/fabric material which changes its local conductivity with applied pressure. If the conductivity in the material changes, the current distribution also changes and the local changes in conductivity (and therefore pressure) can be identified using EIT/ERT. Since most of the sensing area is made of the same material without any wiring, a flexible and stretchable “skin” can be realised.

Experiments were performed with a conductive cloth manufactured by Less EMF Inc. This material is a highly conductive (ρ approx 1.5 ohm/sq), medical-grade silver plated 92% Nylon 8% Dorlastan fabric with the ability to stretch in both directions; conductivity changes as it is stretched or compressed.

Eight electrodes were positioned around a 200mm diameter circular piece of cloth and a constant DC current of 100 mA was injected across pairs of electrodes. Figure 4 (left) shows this experiment, with pressure being applied to the conductive cloth through a square block. Figure 4 (right) depicts a false-colour reconstruction of the measured pressure, with dark blue representing areas of higher pressure.
The development of this sensitive skin shows many improvements over earlier approaches as it is flexible, stretchable and relatively easy to manufacture. Initial results are encouraging, although work needs to be done to improve spatial resolution and discrimination.

Concluding comments

Our results suggest that electrical impedance tomography (EIT) can provide a sound basis for the implementation of a sensitive artificial skin that is suitable to be applied over large areas of a robot. EIT can provide the underlying sensing technology; an even more challenging problem is the interpretation of touch on a robot. Our future work will address this problem.

Biographies

Mari Velonaki is a media artist and researcher who has worked in the field of interactive installation art since 1995. Her practice engages the spectator/participant with digital and robotic ‘characters’ in interplays stimulated by sensory triggered interfaces. Her innovative human-machine interfaces promote intimate and immersive relationships between participants and interactive artworks. Mari’s work has been exhibited widely, including at ARoS Denmark, Wood Street Galleries, Ars Electronica and China Millennium Art Museum. She was awarded a PhD in Media Arts by COFA in 2003 and an Australia Council Visual Arts Fellowship in 2007. In 2006, with David Rye, she co-founded the Centre for Social Robotics within the ACFR at the University of Sydney (www.csr.acfr.usyd.edu.au).

David Silvera Tawil is a PhD student with a deep interest in social robotics and human-robot interaction. With a background in electronics and telecommunications engineering and further studies in mechatronic engineering, David Silvera’s previous research work involved interactive interfaces and remote laboratories for distance learning. His current research contributes to the understanding of the physicality that is possible between humans and robots, particularly by identifying the location and character of human touch on the artificial ‘skin’ of a robot.

Dr. David Rye works in embedded and applied control of machinery, and in the design and implementation of computer-controlled systems. Although his background is originally in mechanical engineering, he now works principally on computerised machinery, electronics, software and systems design. Dr Rye has conducted industrial research and development projects related to automation and control of machinery, including shipboard and container-handling cranes and the system design and experimental validation of autonomous vehicles. Since 2003 he has worked on human-robot interaction in a media arts context. David is also internationally recognised as a pioneer in the introduction and development of university teaching in mechatronics, having instituted the first Australian Bachelor of Engineering in mechatronics in 1990.

Endnotes

[1]Here the term “android” is used for a robot that is built in imitation of a human, regardless of “gender”. Strictly, the term “gynacoid” should be used for a “female” robot, but this term is not widely used.

References

Alirezaei, H., Nagakubo, A., & Kuniyoshi, Y. ‘A tactile distribution sensor which enables stable measurement under high and dynamic stretch’, In Proc. IEEE 2009 Symposium on 3D User Interfaces. (Mar-2009): 87-93.

Arkin, R.C., Fujita, M, Takagi, T. & Hasegawa, R. ‘An ethological and emotional basis for human–robot interaction’, Robotics and Autonomous Systems (2003): 42(3–4) 191–201.

Bickmore, T.W. & Picard, R.W. ‘Establishing and maintaining long-term human-computer relationships’, ACM Trans. Computer-Human Interaction (2005): 12(2).

Breazeal, C. & Scassellati, B. ‘A context-dependent attention system for a social robot’, In Proc. 16th Int. Joint Conf. on Artificial Intelligence (1999): 1146–1151.

Burwell, J. ‘May I touch you? Haptics in the multicultural workplace’, Gender Journal: Men and Woman working together (1999).

Chang, W., Kim, K. E., Lee, H., Cho, J. K., Soh, B. S., Shim, J. H., Yang, G., Cho, S.-J. & Park, J. ‘Recognition of grip-patterns by using capacitive touch sensors’, In Proc. 2006 IEEE Int. Symp. Industrial Electronics (2006): 2936–2941.

DiSalvo, C., Gemperle, F., Forlizzi, J. & Montgomery, E. ‘The Hug: An exploration of robotic form for intimate communication’, In Proc. IEEE Int. Workshop on Robot and Human Interactive Communication (2003): 403–408.

Dubois, J. ‘Tact’ (2001). In Can we fall in love with a machine? (Pittsburgh Cultural Trust: C. Hart and A. Ackerman, 2006), 23.

Duffy, B.R. ‘Anthropomorphism and the social robot.’ Robotics and Autonomous Systems (2003): 42(3–4) 177–190.

Field, T. Touch (Cambridge, Mass.: MIT Press., c2001).

Fleishmann, M. ‘Liquid Views’ (1994). In Interaction 97 (Japan: Sakane, I., 1997), 37–39.

Gockley, R., Bruce, A., Forlizzi, J., Michalowski, M., Mundell, A., Rosenthal, S., Sellner, B., Simmons, R., Snipes, K., Schultz, A.C., & Wang J. ‘Designing robots for long-term social interaction’, In Proc. 2005 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (2005): 1338–1343.

Guertin, J.A. & Townsend, W.T. ‘Teleoperator slave – WAM design methodology’, Industrial Robot (1999): 26(3).

Hegel, F., Krach, S., Kircher, T., Wrede, B. & Sagerer, G. ‘Understanding social robots: A user study on anthropomorphism’, In Proc. IEEE Int. Symposium on Robot and Human Interactive Communication. (Aug, 2008): 574–579.

Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A. & Jaskolka, A. R. ‘Touch communicates distinct emotions’, Emotion (2006): 6(3) 528–533.

Heslin, R. ‘Steps toward a taxonomy of touching’, paper presented to the annual meeting of the Midwestern Psychological Association, Chicago, IL (May, 1974)

Holder, D. S. Electrical Impedance Tomography (Bristol and Philadelphia, Institute of Physics Publishing, 2005).

Hoshi, T., & Shinoda, H. ‘A sensitive skin based on touch-area-evaluating tactile elements’, In Proc. 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, (Mar 2006): 89–94.

Hosoda, K. ‘Robot finger design for developmental tactile interaction’, in Lecture Notes in Computer Science – Embodied Artificial Intelligence (Springer: Berlin/Heidelberg, 2004), 3139 629.

Hwang, J.-H., Lee, K.-W. & Kwon, D.S. ‘The role of mental model and shared grounds in human-robot interaction’, Proc. IEEE Int. Workshop on Robot and Human Interactive Communication (2006): 623–628.

Ishiguro, H. ‘Towards a new cross-interdisciplinary framework’, Cognitive Science Society (2005).

Ishiguro, H. and Nishio, S. ‘Building artificial humans to understand humans’, Proc. Japanese Society for Artificial Organs (2007): 133–142.

Jones, S. E. and Yarbrough, E. ‘A naturalistic study of the meaning of touch’, NIMA (April, 2007).

Kahn, P.H., Freier, N.G., Friedman, B., Severson, R.L. and Feldman, E.N. ‘Social and moral relationships with robotic others?’, Proc. 13th IEEE Int. Workshop on Robot and Human Interactive Communication (2004): 545–550.

Kato, Y., Mukai, T., Hayakawa, T. & Shibata, T. ‘Tactile sensor without wire and sensing element in the tactile region based on EIT method’, IEEE Sensors (Oct, 2007): 792-795.

Lumelsky, V.J., Shur, M.S. & Wagner, S. ‘Sensitive skin’, IEEE Sensors (2001): 1(1) 41–51.

McDaniel, E. and Andersen, P. A., ‘International patterns of interpersonal tactile communication: A field study’, Nonverbal Behavior (1998): 22(1).

Minato, T., Yoshikawa, Y., Noda, T., Ikemoto, S., Ishiguro, H. & Asada, M. ‘CB2: a child robot with biomimetic body for cognitive developmental robotics’, Proc IEEE-RAS/RSJ Int. Conf Humanoid Robots (Nov 29 – Dec 1, 2007).

Mori, M. ‘Bukimi no tani.’ [The Uncanny Valley] Energy (1970): 7(4). In English: K.F. MacDorman & T. Minato (trans). See http://www.androidscience.com/theuncannyvalley/proceedings2005/uncannyvalley.html (29 Feb, 2008).

Mukai, T., Onishi, M., Odashima, T., Hirano, S. & Luo, Z. ‘Development of the tactile sensor system of a human-interactive robot RI-MAN’, IEEE Trans. Robotics (Apr, 2008): 24(2) 505–512.

Mukat, T. ‘Development of soft areal tactile sensors for symbiotic robots using semiconductor pressure sensors’, IEEE 2004 International Conference on Robotics and Biomimetics. (Aug, 2004): 96–100.

Nagakubo, A., Alirezaei, H. and Kuniyoshi, Y. ‘A deformable and deformation sensitive tactile distribution sensor’, IEEE 2007 International Conference on Robotics and Biomimetics (Dec, 2007): 1301–1308.

Nicholls, H. ‘Tactile sensing for robotics’, IEE Colloquium on Robot Sensors, (Jan, 1991): 5/1–5/3.

Nebot, E.M. The University of Sydney, personal communication (Nov, 2005).

Odashima, T., Onishi, M., Tahara, K., Takagi, K., Asano, F., Kato, Y., Nakashima, H., Kobayashi, Y., Mukai, T., Luo, Z. & Hosoe, S. ‘A soft human-interactive robot RI-MAN’, Proc. 2006 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Video V018, 2006.

Pan, Z., Cui, H. & Zhu, Z. ‘A flexible full-body tactile sensor of low cost and minimal connections’, Proc. IEEE 2003 Int. Conf. on Systems, Man and Cybernetics (2003): 2 368–373.

Papakostas, T., Lima, J. & Lowe, M. ‘A large area force sensor for smart skin applications’, IEEE Sensors (2002): 2 1620–1624.

Rossiter, J. & Mukai, T. ‘An LED-based tactile sensor for multi-sensing over large areas’,. Proc. 5th IEEE Conference on Sensors (Oct, 2006): 835–838.

Simaan, N. ‘Snake-like units using flexible backbones and actuation redundancy for enhanced miniaturization’, Proc. 2005 IEEE Int. Conf. Robotics and Automation (2005): 3 012-017.

Schiphorst, T. ‘Bodymaps: Artefacts of touch’ (1995). in Can we fall in love with a machine? (Pittsburgh Cultural Trust : C. Hart and A. Ackerman, 2006), 22 and 56–59.

Shibata, T. ‘An overview of human interactive robots for psychological enrichment’, Proc. IEEE, (2004): 92(11) 1749–1758.

Shimojo, M., Namiki, A., Ishikawa, M., Makino, R. and Mabuchi K. ‘A tactile sensor sheet using pressure conductive rubber with electrical-wires stitched method’ IEEE Sensors (Oct, 2004): 4(5) 589–596.

Shirado, H., Nonomura, Y. & Maeno, T. ‘Realization of human skin-like texture by emulating surface shape pattern and elastic structure’, 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, (Mar, 2006): 295–296.

Someya, T., Sakurai, T. and Sekitani, T. ‘Large-area electronics based on organic transistors’, Device Research Conference, 64th (2006): 209–210.

Someya, T., Kato, Y., Sekitani, T., Iba, S., Noguchi, Y., Murase, Y., Kawaguchi, H., & Sakurai, T. ‘Conformable, flexible, large-area networks of pressure and thermal sensors with organic transistor active matrixes’, Proceedings of the National Academy of Sciences, (2005): 102(35): 12321–12325.

Stiehl, W.D. & Breazeal, C. ‘Applying a “somatic alphabet” approach to inferring orientation, motion, and direction in clusters of force sensing resistors’, Proc. 2004 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (2004): 3015–3020.

Stiehl, W.D. & Breazeal, C. ‘A sensitive skin for robotic companions featuring temperature, force, and electric field sensors’, Proc. 2006 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (2006): 1952–1959.

Stiehl, W. D. & Breazeal, C. ‘Affective touch for robotic companions’, in Lecture Notes in Computer Science – Affective Computing and Intelligent Interaction (Springer: Berlin / Heidelberg, 2005), Vol.3784 747–754.

Stiehl, W.D., Lieberman, J., Breazeal, C., Basel, L., Lalla, L. & Wolf, M. ‘Design of a therapeutic robotic companion for relational, affective touch’, Proc. IEEE International Workshop on Robot and Human Interactive Communication.(2005): 408-415.

Tada, Y., Inoue, M., Kawasaki, T., Kawahito, Y., Ishiguro, H. & Suganuma, K. ‘A flexible and stretchable tactile sensor utilizing static electricity’, Proc. IEEE/RSJ Int. Conference on Intelligent Robots and Systems, (29 Nov, 2007): 684–689.

Takamuku, S., Gomez, G., Hosoda, K. & Pfeifer, R. ‘Haptic discrimination of material properties by a robotic hand’, Proc. IEEE 6th International Conference on Development and Learning. (July, 2007): 1–6.

d’Urbano, A. ‘Touch Me’ (1995). In Are Our Eyes Targets? (ZKM/Centre for Art and Media Karlsruhe: Schwarz, H.-P., 1997).

Wada, K., Shibata, T & Tanie, K. ‘Analysis of factors that bring mental effects to elderly people in robot assisted activity’, Proc. 2002 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (2002): 1152–1157.

Wagner, J.J., Van der Loos, H.F.M. & Leifer, L.J. ‘Construction of social relationships between user and robot’, Robotics and Autonomous Systems (2000): 31(3) 185–191.

Walker, I. D. ‘Some issues in creating “invertebrate” robots.’ Proc. Int. Symp. on Adaptive Motion of Animals and Machines, Montreal, Canada (2000).

Walters, M.L., Dautenhahn, K., te Boekhorst, R., Koay, K.L., Kaouri, C., Woods, S., Nehaniv, C., Lee, D. & Werry, I. ‘The influence of subjects’ personality traits on personal spatial zones in a human-robot interaction experiment’, In Proc. IEEE Int. Workshop on Robot and Human Interactive Communication (2005): 347–352.

Woods, S., Dautenhahn, K. & Schulz, J. ‘The design space of robots: investigating children’s views’, Proc. 13th IEEE Int. Workshop on Robot and Human Interactive Communication. (Sept, 2004): 47–52.

Yamada,Y., Morizono, T., Umetani, Y., and Takahashi, H. ‘Highly soft viscoelastic robot skin with a contact object-location-sensing capability.’ IEEE Trans. Industrial Electronics (2005): 52(4) 960–968.

Both comments and pings are currently closed.

Comments are closed.