9512.net
甜梦文库
当前位置:首页 >> >>

Human–Robot Tactile Interaction



Human–Robot Tactile Interaction?
Thomas W¨ osch, Wendelin Feiten Corporate Technology Siemens AG 81730 Munich, Germany {thomas.woesch,wendelin.feiten}@mchp.siemens.de
Abstract

In the ?eld of service robotics, robots serve and assist human beings. It is natural for humans to directly interact with the robot via tactile interfaces. This paper introduces several kinds of tactile interactions between a user and the robot as well as interactions of the robot with the environment. All interactions are implemented in a single paradigm: Forces measured from tactile sensors result in motion vectors at the contact points. The motion vectors from different sensors are superimposed and then determine the robot’s joint velocities. We present results from our experimental setup consisting of an 8 degrees of freedom manipulator arm mounted on a mobile platform. In the illustrated example, a human interacts with the robot using only the tactile interface. pressures. Appropriate to the activated mode, we determine for each contact point a motion vector describing a motion for that contact point, see sections 5-6. How this motion vector results into a motion of the robot is described in section 7. The experimental results are presented in section 8. A short description of our experimental setup is also given. The paper is concluded in section 9.

2

Related Work

1

Introduction

One of the most challenging parts in robotics during the recent years was and still is service robotics. In general service robots perform no repetitive tasks. The working environment is dynamic and in most cases only partially known. Powerful planning and control structures are demanded to move collision-free. Because these robots serve, assist and work with humans their ?rst priority is to guarantee the humans security. The motions should look human-like. No one will trust a machine that acts in a way that seems unpredictable for an observer. The robot must have interaction possibilities that appear natural to a user, e.g. natural language interaction or gesture recognition. Subject of this paper is the direct interaction between a human and a robot. We focus on tactile interaction using a new kind of tactile sensor, brie?y described in section 3. Different interaction possibilities are introduced in section 4, ranging from special motion modes to a tactile alphabet for setting these modes. When the system gets in contact with the environment or a human, the tactile sensors provide information about the contact points and the
work was partially funded under contract 01IN601A2 and 01IL902DO by BMBF, Morpha project.
? This

Specially in the ?eld of service robotics user-robot interaction is of crucial interest. [1] introduces a robot that uses patterns of interactions similar to those found when people interact with each other. This makes the robot motions appear more con?dential for the humans in the robots environment. Speech and contact panels are used to interact with the robot. Motivational systems for an autonomous robot for sophisticated human-robot interaction that express the robots mood are investigated [2, 3]. [4] uses tactile information from a force/torque sensor mounted on the tool center point of the robot to modify the tool center points trajectory. In literature there can be found various interaction possibilities between a human and a robot. Since this paper is focused on tactile interactions, we will only consider those. [5] proposes an arti?cial skin for a snake like manipulator, using a resistive principle. [6] introduces a modularized sensitive skin using infrared sensors. An acoustic resonant tensor cell for tactile sensing is shown in [7]. [8] proposes a full sensor ?exible suit made of conductive fabric and strings. A often used principle is the capacitive one, as described in [9, 10]. A disadvantage of most of these sensor systems is, that their shock absorbency is low. The sensor information can be used to control a robot platform or manipulator arm. To control a robot system in an environment of average complexity a lot of work was done during recent years. Very popular for controlling a robot have been reactive control schemes based on arti?cial forces [11, 12, 13]. A reason therefore might be the easiness how they were used and can be extended to higher-dimensional problems with

more than four degrees of freedom (DOF). In addition they are faster than planning techniques [14, 15] and can work on-line. Their major drawback and the reason why slower planning techniques are still required in robot motion planning is, that reactive mechanisms suffer from local minima. Frameworks [16] where reactive control schemes are merged with planning components, to combine the advantages of reactive controllers (fastness) and the advantages of planners (?nd a solution whenever one exists) have been proposed.

types of interaction with its different modes. In the next section it is shown how all of these types can implemented in a single paradigm.

4.1

User guided robot motion

3

Tactile Sensor

Figure 1: Experimental setup with tactile sensors. Tactile sensor segments1 are mounted all over the robot, as illustrated in Figure 1a. Figure 1c shows one tactile sensor segment in detail. The segment consists of two conductive foams, separated by an isolating net. When pressure is exerted on the foams, both foams get in contact. The higher the pressure, the lower is the resistance between the foams at the contact spot. The point of pressure and the pressure itself is measured. Therefore each tactile sensor segment is connected to a micro-controller. When more than one sensor segment is used, every controller is connected via a CAN bus. The segments are referenced by unique identi?cation numbers. Each controller can be considered as master. The bus sharing is done according the master-master principle. To get sensor information no polling is required. Ef?cient data acquisition can be done by this way.

Among the ?rst implementations of user guided robot motion is the guidance of a tool using force torque sensor and zero force control. The force and torque measured by a sensor are compensated by those induced by the tool. The remaining part is assigned to the user’s impact on the tool and controlled to zero by appropriate robot motions. Similarly, with our tactile skin the user can apply a force on the robot and make it move. In contrast to classical zero force control, since the tactile skin is not in?uenced by robot workpieces or tools we don’t have to compensate for those in?uences. Instead, the measured forces can be used directly. In the simplest case, the robot will evade the touch. How the motion is derived from the contact force is explained in section 5. However, our tactile skin allows us to implement a lot more complex modes of tactile interaction than this simple Push mode. If the user touches the robot and the robot is in Pull mode, retracting the contact point will make the robot follow to keep the contact. These modes are not restricted to single contacts, but are extended to multiple contacts with the robot. Each individual sensor segment can be set to Push or Pull individually, and the resulting robot motions can be superimposed. This also allows for grasping a robot link and guiding it precisely. This can be used to interactively control the redundant degrees of freedom.

4.2

Tactile robot-human communication on a symbolic level

4

Tactile Interaction

Our experimental setup allows us to investigate different types of tactile interaction. These comprise interaction of the user with the robot as well as interaction of the robot with the environment. This section explains these different
1 Siemens

patent, cat. no. 19959703.0

Since the robot has different modes of tactile interaction, mechanisms are needed to switch between these modes. One intuitive way to set the modes is to use the tactile sensor itself. Therefore a meaning is assigned to certain contact points, to the time when they are pressed and the order in which they are pressed. This may be compared to the Morse-alphabet. For example, two short impacts with a short break could set a given segment to Push mode, and two short impacts with a longer break could set it to Pull mode. Of course, the information given to the robot can also be more complex. For example, touching the inner gripper pads in an alternating fashion with a certain frequency - left-right-left-right in one second - means to the robot: close the gripper and grasp the object between the ?ngers, i.e. sets the robot to Grasp mode. Along these lines, a general tactile input alphabet can be de?ned. However, care has to be taken that this alphabet doesn’t interfere with other tasks of the tactile sensor, and that it is intuitive for the user. This also implies that similar

motions are used to convey similar information. Continuing the example of grasping, the robot indicates that it is ready to hand over an object by moving the object leftright-left-right in one second. Then, the user grasps the object and tells the robot that he or she now is holding the object and the robot should release it by wriggling it leftright-left-right in one second between the robot ?ngers. Finding the optimal tactile alphabet requires a lot of experiments and is far from easy. In this investigation, inspiration might be found in communication theory.

4.3 Tactile interaction of the robot with the environment
Obviously obstacle detection is a very important feature for an autonomous robot. With the tactile sensor, the robot can detect contacts all over its surface. Therefore, if the robot moves autonomously (not under user supervision or tactile guidance), the tactile sensor segments are set to Obstacle Detection mode. In this mode, each contact in the direction of motion is interpreted as a collision and the motion is stopped and then reversed. Since the contact in this case is longer than the short pulses used in the tactile alphabet, even in this mode tactile communication on the symbolic level is still available. The tactile sensor on our robot can be used to build a model of the environment. In Exploration mode, the robot will actively (though carefully) seek contact with the environment and, with reference to the known robot platform con?guration [17], insert vertices into a 3D surface representation. Figure 2: Interactive tactile sensor calibration

tactile interaction is by direct user command. This can be done using the tactile sensor itself, but also using other information channels like natural language interaction, gesture recognition or keyboard commands. Tactile sensor mode switching is also part of the autonomous robot operation. Just as other sensor modalities are selected (together with the target state of the environment in the image of this sensor), the mode of the tactile sensor appropriate for achieving a given subgoal is part of the task description (see [18]).

5

Force generation

4.4 Interactive tactile sensor calibration
Due to inhomogeneities of the sensor material and variations in the mechanical construction, the tactile sensor system has to be calibrated. This calibration is based on a geometrical model of the robot, with the robot surface patches associated to tactile sensor segments. The calibration software displays this model on a screen, highlighting characteristic surface points on the robot one by one (see Figure 2a). In our experiments, we choose these points to be the corners of the rectangular surface patches, with additional points on more complicated surface patches. For each highlighted point on the screen, the user decides which spot on which tactile sensor segment corresponds to this point and exerts a force on it (see Figure 2b). From this contact, the sensor system automatically determines the corresponding sensor segment and adds the segment id and the corresponding tuple of voltages to a list. In section 5 it is shown how such a tuple of voltages is aligned to points on the robot.

Tactile sensors are mounted on our whole experimental system, as shown in Figure 1a. In this section we will only consider one tactile sensor segment. The way of pressing on the sensor (see Figure 3a) and determining the corresponding force (see Figure 3b) for the motion generation (see section 6) is explained. When a force is exerted on the sensor shown in Figure 3a, the sensor system evaluates the pressure point u ∈ V ? R2 that lies in between the voltage tuple domain V ? R2 of the sensor, as shown in Figure 3c. The vertices v i ∈ V ? R2 , ?i = 1, . . . , 4

describe the voltage tuples aligned to the points on the corners of the tactile sensor. The voltage tuple set is given by a convex quadrangle. Introducing the vectors v ?1 v ?2 v ?3 = = = v4 ? v1 v2 ? v1 v3 ? v2

4.5 Mode switching
The modes of tactile interaction need to be appropriate for the current state and goals of the robot. As has been discussed above, an important mechanism to set the mode of

the convex combination of the pressure point can be described by ?1 + α (? v 2 + λ (? v3 ? v ?1 )) u = v 1 + λv (1)

and the parameters (3) and (4), the point f is given by the convex combination ? 1 + α (w ? 2 + λ (w ?3 ? w ? 1 )) f = w1 + λw . (5)

For each corner of the virtual convex quadrangle W in Figure 3d additional vectors ni ∈ R 3 , ?i = 1, . . . , 4 (6)

Figure 3: Force generation and contact point evaluation by pressing a tactile sensor.

are de?ned. These vectors describe a direction for the vertices w i (?i = 1, . . . , 4) of the virtual quadrangle. When a point inside the convex quadrangle is considered, the corresponding direction vector is approximated. With the parameters (3), (4) and the vectors n ?1 n ?2 n ?3 = = = n4 ? n1 n2 ? n1 n3 ? n4

.

with λ, α : [0, 1] Using the notation . (2)

the vector n ∈ R3 describes a direction for the point f in the convex quadrangle W by n = n1 + λn ? 1 + α (? n2 + λ (? n3 ? n ? 1 )) . (7)

u=

ux uy v ?2 =

v1 = bx by

vx vy v ?3 = cx cy

v ?1 =

ax ay

,

The vertices w i (?i = 1, . . . , 4) of the quadrangle shown in Figure 3d are aligned with the corners si ∈ R3 ,
(3)

the parameters λ and α are obtained by
λ= 1 2F 1 2A ±D + ?4AB + C 2

?i = 1, . . . , 4

on the robot link (see Figure 3b). A simple coordinate transformation is used to describe the alignment. When m11 ? m21 M =? m31 0 ? m12 m22 m32 0 m13 m23 m33 0 ? m14 m24 ? ? m34 1

α=

±E +

?4AB + C 2

(4)

with
A B C D E F = = = = = = a y bx ? a x by + b y c x ? b x c y a y u x ? ax u y ? ay v x + a x v y cy ux ? cx uy ? ay (bx + ux ? vx )? ?cy vx + ax (by + uy ? vy ) + cx vy cy ux ? cx uy + ay (bx ? ux + vx )+ +cx vy ? cy vx ? ax (by ? uy + vy ) ?cy ux + cx uy + ay (bx + ux ? vx )+ +cy vx ? ax (by + uy ? vy ) ? cx vy a y cx ? a x cy

The voltage domain V is aligned with a virtual convex quadrangle W ∈ R2 , as illustrated in Figure 3d. Point f ∈ W ? R2 in the domain of the virtual surface W equals the original pressure point p in the voltage duple domain V . Using the vectors w ?1 w ?2 w ?3 = = = w4 ? w1 w2 ? w1 w3 ? w4

describes the coordinate transformation between a point on the virtual surface and the robot link, point f (5) can be transformed into the contact point c ∈ S ? R3 ? ? ? ? m11 m12 m13 m14 m14 f + ? m24 ? (8) c = ? m21 m22 m23 m24 ? 0 m34 m31 m32 m33 m34 on the robot link. Figure 3b shows the force F acting on the contact point c of the robot. Using the approximated direction vector n (7) as the direction of the force and considering the measured pressure kpress , the force at the contact point can be written as n (9) F = kpress n

.

Figure 4: Simpli?ed robot link with two collisions.

6

Motion in Work Space

When the tactile skin detects a contact, a force F (9) and a contact point c (8) is generated as explained in section 5. For each tactile sensor segment such a force and contact point is evaluated. One robot link can consist of more than one tactile sensor segments, but can only move in one direction at the same time. Thus all generated forces on a link are summed, resulting in the overall force F link ∈ R3
N

F link =
n=1

Fn

.

(10)

Figure 5: Human–Robot interaction

Figure 4 shows a simpli?ed robot link. Two obstacles O1 and O2 are in contact with the link. The contact points are c1 ∈ R3 and c2 ∈ R3 . F 1 ∈ R3 and F 2 ∈ R3 describe the generated forces. The control point p ∈ R3 p= 1 N
N

When there are R robot links to be considered, a motion vector δ i (i = 1, . . . , R) is evaluated for each link depending on the forces acting on that link. Each link consists of one control point pi (i = 1, . . . , R) where the motion vector is acting.

cn
n=1

.

(11)

7

Motion in Joint Space

is used to describe a motion for the link, where the point is attached. N represents the number of contacts. The distance vectors r i ∈ R3 r n = cn ? p , ?n = 1, . . . , N (12)

describe the distance between the contact points and the control point. Using these distances and the forces generated on the robot link, a torque M link ∈ R3 acting on the control point can be determined by
N

A motion in joint space results into a movement of the robot links. Section 6 describes how a motion vector δ (14), describing a motion for a control point p (11) on a robot link is derived. This vector has to be transformed into the joint space, to ?nally result into a motion of the robot links. [19] uses the transpose Jacobian to transform a vector into the joint space. The Jacobian J p expresses the motion of the control point p relative to the robot link where the control point is attached. The joint torques are evaluated by
T

M link =
n=1

rn × F n

.

(13)

τ = Jp

δ

.

(15)

The force (10) describes the translational part of the motion for the control point (11) and the torque (13) describes the rotational part of the motion for the control point. They are summarized in the motion vector δ ∈ R6 δ= F link M link . (14)

When R robot links are to be considered, each link contributes a torque vector τ i (?i = 1, . . . , 4) to a total torque. The superimposed joint torques can be used to command a motion of the robot links.

8

Experimental Results

We have tested the tactile interaction on a real system. It consists of a manipulator arm with 8 DOF. The arm is

mounted on a non-holonomic platform. The whole arm and the sides of the platform are covered by tactile sensors. There are 45 sensors mounted on the robot in total. We used a conventional proportional controller to command arm and base. Limitations such as joint limits, velocity limits or acceleration limits were taken into consideration. The person in Figure 5a-d interacts with the robot. The implementation of the proposed method results in humanlike motions. Generated motions ?t harmonically into the human-machine interaction scenario. The user is able to freely interact with the robot. A collision detection system is running in the background. Potential self-collisions are detected by the system before the arm moves. The collision detection system and the tactile interaction system are based on the same principle of generating forces acting on the robot. Both systems work smoothly together at the same time.

[3] J. Schulte, C. Rosenberg, and S. Thrun, “Spontaneous short-term interaction with mobile robots in public places”, in International Conference on Robotics and Automation, vol. 1, (Detroit, USA), pp. 658–663, May 1999. [4] R. Voyles and P. Khosla, “Tactile gestures for human/robot interaction”, in International Conference on Intelligent Robots and Systems, vol. 3, (Pittsburgh, USA), pp. 7–13, August 1995. [5] P. Bohner and M. Hoffmann, “An arti?cial sensitive skin for reactive planning and execution of manipulator paths”, in International Symposium on Intelligent Robotic Systems SIRS’95, (Pisa, Italy), 1995. [6] D.Um, B.Stankovic, K. Giles, T. Hammond, and V. Lumelsky, “A Modularized Sensitive Skin for Motion Planning in Uncertain Environments”, in International Conference on Robotics and Automation, vol. 1, (Leuven, Belgium), pp. 7–12, May 1998. [7] H. Shinoda, K. Matsumoto, and S. Ando, “Acoustic resonant sensor cell for tactile sensing”, in International Conference on Robotics and Automation, vol. 4, (Albuquerque, USA), pp. 3087–3092, April 1997. [8] M. Inaba, Y. Hoshino, K. Nagasaka, T. Ninomiya, S. Kagami, and H. Inoue, “A full-body tactile sensor suit using electrically conductive fabric and strings”, in International Conference on Intelligent Robots and Systems, vol. 2, (Osaka, Japan), pp. 450–457, November 1996. [9] J. Novak and J. Feddema, “A capacitance-based proximity sensor for whole arm obstacle avoidance”, in International Conference on Robotics and Automation, vol. 2, (Albuquerque, USA), pp. 1307–1314, May 1992. [10] H. Shinoda and H. Oasa, “Passive wireless sensing element for sensitive skin”, in International Conference on Intelligent Robots and Systems, vol. 2, (Takamatsu, Japan), pp. 1516–1521, October 2000. [11] O. Khatib, “Real-time Obstacle Avoidance for Manipulators and Mobile Robots”, International Journal of Robotics Research, vol. 5, no. 1, pp. 90–98, 1986. [12] J.-C. Latombe, Robot Motion Planning. Kluwer Academic Publishers, 1991. [13] H. P. Xie, R. V. Patel, S. Kalaycioglu, and H. Asmer, “Real–Time Collision Avoidance for a Redundant Manipulator in an Unstructured Environment”, in International Conference on Intelligent Robots and Systems, vol. 3, (Victoria, Canada), pp. 1925–1930, October 1998. [14] L. E. Kavraki and J. C. Latombe, “Randomized Preprocessing of Con?guration Space for Path Planning: Articulated Robots”, in International Conference on Intelligent Robots and Systems, vol. 3, (Munich, Germany), pp. 1764– 1772, September 1994. [15] C. Nissoux, T. Simeon, and J. P. Laumond, “Visibility based probabilistic roadmaps”, in Intelligent Robots and Systems, vol. 3, (Kyongju, South Korea), pp. 1316–1221, October 1999. [16] S. Quinlan and O. Khatib, “Elastic Bands: Connecting Path Planning and Control”, tech. rep., Robotics Laboratory Computer Science Department Stanford University, 1993. [17] G. Lawitzky, “Das Navigationssystem SINAS”, in Proceedings Robotik 2000, VDI-Berichte 1582, (D¨ usseldorf, Germany), pp. 77–82, VDI-Verlag, June 2000. [18] G. v. Wichert, T. W¨ osch, S. Gutmann, and G. Lawitzky, “Mobman – Ein mobiler Manipulator f¨ ur Alltagsumgebungen”, in Autonome Mobile Systeme (AMS’00) (R. Dillmann, H. W¨ orn, and M. v. Ehr, eds.), Informatik aktuell, pp. 55–62, Springer Verlag, Heidelberg, 2000. [19] M. W. Spong and M. Vidyasagar, Robot Dynamics and Control. Wiley, New York, NY, 1989.

9

Summary, Conclusion and Future Work

The paper introduces different tactile interaction modes. The interaction can be a human-robot interaction or an environment-robot interaction. When the robot is in Push or Pull mode, a human can guide the robot. The tactile sensor is also used as an additional source to get information about the environment. In the interaction mode Obstacle Detection the robot updates an environment model, when collision information is available. The modes can be switched using the tactile sensors according to a tactile alphabet. All modes are implemented in a single paradigm: Forces are combined with torques, resulting into motion vectors. The joint velocities are determined according these motion vectors. Experimental results show motions that appear human-like and ?t well into the human-robot interaction scenario. When the robot is in Push or Pull mode, currently only the amount of pressure is considered. Further work will be done by considering the length of time of contacting the tactile sensor. A varying amount of pressure over time might also be of interest. The introduced algorithm can fail in local minima. Therefore it will be combined with planning components. The redundant DOF in our setup will be utilized for tactile interaction and object manipulation at the same time. Merging the proposed tactile interaction modes with general interaction modalities, e.g. natural language interaction, to a multi-modal interaction system will be investigated.

References
[1] S. Thrun, M. Bennewitz, W. Burgard, A. B. Cremers, F. Dellaert, D. Fox, D. Hahnel, C. R. Rosenberg, N. Roy, J. Schulte, and D. Schulz, “MINERVA: A Tour-Guide Robot that Learns”, in KI - Kunstliche Intelligenz, pp. 14–26, 1999. [2] C. Breazeal, “A Motivational System for Regulating Human-Robot Interaction”, in AAAI/IAAI, pp. 54–61, 1998.



更多相关文章:
多媒体技术及应用期末论文
input and output interaction mechanisms such as visual, auditory and tactile....? Required tooling, components and robot ‘teach-repeat’ files (a recorded...
人机交流ENG
Tactile interaction Touch human-machine interaction interaction has become the ...remote-controlled robot, remote medical treatment and training, based on the...
The development of Robots
robot using fusion visual,acoustic, tactile sensor...man-machine interaction control but is committed to...robot system will enter the human life, become ...
研究生英语资料_图文
The small mobile robot -- equipped with tactile sensors -- would lead the...human-robot interaction in no-visibility conditions that we simply couldn't ...
机器人学简介
Effective Inertia Video clip “A Finger-Shaped Tactile Sensor Using an Optical...Stanford Human-Safe Robot, Task Posture and Control, Multi-Contact Whole-Bo...
仿生机器人
Humanoid robot refers to a certain degree have human characteristics, and ...object recognition technology combines visual and tactile, meticulous work can ...
CHI2009年论文列表
Techniques for mobile interaction MicroRolls: expanding...a voice controlled robot arm "Pimp My Roomba":...Tactile feedback for predictive text entry Texture ...
更多相关标签:
human interaction    tactile    tactile wars    tactile strips    tactile strip    tactile internet    tactile sensor    tactile switch    

All rights reserved Powered by 甜梦文库 9512.net

copyright ©right 2010-2021。
甜梦文库内容来自网络,如有侵犯请联系客服。zhit325@126.com|网站地图