2011 Annual Report – Volume 2

2011 Annual Report

Volume 2


2011 Annual Report – Volume 2

1. Core Projects

1.1 Human System Interaction Thrust

1.2 Mobility and Manipulation Thrust

1.2.1 Soft Interaction

ERC Team Members

Lead: Chris Atkeson (CMU RI)

Faculty: Bambi Brewer (Pitt RST)

PhD Students: Siddarth Sanan (CMU RI), Ben Stephens (CMU RI)

MS Students: Heather Markham (Pitt RST)

Undergrads: Brandon Kmetz (Pitt REU, Rose-Hulman Institute of Technology),

Mike Ornstein (CMU RI), David Schlesinger (CMU RI)

Goals

One goal of this project is to develop ways for robots to physically interact safely with humans. Many aspects of caregiving demand the ability to gently and safely manipulate humans, in particular transfer between bed, wheelchair, toilet and bathing area. Feeding, dressing and grooming involve physical interaction with the user. The requirements for a manipulation system that touches people are quite different than those that deal with inanimate objects, which is why we distinguish these two systems. Furthermore, soft interaction should be available from a mobile robot, not just from a device rigidly mounted to a floor or wall. Another goal of this work is to enable users (including users with chronic stroke symptoms) to interact in an intuitive manner with a robot in order to accomplish tasks that are typically performed with two hands. The long-term goal is to create ways for individuals with disabilities to interact easily and intuitively with robotic systems.

Role in Support of QoLT Strategic Plan

Soft manipulation is a transformative capability for safely interacting with people, and central to the QoLTbots systems. Intuitive and safe interaction between individuals with disabilities and robotic systems is critical to many QoLT projects. In particular, the techniques developed in this project can be deployed on the HERB and PerMMA systems.

Fundamental Research Barriers; Methodologies to Address Them

One barrier is that computers crash and programs have errors. We need to find a way to make robots safe under these conditions. We are exploring how to build intrinsically soft robots. Another barrier is that it is difficult to simultaneously be strong as well as soft, given current actuation technology. Many robots have high gear ratios in order to be strong. This also makes them stiff, and a great deal of effort must go into counteracting that stiffness to make soft and safe robots. We are exploring the use of skin sensing, wrist and ankle force/torque sensing, and joint force/torque sensing to implement active force control. A third barrier is the lack of intuitive ways to program robots. Many previous robotic systems have required user interaction via a joystick or keypad, which can be challenging and slow for individuals with disabilities. In addition, previous systems have not focused on the types of bimanual activities that become difficult as a result of stroke. We have developed a method of direct interaction with the robot by creating a skin interface for the robot arm.

Achievements

In our work on intrinsically soft robots, we have developed a series of soft robot prototypes. In CY2010 we developed a new inflatable arm structure that includes new joints and valves to improve payload capacity from 200gms to 500gms. We verified the safety of our inflatable arm at impact velocities up to 5 m/s. Impact forces were recorded at below 10N for such impacts. We developed a design procedure for inflatable arms for desired payload and stiffness. The design procedure allows selection of appropriate geometrical parameters as well as the operating internal pressure for each link. We developed a control-design synthesis procedure for inflatable arms based on task requirements and safety constraints. This procedure results in a desired bending stiffness for each link of the arm such that a performance metric is maximized given bounds on the maximum impact force for the arm. We developed an inflatable gripper for our inflatable arm. The gripper pressurizes to achieve a desired contact force with the object it is gripping. We developed contact detection using tendon forces for the inflatable arm. The scheme utilizes information regarding the joint stiffness of the arm, which is obtained experimentally. We added additional degrees of freedom at the base of our prototype arm, and demonstrated the arm doing a simple wiping task at the Consumer Electronics Show (CES 2011).

In our work on achieving soft and safe robots that can both balance, move, and help humans with physical tasks using force control, we have developed robust compliant control systems that can respond to large disturbances. In CY2010 we implemented disturbance and modeling error rejection into state estimation for force-based balance applications and implemented it on our test robot. We also developed a step recovery controller based on model predictive control that predicts optimal footstep location and center of mass trajectory needed to come to rest after a large balance disturbance such as a push. We implemented it in a real-time controller on our test robot. We developed a full body force-based balance controller based on desired behaviors. The controller was combined with Virtual Model Control to simplify and perform complex tasks such as helping a person pick up a table, even in the presence of substantial balance disturbances.

We are developing robotic systems to assist individuals with disabilities, including stroke. We have developed a method of direct interaction with the robot by creating a skin for the robot arm. This skin acts as a switch that is closed by touch. When the skin is touched, the robot moves compliantly and compensates for the effect of gravity on the robot and any object that it is grasping. Thus, the user can easily position the robot using the non-paretic arm. When the user releases the arm, the switch is opened, placing the robot into a fixed mode. This keeps the robot in place and enables it to stabilize an object to be used by the individual. For instance, the robot can stabilize a glass while the user pours water into it using the non-paretic arm. Our prototype system consists of a robotic arm, touch-sensitive skin, and a simple hand. This testbed will enable us to investigate how individuals with chronic stroke can use direct interaction with a robotic arm, providing valuable information about how these individuals can successfully and easily interact with assistive robotic systems. The robotic arm used in this prototype system is the 4 degree of freedom Whole Arm Manipulator (WAMTM) from Barrett Technologies (Boston, MA). The skin we developed is a fabric switch, which consists of an external and internal cover, two conductive surfaces, and an insulator which prevents unintentional contact. The simple hand (a prehensor) used in this system is Electric Terminal Device (ETD) from Motion Control, which is a 1 degree of freedom electric hook-style prosthesis. The focus of previous years was the development of this prototype system, particularly the touch-sensitive skin. This year, one focus was on developing software for weight compensation and voice commands for the prehensor; the other main focus was beginning user studies. A paper describing the current system and software was submitted to the 2011 International Conference on Rehabilitation Robotics.

The WAM uses a software model to implement active gravity compensation, allowing the robot to be manipulated as if the arm itself is weightless. However, compensating for the mass of the payload is also necessary in order for individuals to be able to effortlessly move the arm when it is grasping an object. To execute the weight compensation algorithm, the user grasps the desired object with the prehensor, and then executes the lift command (see below). The software then uses position control to lift the object a short distance (approximately 5 cm). The distance of the lift is limited and performed slowly to avoid startling the user. After the lift, the robot measures the mass of the payload. The program then recalibrates the software model of the robot using the newly measured payload mass. The user can then move the WAM and the object being grasped as though both are weightless.

The speech recognition system used in this project was designed using Microsoft's Speech API. The recognition engine runs on the Windows PC. Three phrases are recognized: "hook open", "hook close", and "robot lift". Each recognition phrase begins with a keyword; in this system the keywords are "hook" and "robot", which enable the recognition engine to clearly differentiate between grasping and lifting commands. Execution of the “hook open” command automatically resets the software model of the robot to remove the payload mass.

We are currently beginning initial testing with individuals with chronic stroke. Individuals will come into the lab and attempt to perform a variety of bimanual tasks with and without the robotic system. We will determine whether the system affects which tasks can be completed or the completion time. We will also record user feedback about ease of use of the system and what improvements are needed.

Relevant Work Being Conducted Elsewhere; How this Project Differs

Mobile soft physical interaction with humans is a relatively undeveloped area. No current humanoid robots are fully back-drivable from any contact point, for example, and those that implement soft physical interaction do so only at selected sites using localized force sensing. Soft manipulation of fragile humans is a new challenging area for robotics that we are pioneering. We are exploring both intrinsically soft robots and also robots that can be compliant at any contact point. Our work on compliant control of movable-base robots that can have significant physical interaction draws from work in robotics on humanoids and on mobile manipulators. Systems that interact closely with people in complex environments require compliant controllers to comply with disturbances and uneven ground. Non-compliant robots have long used inverse kinematics and trajectory tracking to solve this problem. This prevents these robots from approaching a singularity (straight legs) and performance is limited by their heavily geared actuation. Compliant robots can respond faster, but must actively control the motion of all compliant joints. Virtual Model Control was proposed by Pratt as a model-free approach to control of compliant robots. Full body inverse dynamics can use a model to generate feed-forward torques based on desired joint accelerations (Disney, USC). The inverse dynamics can also be re-formulated in terms of desired contact forces (ATR). Our approach is to put mobile compliant control in an optimal control framework.

Our work in fundamentally safe robots focuses on structurally soft robots. Other solutions to this problem include computer control systems that never fail (expensive and still unreliable), padding on existing robot designs, lightweight robots, and robots with compliance and/or mechanical fuses in the transmission. We expect to use all these techniques for a complete robot system that is safe to use in daily life.

Many current assistive robots perform only a limited number of tasks, and typically a single task, such as feeding systems. Our goal is to be able to flexibly achieve a wide range of tasks. Many current robotic systems, such as the Assistive Robotic Manipulator (ARM) from Exact Dynamics, have required user interaction via a joystick, keypad, or game controller which can be challenging and slow for individuals with disabilities. Work has been done with the ARM to evaluate other interfaces including speech input, a wireless mouse; and human in-the-loop combined with computer vision processing. El-E, a mobile robotic manipulator, autonomously grasps objects identified with a laser pointer. The system described here differs from previous work in assistive robotics because we are focusing on the use of touch to allow individuals to directly position the robotic manipulator. Chen and Kemp have also used direct interaction to enable nurses to position a mobile robot. By using direct interaction with individuals with disabilities, we hope to make robotic systems for manipulation assistance easy and fast to learn to use and potentially accessible even to individuals with cognitive impairments due to brain injury. We are also focusing on bimanual tasks in which the robot and a human each provide an arm, and work together.

Expected Deliverables and Milestones

Year 6: robot skin; prototype soft robot; direct & teleoperation interfaces for heavy lifting; user testing of Intuitive Interface

Year 7: algorithm to preserve balance in wheeled and legged robots; controller for two-arm lifting of heavy objects

Year 8: prototype transfer robot

Contributions and Broader Impact

· We developed behavior design algorithms for compliant robots based on optimal control and dynamic programming.

· We constructed a multi-link soft robot prototype using inflatable technology, and a prototype continuum robot which moves using elastic deflection. We further developed our theory of optimal design of soft robots and our force control experiments.

· We developed several prototypes of direct manipulation and teleoperation interfaces for programming assistive robot arms using a skin for contact and force sensing.

· Soft interaction is critical to a wide range of applications where machines come in contact with people. We make our algorithms publicly available. We will make our designs for soft robots and skin available as they mature.

Future Plans

Soft robot: Our long term goal is to create a usable and intrinsically safe assistive robot. We also believe this robot will be much cheaper than existing robots. In CY2011 we plan to extensively test our current prototype robot on tasks such as feeding and grooming. We are currently working with a company iRobot to explore commercialization of these types of robots.

Compliant control: Our work in this area has attracted new funding sources, and we will move this work to associated projects.

Intuitive Interfaces and Skin: We are currently completing our prototype system and plan to begin user testing in March. We will complete initial tests with 10 individuals with hemiparesis. We will then analyze their feedback to determine whether intuitive interaction is a viable method of interaction with an assistive robot. In addition, we will use the feedback to modify our system to better meet the needs of users.

For example, individuals with aphasia may prefer to operate the hook by a button or switch rather than voice commands. After modifying the system based on user feedback, we plan to employ a modified system for testing in a simulated home environment. We also plan integrate these techniques with other assistive robotic systems in development at the Quality of Life Technology Engineering Research Center, notably the HERB and PerMMA systems, for eventual in-home testing.