- Reach Out and Touch Something
- Edit Event
- Cancel Event
- Preview Reminder
- Send Reminder
- Other events happening in May 2014
Reach Out and Touch Something
Speaker:
Howard Chizeck
, University of Washington
Date: Thursday, May 01, 2014
Time: 11:00 AM to 12:00 PM Note: all times are in the Eastern Time Zone
Public: Yes
Location: 32-D463 (Star)
Event Type:
Room Description:
Host: Joe Voldman, MIT
Contact: Michael Posa, mposa@csail.mit.edu
Relevant URL: http://robotics.csail.mit.edu/events/
Speaker URL: None
Speaker Photo:
None
Reminders to:
seminars@csail.mit.edu, roboticscenterall@csail.mit.edu, seminars@csail.mit.edu, roboticscenterall@csail.mit.edu, me-all@mit.edu
Reminder Subject:
TALK: Reach Out and Touch Something
Haptic rendering is the application of forces in a virtual environment to a user interface (such as a flight control stick, haptic glove or surgical robot hand controls). The virtual environment can represent a physical environment, it can be fully synthetic (like in a video game), or it can be an augmented reality combination representing both physical and synthetic components. In this virtual environment, motions of the hand control device interact with a virtual representation of the physical environment, and kinesthetic feedback is provided to the user. When the physical environment includes a robot that is controlled by the user (or some other physical system that is under manual control), then haptic rendering facilitates co-robotic (human + robot) actions. For example, the operator of remote robot can feel objects in the robots environment.
This allows for 'remote touching' of moving 3D physical objects. The algorithms include methods to track surfaces of the physical objects and to avoid penetrating them. These algorithms can be used to define and enforce virtual fixtures--that is, force fields around objects, as perceived by the user through the haptic interface. These virtual fixtures can prevent motion of a tool (robot end effector) into a prohibited area. Or they can provide resistance to the operator when entering protected areas (while still allowing entry), or they can be used to guide tools along the surfaces of an object or along virtual surfaces at a specified distances from the object.
We have developed algorithms which perform haptic rendering of dynamically changing objects in real time, based upon point clouds. This has been done for terrestrial applications (like robotic surgery or search and rescue robots) and for underwater applications (like valve turning and connector mating). In terrestrial applications, one or more RGB-D cameras can be used. Underwater applications might use sonar or other sensors. There are a wide variety of potential applications for this technology; any situation where an operator wishes to feel the boundaries (and possibly the surface) of an object at a distance, without the use of contact sensors. This provides an extra channel for human machine interaction.
Speaker Bio:
Howard Jay Chizeck received his B.S (1974) and M.S. (1976) degrees from Case Western Reserve University, and the Sc.D. degree in Electrical Engineering and Computer Science from the Massachusetts Institute of Technology in 1982. From 1981 until 1998 he was a faculty member at Case Western Reserve University in Cleveland Ohio, serving as Chair of the Department of Systems, Control and Industrial Engineering from 1995 - 1998. He was the Chair of the Electrical Engineering Department at the University of Washington in Seattle from 1998-2003. Currently, he is a Professor of Electrical Engineering and Adjunct Professor of Bioengineering at the University of Washington, and a member of the faculty in the Neurobiology and Behavior graduate program.
His research interests are in telerobotics and neural engineering. His telerobotic research includes haptic rendering and control for robotic surgery and for underwater devices. His neural engineering work involves the design and security of brain-machine interfaces, and the development of assistive devices to restore hand and locomotion capabilities. Professor Chizeck is a research thrust leader for the NSF Engineering Research Center for Sensorimotor Neural Engineering (http://csne-erc.org).
Professor Chizeck was elected a Fellow of the IEEE in 1999 "for contributions to the use of control system theory in biomedical engineering and he was elected to the American Institute for Medical and Biological Engineering (AIMBE) College of Fellows in 2011 for contributions to the use of control system theory in functional electrical stimulation assisted walking. From 2008-2012 he was a member of the Science Technology Advisory Panel of The Johns Hopkins Applied Physics Laboratory, and currently serves on the Visiting Committee of the Case School of Engineering of Case Western Reserve University. He is a founder and member of the Board of Directors (since 1987) of Controlsoft Inc (Ohio) (http://www.controlsoftinc.com) and is a founder of the recent spinoff, BluHaptics Inc. (http://www.bluhaptics.com).
Research Areas:
Impact Areas:
Created by Michael Posa at Friday, April 25, 2014 at 11:03 AM.