Juuly 23, 2004

BACK


Publications on Haptics :

Websites:

 

BibTeX references.


Interactive Haptic Rendering of Deformable Surfaces Based on the Medial Axis Transform

Jason J. Corso, Jatin Chhugani and Allison M. Okamura

Eurohaptics 2002, Edimburgh, UK, July 8-10, 2002
Proceedings, pp. 92-98.

Web sites:

PDF file.

Summary:

A new method for interactive deformation and haptic rendering of viscoelastic surfaces. Objects are defined by a discretized Medial Axis Transform (MAT), which consists of an ordered set of circles (in 2D) or spheres (in 3D) whose centers are connected by a skeleton. Our implementation, called DeforMAT, is appealing because it takes advantage of single point haptic interaction to render efficiently while maintaining a very low memory footprint.

 

Two immediate limitations:

  1. Limiting nature of non-bifurcating surfaces, and
  2. difficulty to tie the rendered force and inflicted deformation to mathematically accurate models because of the MAT nature and the linear elastic deformation model we are using.

Future work:

Extend it to bifurcating surfaces and non-ordered input for two separate reasons. Bifurcation is important because it will allow to create objects with more realistic structure (for instance a hand or a plant). Assuming ordered input is acceptable, but to generalize the creation of DeforMAT objects, non-ordered MATs are necessary so that a DeforMAT object can be built from any MAT.

Deformation near the end of the MAT-bodies is under-determined (because of the underlying parametric spline structure). So, capping fixes this problem by affixing a separate ("orthogonal") MAT as a cap to the end of each body.

Video:

 


Uniting Haptic Exploration and Display

Allison M. Okamura

Assistant Professor of Mechanical Engineering

The Johns Hopkins University

The Haptic Exploration Laboratory

Presented at the 10th International Symposium on Robotics Research,
Lorne, Victoria, Australia, Nov. 2001. (in press for 2002)

Abstract

This work develops a methodology for building haptic reality-based modeling systems by exploiting the complementary goals of haptic exploration and display. While the generally unstructured nature of haptic exploration makes it difficult to develop and control autonomous robotic fingers, simultaneous consideration of exploration and display provides a way to characterize the required sensing, control, finger geometry, and haptic modeling components. Both the exploring robot and haptic display must complement an appropriate virtual model, which can accurately recreate remote or dangerous environments. We present an example of haptic exploration, modeling and display for surface features explored using a three-degree-of-freedom spherical robotic fingertip with a tactile sensor.

"no general method has been developed to create autonomous haptic exploratory procedures" (end of Section 1.1).


Haptic Exploration of Unknown Objects

Allison M. Okamura

Ph.D. Thesis, Department of Mechanical Engineering, Stanford University, June 2000.

Abstract

Haptic exploration is a key mechanism humans use to learn about the surface properties of unknown objects. With specialized fingers and sensors, and the appropriate planning and control, robots can also be enabled to explore the world through touch. Haptic exploration has applications in many areas, including planetary exploration, undersea salvage, and other operations in remote or hazardous environments.

This thesis develops an approach for haptic exploration of unknown objects by robotic fingers. Because haptic exploration is coupled with manipulation, a procedure for combined manipulation and exploration using a sequence of phases is presented. Fingers alternately grasp and stabilize the object while other fingers explore the surface with rolling and sliding motions. During an exploratory phase, the goal is to move a finger's tactile sensors over the surface in a way that will elicit useful data.

There exist many possible objectives for haptic exploration. This work concentrates on the detection and identification of fine surface features. In the context of exploration with spherical robotic fingertips, fine surface features and macro features such as bumps, cracks and ridges are defined. Using different types of sensor data, various algorithms and experimental results for fine feature detection are presented. There are also many potential methods for actively exploring a feature on a surface in three dimensions. After a feature has been encountered on a surface, tactile sensor and position data may be used to determine the next direction of finger travel, guiding the finger around and over the feature in a way that will efficiently extract surface properties. Shape skeletons are used to create a map of features and regions on a surface.

ToC

Chapter 1: Introduction

Chapter 2: A Procedure for Multifingered Exploration

Chapter 3: A Feature Definition for Haptic Exploration

Chapter 4: Feature-Guided Exploration

Chapter 5: Conclusions

Appendix A: Kinematics of Contact

Appendix B: Smoothing Tactile Data Using Noise Type

Appendix C: A 3DOF Robotic Finger with Tactile Sensing


BACK

Page created & maintained by Frederic Leymarie, 2002-4.
Comments, suggestions, etc., mail to: leymarie@lems.brown.edu