In recent years, many human-computer interaction and virtual environment systems have incorporated haptic devices that interface with the user through the sense of touch. However, the range of environment properties and types of interactions enabled by current interfaces are limited by feedback from artificial materials or resolved forces in virtual reality simulators. Medical simulators that integrate the cutaneous sensations of direct interaction with kinesthetic feedback can provide a complete haptic experience. To address this, IAI and collaborators at Stanford University and Tangible Haptics have been awarded a follow-on contract entitled, “Hand-Free Kinetic System for Medical Simulation (KineSys MedSim).” The KineSys MedSim interface includes a six degree-of freedom (6-DOF) cable-based kinesthetic robot with interchangeable tactile displays, a 3D stereo camera for hand tracking, and a stereoscopic 3D display. Several components including the robot, air jet lump display, and variable friction surface display have been successfully demonstrated in the first phase of the project. An encountered-type display has been implemented, using the 6-DOF robot and stereo camera. This operates in two modes, a tracking mode and a kinesthetic interaction mode. Simulations of cutaneous display have been conducted using a lump of varying size with an air jet display and of a variable friction surface using a piezo-electric actuated glass surface. The next phase will improve the ergonomics and control software of the kinesthetic robot, enhance the haptic lump display and develop an electrostatic variable friction display. Finally, the components will be integrated as a fully-functional platform for haptic medical simulation and the performance of the KineSys MedSim will be evaluated.