TEAM BIOS
Steve Bress

Camcorder & ComputerVideo
Magazine
Steve Bress serves as Chief Technologist for Telepresence Technologies. Steve brings a rich and diverse background in computer hardware, computer software, and remote control cars, airplanes, and robotics, and is the rare breed of engineer that is as comfortable with a soldering iron as a keyboard. Steve's approach to any problem is to use the best solution, whether hardware or software, and he is adept in both disciplines.
Steve has been involved in the computer industry since the introduction of the first PCs in the late 1970s. Steve authored one of the best books on programming games for the Commodore 64 PC in assembly language, and through his company, Entropy Engineering, wrote games for the Atari 800, Apple, Commodore 64 and Atari 2600. Early on, he became an expert in computer video, working with the first PC dedicated to create video: the Mindset. Steve's Video TitlerTM software was the first PC software to effectively overlay titles on video via a PC. Steve has developed a number of image enhancement software programs, and NASA has used his software for cataloging images from both the Hubble Space Telescope and the Challenger wreckage. Steve later developed the drivers and user interface software to control STB's TV-PCI card. He has had a regular column in both Camcorder and Camcorder & ComputerVideo magazines for 12 years, and has regularly reported on the development of the digital video revolution. Steve created computer special effects that appeared in Hollywood movies such as Universal Soldier, True Lies, and Shadow Conspiracy.

VideoTitler
On the hardware side, Steve developed low cost servos for R/C airplanes and built R/C drone aircraft for the military. While working for Optim Electronics Corp, he designed data acquisition hardware used by most of the major automobile manufacturers. Steve developed a radio communication vision system work for the RB5X Robot (one of the first kit robots available for construction by the hobbyist). Most recently, Steve help create MyKey Technologies, Inc., which develops security devices for PCs, including its NoWriteTM hard drive write protection device and NoWrite FlashBlockTM. Steve currently has 9 patents pending for his work in this area.
With Jim Dunstan (see below), Steve developed code for LunaCorp's "Return To the Moon" CD-ROM in 1993-94, including VideoShow Assistant, the first HiColor slide show program for PCs, that allowed for the easy display of the some 110 images contained on the disk. Steve was also the lead programmer on the first PC-controlled motion arcade game, Lunar Defense. For LunaCorp, Steve developed a voxel-based graphics engine that tied into a motion platform, so the driver felt every bump on the computer-generated landscape. The "Moonroll" program was featured on a number of television programs chronicling LunaCorp's efforts to fly the first private mission to the moon to explore for ice deposits at the lunar poles. Steve was the lead hardware engineer on developing a PC-controlled rover for LunaCorp-sponsored museum exhibit.
Jim Dunstan

Telepresence Development Center
The prototype TXP-1 telepresence portal, with a miniature Moonscape in the foreground
Jim Dunstan serves as President of Telepresence Technologies. Jim brings to TT a skill set rarely seen: as a practicing attorney at Garvey Schubert Barer in Washington, D.C., Jim heads up their Communications and Media Group. He represents such media giants as EchoStar, and has been involved in media deals well in excess of $1 billion.
Jim's true passion lies in outer space, and he is a recognized expert in space law. He has written over a dozen articles on the subject, and represents many of the new generation of space entrepreneurs, including LunaCorp, where he was a founding board member, Mir-Corp, Constellation Services, the Foundation for the International Nongovernment Development of Space (FINDS), and Orbital Recovery.


Scenes from two RadioShack televion ads: a Father's Day ad shot on the space station with Cosmonaut Yuri Usachev, and the 2002 World Series Game 1 first pitch from the International Space Station.
For MirCorp, Jim drafted and negotiated the first commercial lease of a manned space facility, the Russian Mir space station. Jim also worked with MirCorp on Lance Bass' attempt to fly to the International Space Station (ISS). For LunaCorp, Jim did the legal work to allow the first television commercial to be shot aboard the ISS, and participated in the project to have the first pitch from the 2002 World Series be thrown out in space aboard the ISS.
But Jim has never been satisfied just being a lawyer. While in law school he wrote computer games for the first PCs, and formed Adventures Unlimited Software with classmate Mark Jacobs. Mark went on to found Mythic Entertainment, developer of the hugely popular Dark Age of Camelot®, a massively popular multiplayer online game. (Mythic remains one of Jim's best - and favorite - law clients).
In 1993, Jim wrote most of the code for LunaCorp's Return to the Moon®, including the Moonflight Simulator, which was the first program under Windows to be able to run a video file both forward and backwards.
For LunaCorp's 1994 Mission: Planet Earth, Jim wrote "PlanetView," a program that overlaid space imagery on the first cloudless image of the Earth.
Since 1993, Jim has worked with the Robotics Institute at Carnegie Mellon University in support of its lunar rover initiative. He helped develop robotic interfaces for the CMU team, including a demonstration for the NASA Administrator at the grand opening of the National Robotics Engineering Consortium. In 1997, Jim wrote the code to take the realtime motion data from CMU's Nomad robot in the Atacama Desert in Chile, and feed that data into the ViRtogo motion platform. Riders in the Carnegie Science Center were able to sit on the motion platform, drive Nomad, and feel every bump and rock the rover traversed. This was the first realtime remote teleoperation involving vehicle motion.

Game players defended the moon against attack, while experiencing realtime video/audio/motion via a six-way motion platform.
Along with Steve Bress (see above), Jim helped create the first PC-controlled motion arcade game, Lunar DefenseTM. He wrote the motion code to translate the screen action into the ViRtogo motion platform, imbedded in the arcade buildout. For the first time in arcade history, players could feel the impact of an asteroid slamming into the Moon just a few feet behind their turret.
With the experience gained working with CMU on the Nomad Atacama Desert Trek, and helping develop Lunar DefenseTM, Jim became convinced that it was possible to more closely replicate distant environments using non-visual data. Previous work in this area had concentrated on trying to recreate visually the remote environment. After having spent countless hours on a motion platform system, Jim recognized that current systems totally ignore other human senses, which are crucial to actually experiencing the environment. For LunaCorp, Jim created TXP-1, to begin testing his theories, in which a live video of a scale model rover running on the "moonyard" was combined with surround sound and motion, to deliver a whole new "telepresence" experience. Jim delivered a paper on his concepts and current development status at the prestigious Space Studies Institute's Space Manufacturing Conference in May, 2001.
Jim developed the interface software for the LunaCorp tele-scout (built by Steve), created for the museum marketplace. Jim also built and designed the software for Rover-2, which was donated to the Mesa Arizona School District at the Space Frontier Conference in September 2002. Rover-2 is being used in connection with elementary school space simulations.

Vote2DriveTM interface to allow multiple Internet users to simultaneously control a distant rover.
Now with Telepresence Technologies, Jim hopes to expand on his prior work, including adding additional senses to the remote environment such as forcefeedback joystick control of robots, temperature, and even smell. Telepresence Technologies currently is seeking patents on a number of these technologies. Ultimately, Telepresence Technologies hopes to allow multiple users to simultaneously experience and control a robot in distant environments.

PROJECTS
MultisenseTM WiFi® Robots
The dream of an affordable computer controlled robot that is both fun to play with and can provide the platform for real research has been realized in Telepresence Technologies' new MultisenseTM Rovers. Based on over four years of serious research, the MultisenseTM Rovers represent a significant leap forward in robotic design. With a retail price of under $300, these rovers pack an incredible feature set:

  • PC control
  • On-board video and audio, displayed through the PC
  • Four wheel independent drive means incredible maneuverability and awesome terrainability (a tilt sensor is necessary to keep this thing literally from climbing walls)
  • WiFi (802.11b) communications pathway allows for massive data transfers
  • Multiple sensors on-board allow for remote interaction with the environment
  • Single-board MultiplugTM construction provides a rugged platform
  • Patent pending connectors allow for the addition of Smart SensorTM technologies for amazing expandability
  • Multiple software scenarios for "out of the box" fun
  • Open interface architecture to allow the serious researcher to expand its capabilities

/ A Mars-themed unit will be available for sale in the second quarter of 2004. Expansion packs with additional sensors will be available in the third quarter of 2004.
RoboNanny lets you keep an eye on your baby while you work.

Robo-NannyTM
An up-rated version of the MultisenseTM Rover, with a larger footprint and customized software, Robo-NannyTM makes the perfect surveillance platform. Imagine sitting in your office and being able to check up on your home through a browser interface displaying live video and audio from the Robo-NannyTM. Can't see anything from where the Robo-NannyTM is? Then simply drive it around your house.
Leaving your house (or beach home) for an extended period? Program Robo-NannyTM to take a daily "stroll" and record the condition of the house and download it to a remote computer.
Available - Fourth quarter, 2004.
"Flight to the Moon"TM Ride Film
Working with ViRtogo, Telepresence Technologies is creating an entertaining and educational motion ride film for the science center, museum, or Location Base Entertainment (LBE) location.
"Flight to the Moon" takes the participants on a 5 minute thrilling ride to the Moon, based on actual NASA footage, and state of the art computer graphics. The creators have talked with a number of Apollo astronauts who walked (and rode) on the lunar surface, and have asked them to share not only what the saw, but what they felt as well, so we could incorporate it into this motion ride.

/ After strapping into the ViRtogo 6-DOF motion platform, the ride begins a pan upwards of the giant Saturn V rocket, with President Kennedy's famous challenge to a generation "We choose to go to the Moon in this decade, not because it is easy, but because it is hard."
/ The giant F-1 engines roar to life in surround sound, and your entire world shakes as the Saturn V lifts off. After staging, you're "go for Trans-lunar Injection" and the Moon awaits.
/ Before you know it, you're at the Moon, and the Lunar Module pitches over to give a view of the landing site.
/ And then you're there with the Apollo astronauts as they plant the American flag, declaring that "we come in peace, for all mankind."
/ It's time to explore! You hop aboard the lunar rover, and shake, rattle, and roll over the lunar surface.
/ But you can't stay there forever, and soon it's time to light the ascent state of the LEM and return to dock with the Apollo Command module.
/ Then it's back to the big beautiful blue Earth, and a safe splashdown to end the experience.

Immersive User Interfaces for Planetary Robots
MultisenseTM technology is not just about fun and toys. The Telepresence Technologies team has engaged in serious research into designing immersive user interfaces for planetary robots for almost ten years.
Hardware innovations include:

  • Real-time motion platform systems
  • Surround-sound systems including Very Low Frequency (VLF) woofers imbedded in the seat to translate the "rumble" sound into feel
  • Hot and cold fans to provide the user a sense of temperature changes
  • Forcefeedback joystick control which "fights back" as the user tries to either drive up hills, or, in the case of an intelligent remote robot, where the user attempts to direct the rover in an unsafe direction
  • Dry-charge scent capsules to quick convey smell

Software interfaces include:

  • Control of "virtual" rovers over datasets
  • Control of remote robots
  • Motion and other "sense mixer" control interfaces
  • Vote2DriveTM interface to allow large numbers of people to collaborately operate a remote robot

The Ultimate Goal:
Ever notice when trying to view a 3-D image that sometimes it takes a second or two and then - SNAP! - you're seeing 3-D? Think about that happening not just with vision, but with multiple other senses simultaneously. Our experience shows that if all of the senses are "dialed" in correctly, the user experiences that same SNAP!, except that instead of just seeing stereo, your senses literally are transported to the remote location. The difficulty is that each individual is unique, and how their various senses relate to their environment is different. Thus, "dialing in" the proper amount of each sensual stimulus is difficult. Telepresense Technology is working on a "sensation mixer" which will allow the user to quickly become acclimated to the various sense inputs, and more quickly be transported to the remote location.
Ultimately the goal is to build a completely emersive system that can convey all of these senses simultaneously to a large number of people, as well as establish a protocol for the transmission of data whereby such future TXP's can be plugged into a variety of robots in remote locations.