Minimize ISS: METERON

ISS Utilization: METERON (Multi-Purpose End-To-End Robotic Operation Network) / Telerobotics & Haptics

Overview   Laboratory   Launch    Mission Status   Interact Space Experiment    References

METERON is a suite of experiments to validate advanced technologies for telerobotics from space, using the ISS. METERON is an ESA-led technology demonstration involving the ISS (International Space Station). The project is targeted at validating autonomous and real-time telerobotic operations from space to ground. It is targeted to provide answers to important questions regarding required technologies for future space exploration scenarios. To test various scenarios and to validate the related technologies, robots on Earth will be controlled from the interior of the ISS with haptic feedback and high situational awareness. Control devices used in space, will be force reflecting joysticks and arm exoskeletons. 1)

Note: "Haptics" is any form of interaction involving touch (from Greek : haptikos, from haptesthai; to grasp, touch). Haptic technology is a technique that interfaces with the user through the sense of touch. Haptics is the science of applying touch (tactile) sensation and control to interaction with computer applications.

The primary robotics goals of METERON are:

• Technology Validation (advanced mechatronics & wearable haptic devices)

• Telepresence in space (-µG) environment (> 20 ms to supervised autonomy)

• Human-Robot Interaction in highly constrained environments

• Inter-operability of groups of robots and control devices

METERON is being carried out in partnership with DLR (Institute of Robotics & Mechatronics), NASA [Johnson Space Center, AMES Research Center, JPL (Jet Propulsion Laboratory)] and Roscosmos with Russian partners (RTC Institute St. Petersburg, Energia).

The ESA/ESTEC Telerobotics Laboratory is responsible for the definition, implementation and conducting of the telerobotics experiments and in particular for the development of the haptic devices, the controllers and the full on-board control station for ISS.

The ESA Telerobotics & Haptics Laboratory is an engineering research laboratory that performs fundamental research in the domains of telerobotics, mechatronics, haptics and human-robot interaction. The Laboratory output is targeted at supporting novel spaceflight projects by pre-development and demonstration of critical technologies. - The research areas of the laboratory are in (1) the design of human-centric mechatronic haptic devices, in (2) control for distributed telerobotic systems, in (3) the optimization of perceptive feedback for humans, and in (4) the design and development of advanced computational frameworks, tools and APIs (Application Programming Interfaces) to support general telerobotics and mechatronics research.

METERON is a suite of experiments initiated by ESA and its partners. Its main goal is to explore different space telerobotics concepts, using a variety of communication links with different latencies and bandwidth specifications. 2)

Teleoperating the robot for extraterrestrial exploration allows the operator to remain at a safe distance from the humanly unreachable, and dangerous environment. For the METERON experiments, rovers and service robots will be deployed on Earth, in different simulated space and planetary scenarios to perform a catalog of tasks. The target robot will be commanded from the International Space Station (ISS), through the use of different HRI (Human Robot Interface) options, ranging from laptop and tablet PCs, to force reflection joystick, and exoskeleton. Figure 1 gives a concept overview of the METERON experiment suite.

Two forms of telerobotic commands will be studied in the METERON experiment suite:

1) Full haptic feedback telepresence, which requires low latency communication links coupled with haptic capable robots and HRI.

2) Supervised autonomy, which uses high-level abstract commands to communicate complex tasks to the robot. This style of telerobotics depends on the local intelligence of the robot to carry out decision making and processing on site. Supervised autonomy can significantly reduce the workload, and improve task success rate of the human-robot team, and tolerate significantly higher communication latencies, where multi-second delays can be coped with.

Meteron_Auto10

Figure 1: Space telerobotics concept with the METERON experiment suite (image credit: DLR, ESA)

 

supports the implementation of a Space Internet, examines the benefits of controlling surface robots in real time from an orbiting spacecraft, and investigates how best to explore a planet through a partnership between humans and robots. This is of importance to planning future human exploration missions to Mars, for example, and feeds into Earth-based technologies such as in medicine or handling of radioactive material.

 


 

Haptics Laboratory

The objective of the Haptics Laboratory is to study the task capability for a dexterous humanoid robot on a planetary surface, as commanded from a space station (e.g. ISS) in orbit, using only a single tablet PC. A Dell Latitude 10 tablet PC running Windows 8 has been upmassed with ESA's ATV-5 in July 2014 to the ISS for the METERON Haptics-1 experiment . A new GUI (Graphic User Interface) software will be uploaded to this tablet PC for Supvis-Justin (Ref. 2).

The intended robotic tasks include survey, navigation, inspection, and maintenance. These tasks will be performed on a solar farm located on a simulated extraterrestrial planet surface. Using a communication link between the ISS and the Earth, with delays of up to 10 seconds, realtime telepresence would not be possible. Instead, in Supvis-Justin, abstract task-level commands are locally processed and executed by the robot. The robot works together with the tablet PC operator on the ISS in the fashion of a co-worker, rather than a remote electromechanical extension of the astronaut. Furthermore, previous studies have shown that by delegating to the robot through task-space commands, the operator's workload is significantly reduced.

 

Haptics-1 Experiment

Haptics-1, part of METERON, is the first of a series of in-flight experiments to demonstrate advanced teleoperation technologies and to demonstrate the telecontrol with realtime force-feedback of robotics on Earth from Space.

Launch: On July 29, 2014, a Haptics device was flown to the ISS in a pouch, being a small payload on the ATV-5 (Automated Transfer Vehicle) of ESA onboard an Ariane 5 ES launcher from Europe's Spaceport in Kourou, delivering 6.6 tons of supplies. 3)

Haptics-1 is a single degree of freedom force reflective joystick (the 1-DOF Setup). The joystick is high performance with active joint impedance control, enabled through a custom torque sensor at the joint output and driven by a highly power-dense RoboDrive ILM brushless motor.

The joystick run's internally on a 2 kHz sample rate in hard realtime and can communicate with external systems at those rates all data that is acquired internally (handle position, velocity, torque, current, etc.). Internally, the joystick uses an Intel Atom Z530 (1.6 GHz) embedded computer that communicates with the joint motor controller via a realtime EtherCAT bus. The joystick is fully self-powered and only needs 28 V input from the ISS and a mechanical interface on the stations seat tracks.

The output on the handle-bar can generate any position or torque to the user in a ripple-free way. As such, the joystick can render torque, stiffness and damping easily and any combination of control mode can be used at any instant in time for the various experiments. The pouch also houses a flight qualified Tablet computer (Dell Latitude 10) that run's the Haptics-1 GUI (Graphical User Interface). The GUI is the sole entry point for the astronaut crew to load and execute experiments with the joystick.

In the first flight experiments of Haptics-1, an extensive set of physiological data is collected that will help to determine design requirements for the design of haptic devices to be used in Space in the future. Up to now, such data is still not available ! We suspect that in microgravity, the perception thresholds will change! Moreover, we are testing which way the joystick can be used in a better fashion, whether wall-mounted (as shown in Figure 2) or in body-mounted configuration, when the joystick is directly 'strapped' onto the crew's chest.. then providing a body-grounded force feedback.

Meteron_AutoF

Figure 2: Photo of the Haptics device in a wall-mounted configuration (image credit: Telerobotics & Haptics Lab)

In seven distinct scientific conduct protocols, the project will measure the crew human factor data after exposure to microgravity for an extended duration. The measurements include, for instance, the bandwidth of human motion in position and force control tasks, the minimum stimulus thresholds for sensing torque and stiffness, the mass and stiffness of the upper extremity, and other interesting measures that will define clearly how humans will interact naturally with force-feedback systems when exposed to the microgravity environment in space. This data will help the project to design the ISS payload exoskeleton that will go up hopefully by mid 2016.

Haptics-1 is a preparation for the Haptics-2 experiment, which is the next experiment the project already started working on. Haptics-2, in 2015 will make use the 1-DOF Setup to teleoperate a 1-DOF Setup on Earth, from Space. The project plans to make use of a low-latency direct S-band link between Space and Ground and mark the first real teleoperation experiment conducted from Space - ever! The first haptic interaction in space will already take place this year, under Haptics-1, in October 2014 with ESA Astronaut Alexander Gerst. Alexander will be the first to install the Haptics-1 equipment on the ISS and perform a full set of the seven science protocols.

The Haptics-1 experiment (Figure 3) comes down to a deceptively simple-looking lever that can be moved freely to play simple Pong-style computer games. Behind the scenes, a complex suite of servomotors can withstand any force an astronaut operator might unleash on it, while also generating forces that the astronaut will feel in turn, just like a standard video gaming joystick as a player encounters an in-game obstacle. 4)

Meteron_AutoE

Figure 3: Experiment setup of Haptics-1 (image credit: ESA)

 

Haptics-2 Experiment

The Haptics devices, on ISS and in the Telerobotics Lab of ESA, are identical twins.

In the Haptics-2 experiment, the project is testing a series of protocol combinations with beyond state-of-the-art technology. On June 3, 2015, the project will conduct a first comparison and first check of providing such force-feedback control to a system that spans the Earth and Space!

Two routes are available to send signals back- and forth between the ISS and the test setup at the ESA Telerobotics and Haptics Laboratory. The first route is composed by a signalling system that relays data via a geo-synchronous satellite constellation (TDRS), the NASA Ku-Fwd system.

The second is a low-latency direct S-band link provided by our Russian partners through S-band hardware installed on the Russian module of the ISS. The signal is then received by a German ground station of the DLR (German Aerospace Agency). While this link is direct and features little transmission delay, it has only very limited bandwidth. The first link, enables large bandwidth, but also requires much longer signal travelling times.

First, the project will be testing the more challenging link, the new NASA Ku-Forward system. It should allow to transmit signals between the sites in less then 0.8 seconds. However, the realtime control with force-feedback over so much delay is a very hard thing to do! So, one question, that the test on 3 June addresses, is how much delay exactly exists between the sites through this link.

After a full characterization of the link, the project will do the more fun stuff! Hand-shake from Space, control of a video camera between the two Haptics locations, such that Terry Virts can be like an 'avatar' between us, and finally (hopefully - if we still get the time) we'll carry out a remote stiffness discrimination test. There, Terry will have to probe objects of varying stiffness remotely from space and he'll have to rate their relative stiffnesses. This way, we can find out whether the teleoperation system in itself degrades his difference sensing capability. Hopefully not ! He should be able to sense stiffness differences of at least 25% without a doubt!

 


 

Haptics mission status:

• March 15, 2016: ESA's Meteron project is looking at a way to not only reduce this time delay but also make the command path robust against signal interruptions that may occur over long distances with multiple relay stations. — Tim Peake: "On March 17 and April 25, 2016, I will be part of an experiment to drive two different rovers from Europe's Columbus laboratory. At first glance it might seem easy to operate a rover from space – the hard part would be getting astronauts and the rovers to their destination – but the communications network and robot interface need to be built from scratch and tested to work in microgravity. On March 17, I will be driving Eurobot at ESTEC in The Netherlands in a two-hour experiment. The Space Station flies at around 28 800 km/h and completes a full orbit of Earth every 90 minutes, so throughout the session my commands could be beamed directly to the car-sized rover while at other times the Space Station will be on the other side of the planet – so commands need to be relayed through satellites and multiple control centers. To keep everything going smoothly engineers at ESTEC created a new type of ‘space-internet' that adapts to the changing connections and network speed. In the experiment, Eurobot will be near a mock-up lunar base that has a problem with its solar array. It is up to me to locate, grab and unjam the solar panels." 5)

• On August 14, 2015, the Haptics-2 project acquired good science results for the second time. It had a very high interaction with crew (astronaut Kimiya Jui) that resulted in a very efficient science protocol conduct, collected a full amount of 9 science runs and performed a detailed Ku-forward link characterization. 6)

Here a brief wrap-up of what was actually completed today:

- One full run of Protocol 1b (and one partial one), checking out the Ku-forward communication link on all ports and quantifying their delay and jitter. Three channels are used, for telepresence, housekeeping data and for video.

- Seven science data acquisitions for different controller settings to quantify the telepresence control. This was done by teleoperated stiffness discrimination tasks performed by Kimiya. We can state that these runs completed a full conduct of Haptics-2 Protocol 2b. Due to some mishaps in the first 3 runs related with the Ku-forward link connectivity we might, however, not be able to use the first three measurements.

- A time analysis and time-sync between space and ground NTP servers. Acquired on-board GPS time and synced with ground GPS-time (with UTC leap seconds) for a detailed characterization of on-ground and on-board NTP servers. This will be important for Interact, in order to sync all telemetry (space and ground) to the same time base.

- A first Space-to-Ground crew guided procedure while performing a telepresence control experiment.

- A pre-verification of our four-channel bilateral control architecture that is robust to 850 milliseconds of round-trip time delay. 7)

- A verification of the model-mediated bilateral control architecture that is robust to any time-delay. The data post-processing will reveal whether the operator in space was able to identify the right order of stiffness of objects presented to him in random order.

• On June 3, 2015, the Telerobotics Lab performed the first-ever demonstration of space-to-ground remote control with live video and force feedback when NASA astronaut Terry Virts orbiting Earth on the ISS (International Space Station) shook hands with ESA telerobotics specialist André Schiele in the Netherlands. 8) 9)

- Terry was testing a joystick that allows astronauts in space to ‘feel' objects from hundreds of kilometers away. The joystick is a twin of the one on Earth and moving either makes its copy move in the same way.

- The joystick provides feedback so both users can feel the force of the other pushing or pulling. Earlier this year, NASA astronaut Butch Wilmore was the first to test the joystick in space but without a connection.

- This historic first ‘handshake' from space is part of the Lab's Haptics-2 experiment, testing systems to transmit the human sense of touch to (and from) space for advanced robot control.

Meteron_AutoD

Figure 4: Haptics-2 on ISS: Astronaut Terry Virts on the ISS is shaking with the head of ESA's Telerobotics Lab André Schiele on the ground via a force-feedback joystick on the evening of 3 June 2015 (image credit: ESA, J. Harrod) 10)

Legend to Figure 4: First the ISS joystick was moved, then the slaved joystick on the ground. Then Andre shook the ground joystick in turn, felt by Terry in orbit. This first 'handshake with space' took place as part of the Lab's Haptics-2 experiment, harnessing advanced telerobotics technology for the control of space robot systems. ESA's Telerobotics and Haptics Laboratory is based at ESA/ESTEC in Noordwijk, the Netherlands.

The Haptics-2 experiment on June 3, 2015 verified the communications network, the control technology and the software behind the connection. Each signal from Terry to André had to travel from the International Space Station to another satellite (TDRS) some 36 000 km above Earth, through Houston mission control in USA and across the Atlantic Ocean to ESA/ESTEC in the Netherlands, taking up to 0.8 seconds in total both ways.

As the Space Station travels at 28 800 km/h, the time for each signal to reach its destination changes continuously, but the system automatically adjusts to varying time delays. In addition to the joystick, Terry had an extra screen with realtime video from the ground and augmented reality added an arrow to indicate the direction and amount of force.

Meteron_AutoC

Figure 5: Photo of ESA's head of the Telerobotics lab André Schiele waiting to perform the Haptics-2 experiment and the first historic space-to-ground handshake on June 3, 2015 (image credit: ESA, J. Harrod)

Legend to Figure 5: In the first-ever demonstration of space-to-ground remote control with live video and force feedback, NASA astronaut Terry Virts orbiting Earth on the International Space Station shook hands with ESA telerobotics specialist André Schiele in the Netherlands. - Terry was testing a joystick that allows astronauts in space to ‘feel' objects from hundreds of kilometers away. The joystick is a twin of the one on Earth and moving either makes its copy move in the same way.

• January 8, 2015: NASA Astronaut Butch Wilmore completed the first full Haptics-1 data collection onboard the ISS. Now, all seven protocols of Haptics-1 have been recorded in both wall-mount and body-mount sessions! An exciting data-mining and analysis is coming up for the Telerobotics & Haptics Lab in the coming days and weeks. It will reveal in great detail which performances human crew can achieve with haptic force-feedback devices in space and which differences exist between the performance in space and on ground. 11)

- In the wall-mount configuration the joystick is fixed to the Rack's of the ISS and the astronaut stabilizes himself with the handrails (with the other hand) and with his feet.

- In the vest mount configuration, the joystick is fixed to the crew-member on his own chest. The team believes that this force-closure on the human body might improve the then available hand-eye coordination of the crew member during the tests.

Meteron_AutoB

Figure 6: Photo of the Haptics device in a vest (or body) mounted configuration on ISS (image credit: NASA, ESA)

• January 5, 2015, Haptics-1 experiment: In a milestone for space robotics, the International Space Station has hosted the first full run of ESA's experiment with a force-reflecting joystick. NASA crew member Barry Wilmore operated the force-feedback joystick to gather information on physiological factors such as sensitivity of feeling and perception limits. He finished a first run on New Year's Eve (Dec. 31, 2014). 12)

- Operating on a "force feedback" mechanism, the Telerobotics and Haptics Lab's Haptics-1 hardware is essentially a force-reflecting joystick, which mimics the functions of a video gaming joystick. Referring to the sensation that you feel in your fingertips and hands, "force feedback" helps humans carry out tasks such as typing, or tying shoelaces without looking down.

- The deceptively simple-looking lever is connected to a servomotor that can withstand any force an astronaut operator might unleash on it, while generating forces that the astronaut will feel in turn – just like a standard video gaming joystick as a player encounters an in-game obstacle. The joystick measures such forces at a very high resolution.

- Harnessing that sensation for robotics would extend the human sense of touch to space or other remote areas, making robotic control much more natural and easy. Ultimately, robots could work thousands or tens of thousands of kilometers away, yet perform tasks of equal complexity to those a human operator could manage with objects immediately at hand.

- To stop the weightless users being pushed around by the force, the ‘Haptics-1' experiment can be mounted either to a body harness or be fixed to the Station wall.

Meteron_AutoA

Figure 7: Force-feedback experiment on Space Station (image credit: NASA, ESA)

 


 

Interact Space Experiment

Background: In early September of 2015, Danish astronaut Andreas Mogensen will perform a groundbreaking space experiment called Interact, developed by ESA in close collaboration with the TU Delft Robotics Institute. During the 2015 ESA Short Duration Mission, Mogensen will take control of the Interact Centaur rover on Earth from the International Space Station in real-time with force-feedback. 13) 14)

The Mission: The Interact experiment, conceived and implemented by the ESA Telerobotics & Haptics Laboratory, will be the first demonstration of teleoperation of a rover from space to ground in which during part of the experiment, the operator will receive force-feedback during control. The task set up for the astronaut is to maneuver the rover located at ESA's ESTEC technical center in Noordwijk through a special obstacle course, to locate a mechanical task board and to perform a mechanical assembly task. Once the task board is located and approached, the astronaut will use a specially designed haptic joystick in space to take control of one of the Centaur's robotic arms on Earth. With the arm he will execute a "peg-in-hole" assembly task to demonstrate the ability to perform connector mating through teleoperation with tight mechanical tolerances of far below one millimeter. The haptic feedback allows the astronaut to actually feel whether the connector is correctly inserted and, if necessary to fine-tune the insertion angle & alignment. The complete operation is performed from on-board the International Space Station, at approximately 400 km altitude, using a data connection via a geosynchronous satellite constellation at 36.000 km altitude. The communication between the haptic joystick and the ground system is bi-directional, where both systems are essentially coupled. This so-called bi-lateral system is particularly sensitive to time delay, which can cause instability. The satellite connection, called the Tracking and Data Relay Satellite System (TDRSS), results in communication time delays as large as 0.8 seconds, which makes this experiment especially challenging. ESA copes with these challenges through specialized control algorithms developed at ESA's Telerobotics Laboratory, through augmented graphical user interfaces with predictive displays and with ‘force sensitive' robotic control algorithms on ground. These ESA technologies allow the operator to work in real-time from space on a planetary surface. It is as if the astronaut could extend his arm from space to ground.

With Interact, ESA aims to present and validate a future where humans and robots explore space together. Robots will provide their operators much wider sensory feedback over much greater distances than what can be done by terrestrial robots today. Not only in space, but also on Earth, remote controlled robotics will prove highly enabling in dangerous and inaccessible environments. They can be used in arctic conditions, in the deep sea or for robust intervention in nuclear disaster sites.

We can expect that future human exploration missions to the Moon and Mars will benefit from such advanced human-robotic operations. ESA's research in telerobotic technologies and advanced crew operations from orbit will play a key role in these coming adventures. The ESA Telerobotics and Haptics Laboratory, along with ESA's Technical and Space Exploration Directorate, are dedicated to taking the next big steps in human-robot collaboration in space.

 

Interact Centaur:

The mobile robotic platform, called the Interact Centaur, was specifically designed to be able to maneuver through rough terrain at high speeds and to have the dexterity to perform very delicate and precise manipulation tasks through remote control. The Interact Centaur rover was designed and built by the Lab team in collaboration with graduate students from Delft University of Technology, the Netherlands.The custom vehicle design was brought from concept to reality in little over a year.

COMPUTING: The robot makes use of seven high performance computers running software that has been programmed in a highly modular, model-based and distributed way.

ROBOTIC ARMS: Two KUKA (KUKA Laboratories GmbH, Augsburg, Germany) lightweight robotic arms on the front of the rover allow the operator to perform very precise manipulation tasks. The arms can be ‘soft controlled' to safely interact with humans or delicate structures and can be programmed to be compliant (like a spring and or damper) when they hit an object. The arms are equipped with highly ‘force sensitive' sensors and can flex and adapt in a similar manner to human arms during remote control. This allows to tightly couple those arms to an operator located far away by means of haptic (i.e. force-transmitting) interfaces. Their operation during the Interact experiment is very intuitive, allowing delicate and dexterous remote operations to take place across very long distances with the finest amount of force feedback to the operator despite the communication time delay.

ROVER MOBILE PLATFORM: The drivetrain and wheels for the Interact Centaur are a customized version of the remote controlled platform manufactured by AMBOT. This battery-powered, four-wheel-drive, four-wheel steering platform is weatherproof and gives the rover over 8 hours of run-time in challenging terrains.

ROBOTIC PAN-AND-TILT NECK AND HEAD: A robotic 6 degrees of freedom Neck gives the cameras in the rover's head an enormous field of view, good for driving and for close visual inspection tasks.

REAL-TIME CAMERAS: The rover has 4 dedicated real-time streaming cameras that the astronaut can use during the mission. A head pan-tilt camera that will allow general contextual overview of the situation during driving and exploration of the environment. A tool camera mounted on the right robotic arm for vision during precise tool manipulation. Two hazard cameras (front and back) to view the near proximity area otherwise occluded by the chassis during driving.

EXTERIOR DESIGN: A custom-made exterior protects all delicate mechatronic and computing hardware from dust and ensures a good thermal design.

Meteron_Auto9

Figure 8: Photo of the Interact Centaur mobile platform placed in a desert-like background (image credit: ESA)

Meteron_Auto8

Figure 9: Photo of the Interact Rover showing the tool camera head on a neck system, a pair of highly advanced arms designed for remote force-feedback operation and a number of proximity and location sensors, including an arm laser to indicate depth close-up (image credit: ESA)

AUGMENTED REALITY: To provide extra support to the astronaut while driving the rover, an augmented reality (AR) overlay was developed. This allows for virtual markers such as predicted position markers to be displayed on top of the camera feed.

LASER GUIDANCE: To visually support the astronaut when performing the mechanical alignment during the peg-in-hole assembly task, a laser has been embedded within the tool. When hovering over the hole, the laser will be invisible indicating that the connection can be attempted. The Laser creates an artificial depth impression by a dedicated depth-cue. This allows executing such complex 3D tasks without requiring a dedicated stereo 3D video system, which would consume excessive data bandwidth.

Meteron_Auto7

Figure 10: Schematic view of laser guidance assembly mounted to the tool camera (image credit: ESA)

Space-to-Ground Communications: As a complicating factor, the signals between the astronaut and the robot must travel via a dedicated and highly complex network of satellites in geostationary orbit. The signals will travel from the International Space Station via NASA's TDRSS to ground facilities in the U.S. From there, they cross the Atlantic Ocean to the ESA facilities in Noordwijk, the Netherlands. Forces between the robot and its environment, as well as video and status data, travels back to the graphical user interface and the haptic joystick. In this round-trip, all signals cover a distance of nearly 90,000 km. The resulting round trip time delay approaches one second in length.

ESA developed a model-mediated control approach that allows to perform force-feedback between distributed systems up to multiple seconds of time delay, without a noticeable reduction of performance, compared with directly coupled systems. Despite the fact that this smart software and control methods enable the astronaut to perform such tasks on Earth, research suggests that humans can only handle signal transmission time delays of up to about three seconds for control tasks that require hand-eye coordination. In theory this would allow haptic control from Earth to robotic systems on as far away as the surface of our Moon.

Meteron_Auto6

Figure 11: Schematic of the space-to-ground communications (image credit: ESA)

HAPTICS-1 JOYSTICK: On-board the ISS, the astronaut will re-use equipment from the previous Telerobotics & Haptics Lab experiments, called Haptics-1 and Haptics-2. For these experiments a tablet PC and a small force reflective joystick were flown to the ISS with the goal to evaluate human haptic perception in space and to validate real-time telerobotic operations from space to ground. During Haptics-1, on the 30th of December 2014, haptics was first used in the microgravity environment of the ISS. During Haptics-2, on June 3rd (21:00 CEST) 2015, for the first time in history, a handshake with force-feedback was performed between two humans, one located in space and on ground (Figure 3).

 

Interact Flight Experiment

"Interact" is a new space technology demonstration experiment that will be conducted from on-board the International Space Station in 2015. In Interact, astronauts will control an advanced robotic system on Earth, for the first time, with force reflection and in real-time. Astronauts will execute a sequence of remotely operated tasks with the Interact Robot on Earth, located in a real-life outdoor environment. 15) 16)

The Interact Robot (Interact Centaur) consists of a 4 drive 4 steer mobile platform, two seven degrees of freedom robot arms with grippers and of a camera and head system allowing to perceive the environment optical information. On-board, the crew will re-use the equipment provided by Haptics-1 experiment, i.e. a tablet PC and a small force reflective Joystick.

For the very first time, a real-time data-link will exist between a payload on ISS and a robotic system on Earth, simulating e.g. Mars orbiter to surface robotic remote operations as closely as possible.

The Interact experiment is headed by ESA's Telerobotics & Haptics Laboratory and is implemented in collaboration with TU Delft Robotics Institute (Interactive Robotics Theme) as well as with the support of the Delft DREAM hall.

 


 

Status of the Interact Flight Experiment

• Sept. 11, 2015: ESA astronaut Andreas Mogensen is proving to be an expert space driver after commanding two rovers from space this week. As part of ESA's METERON project, Andreas drove a second, car-sized rover from the International Space Station to repair a mockup lunar base in the Netherlands (Figure 14). 17)

- Andreas directly controlled the Eurobot rover in a simulated troubleshooting Moon scenario. A second rover was controlled by ESA's Columbus Control Center at DLR in Oberpfaffenhofen, Germany, allowing Andreas to focus on Eurobot and intervene if necessary.

- The new user interface for operating rovers from space ran perfectly as they worked in harmony at close quarters without any problems. The experiment went so well that it was completed in one continuous session rather than over the planned three sessions on two days.

- "This experiment demonstrated that we have the means to operate lunar robots from a spacecraft orbiting the Moon, a topic ESA is studying at the moment," says METERON system engineer Jessica Grenouilleau. Landing humans on a distant object is one thing, but they will also need the fuel and equipment to work and return to Earth when done. Sending robots to scout landing sites and prepare habitats for humans is more efficient and safer, especially if the robots are controlled by astronauts who can react and adapt to situations better than computer minds.

- METERON is developing the communication networks, interfaces and hardware to operate robots from a distance in space. The Space Station is being used as testbed, with astronauts controlling rovers on Earth. The demonstration showed that robots can perform valuable tasks and two can collaborate efficiently, even if they are controlled from long distances apart.

- The experiment proved the user interface works well and that ESA's ‘space internet' can stream five video signals to the Station – orbiting at 28 800 km/h some 400 km high – without significant delays and in good quality. If the link is lost for a moment, which often happens in space operations, the network adapts without a problem.

Meteron_Auto5

Figure 12: Photo of the Eurobot rover (left) with the surveyor rover during the Supvis-E experiment at ESA/ESTEC in the Netherlands (image credit: ESA) 18)

- The user interface was developed for ESA by a young team from Thales Alenia Space in Italy, working against tight deadlines to be ready in time for Andreas's mission.

Meteron_Auto4

Figure 13: Camera image at the ESA Columbus Control Center at DLR in Oberpfaffenhofen (image credit: ESA)

Meteron_Auto3

Figure 14: Andreas commanded the Eurobot rover from the ISS and drove the rover to repair a mockup lunar base in the Netherlands (image credit: ESA) 19)

Legend to Figure 14: One of the tasks was to lift up insulation to inspect underneath. The Meteron engineers had placed the winning drawing of a children's competition in Denmark to reveal the winner.

Note: The stay of Andreas was scheduled to last only for 10 days. In the morning of September 12, 2015 at 00:51 GMT, Andreas Mogensen, Soyuz spacecraft commander Gennady Padalka and Kazakh cosmonaut Aidyn Aimbetov landed in the steppe of Kazakhstan, marking the end of their missions to the International Space Station. - The trio undocked from the orbiting complex on 11 September at 21:29 GMT in an older Soyuz spacecraft, leaving the new vessel they arrived in for the Station crew. 20)

• Sept. 8, 2015: Putting a round peg into a round hole is not hard for someone standing next to it. But yesterday (Sept. 7), ESA astronaut Andreas Mogensen did this while orbiting 400 km up aboard the ISS (International Space Station), remotely operating a rover and its robotic arm on the ground. Andreas used a force-feedback control system (Haptics-1) developed at ESA, letting him feel for himself whenever the rover's flexible arm met resistance. 21)

- These tactile sensations were essential for the success of the experiment, which involved placing a metal peg into a round hole in a ‘task board' that offered < 0.15 mm of clearance. The peg needed to be inserted 4 cm into the hole to make an electrical connection.

- "We are very happy with today's results," said André Schiele, leading the experiment and ESA's Telerobotics and Haptics Laboratory. Andreas managed two complete drive, approach, park and peg-in-hole insertions, demonstrating precision force-feedback from orbit for the very first time in the history of spaceflight. "He had never operated the rover before but its controls turned out to be very intuitive. Andreas took 45 minutes to reach the task board and then insert the pin on his first attempt, and less than 10 minutes on his follow-up attempt, showing a very steep learning curve."

- The experiment took place at ESA/ESTEC in Noordwijk, the Netherlands, watched by a contingent of media as well as eager telerobotics engineers and center personnel.

- The real challenge was achieving meaningful force feedback despite the distance the signals had to travel: from the Station, hurtling around Earth at 8 km/s, up to satellites almost 36 000 km high and then down to a US ground station in New Mexico, via NASA Houston and then through a transatlantic cable to ESTEC – and back. It added up to a round-trip of more than 144 000 km. The inevitable two-way time delay approaches one second in length, but the team used sophisticated software based on a dedicated control method termed ‘model mediated control' to help compensate for this lag, incorporating sophisticated models to prevent the operator and arm from going out of sync.

- With time remaining, Andreas used a force-feedback joystick to differentiate between the stiffness of different springs, helping to measure the sensitivity of users up in orbit to very small degrees of feedback.

- The test was part of a suite of experiments Andreas is carrying out on his mission to the Space Station. Arriving last Friday (Sept. 4), he is due to return to Earth on Saturday (Sept. 12, 2015).

Meteron_Auto2

Figure 15: The Interact Rover at ESA/ESTEC, under the remote control of ESA astronaut Andreas Mogensen up on the ISS, during an afternoon of experiments on 7 September 2015 (image credit: ESA) 22)

Meteron_Auto1

Figure 16: Photo of the Interact Rover at ESA/ESTEC, under the remote control of ESA astronaut Andreas Mogensen up on the ISS, during an afternoon of experiments on 7 September 2015. Andreas was tasked with using the rover to place a metal peg into a hole with just 0.15 mm of clearance. 23)

Meteron_Auto0

Figure 17: Photo of the Interact Rover at ESA/ESTEC, inserting the metal peg under remote control of ESA astronaut Andreas Mogensen aboard the ISS, into a hole with just 0.15 mm of clearance. The peg needed to be inserted 4 cm to make an electrical connection (image credit: ESA) 24)

 


1) "METERON - Multi-Purpose End-To-End Robotic Operation Network," URL: http://esa-telerobotics.net/meteron

2) Neal Y. Lii, Daniel Leidner, André Schiele, Peter Birkenkampf, Ralph Bayer, Benedikt Pleintinger, Andreas Meissner, Andreas Balzer, "Simulating an extraterrestrial environment for robotic space exploration: The METERON-Supvis-Justin telerobotic experiment and the SOLEX proving ground," URL: http://meteron.dlr.de/wp-content/uploads/2015/05/P10_Lii.pdf

3) "Haptics-1 ... on it's way to ISS on ATV-5," Telerobotics & Haptics Lab, July 29, 2014, URL: http://www.esa-telerobotics.net/news/13/67/Haptics-1-on-it-s-way-to-ISS-on-ATV5

4) "Experiment setup," ESA, Feb. 2, 2014, URL: http://www.esa.int/spaceinimages/Images/2014/02/Experiment_setup

5) "Space-to-ground remote," Tim Peake's Principia Blog, March 15, 2016, URL: http://blogs.esa.int/tim-peake/2016/03/15/space-to-ground-remote/

6) André Schiele, "Post Experiment up-date 14/08/2015, 15:30, METERON Project," Live Blog, August 14, 2015, URL: http://esa-telerobotics.net/meteron/flight-experiments/haptics-2/live-blog-13th-august-2015

7) J. Rebelo,A. Schiele"Time Domain Passivity Controller for 4-Channel Time-Delay Bilateral Teleoperation," Haptics, IEEE Transactions on Haptics, Volume 8 , Issue 1, Oct. 16, 2014, pp: 79-89, ISSN : 1939-1412, Date of current version: 18 March 2015, DOI: 10.1109/TOH.2014.2363466

8) "Historic Handshake between Space and Earth," ESA, June 3, 2015, URL: http://www.esa.int/Our_Activities/Human_Spaceflight/Historic_handshake_between_space_and_Earth

9) "Haptics-2 — First-time teleoperation with force-feedback from Space done on 03. June 2015," ESA, June 3, 2015, URL: http://www.esa-telerobotics.net/gallery/Meteron-Experiments/Haptics-2/79

10) " First handshake with space," ESA, June 3, 2015, URL: http://www.esa.int/spaceinimages/Images/2015/06/First_handshake_with_space

11) "First Haptics-1 dataset collection completed," Telerobotics & Haptics Lab, January 8, 2015, URL: http://www.esa-telerobotics.net/news/17/67/First-Haptics-1-dataset-collection-completed

12) "Astronaut feels the force," ESA, January 5, 2015, URL: http://www.esa.int/Our_Activities/Space_Engineering_Technology/Astronaut_feels_the_force

13) "Interact Space Experiment- Online Fact Sheet," ESA, URL: http://esa-telerobotics.net/uploads/documents/Interact%20Brochure%20-%20Online.pdf

14) "Astronaut Andreas to try sub-millimeter precision task on Earth from Orbit," ESA, Aug.27, 2015, URL: http://www.esa.int/Our_Activities/Space_Engineering_Technology/Astronaut_Andreas_to_try_sub-millimetre_precision_task_on_Earth_from_orbit

15) André Schiele, "Interact Flight Experiments," ESA, Sept. 2015, URL: http://esa-telerobotics.net/meteron/flight-experiments/interact

16) "Telerobotics head introduces Interact rover," ESA, Sept. 1, 2015, URL: http://www.esa.int/spaceinvideos/Videos/2015/09/Telerobotics_head_introduces_Interact_rover

17) "Supervising two rovers from space," ESA, Sept. 11, 2015, URL: http://www.esa.int/Our_Activities/Human_Spaceflight/iriss/Supervising_two_rovers_from_space

18) http://www.esa.int/spaceinimages/Images/2015/09/Eurobot_rover_with_surveyor

19) "Repairing Lunar Lander reveals a surprise," ESA, Sept. 11, 2015, URL: http://www.esa.int/spaceinimages/Images/2015/09/Repairing_Lunar_Lander_reveals_a_surprise

20) "Andeas Mogensen lands after a busy mission on Space Station," ESA, Sept. 12, 2015, URL: http://www.esa.int/Our_Activities/Human_Spaceflight/iriss/Andreas_Mogensen_lands_after_a_
busy_mission_on_Space_Station

21) "Slam dunk for Andreas in space controlling rover on ground," ESA, Sept. 8, 2015, URL: http://www.esa.int/Our_Activities/Space_Engineering_Technology/Slam_dunk_for_Andreas_in_
space_controlling_rover_on_ground

22) "Andreas controlling rover," ESA, Sept. 8, 2015, URL: http://www.esa.int/spaceinimages/Images/2015/09/Andreas_controlling_rover

23) "Rover approaching task board," ESA, Sept. 8, 2015, URL: http://www.esa.int/spaceinimages/Images/2015/09/Rover_approaching_task_board

24) "Rover inserting peg," ESA, Sept. 8, 2015, URL: http://www.esa.int/spaceinimages/Images/2015/09/Rover_inserting_peg
 


The information compiled and edited in this article was provided by Herbert J. Kramer from his documentation of: "Observation of the Earth and Its Environment: Survey of Missions and Sensors" (Springer Verlag) as well as many other sources after the publication of the 4th edition in 2002. - Comments and corrections to this article are always welcome for further updates (herb.kramer@gmx.net).

Overview   Laboratory   Launch    Mission Status   Interact Space Experiment    References   Back to top