How IBM put Watson in space

CIMON says all sorts of useful things, thanks to IBM’s cognitive computing platform

CIMON, powered by IBM Watson.

Space is a tricky thing. It may be vastly, hugely, mind-bogglingly big, as Douglas Adams once wrote, but the sophisticated tin cans that used to protect humans in space are relatively cramped. And the cost to put someone in them and keep them alive is very high indeed.

As a result, astronaut time on the International Space Station (ISS) is at a premium. Enter CIMON: An effort to test the potential of an AI-based companion to make astronauts more productive.

The first iteration of CIMON — Crew Interactive Mobile CompanioN — had its ISS debut in 2018. Although very much falling into the category of technology demonstrator — the floating 3D-printed sphere, whose smarts are courtesy of IBM Watson, is now safely back on Earth — a second version of CIMON is being prepared.

The original idea for CIMON came from Airbus Defence and Space in Germany. The company approached Germany’s space agency — the German Aerospace Centre (DLR) — with the concept and was met with both a warm reception and funding, Matthias Biniok, IBM’s lead Watson architect for Germany, Switzerland and Austria, told Computerworld.

Airbus was seeking a “strong partner” to help with the AI side of the project and went to IBM because of its experience in the area, thanks to its Watson business, Biniok said in an interview on the sidelines of IBM’s Cloud Innovation Exchange in Sydney. He said another factor was the strength of IBM’s data privacy regime.

The project kicked off in late 2016. The long-term goal is for CIMON to be able to act as “an astronaut assistant, as a real companion,” Biniok said. The initial stepping stones to realising that vision have been more modest but still complex to realise: Making CIMON able to support experiments, and fitting it out with video documentation capabilities.

CIMON is about 32 centimetres in diameter, making it a little larger than a basketball, and it weighs about five kilograms. It’s fitted with 14 fans for propulsion, as well as ultrasonic and infrared sensors to aid its autonomous navigation round the ISS. It’s fitted with a stereo camera to help with navigation, as well as a high-resolution camera that supports facial recognition, and two side cameras for videos and still photography.

It has a high-definition directional microphone to faciliate voice recognition, as well as an array of eight other microphones that can help identify the direction of a sound.

“So it’s using all kinds of sensors to actually understand the environment, and then based on this understanding, he knows what to do — based on, of course, the tasks that he has,” Biniok said.

Those tasks are, essentially, to make life easier for astronauts. “Imagine you're an astronaut on the International Space Station,” Biniok said. “You have to do lots of experiments every day.”

Typically the procedure for each experiment will be set out in a 50-100 page PDF document accessed via a wall-mounted laptop.

“So as soon as you have to do this experiment, the astronauts need to float towards the laptop, they need to open the PDF document, they need to scroll down, [figure out] what they want to do, float back to the experiment station, and start working on it,” Biniok said. “If they forget something, they need to float back and it takes a lot of time.”

Astronauts are not cheap: NASA’s price list offers ISS expedition crew member time at US$17,500 per hour.

Something like CIMON can potentially save significant amounts of money simply by shaving down the time taken to conduct an experiment through letting a crewmember know what the next step of an experiment is.

“You can save a lot of money with an intelligent assistant,” Biniok said. “Instead of searching for it, you get your information faster.”

An initial idea to address the issue was strapping a tablet device to an astronaut: That saves a bit of floating but it still requires searching and is not a hands-free solution.

“With CIMON, you can just ask,” Biniok said: “‘CIMON, what's the next step?’ Or, ‘What kind of tool do I need to use right now in this specific context?’ Or something like, ‘Why do I need to use Teflon right now and not any other material?’ CIMON is able to answer all these questions based on the documentation.”

CIMON’s second key use case – mobile video documentation – is aided by its ability to autonomously navigate the station. The device is fitted with an Airbus-built guidance, navigation and control system.

“You can actually just tell CIMON, ‘CIMON go over there to experiment station three, turn 30 degrees to the right and start recording with your left camera’, for example, or you can just tell it ‘CIMON come here and start recording what I'm doing right now,’” Biniok  said

That makes documenting experiments significantly easier, he said.

Behind the scenes, IBM Watson Speech to Text is used to process crew directions to CIMON, while Watson Assistant is used to understand the intention. “So when I say something like, ‘CIMON come here’, then I would need to understand, ‘First of all he said CIMON, so this this task is directed to me’, and then I would need to understand, ‘come here’,” Biniok said.

“This kind of information, natural language processing, is what we’re doing with Watson Assistant. And then on top of that, of course, sometimes you need to answer something. Sometimes it's enough if you just fly there or start recording, but sometimes you need to say something like, ‘Okay I'm on my way’ or something like that, and that's where we're using Watson Text to Speech.”

Although the environment that CIMON was designed to operate in is (figuratively but not quite literally) a million miles from the typical enterprise, it is not that far removed from terrestrial use cases for AI. Woodside, for example, has been a high-profile Watson user for a number of years.

The resources company’s chief executive, Peter Coleman, last month told IBM’s Cloud Innovation Exchange event that the company had around 25 million documents loaded into Watson. According to Coleman, some 80 per cent of Woodside employees interact with Watson on a daily basis.

Earlier this year, Gartner predicted that by 2021 up to a quarter of “digital workers” will use a “virtual employee assistant” (VEA) in a daily basis, up from 2 per cent in 2019. The analyst firm is also expecting that by 2023, 25 per cent of employee interactions with applications will be based on voice.

“We believe that the popularity of connected speakers in the home, such as the Amazon Echo, Apple HomePod and Google Home, will increase pressure on businesses to enable similar devices in the workplace,” said Gartner vice president Van Baker said in January.

“While there are limitations on the actions that VPAs [virtual personal assistants] can perform, employees will readily expand the actions allowed as capabilities improve.”

Biniok said that CIMON also uses Watson’s Visual Recognition API, and open source facial recognition software. CIMON was able to, theoretically at least, employ Watson Tone Analyzer to understand sentiment, although that was not enabled during the device’s initial foray into space.

It is, however, expected to be activated in the next version of CIMON, allowing the mood of astronauts to be assessed and analysed by psychologists.

The processing to make CIMON work is done off-station: It relies on a connection from the ISS to a NASA satellite and from there to a ground station in Switzerland. From there, data is transmitted to the IBM Cloud data centre in Frankfurt.

“When I built the first prototype, I had a latency of eight seconds,” Biniok said. “You don’t want to wait eight seconds an answer.”

IBM managed to cut that latency down to two seconds. That is not ideal, Biniok said, but “it's okay; it's reasonable to work with.”

At the end of 2018, CIMON was involved in its first experiments. It worked well, with only “minor complications,” Biniok said. Those complications were more amusing than serious, he added.

An example was CIMON not entering ‘crew mode’. Normally, directions can be given to CIMON simply through speaking to it. However, when there are multiple people in the room, crew mode will require directions to be prefaced with “CIMON” in a similar fashion to using Amazon’s Alexa or a Google Home device. “One time, I think he understood some kind of complaint and was like, ‘Oh I don't be so mean to me’. The other time, he understood something about lunch, and just said, ‘Yeah, well I'm hungry,’” Biniok said.

“The astronauts were very happy to work with CIMON,” Biniok added, and they were keen to conduct more work with it but it was only on the ISS for a limited period.

Biniok said that IBM worked on making conversation with CIMON feel as natural as possible, and made an effort to give it an appropriate personality. In particular IBM wanted to make sure it would work well with German astronaut Alexander Gerst, who employed the device for multiple experiments during its stay on the ISS. Biniok’s team included a personality expert who worked on making CIMON an ISTJ: Supportive but something of an introvert, he said.

For other, terrestrial Watson projects, IBM will often take a different approach, Biniok said: For example, a European bank that targets a younger market wanted a Watson-based chatbot that was “more open, more proactive, ‘more young’ than the typical chatbot you might see.”

“Based on the use case, you always need an understanding of your target that you're talking with to give them the best user experience,” he said.

The original CIMON will reside in the German Historical Museum, but work on a second, tweaked version of the device is underway.

It will have new hardware and slightly different software. Testing tone analysis will be a focus, and it will also have some different speech and movement capabilities. CIMON is the first device on the ISS that incorporates autonomous free flying, Biniok said, “so that’s really something that we need to test extensively,” he said.

There have been simple enhancements made to dialogue and conversation flows, he added. It’s “more like CIMON 1.1” rather than “2.0” -- although discussions about a “CIMON 2.0” are underway.

There are “lots of things” a bigger CIMON upgrade could potentially deliver for the ISS crew, Biniok said: “When the astronauts are sleeping, sometimes you need to check the air quality, and you need to sometimes check if something is blinking or if there is an anomaly, for example. These kind of things CIMON could check by just flying around the space station during the night.”

One additional idea is fitting a future CIMON device with a small arm that could, for example fetch a tool for an astronaut. Another possibility being considered is having the underlying AI services running on the ISS rather than on the ground, Biniok said, though that and incorporating an arm into CIMON are not on the immediate agenda.

IBM's Matthias Biniok with CIMON. IBM

IBM's Matthias Biniok with CIMON.

Copyright © 2019 IDG Communications, Inc.

8 simple ways to clean data with Excel
Shop Tech Products at Amazon