
It’s an exciting time for space research as NASA eyes a future of longer and more ambitious space missions to the Moon, Mars and beyond through programs like the Artemis program. But venturing deeper into the cosmos also means astronauts will be farther from Earth—and therefore from medical professionals, equipment and real-time advice they might need.
To bridge that gap, NASA is partnering with Google to test an A.I. medical assistant for long-duration missions. Known as the Crew Medical Officer Digital Assistant (CMO-DA), the multi-modal tool is designed to diagnose conditions, analyze symptoms and recommend treatments when contact with Earth-based medical experts isn’t possible.
“Supporting crew health through space-based medical care is becoming increasingly important as NASA missions venture deeper into space,” said Jim Kelly, vice president of federal sales for Google’s public sector arm, in a recent blog post. As the agency prepares to expand its horizons, it’s also exploring “whether remote care capabilities can deliver detailed diagnoses and treatment options if a physician is not onboard or if real-time communication with Earth is limited,” he added.
“Currently, Low Earth Orbit (LEO) missions such as those to the [ISS] have frequent and relatively robust connectivity,” said David Cruley, customer engineer for Google’s public sector division, in a statement to Observer. “However, as distance from the Earth increases, so will latency and communication gaps.”
Future NASA spaceflights will stretch far longer than most of the agency’s current missions, which are primarily trips to the International Space Station (ISS) lasting around six months. Astronauts bound for the ISS are assigned flight surgeons (physicians with specialized training in space medicine), can regularly communicate with people on Earth, have access to a pharmacy and extensive medical equipment, and can return home quickly if urgent care is needed
These safeguards have helped NASA manage unexpected medical issues in the past. In 2019, for example, an astronaut on the ISS discovered a blood clot was able to conduct an ultrasound guided by Earth-based radiologists, take medication stocked on the station, and receive resupplies as needed. The astronaut was asymptomatic soon after returning to Earth.
How does an A.I. space doctor work?
CMO-DA will draw on spaceflight literature, natural language processing and machine learning to provide real-time medical support. Built on Google Cloud’s Vertex A.I. platform, it was trained on open-source data covering the 250 most common medical issues encountered in space.
Early trials have produced promising results. When tested on scenarios involving ankle injury, flank pain and ear pain, physicians scored the assistant’s diagnostic accuracy at 88 percent, 74 percent and 80 percent, respectively, according to a NASA presentation outlining the project.
The initiative is still in its early stages, with NASA and Google focusing on further testing and refining the system through collaboration with medical doctors. “A key goal is to make the A.I. more ‘situationally aware’ of space-specific conditions,” said Cruley, noting that Google aims to ensure future versions can account for the effects of microgravity on the human body and integrate data from onboard devices such as ultrasound imaging.
CMO-DA is meant to support—not replace—human experts, according to NASA. The tool will “assess health, provide real-time diagnostics and guide treatment until a medical professional is available,” the agency said, describing it as having “the potential to ultimately assist agency flight surgeons.”
Its uses won’t be limited to space. “The idea of an A.I. digital health assistant is portable to Earth-based applications,” said Cruley. “Lessons learned could be applied to providing quality medical care in remote or underserved areas with limited access to healthcare professionals.”
<