While it’s easy to imagine asking a robot for the weather forecast, it’s a little more difficult to imagine asking a robot to diagnose a disease or prescribe a treatment plan. But robots may soon play an important role in our healthcare. In San Francisco, two research teams are exploring how AI-powered robots might impact senior care, introducing the idea of “connected aging.” Think robots that can remind you to take medication or suggest you go for a walk. While these solutions are undeniably helpful, they introduce a set of questions about what we should and shouldn’t train machines to do. In a thought-provoking long-form piece, Pulitzer Prizewinning Author, Siddhartha Mukherjee, answers a number of these questions as they relate to the use of AI for disease diagnosis. The question, he surmises, isn’t about what machines will or will not be capable of; rather, what should be automated, and what shouldn’t?

The Business of Healthcare: Diagnostic Testing Devices

For most people, the most accessible way to receive or track a diagnosis is a good-old-fashioned visit to the doctor or the clinic. Increasingly, home diagnostic devices and lab developed tests (aka when you mail your samples to a lab, as opposed to receiving a result on the spot) are making it easier to receive highly accurate test results at a much lower cost. As a result, a number of new diagnostic testing device companies are rethinking diagnosis to make it more about continuous monitoring, and less about that once-in-a-while visit to the clinic. While startups in the diagnostics space have traditionally faced resistance from investors, these new technological developments might signal a changing tide for the diagnostics industry.

Noteworthy Outcomes: Translational Applications

How ubiquitous will mobile platforms become in research and patient care? Updates on two new projects that explore the potential of mobile health (mHealth) are widening our understanding of this area of healthcare technology. On the healthcare side of the equation, The Asthma Mobile Health Study, recently published in Nature Biotechnology, monitored the symptoms of a large cohort of asthma sufferers using a smartphone app. On the technology side of the equation, Google recently introduced a machine learning approach called Federated Learning, which “enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device.”

The last few weeks have also witnessed some exciting developments in the application of data science to healthcare with the release of The Drug Repurposing Hub and The Stem Explorer. The Drug Repurposing Hub is “a next-generation drug library and information resource” that could accelerate the development of drugs at a global scale. The Allen Institute recently released a stunning new “portal into the human cell” that offers scientists an unprecedented resource for learning stem cell types, which could lead to potentially better diagnostics down the road.

Encounters with Data: Emojis, Doodles, and Deep Learning

Emojis are serious business. That’s what the Instacart team is saying after using emojis in a deep learning program they developed to predict the fastest sequence of items for shoppers to pick up in a grocery store.

Ever wish your lazy sketches looked a bit more polished? A new AI tool from Google guesses your doodle and suggests the right drawing. It’s a fun, useful, and clever application of deep learning.