DeepMind: How are companies using it?

DeepMind has attracted mixed headlines since Google paid £400 million for the UK-based AI startup in 2014. The awe inspired by DeepMind's AlphaGo system defeating Go world champion Lee Sedol was soon tempered by criticisms of its controversial access to personal health records, which the ICO ruled had breached the Data Protection Act, and the concerns grew when Google announced it would be taking control of DeepMind Health.

Trust has wavered ever since, but the AI developed in the DeepMind lab in King's Cross, London continues to lead the world and is finding its way into some intriguing applications.

Read next: Google DeepMind: the story behind the world's leading AI startup

Improving wind farm efficiency

Improving wind farm efficiency

DeepMind boosted the value of the energy produced by Google's fleet of wind farms in the central United States by predicting their output 36 hours before the energy is generated.

The company trained a neural network on local weather forecasts and historical turbine data so it could recommend the optimal hourly delivery commitments to the power grid a day in advance.

DeepMind claims this has already increased the value of Google's wind energy by roughly 20 percent and intends to further refine the model to make the unpredictable energy source more commercially viable.

Optimising app recommendations in Google Play

Optimising app recommendations in Google Play

DeepMind has helped to personalise app recommendations in Google Play by using machine learning to find the apps that users are more likely to use and enjoy based on their previous downloads and the context in which they were used.

The work aims to attract paying customers to the Google Play store, an example of how Google can commercialise the technology produced by DeepMind.

Detecting acute kidney injury in NHS patients
iStock

Detecting acute kidney injury in NHS patients

DeepMind developed a patient safety alert app called Streams that reviews test results for signs of sickness and sends staff instant alerts if an urgent assessment is required. The app also helps clinicians to quickly check for other serious conditions such as acute kidney injury and displays results of blood tests, scans, and x-rays at the touch of a button.

The project is part of DeepMind Health, which launched in February 2017 with the goal of using machine learning systems to improve healthcare treatment and digitise medical processes.

Streams was first used at Royal Free NHS Trust in north London to detect acute kidney injury by analysing blood tests. It was later rolled out in a number of healthcare organisations, including Yeovil District Hospital NHS Foundation, Taunton and Somerset NHS Foundation Trust and Imperial College Healthcare NHS Trust.

Nurses said the app saved them up to two hours a day, but the data-sharing agreement soon fell foul of privacy laws. In July 2017, the Information Commissioner's Office (ICO) ruled that the Royal Free had failed to comply with the Data Protection Act when it provided patient details to DeepMind. Among its shortcomings was a failure to adequately inform patients that their data would be used in the trial.

The issue was resolved by the ICO instructing the Royal Free to sign a formal undertaking to ensure future compliance, but further concerns followed when DeepMind announced that team behind Streams was joining Google.

DeepMind Health will now work under the newly formed Google Health led by former Geisinger CEO David Feinberg, as part of a strategy to integrate Google's various healthcare projects.

Privacy campaigners argue that the move breaks DeepMind's promise that the personal data acquired by Streams acquired would not be used by Google. CNBC reported that DeepMind's independent review board will now likely to be scrapped as the company aims to expand the service beyond the UK.

DeepMind claimsthat the support of Google will help turn the app into an AI-powered assistant for all nurses and doctors that combines the best algorithms with intuitive design.

Breast cancer diagnosis at Imperial College London

Breast cancer diagnosis at Imperial College London

DeepMind is collaborating with Google's AOI health research team and a group of research institutions, led by the Cancer Research UK Centre at Imperial College London to improve the detection of breast cancer.

The disease kills 500,000 people around the world every year, partly due to the challenges of detection and diagnosis. The mammogram scans used today fail to spot thousands of cancers every year and often lead to false alarms from overdiagnosis. DeepMind believes that machine learning could improve this.

DeepMind researchers will analyse historic de-identified mammograms from around 7,500 women to assess whether machine learning tools identify signs of cancerous tissue more effectively than mammograms. The exploratory work has the potential to transform breast cancer testing.

Predicting patient deterioration in military veterans

Predicting patient deterioration in military veterans

DeepMind is working with the US Department of Veterans Affairs to predict patient deterioration by analysing patterns from around 700,000 historical medical records.

The project aims to determine if machine learning can identify the risk factors for patient deterioration and predict its onset to improve the treatment of a problem that causes an estimated 11 percent of all in-hospital deaths.

The research team will look for ways to improve the algorithms used to detect Acute Kidney Injury (AKI), a common cause of patient deterioration and an area in which DeepMind has developed expertise.

Personalising battery management and screen brightness on Android devices

Personalising battery management and screen brightness on Android devices

DeepMind for Google has created two new features for Android: Adaptive Battery, which predicts which apps you'll need next and thereby boosts battery performance, and Adaptive Brightness, which learns your brightness preferences in different surroundings to personalise your screen settings.

The features will be available later this year for devices running the Android P operating system.

Identifying eye diseases Moorfields Eye Hospital
iStock

Identifying eye diseases Moorfields Eye Hospital

DeepMindpartnered with Moorfields Eye Hospital to develop a machine learning-based system that can recognise sight-threatening eye diseases from a digital scan of the eye. While the company's initial collaboration with the NHS at the Royal Free focused on patient care, this is the first one that is entirely dedicated to medical research.

The programme involves the analysis of more than one million anonymous eye scans to produce an algorithm that detects early signs of emerging eye conditions and increases the speed of diagnosis.

A Moorfields ophthalmologist called Pearse Keane is credited with coming up with the idea. He contacted the company after seeing their technology help computers learn how to play video games, and believed it could be applied to images of the eye.

Read next: Google's DeepMind promises openness as it begins public consultation over healthcare plans

Treating head and neck cancers at University College London Hospital
iStock

Treating head and neck cancers at University College London Hospital

DeepMind's has also worked with the NHS aims to improve the treatment of head and neck cancers. Before radiotherapy can begin, clinicians currently spend around four hours preparing a detailed map of each patient's body to avoid targeting the delicate surrounding tissue that can be damaged in treatment. The information is then fed into a radiotherapy machine to target the cancer without harming the healthy tissue.

Researchers at DeepMind believe machine learning can cut this time down to an hour. The team is analysing anonymised scans from UCLH patients to develop a radiotherapy segmentation algorithm that can automate parts of the process. They hope to eventually apply the algorithm to other parts of the body.

Generating Google Assistant voices
iStock

Generating Google Assistant voices

While healthcare technology dominates the current DeepMind developments, its machine learning systems have also been extended to audio analysis. Talking machines have a long history in science fiction and are gaining mass adoption through products such as Siri, but the gap between computer and human speech remains substantial.

DeepMind has developed a text-to-speech system that can close that gap by more than 50 per cent. Known as WaveNet, it uses a neural network to replicate the sound waves produced by human speakers rather than copying the language that they use. The technology is now used to generate the Google Assistant voices for US English and Japanese across all platforms.

Cutting electricity bills at Google
iStock

Cutting electricity bills at Google

Google uses machine learning in a range of its own products, including Maps, Gmail, YouTube and Android, and believes that DeepMind technology could enhance search, robots and the Internet of Things. A DeepMind agent has already matched human performance at 49 Atari games including Pac-Man and Space Invaders, and became the first computer programme to win a game of Go against a professional player.

Google has even used DeepMind to cut the electricity bills at its huge data centres. DeepMind algorithms predicted the air conditioning required to cool the vast number of servers powering its services, which vary depending on user demand. The results were efficiency savings of 40 percent in the cooling systems, and a 15 percent reduction to the overall energy used in the data centres.

Copyright © 2019 IDG Communications, Inc.