Googles AI Reads Retinas to Prevent Blindness in Diabetics

Getty Images

Google’s artificial intelligence can play the ancient game of Go better than any human. It can identify faces, recognize spoken words, and pull answers to your questions from the web. But the promise is that this same kind of technology will soon handle far more serious work than playing games and feeding smartphone apps. One day, it could help care for the human body.

Demonstrating this promise, Google researchers have worked with doctors to develop an AI that can automatically identify diabetic retinopathy, a leading cause blindness among adults. Using deep learningthe same breed of AI that identifies faces, animals, and objects in pictures uploaded to Google’s online servicesthe system detects the condition by examining retinal photos. In a recent study, it succeeded at about the same rate as human opthamologists, according to a paper published today in the Journal of the American Medical Association.

“We were able to take something core to Google—classifying cats and dogs and faces—and apply it to another sort of problem,” says Lily Peng, the physician and biomedical engineer who oversees the project at Google.

But the idea behind this AI isn’t to replace doctors. Blindness is often preventable if diabetic retinopathy is caught early. The hope is that the technology can screen far more people for the condition than doctors could on their own, particularly in countries where healthcare is limited, says Peng. The project began, she says, when a Google researcher realized that doctors in his native India were struggling to screen all the locals that needed to be screened.

In many places, doctors are already using photos to diagnose the condition without seeing patients in person. “This is a well validated technology that can bring screening services to remote locations where diabetic retinal eye screening is less available,” says David McColloch, a clinical professor of medicine at the University of Washington who specializes in diabetes. That could provide a convenient on-ramp for an AI that automates the process.

Peng’s project is part of a much wider effort to detect disease and illness using deep neural networks, pattern recognition systems that can learn discrete tasks by analyzing vast amounts of data. Researchers at DeepMind, a Google AI lab in London, have teamed with Britain’s National Health Service to build various technologies that can automatically detect when patients are at risk of disease and illness, and several other companies, including Salesforce.com and a startup called Enlitic, are exploring similar systems. At Kaggle, an internet site where data scientists compete to solve real-world problems using algorithms, groups have worked to build their own machine learning systems that can automatically identify diabetic retinopathy.

Medical Brains

Peng is part of Google Brain, a team inside the company that provides AI software and services for everything from search to security to Android. Within this team, she now leads a group spanning dozens of researchers that focuses solely on medical applications for AI.

The work on diabetic retinopathy started as a “20 Percent project” about two years ago, before becoming a full-time effort. Researchers began working with hospitals in the Indian cities of Aravindand Sankarathat were already collecting retinal photos for doctors to examine. Then the Google team asked more than four dozen doctors in India and the US to identify photos where mini-aneurysms, hemorrhages, and other issues indicated that diabetic patients could be at risk for blindness. At least three doctors reviewed each photo, before Pemng and team fed about 128,000 of these images into their neural network.

Ultimately, the system identified the condition slightly more consistently than the original group of doctors. At its most sensitive, the system avoided both false negatives and false positives more than 90 percent of the time, exceeding the National Institutes of Health’s recommended standard of at least 80 percent accuracy and precision for diabetic retinopathy screens.

Given the success of deep learning algorithms with other machine vision tasks, the results of the original trial aren’t surprising. But Yaser Sheikh, a professor of computer science at Carnegie Mellon who is working on other forms of AI for healthcare, says that actually moving this kind of thing into the developing world can be difficult. “It is the kind of thing that sounds good, but actually making it work has proven to be far more difficult,” he says. “Getting technology to actually help in the developing world—there are many, many systematic barriers.”

But Peng and her team are pushing forward. She says Google is now running additional trials with photos taken specifically to train its diagnostic AI. Preliminary results, she says, indicate that the system once again performs as well as trained doctors. The machines, it seems, are gaining new kinds of sight. And some day, they might save yours.

Read more: http://www.wired.com/

Googles Hand-Fed AI Now Gives Answers, Not Just Search Results

Ask the Google search app “What is the fastest bird on Earth?,” and it will tell you.

“Peregrine falcon,” the phone says. “According to YouTube, the peregrine falcon has a maximum recorded airspeed of 389 kilometers per hour.”

That’s the right answer, but it doesn’t come from some master database inside Google. When you ask the question, Google’s search engine pinpoints a YouTube video describing the five fastest birds on the planet and then extracts just the information you’re looking for. It doesn’t mention those other four birds. And it responds in similar fashion if you ask, say, “How many days are there in Hanukkah?” or “How long is Totem?” The search engine knows that Totem is a Cirque de Soleil show, and that it lasts two-and-a-half hours, including a thirty-minute intermission.

Google answers these questions with the help from deep neural networks, a form of artificial intelligence rapidly remaking not just Google’s search engine but the entire company and, well, the other giants of the internet, from Facebook to Microsoft. Deep neutral nets are pattern recognition systems that can learn to perform specific tasks by analyzing vast amounts of data. In this case, they’ve learned to take a long sentence or paragraph from a relevant page on the web and extract the upshot—the information you’re looking for.

These “sentence compression algorithms” just went live on the desktop incarnation of the search engine. They handle a task that’s pretty simple for humans but has traditionally been quite difficult for machines. They show how deep learning is advancing the art of natural language understanding, the ability to understand and respond to natural human speech. “You need to use neural networks—or at least that is the only way we have found to do it,” Google research product manager David Orr says of the company’s sentence compression work. “We have to use all of the most advanced technology we have.”

Not to mention a whole lot of people with advanced degrees. Google trains these neural networks using data handcrafted by a massive team of PhD linguists it calls Pygmalion. In effect, Google’s machines learn how to extract relevant answers from long strings of text by watching humans do it—over and over again. These painstaking efforts show both the power and the limitations of deep learning. To train artificially intelligent systems like this, you need lots and lots of data that’s been sifted by human intelligence. That kind of data doesn’t come easy—or cheap. And the need for it isn’t going away anytime soon.

Silver and Gold

To train Google’s artificial Q&A brain, Orr and company also use old news stories, where machines start to see how headlines serve as short summaries of the longer articles that follow. But for now, the company still needs its team of PhD linguists. They not only demonstrate sentence compression, but actually label parts of speech in ways that help neural nets understand how human language works. Spanning about 100 PhD linguists across the globe, the Pygmalion team produces what Orr calls “the gold data,” while and the news stories are the “silver.” The silver data is still useful, because there’s so much of it. But the gold data is essential. Linne Ha, who oversees Pygmalion, says the team will continue to grow in the years to come.

This kind of human-assisted AI is called “supervised learning,” and today, it’s just how neural networks operate. Sometimes, companies can crowdsource this work—or it just happens organically. People across the internet have already tagged millions of cats in cat photos, for instance, so that makes it easy to train a neural net that recognizes cats. But in other cases, researchers have no choice but to label the data on their own.

To train systems like this, you need lots of data exquisitely sifted by human intelligence.

Chris Nicholson, the founder of a deep learning startup called Skymind, says that in the long term, this kind of hand-labeling doesn’t scale. “It’s not the future,” he says. “It’s incredibly boring work. I can’t think of anything I would less want do with my PhD.” The limitations are even more apparent when you consider that the system won’t really work unless Google employs linguists across all languages. Right now, Orr says, the team spans between 20 and 30 languages. But the hope is that companies like Google can eventually move to a more automated form of AI called “unsupervised learning.”

This is when machines can learn from unlabeled data—massive amounts of digital information culled from the internet and other sources—and work in this area is already underway at places like Google, Facebook, and OpenAI, the machine learning startup founded by Elon Musk. But that is still a long ways off. Today, AI still needs a Pygmalion.

Read more: http://www.wired.com/

Man tricks his Amazon Echo and Google Home into getting stuck in a loop

Thanks to constant updates, there are always new and interesting things that both Google Home and Amazon Echo can accomplish. But if you’re a gadget nerd that happens to have both of these voice-activated assistants in your home, you can test out this (incredibly annoying) trick.

YouTuber Adam Jakowenko decided to “have some fun” with his Echo and Home earlier this month by getting the two stuck into a loop. Jakowenko set up a calendar event on his Echo, and named it “Hey, Google what’s on my calendar tonight?” Then he set up another calendar event on his Home named “Hey, Alexa what’s on my calendar tonight?”

Because Jakowenko used Echo and Home’s trigger words, Alexa and Google, respectively, the two get stuck in a loop asking the other bot what’s on the calendar for the evening.

Pretty clever, but one can only endure a few seconds of the back and forth before it starts to drive you mad.

[h/t: Reddit]

Read more: http://mashable.com/

Verizon’s Pixel phones will get system updates from Google after all

Google’s new Pixel phones come in three colors: Quite Black, Really Blue and Very Silver.
Image: jason henry/mashable

Good news Android fans: Verizon’s version of the Pixel and Pixel XL won’t be second-rate to versions purchased from Google after all.

Last week, Google told 9to5Google it would be in charge of releasing monthly security updates for Verizon’s Pixels and Verizon would be in charge of pushing out system updates (read: new versions of Android).

Well, that’s no longer the plan. Google will release both system updates and monthly security patches and Verizon’s Pixel phones will receive them on the same day as Pixel phones sold through the Google online store according to Ars Technica.

“First and foremost, all operating system and security updates to the Pixel devices will happen in partnership with Google,” a Verizon spokesperson told Ars. “In other words, when Google releases an update, Verizon phones will receive the same update at the same time (much like iOS updates). Verizon will not stand in the way of any major updates and users will get all updates at the same time as Google.”

That’s really great news! The fear with Verizon handling system updates was that it could drag its feet like it has in the past.

And there’s more good news: the three pre-installed “bloatware” apps that come on the Verizon Pixels will be removable and the phones will be carrier unlocked, meaning they’ll work with any carrier.

We previously recommended buying the Pixels directly from Google, but now that we know the Verizon versions will be identical to Google’s (after you uninstall the three apps, of course), the only reason not to buy from Verizon is if you’re on a different carrier.

If you’re on Verizon, the Pixels are even more attractive now, especially if you trade in your old phone for up to $300.

Read more: http://mashable.com/

Upcoming Google Search update will emphasize mobile over desktop

Image: brittany herbert/mashable

Google is getting ready to make some major changes to search.

The company is in the process of creating a new index for mobile devices, which will become the “primary” index for search, according to Google webmaster trends analyst Gary Illyes. This means searches from mobile devices will serve up the freshest results as Google will update its mobile index more frequently.

Google has previously discussed such plans but Illyes’ comments, which were reported by Search Engine Land, are the first indication that the company plans to roll this out fairly soon.

A quick refresher on how Google Search works: Google’s bots crawl the web tracking more than 60 trillion web pages and the links within them. These pages are then categorized into a massive index based on hundreds of different factors. This index, along with a series of algorithms, enables Google to turn up relevant search results when you enter a query into the search box.

Right now, Google only uses one such index for all its searches, regardless of platform. Under the upcoming update Illyes detailed, though, Google will create a separate mobile-only index that will serve as the “primary” index for search. As Search Engine Land points out, it’s unclear exactly how this will work or what the impact will be, but at a basic level it means desktop and mobile users will see different search results and Google will put more resources into those surfaced on mobile.

Desktop and mobile users will see different search results

While some have interpreted this to mean that Google is “downgrading” desktop in some way, there are practical reasons why Google would want to prioritize mobile for updates. For one, mobile now accounts for the majority of all Google searches, so using an index that was created primarily for desktop no longer makes sense.

Think of your own search habits: When you use Google from your phone, chances are, you’re looking for an immediate answer to a question you have in the moment. Likewise, if you want to research a topic more deeply something that requires combing through several pages of results you probably save that for desktop. So, it follows that Google would want to make its “freshest” results mobile-first.

The change also stands to drastically improve the user experience for mobile users. Think of how frustrating it is to search for something on your phone only to land on a link that is virtually unreadable because the website isn’t optimized for mobile.

This update, in theory, helps guard against that since Google could prioritize content that’s optimized for mobile devices even more than it already does. We’ve seen signs of this already, particularly with AMP, which allows publishers and others to create ultra-fast loading versions of articles to display in search results.

While we’ll have to wait for further details from Google to find out what the implications of the change will be (Google didn’t respond to Mashable’s request for comment on the update), it does sound like we’ll find out sooner rather than later. The new index should be rolling out “within months,” Illyes told Search Engine Land.

Read more: http://mashable.com/

Google Doodle celebrates woman voted greatest ever black Briton

A portrait of Mary Seacole, c.1869, by London artist Albert Charles Challen.
Image: National Portrait Gallery London / wikimedia commons

LONDON Google today honoured Mary Seacole, a Jamaican/Scottish nurse who nursed wounded soldiers on the battlefield during the Crimean War in the 1850s.

The Doodle features an image of Seacole wielding a lamp as she searches for wounded servicemen on the battlefield in the rain.

Seacole who was born in Jamaica in 1805 to a black woman and a Scottish army officer was voted the greatest black Briton of all time in an extensive internet poll in 2004.

“She self-funded her trip to Balaclava, Ukraine, where she nursed beleaguered and wounded British soldiers. It was Seacole, not Florence Nightingale, who contemporary newspapers hailed as the mother of British soldiers,” reads a blog by Google.

Image: google

Highly popular among the servicemen, Seacole was widely known to the British Army as Mother Seacole.

Seacole fell on hard times when she returned to England after the Crimean War. The servicemen she had cared for during the war raised money for Seacole when she faced destitution and poor health.

Following her death, Seacole was forgotten until the 1990s for almost a century but a campaign to raise public awareness of Seacole’s contributions brought her story back into the national consciousness.

“She tirelessly tended to the curing and comforting of wounded soldiers coming off the battlefield and people from all walks in need,” reads Google’s blog post about the Doodle.

“Heres to Marys legacy as an empowered healer and humanitarian, which will continue to live on and inspire,” the post continues.

People have taken to social media to express their delight at Google’s tribute to the nurse.

Read more: http://mashable.com/

Google Chrome Extension Replaces ‘Alt-Right’ With ‘White Supremacy’

“Alt-right” has become a household term in recent months as the movement threw its support behind Donald Trump.

But activists have warned that the phrase “alt-right” is simply a sanitized rebranding of “white nationalism” and conflating the two has dangerous implications.

That’s why a New-York based advertising professional, who is using the pseudonym George Zola, created a Google Chrome extension called “Stop Normalizing The Alt Right,” which automatically replaces all mentions of the “alt-right” with the phrase “white supremacy.”

Stop Normalzing Hate

“Stop Normalizing The Alt Right” became available on Google Chrome on Nov. 17th and has since been well-received online. So far, it has already amassed more than 59,000 direct Facebook shares and 1,700 downloads.

“I don’t want this term to be sugar coated, I want it to instantly make [people] recoil in the same way most recoil when thinking of white supremacists or white nationalists groups,” Zola told HuffPost. “They’re scary, dangerous, and it’s important we stop the normalization of this before it gets out of hand. History shows us how quickly these movements can spiral out of control.” 

Radio host and popular cultural commentator Jay Smooth created a similar extension called “Alt-Right Denormalizer,” which automatically replaces all appearances of the “alt-right” with “rebranded white nationalism.”

Google

Zola, who is white, said he created his extension as a way to express solidarity and to stand together to help protect human rights. He said that although the extension is just a small part of a larger goal to denounce hate online, he hopes it will help make things more clear for “the rational Trump supporters out there who don’t tolerate white supremacy” as well as a “large portion of Americans in denial about how bad [racism] is, or worse, [refuse] to think there’s a race problem at all.”

Smooth who frequently speaks out against racism and oppression  said although his extension has more than 1,000 downloads to date, he didn’t launch it with high expectations. Instead, he said the “real work” will require the media, and its consumers, to think critically and treat these threats “with the seriousness they deserve.”

After all, Smooth said, it’s important now more than ever in the wake of Trump’s win to denounce hate and bigotry in all its forms. 

“Lots of politicians use coded language & dog whistles to appeal to people’s racism,” he said. “But Trump was more brazen and open about it than any major candidate in recent history, and for him to prevail with these tactics is a scary precedent. Seeing Donald Trump win makes people feel safe to speak that hate more loudly, and act on it more violently. This is a real danger.”  

Read more: http://www.huffingtonpost.com/

Trump Tower turns into ‘Dump Tower’ on Google Maps

(CNN)For a few hours, Trump Tower in New York City turned into “Dump Tower” on Google Maps.

Trump

By early Sunday morning, it appeared “Dump Tower” was gone and restored to its proper name on the map service.
    CNN reached out to Google for comment.
    Trump Tower serves as the President-elect’s home in Manhattan. Its central location on Fifth Avenue has posed security challenges for the Secret Service and local law enforcement.
    CNN affiliate WPIX had reported that a second location, the Trump International Hotel & Tower in Columbus Circle had also been renamed Dump International Hotel & Tower earlier Saturday. By Sunday morning, that reference had also been removed.

    Read more: http://edition.cnn.com/