If you’re into technology, I highly recommend Benedict Evans’ weekly newsletter. In this weeks email you’ll find stories like, The Inside Story Behind Pebble’s Demise, Google’s blog post titled Assessing Cardiovascular Risk Factors with Computer Vision, a story about how Intel Fights for its Future by Jean-Louis Gassée, and Pete Warden’s story of Why Low-Power NN Accelerators Matter. All good reads.
AndroidAuthority.com has a good, detailed article titled, How Google is powering the world’s AI.
Several media outlets are reporting that Google is introducing their Neural Networks API in developer preview of Android 8.1. TechCrunch has a well-written article that includes this:
“The big highlight here is the new Neural Networks API, which brings hardware-accelerated inference to the phone for quickly executing previously trained machine learning models. Bringing these calculations to the edge can bring a lot of utility to the end user by reducing latency and load on the network, while also keeping more sensitive data on-device.”
Here’s a Facebook post on how using AI has improved their language translation system by 11%.
From a Phys.org article titled The thermodynamics of learning:
“The greatest significance of our work is that we bring the second law of thermodynamics to the analysis of neural networks,” Sebastian Goldt at the University of Stuttgart, Germany, told Phys.org. “The second law is a very powerful statement about which transformations are possible — and learning is just a transformation of a neural network at the expense of energy. This makes our results quite general and takes us one step towards understanding the ultimate limits of the efficiency of neural networks.”
This Google blog post explains how they put “deep learning” on Android phones. A short blurb:
“Today we announced that the Google Translate app now does real-time visual translation of 20 more languages. So the next time you’re in Prague and can’t read a menu, we’ve got your back. But how are we able to recognize these new languages? In short: deep neural nets.”