Computational Awareness on Your Mobile and Wearables


(T) Researchers at Hiroshima City University have developed a 17 grams wearable for the ear called the Earclip-type Wearable PC. The wearable is controlled through inputs from its infrared sensors that monitor the user’s ear movements that occur along with various eye and mouth motions. It has a microchip and storage and can download software. It includes GPS, Bluetooth, a compass, a gyro sensor (which senses angular velocity), a barometer, a speaker, and a microphone. Other sensors can also be installed to monitor a user’s vital health signs. If the wearable finds a health problem, it could notify the user’s family members. An accelerometer could identify if an older user falls, which would prompt the computer to call both the relatives and an ambulance.

Mobile computing has entered an inflection point where mobile apps, smartphones, tablets, and wearables are increasingly aware of the profile and behavior of their users and the context and environment in which they are being used. Users are known for their inputs, conversations, searches, social networks, and apps. Context and environment are known by the sensors, the device and the apps of the user combined with the sensors, the devices and the apps of millions of other users who can be close by or far away.

These massive datasets emerging from mining user data, sensor data, social network data, and crowdsourcing data are pushing computational awareness to new frontiers. And, the latest machine learning algorithms provide the engine to compute all that massive data.

Computational awareness could be the technology which is going to drive the most valuable features of your mobile platforms and your mobile apps. All the stars are aligned: mobile processing power, sensor technologies, machine learning algorithms, large-scale back-end data processing and analytics, and certainly more to come…But one major challenge to overcome: data privacy. We are just starting the debate on that one…

In May at its Google IO 2015 Developer Conference in San Francisco, Google showcased “Now on Tap”:

“ If your spouse messaged you about dinner at a new restaurant, without leaving the app, Now can find menus, reviews, help you book a table, navigate there, and deep link you into relevant apps”

In June at its Apple WWDC 2015 Developer Conference in San Francisco, Apple showcased how Siri is becoming more “proactive”:

“Plug in your headphones and iOS 9 recognizes that you might want to finish the podcast you started earlier. Or connect via Bluetooth to your car, and your favorite playlist is suggested for the ride home”

And this week, Facebook gave some insights to its new personal assistant “M” for Messenger:

“M can actually complete tasks on your behalf. It can purchase items, get gifts delivered to your loved ones, book restaurants, travel arrangements, appointments and way more”

So, I cannot wait to find before Christmas the perfect personal assistant and wearable who could help my 81 years old Mom next year:)


Note: The picture above is the Earclip-type Wearable PC from Hiroshima University.

Copyright © 2005-2016 by Serge-Paul Carrasco. All rights reserved.
Contact Us: asvinsider at gmail dot com.

Categories: Mobile