What to Expect From Google I/O This Week: Thinking Beyond Hardware to Machine Intelligence

As we pay close attention to the war to become industry leader between Google, Apple, Microsoft, Samsung, Amazon and all of the large players within the developer landscape, we can certainly make predictions as to what Google may announce.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

The days leading up to the Google I/O conference this week are very much comparable to the lead up to the climax of a scary movie for developers - a feeling of suspense sits within our stomachs, as we hold our breath awaiting what Google will announce at the big event. But as we pay close attention to the big war to become industry leader between Google, Apple, Microsoft, Samsung, Amazon and all of the large players within the developer landscape, we can certainly make predictions as to what Google may announce.

This year, not only will Google be live-streaming the show, but they've also created an app so that we can follow along in real-time and tune into all of the announcements they make - heightening the suspense that we have around the potential announcements. However, as we take into account recent developments by all of the major players, along with what is needed by Google to stay ahead of the pack, we can certainly not only predict, but forecast that Google will begin to take on the task of incorporating more machine intelligence into its products and services in many forms. By making devices smarter, more relevant and more personalized will fundamentally drive value while simultaneously freeing users from information overload.

Intelligent Alerts and Messages
One of the largest steps we are going to see from Google across the board is a step toward more intelligent alerts for end-users. Google understands that in order to keep their stake as a leader in the space, they will also have to adapt to the fact that consumers are continuing to demand their devices truly provide them with meaningful alerts and messages based on context and current interests. Of course this can come in the form of alerts on many devices - whether it be our mobile phones, smartwatches, or on experimental wearable devices such as Google Glass.

While Google has shared very little information leading up to the conference, they did release the developer preview for Android Wear SDK. "Android Wear is a set of APIs that extends the Android platform to a new generation of wearables," said Timothy Jordan, Google Developers' spokesperson in a video shared by the company late last week. Hopefully these new APIs will allow developers to create applications that integrate truly intelligent alerts to the end-user. This will be a positive step toward consumers' mobile devices sending only alerts that are relevant and they really want to see: which will be tied to geographical position, time of day and their interests. No matter what hardware Google may have plans on announcing, or announcing updates to--we can count on the devices having a strong focus on providing end-users with alerts that are truly relevant.

A More Predictive Google Now
It's no secret that Google's technologies have always relied on managing and making sense of massive amounts of data. Google has immense quantities of data - giving them the ability to leverage their information advantage. This gives Google an advantage in predictive analytics, as creating usable machine intelligence is driven by access to information. Applying machine learning algorithms to information in the form of refined data presents the opportunity to both train and validate the data to accomplish different tasks.

We are hoping that at I/O this year, we will see steps toward more relevant predictive analytics, beyond our devices simply telling us if we will encounter a traffic jam on our commute home. It's very likely that Google will show a progression toward making more relevant predictions and personalization. We will likely see this manifest itself within Google Now.

Microsoft's Cortana and other competitors are also trying to create predictive virtual assistants that utilize ambient and contextual information in order to predict end-user wants and needs. Cortana in particular has recently upped its game, and it is likely that Google Now will make large steps in the space of predictive services. As users are becoming increasingly sensitive to irrelevant predictions - Google Now will have to become more intuitive and make more intelligent predictions based information that is influential to the user in order to stay relevant. Google Now will likely start displaying relevant information to users based upon intelligent predictions.

Smarter Search
A huge trend we are seeing from large search engines, including product search engines on e-commerce sites, is the increasing importance of actually understanding user intent when searching, as opposed to simply utilizing statistical pattern matching. It is no longer just about understanding search queries, but now search engines are also making strides toward understanding more about the content we search for.

This progression for search engines toward becoming answer engines is crucial, particularly given our increased reliance on mobile devices. On mobile, search patterns are quite different. Users want higher precision results instantly, without having to browse through a large set of ranked search results. Smarter search will take into account the context and intent of a users' search, as well as past interests or searches, providing truly enriched search queries and search experiences. In this space it is natural to see endeavors like Google's Knowledge Graph to become increasingly important.

We'll begin to see more relevant question-and-answer capabilities from Google that could up the game for companies like Amazon - creating capabilities that will make it easier for consumers to search for products. We should expect to hear plans from Google to make their search engines more semantic in order to move in the direction of providing answers instead of the search results lists that we are familiar with today.

Showcasing Brainy Examples
We have seen it from all of the competitors, and we should expect it from Google this week as well: showcasing. We saw IBM utilizing this method at the National Retail Foundation's BIG Show this year with CEO Ginni Rommetty's keynote address when she demo-ed Waton's technology in real-time via a virtual smart shopper for the North Face brand. Facebook has used this technique when showing us practical showcases of facial recognition technology.

Google will undoubtedly drive conversations around their more intelligent technological advancements with the same method: showing us exciting examples of machine intelligence in action. Examples like this can come in the simplest of forms - but it is what is needed by the developer community in order to get excited around Google's advancements in the space. The underlying machine learning concept and techniques may be abstract and require in-depth mathematical understanding, but, showcasing is a great method to translate this leading edge technology into examples of solutions that we can better wrap our heads around and truly understand. While it's hard to say exactly what they will showcase, and what example they will choose - we can anticipate that Google will utilize this technique at I/O this week in order to show us what they're made of - and that they are going to continue to innovate with the data at their fingertips in order to continue provide us more intelligent devices and platforms.

Popular in the Community

Close

What's Hot