The Pixel And The Google Ecosystem

The Pixel 2 isn’t Google’s first smartphone. It’s not even really technically its second phone — there were the Nexus devices of the past, and in some sense all Android phones. They weren’t Google phones exactly, but they were at least Google-adjacent smartphones.

Perhaps more accurately, they can be categorized as the “not-Apple” phones.

While Google became the vague banner under which all Android/“not-Apple” phones could be loosely grouped, a negation of an ecosystem does not itself an ecosystem make.

As Karen Webster pointed out two years ago in her column “Who Will Run Android Pay,” the Android ecosystem was very, very different from its Cupertino-based counterpart, in ways that made it much harder to govern:

As an open source platform, no one really ‘owns’ Android. Yes, Google comes close but not in the way that Apple owns the iPhone. That means that hardware makers and developers are free to access and use it as they wish via an open source license that they get from Google. Google acquired Android, Inc. (the company the original team of four created) in 2005 and two years later launched it, along with a consortia of hardware, software and telco companies, to advance the notion of open standards for mobile devices. Updates to the operating system are released periodically.”

Google holds the best position to control the Android ecosystem, but only to the degree it can be controlled — with tens and possibly hundreds of thousands of different flavors of Android devices currently in the market. Those individual device makers tend to load their devices with a pile of custom mobile apps that are only compatible with other devices from the same maker.

Thus, control is a relative concept.

Device makers customized their products to push their own proprietary mobile apps and services that would compete with and crowd out Google’s equivalents. Think Android Pay and Samsung Pay. Android Pay has been around longer; Samsung Pay has nearly twice its adoption rate, according to the PYMNTS/InfoScout Mobile Wallet Usage figures.

Additionally, Android-powered phones are shipping with alternatives for voice-activated virtual assistants, rival app stores and even Microsoft products.

Which means that despite the release of the Pixel last year — and the Pixel 2 line this year — even if it isn’t quite a first, the innovation marks Google’s shift away from the Android ecosystem and into the evolving and expanding Google ecosystem.

Gone are the oft-complained about irregular update cycles that seem to leave every Android phone running a different desert-themed operating system, replaced instead by a regular and timely software update cycle optimized around running the core software of Google’s ecosystem — and in at least some cases, the unique features of the artificial intelligence (AI) and search-oriented Google brand.

While the camera upgrades have gotten a lot of press — particularly the upgrades to image quality — the more interesting addition is the expanded place for Google’s visual search tool Lens.

To use Lens, users snap a photo and squeeze the bottom of their phone to call up the virtual assistant to identity the object in the photo, which then offers up the pertinent information. Historical landmarks get brief bios; storefronts yield hours of retail operation and contact data. The feature is also designed to read signs, extract things like phone numbers and web addresses and give the user an instant means to contact the merchant.

By all accounts, this feature is better in theory than in execution. Some reviewers have questioned why Google worked so hard to push it out when more fine-tuning and better software might have improved Lens’ reliability.

But other reviewers had better experiences, noting that for the gaps they’ve noticed so far, Lens is still a somewhat more reliable guide than Samsung’s Bixby. Most reviewers agree the shape and scope of the play on offer is apparent.

The more it’s used, the more the tool learns and aggregates what it learns (that’s the theory, anyway). The visual search tool Lens could work up to its advanced billing at Google’s product release presentation earlier this year, marking a pretty important advance in how consumers interact digitally in real-world environments.

Ultimately, that experience might be the right wrap-up for much of what the Pixel 2 smartphone might be able to do and how it might just be a competitor in a market currently dominated by Apple and Samsung.

The Google phone, unlike its more Android-focused forerunners, is most clearly about creating a Google toolbox for consumers who have been using some of the tools for the better part of the last few decades. Google already knows how people like to search, and a visual search that can instantly identify places, products and things with the flash of a camera would be the most useful extension of that information if Google can get it to work as seamlessly as they hope it will.

Tied in to Google expanding its competitive play in voice-activated AI tech — and the company’s multi-modal search and consumer touch points built to not use a host of devices, or even a software suite, to rule them all — is a more clearly defined Google garden that consumers can spend more time exploring.

Google and the Pixel 2 have a long way to go (there is an issue with the design department if the words “workman-like” keep coming up in product descriptions).

But those retail products and their potential — to actually make a change in how consumers interact in digital yet real-world experiences — have managed to capture a lot of interest.

It will be worth watching in early 2018 to see how the Pixel 2 sold at Christmas and how well the smartphone works in tying that Google experience together.