Yesterday, Google announced the latest “feature drop” for its Pixel line of Android phones. It’s part of an effort to get people to realize that the Pixel gets software updates ahead of other Android phones and that some of the features it receives stay exclusive to the Pixel. And yesterday’s “drop” epitomizes so many things that are good (and bad) about Google’s hardware efforts, so I wanted to dwell on it for a moment today.
First and foremost, saying that these features were “released” yesterday is only vaguely accurate. Instead, the rollout began yesterday and should theoretically be completed for all users in a couple of weeks. That’s significantly better than the last (and first) feature drop, which trickled out to Pixel owners much more slowly.
Google has very reasonable reasons for not distributing its updates to everybody on day one, but they undercut whatever excitement people may feel when they hear about them — since there’s an indeterminate wait. I covered all this in the newsletter last December with the first feature drop.
So let’s look at what’s new in this month’s update, courtesy of this rundown from Chris Welch. There are some basic quality-of-life (to borrow a term from video games) tweaks: dark mode can be scheduled, adaptive brightness has been improved, and you can set up little actions based on which Wi-Fi networks you’re connected to. There’s a new gesture for the Pixel 4’s Motion Sense chip, new emoji, and new AR effects for Duo video chats. All fine.
But there was one line on Google’s support page for the update that caught my eye (emphasis mine): “In addition to long press, you can now firmly press to get more help from your apps more quickly.”
“Firmly press” sets off alarm bells because it sounds a lot like the iPhone’s 3D Touch, which enables different actions depending on how hard you press on the touchscreen. It was a beloved feature for some people because it gave faster access to the cursor mode on the iPhone’s keyboard (I think long-pressing the space bar works fine for that, but I get that people love it). It’s also gone on the latest versions of the iPhone — Apple has seemingly abandoned it because the hardware to support it was too expensive/thick/complex/finicky/whatever.
But now, it seems that Google has done the same thing for the touchscreen that it does with the camera: use its software algorithms to make commodity parts do something special. That is a very Googley thing to do, but not quite as Googley as the fact that there was virtually no information about this feature to be found anywhere on the internet beyond a speculative note over at XDA Developers.
After a few hours of back and forth, I finally got more details from Google. Here’s what this feature does, according to Google:
Long Press currently works in a select set of apps and system user interfaces such as the app Launcher, Photos, and Drive. This update accelerates the press to bring up more options faster. We also plan to expand its applications to more first party apps in the near future.
Essentially, this new feature lets you press harder to bring up long-press menus faster. In fact, Google’s documentation for Android’s Deep Press API explicitly says it should never do a new thing, it should only be a faster way to execute a long press. The answer to why it only works in certain apps is that a lot of Android developers aren’t using standard APIs for long press actions. Because Android.
Okay, but how does it work? It turns out my hunch was correct: Google has figured out how to use machine learning algorithms to detect a firm press, something Apple had to use hardware for.
Tap your screen right now, and think about how much of your fingertip is getting registered by the capacitive sensors. Then press hard and note how your finger smushes down on the screen — more gets registered. The machine learning comes in because Google needs to model thousands of finger sizes and shapes and it also measures how much changes over a short period of time to determine how hard you’re pressing. The rate of smush, if you will.
I have no idea if Google’s machine-learning smush detection algorithms are as precise as 3D Touch on the iPhone, but since they’re just being used for faster detection of long presses I guess it doesn’t matter too much yet. Someday, though, maybe the Pixel could start doing things that the iPhone used to be able to do.
(For the record, Apple’s GarageBand has a sort of software-based detector for how hard you are pressing, but it uses the accelerometer.)
So Google made long pressing take not so long. It also finally brought some updates to Google Pay — specifically, it finally figured out that people might want to switch between cards in Google Pay more easily, so it added a shortcut to get to them by long-pressing the power button. It’s a little catch-up to Apple Wallet.
Getting passes of all kinds into Apple Wallet is easy and common — essentially every airline gives you a button to do so. It’s so much better than Android’s method, which requires opening the app or saving a screenshot and then hoping you can find it quickly later. But integration with Google Pay has been lacking. Google announced boarding pass support a year and a half ago and virtually no airline uses it. (As an aside, I’d prefer it be called Google Wallet, but that brand was already used up so they call it Google Pay, because Google).
This annoyance has been going on for years, but now there’s finally an answer for Pixel users that is very Google. Instead of convincing partners to also add a Google Pay button, Google lets you take a screenshot of your boarding pass in your airline’s app. When the screenshot system sees a QR code, the notification gives you a button to save the boarding pass in your Google Pay wallet. It also lets the Google Assistant know you care about that flight so it will send you updates.
Both the screenshot boarding pass and the firm press detectors share a common bond: they are very clever software solutions that take unique advantage of Google’s machine learning strengths to solve problems. They are also problems that, bluntly, Apple solved via more traditional methods before Google.
Still, credit where it’s due, Google is catching up and, in some cases, innovating. The automatic car crash detection looks like it could be a literal lifesaver, for example. And in everyday things, Google is making progress on fixing Android’s little annoyances piece by piece and doing so throughout the entire year instead of in one giant operating system update. Now if it could just do a better job distributing both kinds of updates to non-Pixel owners, we’d be cooking with gas.
[“source=theverge”]