The Pixel 2 is a phone that’s almost five years old, but it introduced a feature that I miss more and more with each passing year. Formerly known as Active Edge, it lets you bring Google Assistant to your phone with a tap. In a way, this is an unusual idea. But it effectively gave you something that modern phones lack: a way to physically interact with the phone to get something he has.
If you look at the Pixel 2 and 2 XL, you won’t see anything that suggests you’re getting anything special. Sure, there’s a power button and volume rocker, but otherwise the features are sparse. Give the phone’s bare edges a good squeeze, however, and there’ll be a subtle vibration and movement as Google Assistant appears at the bottom of the screen, ready to listen to you. No need to wake the phone, long press physical or virtual buttons or tap the screen. You press and start talking.
We’ll talk about how useful it is in a moment, but I don’t want to hide how awesome it feels. The phones are solid metal and plastic bodies, but the Pixel recognizes when I do more than just press. According to the old iFixit teardownThis is made possible by a couple of pressure gauges installed inside the phone that can detect the slightest flex in your phone’s case when pressed. For the record, this is a change that my human nervous system cannot keep up with; I can’t tell if the phone bends.
If you find Active Edge useful, it’s probably because you’ve enjoyed using Google Assistant, as described This topic is Reddit. Personally, I used a voice assistant once a day when I had a Pixel 2 because it was literally that convenient. The stuff it’s made of so What is practical is that the pressure mainly always works. Even if you’re in an app that hides the navigation buttons, or your phone’s screen is completely disabled, Active Edge still does its job.
While this makes it very useful for finding interesting information or doing quick calculations and conversions, I’d argue that Active Edge would be more useful if you could remap it. I enjoyed having Assistant, but if I could press to turn on my flashlight, I’d have instant access to the most important function of my phone no matter what.
This version of the function already exists. The HTC U11, released a few months before the Pixel 2, has a similar but more customizable feature called Edge Sense. Two companies They worked together on the Pixel and Pixel 2, which explains how it ended up on Google devices. Google in the same year HTC bought the mobile division team.
Active edge combat Not Google’s first attempt It offers the option of using the touchscreen or physical buttons to control your phone. A few years before the Pixel 2, Motorola let you do this Open the camera by rotating your phone And Turn on the flashlight With karate chops – no different than you 2008 Mix music on iPod nano. The camera shortcut appeared relatively recently when Google was owned by Motorola.
However, over time phone manufacturers have moved away from accessing some basic functions through a physical mechanism. Take, for example, my daily driver, the iPhone 12 Mini. I have to hold down the power button to launch Siri which is there Burdened with responsibility Ever since Apple got rid of the home button. To turn on the flashlight, which I do several times a day, I have to go up to the screen and hold the button in the left corner. A little more convenient is the camera, which can be accessed with a left swipe on the lock screen, but the screen still needs to be on for it to work. And if you actually use The easiest way to access the phone, flashlight or camera is through Control Center where you swipe down from the top right corner and try to select a specific icon from the network.
In other words, if I turn off my phone and notice my cat doing something cute, it might turn off completely when I actually open the camera. It’s not like it’s difficult to turn on the camera or flashlight – it might be more convenient to have a dedicated button or squeeze gesture. Apple even acknowledged this for a while when they made a battery cover for the iPhone It had a button to turn on the camera. Shaving off a few seconds here or there extends the life of the phone.
Just to prove the point, here’s how much faster the camera is on my iPhone compared to the Samsung Galaxy S22, where you can double-click the power button to launch the camera:
Neither phone handles screen recording and camera previews well, but the S22 opens its camera app before I hit the camera icon on the iPhone.
Unfortunately, even Google phones aren’t immune to the disappearance of physical buttons. Active Edge stopped appearing in 4A and 5 pixels in 2020. Samsung has also removed the once-included button to summon a virtual assistant (which, sadly It must be Bixby)
Tried adding virtual buttons that you activate by interacting with the device. For example, Apple has an accessibility feature that lets you Tap the back of your phone to initiate action or even Your widget in shortcut formAnd Google A similar feature has been added to the Pixel. But to be perfectly honest, I didn’t find it reliable enough. A default button that barely works is not a great button. Active Edge worked for me almost every time, even though my phone had a thick otterbox.
It’s not like physical controls on phones have completely disappeared. As I mentioned, Apple lets you launch things like Apple Pay and Siri with a series of taps or presses of the power button, and there’s no shortage of Android phones that let you double-press the power button to launch the camera or other apps. can start
But I would argue that mapping a shortcut or two to a button doesn’t give us easy access to everything should Easy to reach. To be clear, I’m not advocating for my entire phone to be covered in buttons, but I think major manufacturers should take a cue from phones of the past (and yes, smaller phone manufacturers – I see you Sony fans) and add at least one or two physical Bring back the shortcut. As Google points out, this doesn’t necessarily require adding an extra physical switch that needs to be waterproof. It can be as simple as pressing a button that gives users quick access to the features they need — or Pixel, in Google’s case.