Evolutionary. Not revolutionary.
The Porsche 911 is 53-years-old. Some say that its design is predictable and outdated. A few even found prehistoric cave drawings of it in Germany. But people usually forget that it’s the most iconic sports car in the world.
The combination of its pronounced fenders, defined shoulders, and lines that extend to the rear produce a distinguishable, successful look. Porsche has not only created a product that prioritises essentials over the superficial, but they’ve sustained this for half a century. And you can say the same for Leica’s 35 mm rangefinders and Rolex’s Submariner line. These companies exhibit the importance of evolving a product’s original design, which is the case for Apple’s iPhone.
People gravitate towards a product that’s designed well. We want a no-bullshit, beneficial, and simple product. No one wants to be burdened with unwarranted change. People hate different—it's why iterated products are so popular. But the concern is that too much iteration can lead to stagnation, and that’s why some tech writers were quick to say that the iPhone 7's design is boring, even after seeing this.
There are usually a billion doom-and-gloom articles when a new iPhone is revealed. A lot of tech writers endlessly preach a narrative in which Apple changes the iPhone’s design for the sake of change; in which they believe that innovation is synonymous with revolution. And when a new device is revealed, it’s met with disappointment because—from early leaks—the writers knew what the phone looked like and saw no dramatic difference, just a facelift.
But the iPhone 7’s design is a conscious evolution of its original design. In the same way that a Porsche 911 has to look like a Porsche, the iPhone 7 has to look like an iPhone while refining the iPhone experience. And it does just that.
Good design cannot be measured definitively. Dieter Rams, an influential 20th century designer, created 10 important principles for good design. For over 20 years, these principles influenced designers like Jony Ive, Apple’s Chief Design Officer, to create innovative, honest, and minimalistic products. I believe that the original iPhone’s design exemplifies that ideology, and the iPhone 7’s design simply continues it. It doesn’t display any superfluous changes.
Though subliminal, the absence of the top and bottom antenna lines make the back of the iPhone look cleaner. After a while, the antenna lines on my 6S gave the phone a dated and kitschy look, while the iPhone 7’s back looked concentrated. Nonetheless, that’s all thrown out the window if you use a case on your phone, but man, would it be hard to persuade me to put a case on the jet black model.
It looks beautiful and gives the iPhone 7 a uniformed look. The antennas disappear and the bezel looks nonexistent. I didn’t order one, because it loves fingerprints more than I do. But if my pants had microfibre pockets, a jet black iPhone 7 would be on my table right now. On the bright side, it’s the grippiest iPhone I’ve ever held due to its glossy anodised finish.
After owning a space grey iPhone 6S for a year, I promised myself that I was going to order a silver iPhone 7, because my hands become greasy quickly, which is why I carry a microfibre cloth with me. I know, I’m crazy, but hey, the silver iPhone 7 looks uniformed when the screen is on—most apps have a white background—and the only fingerprints I have to worry about are on the screen and Apple logo. Also, considering the fact that I rarely put a case on my phones, and don’t baby them, the jet black model would make everything that the silver model hides apparent.
When I had an HTC M8 two years ago, I slowly gravitated towards the iPhone 6’s look because I felt like it had character. It looked concentrated, honest, and thoughtfully made. It was my first iPhone, and there was never a better looking phone until I saw the Nexus 6P. It had a contemporary aesthetic that was inviting and expressive. It had character. But when I got my Google Pixel—the XL variant—it had about as much character as a toaster.
Huawei did a brilliant job with the Nexus 6P, but unfortunately their smartphone partnership with Google was short lived. The Pixel was originally intended to be built by Huawei, but Google didn’t want another manufacturer’s trademark on the self-branded Pixels. This caused Huawei to end their association with Google, leaving them with nine months to find a manufacturer for the Pixels before their launch. This triggered a partnership with HTC to design a phone heavily influenced by a previous HTC device, the HTC A9.
Companies usually have a 12-20 month pipeline to produce a smartphone. But because Google only had nine, they were stuck designing a toaster.
There’s a significant amount of wasted white space on the Pixel's bottom front bezel. For a phone that’s similarly priced to the iPhone, the side buttons feel like they’ll fall off at a moment’s notice. You know that this wouldn't be the case on an iPhone, and that if it was, the person responsible would be sent into the desert with a shovel, one bullet, and revolver.
Google says that the glass on the back is intended to provide better antenna performance, and to give the Pixel personality. If you accidentally drop the Pixel and it lands on its back, the glass will probably shatter, which can leave you with foggy images if the crack extends to the cameras lens.
The one thing the Pixel took from the Nexus 6P was its fingerprint placement, which is arguably the least conventional place to put one—on the back of the phone. The problem is that there’s no way to unlock the phone without picking it up. If my iPhone 7 or Galaxy S7 Edge is on the table, I can easily unlock them with my finger by pressing the home button, which I had taken for granted before I got the Pixel. It’s funny how the Pixel’s fingerprint scanner lets me use five different finger imprints to unlock the phone, when in reality I’ll only be using two: my index fingers.
To their credit, Google did as much as they could in less than a year. Since they barely had control over the design, they shifted focus to what they could handle—the software. And let me tell you, it’s damn brilliant.
Apple designs their own SoC (system on chip), and they have their own micro architecture. This creates a level of vertical integration that allows Apple to control and optimise their hardware. Whenever I hear that the latest iPhone is faster than a MacBook, iPad, Ferrari, etc. I’m not surprised. Since Apple has control over both their hardware and software, they can use witchcraft to have iOS draw more performance from two core processors than Samsung can with four.
Because the iPhone 7 introduces a quad-core processor, they can do things like designate two of the four-core processors to be used for a low-power task, like, texting a Tinder match. But when you need to quickly edit three 43 MB RAW images in Lightroom, and export them to Dropbox, those two high performance cores do it while sipping daiquiris.
That level of optimisation extends to other situations on both iOS and Android. After I restored my iPhone 7 with my 6S’s encrypted backup, I left my powered-on 6S on my bedroom counter for the next eight days, during which the battery percentage went from 60% to 14%. I only touched it once to put it in silent mode, because of the blaring push notifications. The same can’t be said for my Galaxy S7 Edge. After receiving my Pixel XL, I left my powered-on S7 Edge on my desk for three days. It had around 80% left, but when I picked it up for the first time, I noticed that it had died. It may have died on the second day, or even two hours before I picked it up. Regardless, its battery was kaput in less time than my iPhone 6S. Other than push notifications, a phone should idle when its screen is off, not watch TV and drink all my juice.
The reason why the iPhone 6S was able to able to last that long is because of its custom SoC. Apple’s engineers have designed iOS to draw very little power from the core processors when they’re not used. This results in a smooth software experience that I haven’t found on any Android device, until I used the Google Pixel.
Though Google doesn’t have their own custom chips—yet—they do have complete control over Android. This allows them to essentially tweak every little thing in Android in order to maximise the power drawn from the SoC. The performance lead for the Pixel team declared:
we spent most of the time on performance consistency and touch latency. let's just say it paid off.— park avenue (@t_murray) October 4, 2016
And it did. The Google Pixel is car-falling-off-a-cliff fast. The fact that Google could optimise this much efficiency in a chip that they didn’t design is remarkable. Google has already confirmed that they are planning to produce their own custom chips, and I’m eager to see what they do with them.
Saying a Ferrari LaFerrari should be faster than a Porsche 918 Spyder on the Portico Circuit because it has more horsepower is blind optimism. Technically, it should be faster because it has more horsepower, but there’s a bevy of elements—such as weight and aerodynamics—that can hinder or benefit the lap time. Ultimately, the Porsche 918 Spyder delivers a faster lap time because of its well-tuned system. This is what Apple and Google have shown with their latest phones. You don’t have to throw a million processors, like other competitors, to get good performance. You just need to optimise those four-core processors.
You’ll probably grow a beard before you drain the battery in either of these phones.
The battery life in the 7 is good enough to handle a moderate day, but when I went to New York to shoot with the iPhone 7, I bought the Smart Battery Case so I would have the confidence to create content without any battery-related compromises. From 8 a.m. till midnight, I shot 4K videos, recorded hour-long interviews, and used Apple Maps every hour, among other menial tasks, and was left with 50% left on my phone, while the battery case was depleted.
Even on the days when I decided not to use the battery case, I shot gigabytes of RAW photos, used Apple Maps, streamed music, browsed Instagram/Twitter, dealt with emails and texts, and came back home around midnight with 20% left. Granted, I didn’t shoot videos those days, but everything else was fair game.
I wasn’t able to task the Pixel XL as heavily as my iPhone since it was delivered when I was out town, however, I did as much as I could to put it through its paces in my daily life. There were days that from 12 p.m. to 3 a.m. I decided to watch a two-hour-long movie, stream a few 20-minute YouTube videos, deal with emails, read news for half-an-hour, and stream podcasts; the Pixel would be left with 35%. Even on days when I forgot to charge the phone and had 60% to spare, I’d use the Pixel XL throughout the day without being concerned about its battery life.
Can you hear me?
After owning various iPhones and Samsung Galaxys, I started to realise how little I appreciated the front-facing stereo speakers on the HTC M8 and Nexus 6P. When I had those phones, I wasn't burdened with the worry of losing clarity when listening to a podcast while my phone was a few feet away, or watching a video when I didn’t have my earphones. Front-facing stereo speakers are a feature you don’t think you need in a phone, until you get a phone that has them.
Unfortunately, if you’re upgrading to the Google Pixel from a phone with wonderful front-facing stereo speakers, you’ll be forfeiting them. The Pixel’s single, bottom-firing speaker is an unredeemable feature in the way that stepping in a puddle with socks is enjoyable. When I hold the Pixel XL in a comfortable position, I end up muffling the sound, causing me to re-orient the phone in an uncomfortable position. Sometimes I don’t want to plug in my earphones or connect to a Bluetooth speaker just to enjoy audio, so it’s disappointing to have my fingers or palm block the Pixel’s speaker when I’m holding it. The other issue with the Pixel’s speaker is that when I played audio at the phone’s highest volume, it would sound like it was coming from a bucket—distorted with a lack of the aural perspective that stereo sound systems have.
The iPhone 7 is the first iPhone with a stereo speaker setup. Previous iPhone’s would have one speaker, placed on the bottom right. But now, with a second speaker placed in the earpiece, the iPhone 7’s audio has an increased dynamic range. There have been times where the earpiece would be so loud, that I’d have to regularly decrease the volume during phone calls. When you’re listening to audio with heavy bass, the phone’s chassis seems to act as a subwoofer, which renders listening to music without earphones an immersive experience.
One of the niftiest features that the iPhone 7 took from the iPad Pro is its dynamic audio channel delivery based on the device’s orientation. Whether you hold the iPhone 7 horizontally or vertically, its speakers will automatically adjust the left and right audio channels to provide a rich, consistent surround sound experience.
Now, I understand that stereo speakers on a phone isn’t the highest priority for some people, especially when many people use earphones and Bluetooth speakers. Hell, I use both frequently, but in the rare circumstances where I have to play audio from my iPhone 7's speaker, I’m never concerned about not being able to hear it loud or clear enough. And there are times when I prefer to listen to audio from the phone’s speakers because I don’t want to bring earphones or a big speaker.
Point. And shoot.
Because smartphones are so portable, we’re able to take photos we wouldn’t have been possible a few years ago. I once talked to Joe Pugliese, a Los Angeles-based celebrity photographer, about mobile photography, and his insight is worth sharing:
“I've been very happy to see the point of entry to photography get so low that basically anyone can do it. I was always uncomfortable with the fact that so many voices weren't being heard around the world because of the difficulty in taking pictures, getting them developed, and presenting them to somebody else. You can find coverage on almost anything now, even if it was taken on a phone… If you have a vision and you can point that thing in the right direction at the right thing, then who gives a shit what the device is?”
And I can’t agree with him enough. We’ve reached to the point where smartphones let people realise their artistic passions with a relatively affordable camera that fits in your pocket. Smartphones won’t replace DSLRs but they can, and will, complement them. This is why you see companies like Apple, and now Google, stress the importance of an intuitive, powerful camera in a smartphone.
The biggest draw to my first iPhone was the fact that the camera never seemed like an afterthought. It was, and still is, the most intuitive camera I’ve ever used. And with the iPhone 7, that ideology of an intuitive experience is exemplified by powerful tools such as Portrait mode, which let you approach photography with more variety, without the handicap of a botched interface.
Before you continue reading, this website has the best interactive tool to learn about the way that settings affect a picture.
The iPhone 7’s Portrait mode, which artificially reproduces bokeh—the blurriness in backgrounds that you’ll see in photos with a shallow depth-of-field—isn’t the first phone to include the feature, but I’ve never seen it executed well before. The HTC One M8, one of the first phones with an artificial bokeh effect, produces bokeh by using two separate sensors to map depth data, similar to the way that 3D cameras determine depth. Unlike the iPhone 7 Plus, it had an adjustable refocus; you’d have a slider to let you de-focus an image as much as you'd like. However, with all this amazing technology, it fell flat because of its algorithm. When I used the feature on my M8, images would unreliably produce artificial bokeh, making the effect seem evident and tasteless once applied to an image.
Apple utilised HTC’s approach, but because of their complete, synonymous control over both hardware and software they were able to develop smooth bokeh effects on the iPhone 7 Plus in favourable situations, which usually involve being in a well-lit area with a still subject that’s no more than seven feet away. This allows the phone to map depth without moving objects interfering. When I tested this feature on an iPhone 7 Plus, I noticed that the portrait mode’s live view seemed to also incorporate a facial tracking system similar to Snapchat’s face filters.
As much as I appreciate this feature, it’s not something I’d use, because I prefer using my dedicated cameras to achieve smooth, real bokeh. But for someone who’s always wanted a DSLR but could never justify buying one, you can buy the iPhone 7 Plus because of this feature.
I considered ordering the iPhone 7 Plus because of the 56 mm (equivalent) telephoto lens, but I prefer shooting at wide angles, and the iPhone 7 Plus is too big to comfortably fit in either my pockets or hands. The fact that both iPhone 7s have identical sensors, ƒ/1.8 aperture, and optical image stabilisation, was comforting enough for me to buy the regular sized model, which finally brings me to my experience with the iPhone 7’s camera.
The iPhone has been my daily camera for nearly three years. There have been times where I’d have a Leica on my neck and an Android device in my back pocket, but the first thing I’d use to take a picture was my iPhone. It’s mainly because I know what the picture is going to look like before taking it, and the fact that I can share that picture within seconds of taking it. And also, there have been situations where I don’t feel comfortable carrying a $5,000 camera, and know that if I wanted to shoot something I could capture it with a semblance of accuracy.
One of my growing issues with the iPhone 6S was its ƒ/2.2 aperture size, because often I wanted to shoot without the flash. Since smartphones have relatively small sensors, the smallest change in f-stops can yield significant results in your image. The smaller the number, the greater amount of light you’re letting into the sensor. So upgrading from ƒ/2.2 to ƒ/1.8 yields a marginal difference with smartphone sensors.
From being able to use a faster shutter speed without sacrificing exposure values, to getting a shallower depth-of-field on close-up subjects, the iPhone 7 made me more comfortable taking pictures in unpredictable scenarios. So when I decided to challenge the iPhone 7’s photographic capabilities, I wanted to take it somewhere where spontaneity was welcomed: New York City.
My 'Daily Camera' series explains how and why I approached the images I took with my iPhone 7, but to briefly summarise my experience, the iPhone 7 handled most scenarios with ease. I decided to skip iOS’s default camera app and use ProCam 4 to capture RAW images and the amount of detail I was able to edit and modify on the iPhone 7 was surprising. Pairing ProCam 4 with Adobe’s Lightroom Mobile app allowed me to eliminate nearly blown-out highlights, enhance readily vibrant colours, among other things that stretched my editing capabilities on the iPhone 7’s vibrant display.
The iPhone 7 introduces a new wide colour gamut, which DisplayMate claimed as the most colour accurate display they’ve ever measured, giving it a near perfect gamut accuracy. So when it came to taking, editing, and viewing photos, I knew that the colours I saw on them were representative of what you’d see with the naked eye.
I rarely used ProCam’s manual focusing feature since the iPhone 7’s focus tracking rarely failed me, unless I placed it in a situation where manual controls were needed, such as taking pictures of a moving object at night with only one light source. And that was usually the type of scenario in which the iPhone had problems: moving subjects at night. Unless you’re using a flash to compensate for low exposure values at a high shutter speed, the iPhone 7 isn’t going to win any awards in low-light photography of subjects that aren’t moving. Not to point any fingers, but do you see any images in Apple’s latest iPhone 7 campaign where someone is running, jumping, or doing any sort of rapid movement at night? Some scenarios you can get away with, such as having a huge light source close to the subject, but the iPhone 7 can be frustrating to use in low-light situations, however, there’s a smartphone that makes shooting low-light images seamless.
The Google Pixel has the most capable camera in an Android phone I’ve ever used. It also has the best low-light camera I’ve used in a smartphone. With or without the flash, I never once worried about having an image that was out of focus or flooded with noise (grain). The only sacrifice I noticed the Pixel making was inconsistent white balance and overly saturated colours. You can fix both in post, but you can’t reduce noise without it being noticeable, and you can’t magically make a blurred image sharp. I really wish I had the Pixel when I was in New York because often I found myself frustrated, having to switch to manual controls just to get a feasible low-light image with the iPhone 7. I mean, it’s a problem that I don’t mind occasionally since I know how to make the image sharp, exposed well, and accurate, but for someone who isn’t well-versed in [mobile] photography, they’re going to be frustrated about the way that everything looks washed out and full of noise. There will also be scenarios where you want to take a picture of a fleeting moment and you don’t have time to open up a third party app and mess around with settings and toggles.
Although its illegal and stupid, I’ve taken a few pictures with the Pixel while driving, and 70% of them are crisp and beautifully rendered. I never once looked at the phone or tapped the screen to focus or take a picture. I’d simply pull the phone out, point it at the subject’s direction, double press the lock button to activate the camera, and press the shutter with the volume rocker.
The main reason that the Pixel takes amazing low-light images is because of its HDR+ mode, which utilises various software features to make taking a high dynamic range image as seamless as possible. If you press the Pixel’s shutter, you’re not actually taking that image, because a number of images have already been taken. This is due to the fact that the Google Pixel has already combined multiple patches of shots with exposures targeted at various areas. Traditionally, an HDR image involves taking an underexposed, well-exposed, and overexposed image and combining them to create an accurately exposed image. But the wizards at Google Research’s computational photography team decided to use the Pixel’s HDR+ engine as a means of underexposing every image. This allows the Pixel to capture multiple frames so it can reduce noise in shadows, and compress the dynamic range using local tone mapping. This was crucial to me because my subjects were sharper at night than my iPhone 7.
There are many spatial limitations to implementing better hardware in smartphone cameras, so manufacturers tend to focus on intelligent software processing in order to yield better photographic results. It pushes competing companies to approach camera design and development in a new way, and gives consumers on each platform the benefit of a phone with a great camera. One of the biggest gripes I had with Android devices was the lack of smart photo processing, but seeing what Google has started with HDR+ makes me hopeful of the Pixel’s successor; it should remind Apple that they’re not the only ones with good witchcraft. In the end, the ones that benefit from this the most are us, the consumers.
Every year, tech writers perpetuate the same narrative: the next iPhone is boring and doomed. Looking at this list of articles you would think that Apple should be dead by now.
- The end of the Apple dynasty? TechCrunch, 2016
- 6 Reasons Apple is Still More Doomed Than You Think, Forbes, 2015
- Conventional wisdom watch: Apple is doomed. Google is forever, CNN Money, 2014
- There's No Question Now: Apple is Dead, The Street, 2013
- 9 Reasons Apple Is “Doomed”, Buzzfeed, 2013
Yet, the iPhone 7 yielded record-breaking earnings for Apple. It seems like the same writers who cry “doom” have no fucking idea what consumers want. Consumers will demonstrate how tolerable evolutionary changes are to them by buying—or not buying—something. That’s why after five decades, Porsche still has one of the most sought-after sports cars in the automotive market. And it’s why after ten years, Apple still has the most iconic, desirable smartphone in the world. Could it be better? Of course. And that’s what the iPhone 7 is—a better iPhone. But the Google Pixel? It’s a good phone that should’ve been released a year ago.
The iPhone 7’s Taptic Engine introduces System Haptics for the home button, software controls, and interactions. Some of these interactions involve a gentle tap when you open the Control and Notification Centre. You’ll also notice them when scrolling through date and time wheels.
The ‘click’ you hear from the iPhone 7’s home button is actually emitted from the bottom right speaker grille, which emulates the tactile response you’d get from clicking a physical button. Try clicking it while covering the speaker grille with your finger.
The Google Pixel has Google Assistant, the best personal assistant I’ve used on a phone. Its smart, fast, and just as personable as Siri. The divide between Google and Apple’s services primarily boils down to privacy. Google users presumably tolerate the lack of privacy in return for accuracy, whereas Apple users presumably tolerate the lack of accuracy for the sake of privacy. This is why Siri sucks.
Android devices such as the Pixel have a split-screen feature, which is a godsend. It’s incredibly useful for simultaneously digesting content and information. From reading an article on the train while monitoring exactly where your next stop is on Google Maps, to watching a YouTube video while you quickly search for something, it’s a feature that I really hope the iPhone 7 Plus’s successor will have.
Screen search is another intelligent Android software feature that’s activated when you touch and hold the Home button. This lets Google give you information about what’s on your screen to show you links, apps, and actions.