LIQUIFY | iPhone 12 Pro Max Review
Updated: Mar 4
Apple's iPhone 12 Pro Max features the powerful A14 Bionic, all-new design with Ceramic Shield, Pro camera system, LiDAR Scanner, and the biggest Super Retina XDR display ever on an iPhone.
Pro Max features huge advancements for mobile photography,cinematography,high-speed mobile broadband and processing power with the all new A14 Bionic System-on-a-Chip (SoC).
I've had the Pro Max for 17 days now, long enough for the newness to wear off, and the reality to set in. Does it stand up to the hype? Read on, I'll tell you all about it ...
It's a Leap Year
Let me start by telling you my path with iPhone's to give you some context. My first iPhone was the iPhone 7, and then the 8+, which was followed by the XR, and then iPhone 11 so this is my first true personal experience with a Pro model of iPhone. I picked the right point to rung up the ladder.
I specifically used Liquify in the blog title, it's the one word I would use to describe my experience.
This is now one of the many products in the Pro line that I currently own. My Pro devices include a 2019 16" MacBook Pro, a 2015 12.9" iPad Pro and now the iPhone 12 Pro Max. I also have a new pair of the AirPods Pro, thanks to one of my clients. I did own a couple of different 13" MacBook Pros, but I traded up to the 16" MBP, and am not looking back ...
Working with the Pro level devices is a definitely a step up. The added capabilities of the Pro model products helps them perform better individually, but when coupled with other Pro model devices it has liquified my mobile, laptop, listening, entertainment, and computing experience.
Each product added to my Pro circle, adds that much more functionality to my life. So much so, that I feel like I am beginning to experience that idyllic future - that has been promised to us for so long.
Don't worry I'll get into all the critique and specs, but I wanted to set the stage for what's to follow.
So let's dive right in, by now you have all seen the stovetop memes, but the Pro Max has a full-sized range. As you can see in my amatuer video production below, the camera system area on the Pro Max is even larger than the one on the iPhone 12 Pro model. There was no difference in the camera systems between the iPhone 11 Pro and the 11 Pro Max - they were the same.
But the 2020 Pro Max packs a powerhouse of features that separates it from all the other iPhones. It is not only larger, has a bigger battery but it also includes camera features a Pro can appreciate. Now I am not saying that I am a Pro, but I aspire and having pro tools is helping me ramp up.
Let's start off with the Optical Image Stabilization (OIS), the Pro Max uses a different method than any other smartphone in OIS, they borrowed technology from 35 mm cameras called IBIS.
IBIS is an acronym for In-Body Image Stabilization. It's a new technology that stabilizes the iPhones image sensor to provide both shake-free video footage, and sharp still images when shooting handheld, at longer shutter speeds. Other manufacturers, and even all other iPhones stabilize images by moving the camera lens, but the Pro Max is different from all the rest.
Apple says, "until now, sensor‑shift stabilization was only on DSLR cameras. This is the first time it’s been adapted for iPhone. Whether you’re shooting video of your kids as you chase them around the park or holding your iPhone out the window on a bumpy road, you’ll get more precise stabilization than ever. Apple engineered a stabilizing solution, called sensor-shift OIS that moves just the sensor, keeping it even steadier than before. Believe me, It’s a game changer.
This new sensor-shift readjusts the sensor on both the X, and Y axis, so no matter which orientation, or what direction of movement you perform while filming or taking stills the sensor shifts with you.
Oh BTW, I named my new iPhone 12, I call him Max, so let's just call him by his first name for the rest of this blog post. Anyway, on to the sensor itself, Max has a larger sensor than any of the other 2020 iPhone 12 models.
Max takes the pro camera experience even further. The new ƒ/1.6 aperture Wide camera boasts a 47% larger sensor with 1.7μm pixels - Almost twice as big as the sensor on iPhone 12 Pro or any of the other models in this year's lineup.
But there is a whole lot more to this full sized range, This new Wide lens is comprised of seven individual elements, for edge‑to‑edge sharpness. But wait, there is still two other camera lenses and a LiDAR Scanner to talk about. And then we'll dive into computational photography, Dolby Vision and all the rest the features and how they have liquified my smartphone experience.
But first, please take a moment to soak in all the goodness of my first video for Applefanboi. I am a rank amateur, but I'll get better. If you like the content, please like the video and subscribe. And don't forget to tap the bell to get notified when I upload new videos. Feel free to leave a comment.
A New Day, For Night Mode
Apple says it has made improvements to Night mode, now expanded to the TrueDepth and Ultra Wide cameras, allowing for an even brighter picture. And Night mode Time-Lapse delivers longer exposure times for sharper videos, better light trails, and smoother exposure in low-light scenarios when used with a tripod. Deep Fusion, now better and faster, comes to all cameras, and with the new Smart HDR 3, users can expect more true-to-life images, even in complex scenes.
Well as the Wide angle lens has the biggest sensor and lets in more light, decided to put it to the test. I live on the banks of Lake Villa Park and other than some property lights and the lights of the houses across the lake there is virtually no light in the area. So I went out about 9:00 PM on a dark night. and shot directly across the lake. The grass, and posts that cordon off the banks were lit up in the shot like it was the middle of the day. But somehow the rest of the shot didn't get overexposed, or washed out with the light, which is the effect you get when you shoot with a flash at night.
The picture was solid and the exposure only took about two seconds. Please note that this was a handheld shot, no tripod was used. When testing the same shot with the other cameras there was a up to a 3.5 second exposure time with lessened results. If you are going to do Night Mode photography use the Wide lens setting and you'll be able to capture almost any image. Even in the darkest of situations. The LiDAR scanner found in the iPhone 12 Pro models, greatly assists. It doesn't look like this to the naked eye at this hour. Even the clouds in the distance, above the trees, are visible. When you can define clouds in pitch black scenes you have something incredible.
AR at the Speed of Light
An all-new LiDAR Scanner on the iPhone 12 Pro Max enables Night mode portraits, and even more realistic AR experiences. Light Detection and Ranging (LiDAR) creates a depth map in nanoseconds.
The LiDAR Scanner on Max measures how long it takes light to reflect back from objects, so it can create a depth map of any space you’re in. Because it’s ultrafast and accurate, AR apps can now transform a room into a realistic rainforest or show you exactly how a new sneaker will fit.
Previously AR was more difficult to create and manage in a space. Before LiDAR the camera was used to scan the area to find the borders before you could launch into AR mode. The new LiDAR scanner takes this over this responsibility, allowing users to jump right into the AR activity without having to scan the space first.
Apple reports, this technology delivers faster, more realistic AR experiences and improves autofocus by 6x in low-light scenes for more accuracy and reduced capture time in photos and videos. This advanced hardware, combined with the power of the Neural Engine of A14 Bionic, also unlocks Night mode portraits, rendering a beautiful low-light bokeh effect.
So you're beginning to see how these Pro features make for higher quality results, even if you're not a trained professional. Even at night in the photo above you can easily judge the distance of the houses across the lake, and clearly see the leaves on the trees. Much of this is due to what Apple has dubbed computational photography. And no, it's not marketing buzz words, this is a highly-sophisticated image processing pipeline. The LiDAR 3-D image maps are the key to great images.
So far all we have talked about is the basics of the camera system but I don't want to tarry too long on any one subject. Max has four total camera, Wide, Ultrawide, and Telephoto on the back and a True Depth front facing camera that uses Infrared (IR) rather than LiDAR to scan for great portraits. You can easily flip back and forth through all four lenses from the main camera screen. But all these lenses, and sensors wouldn't be able to create these fantastic images without the A14 Bionic SoC.
Designed by Apple, the A14 Bionic is the first 5-nanometer chip in the industry, with advanced components, that are literally atoms wide. Apple reports that forty percent more transistors rev up speeds while increasing efficiency for great battery life. It has up to a 50 percent faster Central Processing Unit (CPU) and Graphics Processing Unit (GPU) than any other smartphone chip.
The Neural Engine is said to be up to 80 times faster than the previous generation iPhones with 16 cores of processing power, designed to accelerate artificial intelligence applications, especially Machine Learning (ML), and Machine Vision (MV). In other words, everything that is input, and everything that the four cameras see. Like applying Deep Fusion to improve details in your photos.
The photo above was shot the morning I was setting up my iPhone 12 Pro Max. I shot this image with my iPhone 11 prior to sending it back to T-Mobile. Deep Fusion is a new image processing system that works automatically behind the scenes in certain conditions. Apple describes it as an advanced image processing system that uses the Neural Engine to capture images with dramatically better texture, detail, and reduced noise in lower light.
But more recently Deep Fusion as the ability to recognize objects and output images that accurately represent them. My camera recognized this object, and gave me haptic feedback when it locked into Deep Fusion mode. I had recently watched Zollotech talk about this feature and snapped the pic. No it's not an award winning composition, but I have to say it's one of the crispest shots I have ever caught with any camera, especially one thats on last year's, iPhone model.
In attempts to increase speed other manufacturers add more ram, and bigger batteries, Apple went for smaller. They say, iPhone 12 Pro has the first 5‑nanometer chip in the industry. The components are measured in mere atoms, which means the transistors can be packed incredibly close together. So energy flows between them faster — and less of it gets lost along the way. That’s how A14 Bionic boosts performance while saving energy. There’s never been anything like A14 Bionic, it like the new M1 stands above all the rest.
What does all this mean for you? A ton. Big performance gains across the board. Smart HDR 3 for more true-to-life photos. Improved detail in high-quality video, thanks to advanced temporal noise reduction on the new image signal processor. And Dolby Vision that’s encoded while you shoot.
Apple ❤️'s Dolby
By now you may have seen quite a few examples of people shooting Dolby Vision videos and posting them online. You see the realism, vibrance and feel the energy of the videos being shot.
Not a lot of reviewers have really focused on the relationship between Apple and Dolby Labs. They seem to separate the Dolby Vision video from the Dolby Atmos Spatial Audio and Dolby Digital. But Apple and Dolby have a deep partnership that has been growing for about the last five years, it has had a profound effect on Apple products thats not really being talked about. I'll change that!
There isn't much of a trail to understand exactly when Apple began working with Dolby but I think it goes back at least a year before AirPods were released. On September 7, 2016. Apple announced the iPhone 7 and 7 Plus, Apple Watch Series 2 and the original AirPods. In the proceedings Phil Schiller talked about Audio generally before introducing the AirPods wireless headphones. First he spoke about Apple's dynamic dual speaker audio on the iPhone 7 models.
Then Phil spoke about how EarPods were making a transition to a lightning cable connection and how Apple was courageously ditching the analog headphone jack in favor of the digital lighting connector. While shared there were over 900 million users of the digital lighting connector worldwide and the number of audio accessories that use a lighting connector - probably why it still exists to this day. Although I am predicting a port-less iPhone next year. The video below is cued up to begin at Phil's Audio portion of the keynote presentations.
He even featured, almost enviously a pair of JBL Reflect Aware noise canceling headphones, he spoke about the challenges ahead and how the tether to the analog headphone jack was stopping us from moving forward. Phil shared a vision of a wireless audio future and then introduced AirPods. When running down the features of the AirPods he spoke of beam-forming microphones, which is the same type of language used to describe how HomePods work when in a space..
So I think this was the inception point of the Apple - Dolby relationship, how long had they been working together before the announcement? At least one year I would think, and they kept going.
Since the release of the AirPods, they have become a phenomenon. Then came Apple's original HomePod in 2018, another product that featured beam-forming and spatial awareness. This original HomePod was updated with iOS & TVOS 14.2 to provide Dolby Atmos Spatial Audio when playing back the properly encoded content via your iPhone, iPad and Mac - but also via the Apple 4K TV set top box.
In 2019 Apple introduced the AirPods Pro, with Active Noise Cancellation (ANC) and toggle transparency mode to let the outside world in when necessary. AirPods Pro was also updated with 14.2 and a new firmware update gave these in-ear headphones audio like you've never heard.
Audio quality has improved across Apple's product line, with the Pro line of equipment having the best sound, even though the standard stuff is way better than what the other manufacturers are doing. Now, there is rumors of AirPods Studio, a pair of over the ear headphones forthcoming.
So sorry to go off on a tangent there but I wanted you to see how long I think Apple has been working with Dolby, and the improvements that has brought to the Apple product line. All with very little mention of Dolby Labs until the iPhone 12 announcement when they announced all iPhone 12 model can now capture, playback, edit and share Dolby Vision HDR video - with 10-bit in Max.
The Double D
I think there is a deep partnership between Apple & Dolby, as Cupertino places Dolby Labs standards into millions of users hands. I'm referring to Dolby Vision, and Dolby Atmos.
The world's top storytellers, filmmakers, and creatives all have one thing in common. They tell their stories using Dolby Vision to help audiences see what they see, and feel what they feel, creating a deeper connection with their audience.
Now for the first time ever, on any device in the world, all iPhone models can record, edit, watch, and share videos in the ultravivid picture of Dolby Vision. There’s nothing to enable and no settings to adjust so you can start recording in Dolby Vision right out of the box.
For filmmakers sharing their story with the world, travelers documenting the beauty of nature, or parents capturing highlights of the school art show, now it’s easy to capture and deliver these moments in the true-to-life color and depth of Dolby Vision, setting the new standard for content captured on smartphones. Social media platforms, and App makers are racing to catch up.
In collaboration with Apple, Dolby has a history of building technology ecosystems that enable consistent, spectacular entertainment experiences. Dolby collaborated to implement seamless, efficient, and cost-effective ways to deliver experiences across diverse platforms and devices.
The future for high-fidelity, user-generated content is here. Millions of people are now able to record, edit, watch, and share videos in Dolby Vision on iPhone 12 and iPhone 12 Pro models, empowering everyone with the same tech used by top filmmakers. But Pro Max can do more.
If you're trying to decide which model iPhone to get, here's a pro tip. The iPhone 12 and iPhone 12 mini can record in HDR at 4K at 30 frames per second, while the iPhone 12 Pro and iPhone 12 Pro Max can record in HDR at 4K at 60 frames per second. Now add in the better camera technology of the iPhone 12 Pro Max, and you have cinematic quality video, for $100 more.
The price difference between the iPhone 12 Pro ($999) and the iPhone 12 Pro Max $1099) is a bad mistake that many made, due to the staggered release. For the extra $100 you not only get the better camera with better capabilities, but you also get a much larger display, 6.1" Vs. 6.7".
What is Dolby Vision?
HDR is a video technology that presents pictures in with a wider amount of colors, brightness and contrast, allowing them to look more natural on a HDR-enabled TV or smart phone. I'm sure you have heard of contrast ratio for TV sets. Most have a X:1 with more being better. Think of it as graduations on a scale.
The more graduations you have, the greater the range of contrast. If your TV has a 1,000,000:1 contrast ration it has 1 million graduations. Now think of these very same graduations for each individual color in the spectrum, and now include all these same about of graduations of light. So a larger scale of color, light and contrast. But there is a Dolby difference.
Dolby Vision takes things one step further. It is a version of HDR that is designed to preserve a lot more information that can flow through from the original content creation - at a Hollywood studio, for example - to that content's arrival on your TV or mobile device. This information is called metadata and it carries the color, brightness, and color information for every frame of a film or TV show, so the TV (or phone or tablet) knows exactly how to display picture through the whole movie.
Because this information is there for every frame, it's called dynamic metadata, whereas standard HDR10 only has one data point, or static metadata. In short, Dolby Vision is an HDR standard that uses dynamic metadata. These dynamics give you better visuals and improve the image quality.
Whereas others attempt to boost resolution with more pixels, Dolby accomplishes image quality with a greater use of color. Dolby's version of HDR is superior to other HDR standards.
Right now you're saying what my iPhone 12 doesn't have spatial audio, it's got the same two speakers as always. This is where the Pro aspect of the devices extends the liquidity magic of the Apple experience. Dolby Atmos combined with Dolby Vision is a grand entertainment experience.
By pairing AirPods Pro with your new iPhone 12 you'll experience mind-blowing audio, you'll also be able to call up Siri with just your voice. As well as being able to block out the rest of the world with Active Noise Cancellation (ANC) and some other great features. But, I've covered this in depth in another blog, which I'll link to this blog if you want to dive deeper.
Each Apple Pro level device comes it with own amazing set of features, but when you start linking them all together, each device is heightened, and interconnected network becomes more liquified.
At this point some of you are like are we ever going to get the end of this blog? Unfortunately I don't have a choice it's my calling and my honor to pass along this information. But I am writing this long on purpose. Because this year is a milestone for Apple, they are standing at the threshold of a whole new era in computing. The idyllic future we have been promised for so many years is finally coming to fruition.
This quarter Apple led the industry in introducing two new top quality standards that the industry is racing to catch up with. The Dolby standards were being used in cinema and other devices, but Apple just propelled Dolby on to the main stage of every iPhone users everyday life. I'm not going to even dive into how the new M1 MacBooks are shattering previous barriers. It's a leap year ...
RAW is a photography science, that has gained more popularity in recent years, and it's soon coming to your iPhone. Traditionally RAW files are uncompressed digital image files that are stored on your camera without any processing. All major camera makers have their own version of RAW.
It's more of a standard, but individually proprietary to each manufacturer; Canon is CR3, and Nikon if NEF. Android phones that allow you to shoot RAW uses the DNG universal RAW format. With iOS 14.3, Apple is introducing ProRAW their version of the standard.
But unlike cameras that shoot RAW, the iPhone will allow you to edit and share the final image right on your phone. Using the editor in the Photos App. You will now be able to edit completely unprocessed files but with all the metadata there if you want it. You can get a helping hand from the A14 Bionic, or you can go freestyle, adding the colors, light and contrast you want.
Many of the YouTube photographers have been beta testing this aspect of iOS 14.3, and the results are phenomenal. I am not a beta tester, I rely too much on my phone and can't have things not working perfectly - I'm wayyyy too OCD for that. But I follow these guys pretty closely, and am always looking for others with new perspectives, I like to get my information from a variety of sources and then distill it for my own personal consumption.
Watching multiple reviewers, and creators I am able to pick up threads, and also find some commonality between the many, and varied opinions. But I learn something from each one of them every day. These people are my heroes, I don't follow sports or stocks - I'm tech-centric if thats not apparent by now. So anyway on to the homestretch, lets close it all up with MagSafe.
Stuck on You
MagSafe is a recycling of an older Apple technology for power cords on older MacBooks. The connection to the laptop was magnetized so if someone tripped over the cord it would simply detach from the machine. With iPhone 12 it's been reborn as a better way to charge your device, and a whole lot more.
After putting Qi wireless charging in iPhone 8, I'm sure Apple found it not optimal for iPhone charging. With a variety of Qi charging solutions on the market I'm sure many of them were not aligning with in-device coils and were causing all kinds of issues like overheating, which reduces battery longevity and a host of other issues.
Wanting the Pro experience I purchased the clear MagSafe case, and the charging disc. But there is a lot more to the MagSafe charging system as a whole. I've talked about it some in the channels and mention it briefly in the amateur video I posted above, but there is a lot more to it - read on.
In the virtual presentation Deniz Teoman, Apple's VP of Hardware Engineering spoke about the newly-designed single-turn coil NFC, and a magnetometer that senses other magnets. It also put in shields to focus the charging to the coils inside the phone so there is no energy bleed-off.
The magnetometer is what senses the Apple designed cases and chargers and prompts the display to put up the ring that indicates the case color, and if the MagSafe certified charger is detected it will also display the charging animation. The case ring animation, is separate from the charging.
Being from the payments world I am most interested in the NFC being front and center in the MagSafe charging area. This NFC sensor will be combined with App Clip enabled signage for a smoother more liquified payment experience. But, again I wrote about all this forthcoming ambient payment experience in another blog which I will link to this one. It's a whole new era for Apple.
Now there is a snap-on wallet too that provides room for a card and some folded up cash, but thats about it. As far as I have been able to ascertain the snap on wallet does not have any special payment capabilities. I don't think it interacts with cards in your wallet or anything like that.
It's more the first of many accessories that will be added to iPhone as Apple releases the IP around MagSafe to other manufacturers. I've heard in the channels that they have not yet released all the specs around MagSafe other than to point to some recommended vendors for MagSafe magnets.
Some of the coolest things I have seen so far is a magnetic tripod for mounting your new iPhone camera from Moment, they also have a complete line of lenses, filters and other accessories for your photo and video needs. Belkin has some of the initial accessories, a car charging vent mount has been released, along with a really cool stainless steel charging stand for Apple Watch and iPhone being announced ... you'll see a plethora of MagSafe accessories as we head into 2021.
Prediction: In my amateur unboxing video I talk about the alignment tail that currently serves no purpose other than the MagSafe Wallet. But I think AirPods 3 will be small earbuds that lay in a small magnetic oval tray that will snap to the back of your iPhone, and perhaps charge them.
It has been rumored that the MagSafe charging system also supports a low trickle reverse wireless charging capability that hasn't been activated yet. I think this will be activated to trickle charge future wireless AirPods. This may be where the alignment tail comes in. It doesn't align the MagSafe charger and it doesn't seem to have any other effect of any of the other accessories. The images above show AirPods with a stem, it's rumored that the new AirPods will be more earbud-like.
So stay tuned, Apple has more to reveal in the coming months. We are at the dawn of a new era for Apple products and services. This year they returned to their roots with the iconic iPhone 4 design and reclaimed their soul with a new Windows Killer series of laptops and desktops.
If you're pro-am, or even have aspirations of being a photographer or videographer go for the Pro Max. If you're a creator, move up a rung or two with Pro gear. In the few weeks I have been using this device, my workflow is better, I produce better results and I'm not waiting on anything ...
As a closing note, yes the iPhone 12 has 5G across all models, but that's for down the line, a couple of years from now. When AT&T, Verizon and T-Mobile start really rolling out true high-speed broadband with real 5G speeds you'll be ready for it. But for now expect 4G or 4G+ speeds and just know you're covered for the upcoming onslaught of real 5G when it arrives here in the U.S.
If you like Applefanboi please subscribe to the blog, look for some growth in the next few months. We're going to be adding an Applefanboi podcast, YouTube reviews, as well as white papers.
One thing I did want to mention to end this on a positive note. On the 27th of November Flurry Analytics Analyst, Aman Bansel wrote that the iPhone 12 Pro Max launch dominated all Apple devices over the last three years. Flurry Analytics is used in over 1 million mobile apps, so it can provide insights from 2 billion mobile devices per month, with anonymized data of course.
Aman reports, "This year’s iPhone 12 series captured a combined 1.9% install base during their combined launch weeks, a big jump over each of the last two years."
He continues Last year, the iPhone 11 series launch captured 1.2% install base, and two years ago the iPhone XS and XR series captured 1.6%. Impressively, the premium-priced iPhone 12 Pro Max captured nearly 1% of the iPhone install base in its first week, outperforming any other model in our analysis across all three years." I agree, across all iPhone 12 models - Pro Max outdoes them all.