The iPhone’s cameras get most of their superpowers from custom silicon running powerful software. The iPhone 13 is no exception, using raw computing power to create the fantastic Cinematic mode. But it also introduces hardware changes that improve all cameras, especially in the iPhone 13 Pro. But there are all kinds of reasons to upgrade or hold off.  “The only reason I’m not as interested in the cinematic mode is that it looks like it’s built upon portrait mode, so it might not be as sharp as actually having a lens focus on a subject,” photographer and app developer Chris Hannah told Lifewire via direct message. “[But] the telephoto is a main reason why I’m upgrading.”

Pro vs. Non Pro

The most significant differences between the iPhones 13 and 13 Pro are physical. The 13 has a 2x optical zoom range, for example, whereas the Pro gets 6x. The Pro also has a LiDAR scanner for better movies, low-light autofocus, and portrait mode photos. It also has superior lenses, which enable macro mode. This lets you focus as close as two centimeters or less than an inch.  The superior cameras in the Pro also take better night shots and allow night mode on the telephoto lens.  Feature-wise, though, the regular iPhone 13 gets pretty much all the new camera tricks because the A15 chip powers both models. The only software feature that the regular 13 won’t get is ProRes video support. 

Cinematic Mode

The big news in the iPhone 13 is Cinematic mode. This is like the portrait mode we’ve been using for years, blurring the background to make the subject pop out. But with video, things get even fancier.  Cinematic mode emulates the focus-pulling technique seen in countless Hollywood movies. This is where a camera operator switches focus from something near to something far, or vice versa. Done well, it moves your eye around the frame without jarring viewer. Apple’s take is done computationally. It generates a depth map of every single frame, at 30 frames-per-second. This is a 3D map of the scene, so the iPhone knows how far away everything is. It then decides who or what should be in focus, and uses this map to blur the rest of the scene in a natural-looking way.  This is impressive on several levels. First, there’s the sheer power required to compute a depth map for every single frame. Then, if the example movies are anything to go on, the result is way, way better than the current Portrait Mode for still images, free of weird artifacts like blurring the gaps around spectacles, and so on. And the actual focus-pulling action is also pretty great, mimicking the pro pulling from the movies.  Also impressive is the AI that determines who or what is the current subject. Apple says that the iPhone uses depth cues, but also looks outside the current scene (presumably using the ultra-wide camera) to see if somebody is about to enter the frame. “It will change the language of cinema, in a very positive way,” said cinematographer Greig Fraser in Apple’s promotional video. We’ll see how good this is in practice. One of the cues that trigger a focus shift is when the currently in-focus person looks towards another person. In the demo videos, these motions were quite exaggerated, perhaps for comedy effect, but maybe because the effect requires it. It could end up that we’re all shooting movies that look like the Dramatic Chipmunk: No matter, because the most impressive part is still to come: Because all of this focusing is computational, you can adjust it after the fact. While editing, you can choose the subjects and even change the virtual lens aperture to control the blur.  The iPhone’s cameras continue to impress, and although the focus for the past few models seems to be on video, the stills side is still improving fast. But perhaps the best part of this new round of iPhones is that the regular model gets almost all the new features of the Pro, only leaving out those that rely on new hardware. It’s a pretty neat time to be an iPhone movie maker.