In iOS 15, iPadOS 15, and macOS Monterey, Live Text is everywhere: in screenshots, in your Photos library, and even in text-input areas. What it does is recognize words in any image, then turn them into regular text. From there, you can select it, copy it, share it, look it up, and even translate it. Live Text also brings Apple’s “data detectors” to the party, so you can tap a telephone number in a photo of a store sign, for example, then call it with the phone app.  If you’ve ever found yourself tapping a word in a paperback to look up its meaning, long pressing a link printed in a magazine to see a web preview, or tapping a place’s name to see it on a map, then you’re going to love Live Text. It makes the real world searchable, editable, and more usable.

It Just Works

In the new Mac, iPad, and iPhone operating systems, Live Text is just there. There’s no special mode. Apple has added live text anywhere it makes sense. There’s a new button every time you take a screenshot, which lets you highlight all text in the image, and that’s the most complex it ever gets.  In most cases, the text in a photo is simply text. Say you took a photo of a product label in a store to remember to check on it later. If you’re looking at that picture in the Photos app, you just swipe your finger across any text to select it. It’s just that now, all images of text are also text, automatically. From there you can share it, copy it, use the new built-in translation feature, call the number, open a link, see an address on a map, and more. 

Search Anything

The first time you install iOS 15, it will scan and process your photo library, recognizing any text it finds. This has one very powerful implication: if you search for something in Spotlight (the system-wide search tool), then it will include results from text in your photos. For instance, you can find receipts you photographed years ago, just by searching for any text that may be on the receipt. Are you trying to remember where you ate that delicious rice dish on holiday in the Costa Brava? If you photographed the menu, you’ll be able to find it easily. Or how about building a recipe book without even trying? Every time you see a recipe in a magazine or in a recipe book, just take a picture, and you can find it any time.  Live Text fundamentally changes how you interact with the world. Suddenly, every word on any piece of paper, any storefront, screenshot, or road sign becomes just as usable as text in a notes app. There are already bits and pieces of this in computing. Google’s Translate app has long been able to translate text through the camera, and iOS has been able to scan and OCR documents in the Notes app for a while. But now that Apple has baked Live Text into its devices, there’s no longer any distinction between kinds of text. It’s all the same. Even those screenshots of text people post on Twitter are now as usable as if they’d done it properly. 

AR Lite

Live Text is another example of Apple diving into augmented reality. We’ve seen how Apple is all in on AR, from the too-long demos in its various keynotes over the years to the neat AR models of new products that let you see how the new iMac would look on your desk. Apple also has added plenty of audio AR features, reading out messages and alerts, or giving you directions via AirPods.  It’s an open secret that this is all practice for Apple’s eventual AR glasses product, and Live Text will likely be a big part of that. Not only will your glasses be able to read signs around you to get better awareness, but they’ll be able to look up information as you read it.  But for now, we’re all benefitting from Apple’s AR experiments. Live Text is just fantastic. I’ve only been using it for a couple of days, and it already seems natural. I can’t wait to see what app developers do with it.