THE The iOS 26 version brought a host of new updates to Apple’s mobile interface, and one app that has received a lot of attention is Apple Maps. A new Places Visited feature lets you quickly identify restaurants, stores and other recurring locations you come back frequently.
There’s also a new feature called Favorite Routes that remembers your most traveled paths to give you alerts for your daily commute. But one of the biggest changes that will affect how users interact with iOS Maps is a natural language search feature built right into the app.
Powered by Apple Intelligence, Natural Language Search uses the power of AI to make all your Apple Maps searches even easier. You don’t have to worry about digging through the settings to enable this feature, as it’s built right into Apple Maps’ coding.
Those updated to iOS 26 will see a pop-up when launching Maps saying “Search the way you speak.” This opens the door to more dynamic and conversational search methods, with prompts like “Where is the best Chinese restaurant near me that’s open late” or “Find a cafe with free Wi-Fi on your way to work.”
Learn more: 5 of the Best Tips and Tricks for Using Apple CarPlay
Natural language search makes Maps more human
The Apple Maps app displayed on an iPhone. -sdx15/Shutterstock
Apple Intelligence is one of the most notable forces behind the inner workings of iOS 26, and the enhanced natural language tool has been integrated into many first- and third-party apps. But when it comes to how the improvement directly affects Apple Maps, users will no longer be limited to typing stilted search queries, which was especially frustrating when trying to look up things while driving.
When combined with voice assistant Siri, the latest version of Apple Maps should provide a seamless user experience, from your initial search to the results provided. And thanks to AI, you can continue your Maps searches with contextual tracking. For example, once Maps returns the results “Where is the best Chinese restaurant near me that’s open late,” you can follow up with “Show me the quickest way to get there” or “Does it close in less than 10 minutes?” ยป
Apple Core Models Framework This is what makes this new era of natural language tools possible. Acting as a behind-the-scenes intelligence for conversational searches, Apple’s AI is capable of understanding much more than the words you speak; it also understands the overall intention behind them. This allows software like Apple Maps to provide a more human experience when providing search results. This will make you feel more like you’re chatting with a friend or family member than with a smartphone.
A more complete Apple Maps experience
A person driving on the highway. -Anyaberkut/Getty Images
Imagine a world where everyone new features and improvements in iOS 26 Maps work in unison: You might be driving down the highway after an exhausting day at the office and say, “Take me to my usual bar.” With the Visited Places feature, Apple Intelligence will know exactly which watering hole you’re talking about and preferred routes will help you get there as quickly as possible.
But let’s say there’s an accident somewhere on your route and traffic starts to build up. With natural language search, you’ll be able to say “Take me to this location using secondary roads” and the Maps app should generate a new set of directions without you having to tap or swipe.
Let’s face it: we’ve all reached past our steering wheel to type a word or phrase into the Maps search box. Not only does this new feature create a more cohesive and interactive Maps app for all iOS 26 users, but it also provides a safer driving experience. With Apple Intelligence behind the wheel of our iPhones, it’s one less reason to take your eyes off the road.
Did you enjoy this article? Sign up for CNET’s free newsletter And add us as your preferred search source for the latest in tech and entertainment, plus tips and advice you’ll actually use.
Read the original article on CNET.