WWDC first look: How Apple is improving its ecosystem

5gDedicated

While the new user interface design Apple execs highlighted at this year’s Worldwide Developers Conference (WWDC) might have been a bit of an eye-candy distraction, enterprise users were not forgotten, with announcements unveiled that could make a difference in the business world.

Apple’s WWDC keynote speech was far from the disappointment many had anticipated, given the number of changes announced. In addition to that new user interface, which will show up across all Apple products and is called Liquid Glass (we’ll look at it in more depth elsewhere), there were also APIs designed to help developers work AI into their software. The last is a nod to the company’s slow efforts to make Apple Intelligence the showcase tech promised at last year’s WWDC.

But for now I’m focused on just some of the changes Apple introduced that should have the most impact on enterprise users. Many, but not all, of these changes will be similar across all Apple’s devices, whether that’s macOS, iOS iPadOS, or others.

What’s a computer? The iPad

For me, the big news concerns the iPad, which gains noteworthy improvements that will transform its use. Not only can you open multiple app windows at once, but Apple has done this with a clever and versatile windowing system that even supports windows resizing, pervasive position memory and more. (You can even see all your currently open windows in Expose.)

These multitasking and windowing improvements should make it much easier to engage in professional work on an iPad and are the kind of improvements people have wanted for years. Changes on the iPad extend to the Files app, too, which gets important updates that will make it particularly useful.

Files for the rest of us

All Apple users, not just iPad owners, should see benefits from changes coming to Files. For iPad users, they make it easier to find and work with the files you need, and to add additional sorting options in the Files window; but Files is an app, which implies that at least some of those improvements will be available across other Apple devices. Pro users will likely also appreciate the new ability to customize folders with symbols, emoji, and colors.

The transformation of automation

Apple rolled out improvements in Shortcuts, including the capacity to create them and the ability to use Spotlight to identify and invoke them. And since the Shortcuts you do create can sync across all your devices, users will be able to build powerful productivity shortcuts on one device and use them on another, where they’re supported. This integration is, of course, also reflected by the Liquid Glass UI, which should create a real sense of familiarity as you move from Mac to tablet to iPhone.

Spotlight is evolving

Apple also announced big changes in Spotlight search on the Mac, including intelligent actions built around Shortcuts and Apple Intelligence, as well as Spotlight actions and quick keys; the latter automatically surface actions you might want to take with a Spotlight-selected item and commit those actions from within the search. This is powerful — you can find an item, change and save it, and share it, all from within Spotlight. The software can also learn from what you do, offering personalized actions for what you might want to do. 

“During a search, all results — including files, folders, events, apps, messages, and more — are now listed together and ranked intelligently based on relevance to the user,” Apple said. N”ew filtering options rapidly narrow searches to exactly what a user is looking for, like PDFs or Mail messages. Spotlight can also surface results for documents stored on third-party cloud drives. And when a user doesn’t know exactly what they’re searching for, Spotlight’s new browse views make it easy to scan through their apps, files, clipboard history, and more.”

You can also take hundreds of actions from directly inside Spotlight, such as sending an email or writing a note. While many of these features are being promoted as for the Mac, Spotlight operates across Apple’s platforms, so it will be interesting to see where else these features might show up as beta testing begins and future support rolls out.

App Intents

That brings me to App Intents. App Intents have been around for a while, but the framework will now let developers make actions in their apps in available across the system via Shortcuts or Spotlight. If nothing else this, should make work itself easier to get on with, and as those improvements percolate across Apple’s ecosystem, the hardest task will probably become simply remembering what actions you are able to take.

App Intents also gain support for visual intelligence, which means apps will be able to provide and make use of visual search results.

Etsy CTO Rafe Colburn explained what this means in a statement provided by Apple: “The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock and makes it easier than ever for buyers to quickly discover exactly what they’re looking for while directly supporting small businesses,” he said.

Vision Pro

Enterprises are already using Vision Pro devices, and Apple talked up the spatial computing headset during its WWDC keynote. Given the cost of these devices ($3,499), it is a welcome change that it’s now much easier to share them in workgroups; each user can save their own vision OS profile to their iPhone and then set the vision device up to suit. Apple also unveiled a new Vision Pro tool that lets you save a piece of content as Protected Content. (There are new APIs app developers can use for this, too.) The intention here is to ensure content shared on a Vision Pro doesn’t get shared outside your business.

There’s also support for a new Vision Pro peripheral, one that doesn’t come from Apple. Logitech Muse is a new pen/control device for spatial reality that lets you tweak and draw when working on projects with others. Look to Scroll allows users to explore apps and websites using just their eyes, and users can customize the scroll speed to their liking. Not surprisingly, developers will be able to integrate Look to Scroll into their visionOS apps.

Apple Intelligence was everywhere

Reading between the lines, you can see that artificial intelligence is actually everywhere in this year’s WWDC announcements. One of the biggest announcements does hint at the contextual intelligence Apple has previously told us about: Visual Intelligence now understands what is on screen, lets you ask questions about what you are looking at, and allows you to search for specific items or add events to your calendar. Apple has, of course, made it possible for developers to build support for this feature within their apps. 

“Last year, we took the first steps on a journey to bring users intelligence that’s helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we’re integrating features in even more places across each of our operating systems,” said Craig Federighi, Apple’s senior vice president of software engineering.

What this means: developers can now use Apple Intelligence APIs to build its tools within their apps, thanks to the Foundation Models Framework.

Developers can also look forward to Xcode 26, which can connect large language models (LLMs) directly into their coding experience, thanks to built-in support for ChatGPT. (Developers can also use API keys from other providers or run local models to support Xcode.)

Apple Intelligence features will be coming to eight more languages by the end of the year: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese. 

Live Translation

Live Translation is profound. It’s built into Messages, FaceTime, and iPhone and runs entirely on device. In use, it can automatically translate messages between languages, offer translated live captions on FaceTime calls, or speak the translation aloud during a conversation while on a phone call. It’s a fantastic feature I can’t wait to try and I expect to see it used in very interesting ways. 

Apple also did confirm a new system to help promote accessibility in apps. Accessibility Nutrition Labels for App Store product pages will let developers tell customers what accessibility features their apps support.

Summing up, even if this is a lean year for flashy announcements, the size of Apple’s ecosystem is now so vast the company is still capable of introducing changes that will delight customers, while also helping them get things done.

Apple rolled out the first betas of the new operating systems on Monday, with public betas following over the next few weeks. So you will be able to see for yourself which improvements make the most difference to you. Follow me through one of the networks below to learn more about these changes as I explore them.

You can follow me on social media! Join me on BlueSky,  LinkedIn, and Mastodon.WWDC first look: How Apple is improving its ecosystem – ComputerworldRead More