Cook Puts Another Apple in the Oven

Apple’s WWDC 2025 showcased a major UI overhaul, “Liquid Glass,” a unified design language across iOS, iPadOS, macOS, and visionOS. While the Vision Pro’s visionOS received positive updates, including enhanced widgets, the core focus was on a more dynamic and layered user experience. iPadOS saw significant improvements to become a true productivity tool, catching up to macOS. However, the expected advancements in Apple Intelligence were underwhelming, raising questions about Apple’s AI strategy.

Apple’s stock has taken a beating, dropping roughly 20% this year, making it the worst performer among the “Magnificent Seven” tech giants. The Street is abuzz: When will Cupertino regain its AI footing?

The answer, as of today, isn’t exactly “now.”

At the recent WWDC (Worldwide Developers Conference) 2025, Apple unveiled updates across its entire ecosystem, encompassing iOS, iPadOS, macOS, visionOS, and tvOS.

One notable shift: the OS naming convention has been revamped. iOS, for instance, skips iOS 19 and rebrands as “iOS 26.” Apple signals a fresh start with a year-based suffix, perhaps hinting at a new era.

**Another major change? A complete UI overhaul, the first in 12 years. It’s dubbed “Liquid Glass,” a unified design language intended to permeate all platforms.**

And AI?

Well, let’s just say the demo of “Live Translation” in iOS 26, given the current timing and functionality, would likely elicit a few knowing chuckles if presented by a certain Android rival.

The stark reality is that Apple failed to showcase a compelling Apple Intelligence (AI) vision at this WWDC. Some capabilities touted at last year’s conference remain unrealized.

So, in a generative AI arms race, how has Tim Cook steered the company to focus on user interface innovation?

01 Liquid Glass: What Is It, Exactly?

The “Liquid Glass” concept dominated the conference. Officially, it’s described as “a new, semi-transparent material that reflects and refracts the environment, further highlighting content and bringing elements like controls, navigation, app icons, and widgets to life.”

Essentially, it’s about adding a sense of physicality, creating a more transparent, multi-layered visual experience.

Cook's Vision

The introduction of Liquid Glass in iOS 26 brings two immediate changes.

First, dynamic lighting effects. Liquid Glass can react to content and context, generating different levels of sheen and real-time light rendering to create a more lively interface compared to its static predecessors.

Second, dynamic interaction elements. Interface components shift based on user actions and device status. For example, when accessing extra options, elements morph like real glass, departing from the static designs of the past.

Apple’s last major UI overhaul, launched with iOS 7 in 2013, was defined by its minimalism and brighter color schemes. Going back further, we see the “skeuomorphic” design of the early iOS iterations.

**Does iOS 26 mark a greater leap than the one made with iOS 7?**

I’d argue no. iOS 7 was a direct response to user needs. However, based on the presentation at today’s WWDC, that doesn’t seem to be the driving force behind iOS 26.

Consider the shift away from skeuomorphism. That design was a product of a time when touchscreens were new. It was meant to help people understand the interaction logic of these devices. This generated the oft-quoted “even a five-year-old can use an iPhone” simplicity.

Those needs were largely obsolete by 2013. Content-driven applications were booming, and users wanted as much content displayed on the screen as possible. Apple responded by making the background translucent to emphasize the content and removed visual distractions like gradients and shadows.

Has the graphic interface usage requirements inherently changed? The answer is obvious.

**Why, then, is Apple doing this? I’d argue there are a couple of reasons:**

First, it’s about matching the hardware. The A18 chip in the iPhone 16 series has a powerful neural engine capable of supporting the real-time lighting and dynamics of Liqiud Glass. A richer, more layered UI does improve the user experience.

Second, it’s about “building a unified cross-platform visual experience and constructing the interaction system for the spatial computing era.” Judging by the conference, Apple is building the future around Liquid Glass.

02 Spatial Computing: The Narrative for the Next Decade

Two years ago, Apple unveiled its Vision Pro headset at WWDC, effectively kicking off the age of “spatial computing.”

Though, even today, the Vision Pro maintains an advantage over competitors, its sales, according to numerous third-party research firms, are only about 400,000–500,000 units. This is not a major number for Apple.

However, indications suggest Apple intends to continue developing head-mounted displays. This includes iterations on the Vision Pro and, potentially, a more affordable headset designed for mass-market appeal.

This creates an inevitable paradox: maintaining a consistent experience across an ecosystem of products demands unified integration. The nature of a two-dimensional screen experience versus a three-dimensional space, creates a deep division in user interface methods.

To solve this, the firm needs to solve two challenges. First, the disparity between the underlying logic of different visual languages. With iOS 26, Apple seems to have begun by ensuring consistent dynamic light and material qualities.

**The second challenge is far more challenging: that of abstracted uniformity in interaction logic.**

During the Liquid Glass presentation, Apple mentioned the concept of Liquid Glass being a unique functional layer, added on top of existing apps, in limited cases.

Cook's Vision

While details were scarce at the conference, my view is that Apple is potentially translating the concept of “spatial folding” in a three-dimensional space to “layered nesting” within a two-dimensional interface.

Undoubtedly, this is complex. It requires a new intermediate layer for dimensional conversion, and a complete overhaul of existing input methods. This isn’t a project that can be completed in any version of iOS 26.

However, if Apple can integrate a cross-platform experience for 2D and 3D interfaces, even if they are still chasing the AI paradigm, they can tell a solid story of innovation for the next decade.

The vision OS 26 system, meanwhile, is fairly well done.

I was impressed by the “widget feature.” Users can set up their own virtual watch faces and virtual photo albums on interior walls using the Vision Pro.

VisionOS will leverage spatial anchoring technology to precisely remember the layout and settings of each widget. Every virtual component will reappear in its original location when the user puts the device back on.

Cook's Vision

The capacity for computing is still far ahead of the competition. For Cook, it could be the product he actually wants to see shine.

03 iPadOS: Compensating for Lost Ground

Apple’s “cross-dimensional design” is exciting, but it’s also a long-term promise. The immediate iOS 26 upgrades may not be enough to persuade current users to update their software and hardware.

Apple seems to have recognized this as well. Hence, we see the most significant update to iPadOS in the system’s history.

Here are a few immediate changes to take in:

iPadOS 26 now includes window control buttons like those in macOS, as well as the ability to dynamically resize windows via the lower-right corner.

Cook's Vision

iPadOS 26 also brings in the macOS top menu bar. The user can access the commands for the applications they’re using by dragging down from the top of the display or moving the cursor to the top. Notably, users may use the search function in this menu bar to rapidly locate specific features or relevant tips in applications.

Even file management has been changed, and file opening methods and folder styles can now be customized. They can also be synchronized across devices.

Cook's Vision

Multi-tasking, front-end scheduling, and even the appearance of the pointer have all been “fully Mac-ified.” With the update to iPadOS 26, the iPad is finally looking like a true productivity tool.

This begs a question: will the MacBook Air’s ecosystem be threatened by these capabilities moving to the iPad?

Apple may have taken this into consideration. macOS 26 Tahoe has seen its own “ultimate evolution”: **the integration of Shortcuts, Spotlight, and Apple Intelligence.**

Users can now quickly launch various functions using natural language and phrases, and they can also be compatible with specific secondary functions within third-party apps like “remove image background” in a given app.

In short, this year’s updates to iPadOS and macOS are truly impressive. But a fair question looms: Were these functions impossible before?

Consider the iPad Pro 2020 launch. Cook stated it was a tablet capable of replacing the PC. It’s only just now, though, that we’re seeing a true “complete iPad.” From a technological standpoint, multi-tasking and the Mac-like front-end scheduling pose no hardship.

Amidst the external questioning regarding Apple’s AI prowess, Cook has finally abandoned his “perfected approach.”
**That said, if Apple fails again to deliver on Apple Intelligence capabilities, what will they have to bring to the table at next year’s developers’ conference?**

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/2130.html

Like (0)
Previous 2025年6月9日 pm6:27
Next 2025年6月9日 pm8:26

Related News