Apple's Liquid Glass Has Nothing to Do with AR

Many think Apple has lost its mind with liquid glass. The consensus seems to be that this is either terrible UI design or some kind of elaborate preparation for augmented reality. Both takes are wrong.

I've been watching the reaction to liquid glass, and it's obvious that people are reacting to Apple's marketing rather than reading their actual documentation. If you ignore the hype and look at what Apple's design team actually built, liquid glass makes perfect sense. It's not revolutionary, it's not for AR, and it's definitely not applied everywhere. It's a targeted solution to a specific design constraint that every content app faces.

The Real Problem Apple Solved

Here's the engineering constraint that led to liquid glass: You're building an app that displays content - videos, photos, documents. Users spend most of their time consuming that content, not interacting with controls. But you still need some controls available instantly. The question becomes: how do you minimize visual interference while maintaining usability?

Before liquid glass, the least intrusive option was backdrop blur with dimming overlays. Apple asked whether they could push transparency even further. The answer required working backwards from the basic perceptual problem: how do you make something visible when it's completely transparent?

You can understand their solution with a simple thought experiment: Water is transparent, but you can see it being poured. How? Edge distortion and refraction. Your brain fills in the complete 3D shape from minimal visual cues. Apple shows this exact example in their presentation video - it's not revolutionary. Probably a designer was just taking a shower and noticed it. It's literally a shower thought: "oh yeah, duh, of course that's how we can have something that's almost completely transparent."

They applied this principle to interface design. Instead of opacity and blur, use refraction effects to indicate interactive elements with even less visual intrusion.

People Are Ignoring Apple's Actual Guidelines

Apple has a 20 minute video where the designers explain (once you boil away the marketing speak) that liquid glass should only be used when:

That's it. Apple isn't suggesting you redesign your entire interface.

This Isn't About AR

Because liquid glass shares visual DNA with visionOS, people assume this is preparation for augmented reality. This is completely wrong.

Apple isn't suggesting that all AR controls should be transparent. They're solving a practical problem for phones and tablets. The visual similarity to visionOS is coincidental - both use transparency effects, but for entirely different reasons and constraints.

Addressing Real User Complaints

Looking at actual criticism, most complaints assume liquid glass will be applied broadly. But Apple's guidelines restrict it to specific scenarios:

"Everything looks disabled and gray" - Liquid glass isn't for primary interface elements, only secondary controls in content-focused apps.

"I can't read anything at default sizes" - Apple provides opacity controls and limits this to scenarios where users learn control locations through repeated use.

"This is Windows Vista all over again" - Vista tried to make everything translucent. Apple suggest using liquid glass only for infrequent controls layerd on top of content. Think media consumption apps.

"Accessibility nightmare" - Apple maintains that liquid glass should only be used where spatial memory can compensate for reduced contrast.

The pattern is clear: critics assume broad application when Apple documented narrow constraints.

They have thought about this. All the knobs you could possible want tone down the effect for the obvious corner cases are fully implemented.

Why Apple Built This as a System Primitive

As a designer who has thought about these same design constraints for controls in content apps, I've wished there was something better than what we had. I've implemented progressive gaussian blur effects with gradient dimming layers (the same approach Apple uses in their native apps) and that was expensive. I could certainly not justify doing anything more sophisticated. It's a big hack to make this work properly in web apps, it requires a lot more computation.

Which brings me to why it's genuinely valuable that Apple built this as a system primitive. There's no way an individual engineer could ever justify liquid glass development to their boss. This is a "nice to have" - the kind of polish that improves the experience but would never make it through a cost-benefit analysis for a single app.

What makes liquid glass valuable is that it's an elegant solution that automatically provides minimal viable contrast against any background - even dynamic, moving content. Instead of writing conditional logic to sample colors and adjust contrast (which would be a nightmare), you get a system that handles all scenarios.

This adds another option to the designer's toolkit. Before this, backdrop blur with dimming was the least intrusive option available. Now there's one step further - a "barely there" option that's useful to have.

This Continues a Clear Progression

Liquid glass isn't revolutionary. It continues a progression that's really about adding granularity to design options.

Think of it as a continuum from very strong separation of controls to no controls at all. You start with separate toolbars that have clear borders - that's saying "this area of the screen is totally different from the content below." Then you move through various steps, each one adding another option between "showing nothing" and "showing controls on top of content":

  1. Heavy 3D buttons with borders → Flat design with clear boundaries
  2. Prominent toolbars → Contextual controls
  3. Opaque overlays → Translucent blur with dimming
  4. Translucent blur → Refraction-based transparency

Each step gives designers more precision. We had a rough approximation before: content or controls. Now we're refining that approximation with more intermediate steps between "A or B" and just "A" (pure content).

This means designers can be more precise about the level of visual intrusion. Liquid glass doesn't replace the other options - it adds one more level of granular control to the toolkit.

If you actually watch Apple's 20-minute design presentation, it's them saying over and over: "Yes, it's very easy to think of obvious ways this won't work. You can think of corner cases where this is a bad idea. Our answer is: don't use it for those cases."

Of course there are scenarios where liquid glass would be terrible. Apple explicitly didn't design it for those scenarios. If you're using it wrong, they're telling you these are the guidelines, and all the scenarios where things go poorly are outside the fence their guidelines set up. The actual area of this fenced-in zone is very small. Apple isn't suggesting you redesign everything with liquid glass - they're giving you one more option for very specific design problems.

Why the Reaction Is Wrong

Most criticism stems from three misconceptions:

  1. Assuming broad application - People think Apple wants liquid glass everywhere, when they documented very specific constraints
  2. The AR connection - Visual similarity to visionOS doesn't mean this is AR preparation
  3. Ignoring usage patterns - Most phone use is information consumption, not constant interaction

If you read Apple's actual guidelines instead of reacting to marketing, liquid glass makes sense. It's an engineering solution to a real design constraint, with clear limitations and specific use cases.

Summary

The only surprising thing about the reaction is that people are surprised. Apple built a specialized tool for a common design problem and documented exactly when to use it. The criticism mostly comes from people who didn't read the documentation.

Want a weekly digest of this blog?