# 57 : Ultra-Wide Virtual Display is Great, But We Expect More

Published on

Get weekly handpicked updates on Swift and SwiftUI!

Weekly Comment

The release of visionOS 2.2 beta finally brings the long-awaited “wide” and “ultra-wide” virtual display modes to Apple Vision Pro. In ultra-wide mode, users are presented with an expansive display space that transcends physical field of view limitations. Although Apple Vision Pro has its hardware resolution constraints, thanks to Foveated Rendering technology, the virtual display is not only clearer than before but also delivers a visual experience comparable to an 8K display.

The introduction of this feature has delighted many Apple Vision Pro users, who have eagerly retrieved their shelved devices to immerse themselves in the new ultra-wide display experience. From my personal experience, the ultra-wide virtual display creates a unique sense of immersion for development work, allowing me to maintain focus for extended periods — until the device’s weight interrupts this immersive experience.

The ultra-wide mode is likely to extend users’ daily usage time and may also spawn new application forms — macOS desktop applications specifically optimized for Apple Vision Pro’s ultra-wide display. From a certain perspective, the success of the virtual display feature seems somewhat “counterintuitive”: Apple originally envisioned Apple Vision Pro as a standalone spatial computing device independent of other devices. Nevertheless, the enhancement of virtual display functionality is undoubtedly a significant benefit for existing users and demonstrates Apple’s commitment to continuously improving the visionOS ecosystem.

The introduction of ultra-wide virtual display proves that Apple Vision Pro’s hardware potential remains largely untapped. As Apple releases more functional APIs, developers may create innovative experiences beyond imagination for this device. While these improvements may not immediately lead to significant sales growth, the ability to explore more possibilities in spatial computing is enough to demonstrate the historical value of this first-generation product.

Notably, the virtual display functionality doesn’t rely on some of the “premium” components in Apple Vision Pro (which have limited functionality while adding cost and weight). While maintaining core computing capabilities, introducing a more affordable model would undoubtedly attract more potential users who are currently on the fence. We look forward to Apple Vision Pro unleashing more exciting potential in the future.

Originals

Mastering Data Tracking and Notifications in Core Data and SwiftData

Fatbobman

Core Data and SwiftData, as powerful persistence frameworks in the Apple ecosystem, not only provide declarative data listening tools like @FetchRequest and @Query, but also have a complete set of data tracking and notification mechanisms built-in. Understanding and mastering these mechanisms is crucial for building robust data-driven applications. This article will take you through multi-layered solutions—from simple custom notifications to the powerful Persistent History Tracking and SwiftData History—to help you handle various complex data synchronization scenarios.

Recent Selections

Swift Ownership and ~Copyable

Wei Wang (Onevcat)

Rust’s ownership system ensures memory safety through strict rules, making it widely used in systems programming and high-performance computing. The Swift team has also recognized the importance of ownership, introducing related features in version 5.9 and refining them further in Swift 6.0. In this article, Wei Wang (Onevcat) delves into Swift’s ownership system and keywords like ~Copyable. He points out that while understanding ~Copyable isn’t essential for all Swift developers, it can significantly enhance code stability and efficiency in scenarios involving resource exclusivity and lifecycle management.

Developing in Swift with VS Code Dev Containers

Natan Rolnik

Dev Containers are a technology that uses Docker containers as a full-featured development environment, ideal for running applications, isolating development tools and dependencies, and supporting continuous integration and testing. In short, Dev Containers allow developers to run and debug executables within Docker containers, ensuring consistency between local development and remote deployment environments. In this series, Natan Rolnik explores how to leverage VS Code Dev Containers for Swift development, showcasing how this technology streamlines the development workflow.

Being Sendable with SwiftData

Leo G Dion

In SwiftData, you can use a PersistentIdentifier that conforms to the Sendable protocol to ensure thread-safe data transmission across threads. However, this often results in the loss of type information for models. In this article, Leo G Dion presents a solution by constructing a Phantom Type structure to retain type information while passing identifiers, thereby achieving thread safety without sacrificing model type identification.

Swift Format in Xcode

Sarah Reichelt

In Apple’s development ecosystem, two similarly named yet functionally similar code formatting tools exist: SwiftFormat by Nick Lockwood, which offers rich customization, and Apple’s own swift-format, built into Xcode 16. In this article, Sarah Reichelt tests swift-format and compares it to tools like SwiftLint and Prettier. She hopes that Apple or the Swift community will eventually introduce a Swift style guide, alongside support for auto-formatting on save, to further enhance the development experience.

SwiftUI Self-Sizing Flow Layouts

Keith Harrison

Starting with iOS 16, SwiftUI offers the Layout protocol, allowing developers to build custom layout containers. In this article, Keith Harrison explores how to achieve a UICollectionViewFlowLayout-like effect in SwiftUI, enabling automatic adjustment of the number of items per row or column.

Unfortunately, the Layout protocol currently lacks support for lazy loading, limiting its effectiveness for handling large datasets.

On Device Llama 3.1 with Core ML

Official Apple documentation. This article provides an in-depth guide on deploying the Llama 3.1 mid-sized language model on Apple Silicon devices, optimized for real-time applications. Using optimizations like Key-Value Caching and block-level Int4 quantization on macOS Sequoia, the 8B parameter Llama-3.1-8B-Instruct model achieves a decoding speed of approximately 33 tokens per second, significantly enhancing local inference efficiency.

Event

Let’s visionOS 2025

Let’s visionOS 2025 will be held in Shanghai, China, from February 28 to March 2, 2025. Let’s visionOS is the world’s first conference focused on spatial computing and Apple Vision Pro app development, and it was successfully held in Beijing earlier this year. The 2025 theme will further expand to include spatial computing, artificial intelligence, and iOS. For more details, please visit the official website.

Weekly Swift & SwiftUI insights, delivered every Monday night. Join developers worldwide.
Easy unsubscribe, zero spam guaranteed