Issue #128

Is My App Stuck in Review?

Cover for Weekly Issue 128

Last Thursday, a user in my Discord community complained that their app had been submitted to App Store Connect for four or five days but still hadn’t entered the review process. While I was enthusiastically analyzing the possible reasons with everyone, my heart suddenly skipped a beat: it seemed that the app I submitted on Monday hadn’t received any review updates either.

Someone suggested I apply for an “Expedited Review”. However, when I clicked into the page, the system prompted that I had “no eligible apps”. Upon closer inspection, I realized I was just a bit rusty from not updating my apps for so long—although my app had completed all the prerequisite steps, I simply hadn’t clicked the “Submit for Review” button at all.

Just a few hours after I finally clicked the button, the app was successfully approved and published.

While my situation was a mere false alarm, discussions in the community about Apple’s app review process slowing down have indeed been increasing recently. Many speculate that this might be related to the recent rise of Vibe Coding. Although there is no official confirmation, Vibe Coding has undeniably lowered the barrier to entry for development. In doing so, it has simultaneously amplified the volume of app submissions and the frequency of iterations in a short period, thereby passing the pressure down to the review team.

In fact, Apple has recently been holding up the review process for apps like Replit, which allow everyday consumers to engage in Vibe Coding. Even when allowing them to remain on the store, Apple has demanded compromises on core features. In Michael Tsai’s blog post covering this news, I came across a very sharp comment:

I thought the implication was that the vibe coding apps were being used to make the vibecoded apps that get submitted.

AI is not only reshaping the way we develop software but also posing new challenges to the app review and distribution systems. One might ask: if we fight magic with magic and let AI fully take over the review process, wouldn’t it be more efficient?

Apple’s review mechanism has never been entirely transparent. Sometimes, whether an app passes smoothly or not even depends on whether you “happen to” encounter a sympathetic reviewer. But looking at it from another angle, at least “humans” remain the most crucial part of this defense line. Human judgment can be flawed and biased, but it still retains a certain degree of flexibility when dealing with rigid rules.

I truly hope that the software ecosystem of the future does not devolve into a closed loop of “AI Development -> AI Review”.

Original

CDE: An Attempt to Make Core Data Feel More Like Modern Swift

In last week’s article, I discussed the current reality of Core Data in modern projects: it hasn’t disappeared and still holds unique value, but the sense of misalignment between it and modern Swift development is becoming increasingly apparent. In this article, I continue along that line of thought and introduce an experimental project of mine: Core Data Evolution (CDE).

It is not a new framework intended to replace Core Data, nor is it an attempt to pull developers back to older technologies. More accurately, it is my own response to this misalignment: if I still recognize the value of Core Data’s object graph model, migration system, and mature runtime, can it continue to exist in modern Swift projects in a more natural way?

Recent Recommendations

Expanding Animations in SwiftUI Lists

Developers often encounter a frustrating animation issue: when dynamically changing the height of a row inside a List, the content does not expand smoothly, but instead jumps abruptly. In this article, Pavel Zak demonstrates through several experiments why common approaches such as conditional rendering with if, withAnimation, or even .transition fail to produce the desired effect inside List. While built-in solutions like DisclosureGroup can achieve smoother results, Pavel presents a more flexible approach: using Animatable combined with view size measurement to ensure that List receives continuously changing height values during the animation, resulting in a truly smooth expansion.

A key characteristic of List (which is still backed by UIKit/AppKit) is that it requires a definitive row height during layout. Therefore, instead of letting List deal with structural changes, developers should, like DisclosureGroup, transform “discrete changes” into “continuous changes” by providing interpolatable height values. This is also why developers often resort to the Animatable protocol when dealing with animation anomalies. For a deeper understanding of this protocol and its use cases, you can refer to my previous article.


SwiftUI iPad Adaptive Layout: Five Layers for Apps That Don’t Break in Split View

While Apple’s push toward multi-window capabilities in iPadOS is well-intentioned, it significantly increases the complexity of layout adaptation. Apps may appear in various forms, such as iPhone-like layouts, traditional full-screen iPad views, or Stage Manager windows. Wesley Matlock points out that relying solely on horizontalSizeClass is often insufficient in real-world scenarios. Developers need to combine container size with size classes to build a more fine-grained LayoutEnvironment, make layout branching decisions at the root view, and leverage mechanisms like ViewThatFits to let the system choose the most appropriate UI based on actual constraints, rather than assumptions about the device.


Pitfalls and workarounds when dealing with RGB HDR Gain Map using ImageIO

The introduction of RGB HDR Gain Map based on the ISO 21496-1 standard in iOS 18 enables richer HDR image processing, but also introduces new pitfalls. Although the relevant APIs can return auxiliary data dictionaries, in the RGB Gain Map scenario the actual bitmap data (kCGImageAuxiliaryDataInfoData) is missing, preventing further processing. In other words, ImageIO is unable to fully read the content it generates in this case. Weichao Deng proposes a hybrid approach: use Core Image to read the Gain Map as a CIImage, manually render it into bitmap data, reconstruct the missing fields, and then write it back via ImageIO. For developers working on camera or image processing apps that involve HDR Gain Maps, this article can save a significant amount of debugging time.


A Vision for Networking in Swift

The Swift Ecosystem Steering Group recently published a vision document on networking, discussing the current fragmentation in Swift’s networking ecosystem and its potential future direction.

The document highlights a clear divide: URLSession, SwiftNIO, and Network.framework coexist with overlapping functionality but incompatible abstractions. Developers often need to commit early to a specific stack, making later changes costly. Additionally, most existing networking APIs were designed before Swift Concurrency and rely on completion handlers, delegates, or reactive patterns, which feel increasingly out of place in modern Swift.

The proposed direction is a unified, layered networking architecture: shared I/O primitives and buffer types at the bottom, reusable protocol implementations (TLS, HTTP/1.1/2/3, QUIC, WebSocket) in the middle, and async/await-based client and server APIs at the top. The swift-http-types package (defining HTTPRequest / HTTPResponse) can be seen as an early step in this direction. The document also emphasizes that SwiftNIO and Network.framework will not be replaced, but will gradually converge toward shared underlying primitives.

The vision is currently open for community feedback. You can participate here.


Preparing Your iOS Codebase for AI Agents

As AI agents (such as Codex and Claude Code) become increasingly involved in real-world development workflows, the focus is shifting from “how to use AI to write code” to “how to make codebases suitable for AI collaboration”. Hesham Salman explores this transition from an engineering perspective.

He argues that AI relies more on explicit contracts than on prompts. By structuring project conventions and behavioral rules through layered AGENTS.md documentation, using a Makefile to standardize build and test workflows, and encoding multi-step processes into reusable “skills,” implicit engineering knowledge can be transformed into structured, machine-readable systems.

One particularly insightful detail: the author requires agents to update documentation whenever they encounter undocumented conventions, while enforcing a strict rule — every change must make the document shorter or more useful. This self-maintaining mechanism prevents both documentation decay and uncontrolled growth, striking a practical balance.


iOS Conf SG 2026 Videos

iOS Conf SG 2026 was held from January 21 to 23 in Singapore, featuring dozens of developers and content creators from around the world sharing their insights and experiences in the Apple ecosystem. Last week, the full set of talks was released. I also had the opportunity to participate as a speaker, and you can explore the sessions as you’re interested.

Tools

TaskGate: Managing Actor Reentrancy

While actors largely eliminate data races, their reentrant nature means that logic which appears sequential can lose its execution order after an await, leading to duplicate work or inconsistent state.

TaskGate, created by Matt Massicotte, addresses this scenario by introducing AsyncGate and AsyncRecursiveGate, which define critical sections for asynchronous code within actors. These ensure that only one task can enter a given section at a time. Unlike traditional locks, TaskGate allows safe asynchronous operations while holding the gate.

Matt explicitly notes that this is not a replacement for well-designed actor models, but rather a supplementary tool when other approaches are insufficient. The gates are intentionally non-Sendable to reduce misuse across actors. If you’re dealing with reentrancy-related state issues or want to better understand this subtle aspect of Swift concurrency, both the library and the related Reddit discussion are worth exploring.


pico-bare-swift

When Apple created Swift, the goal was clearly broader than just app development—it was meant to evolve into a general-purpose language across domains and abstraction levels. However, for a long time, Swift has struggled to gain traction in areas traditionally dominated by C/C++ or Rust. Through this example project, kishikawa katsumi demonstrates another possibility: with Embedded Swift, the language is beginning to enter the domain of embedded systems.

What makes this project particularly appealing is that it turns something traditionally associated with low-level, C-centric development into a structured learning path. It goes far beyond “blinking an LED with Swift,” covering startup code, vector tables, memory initialization, register access, as well as drivers for UART, PWM, I2C, and SSD1306 OLED displays. In a sense, the value of such projects lies not in their practicality, but in how they redefine the boundaries of what Swift can do.

Related Weekly

Subscribe to Fatbobman

Weekly Swift & SwiftUI highlights. Join developers.

Subscribe Now