 WWDC 2024 highlights for developers 🧑‍💻

June 14, 2024

WWDC SwiftUI poster

The highly awaited WWDC 2024 has just wrapped up, and Apple has once again introduced a series of groundbreaking announcements set to transform the tech industry. Here is a brief overview of the major highlights from this year's event. Hope you enjoy 🙌.

This issue is somewhat beyond the focus of this newsletter, but following an incredible week at WWDC, I would like to express my gratitude 🙏 to the initial subscribers by providing a summary of the key highlights that I believe are most important for Apple developers.

Introduction of Apple Intelligence

The next big step for Apple is here. Apple Intelligence is a personal intelligence built into your devices to help you write, express yourself, and get things done effortlessly. It is deeply integrated into features and apps across the system, and built with privacy from the ground up.

Apple Intelligence starts with Apple's on-device foundation model and it is extended to the cloud with Private Cloud Compute to run larger foundation models.

Writing Tools, Genmoji 🥳 and Image Playground

Writing Tools, Genmoji and Image Playground are three powerful new Apple Intelligence features that will for sure delight users by integrating these into your apps.

Writing Tools aim to help users rewrite, proofread and summarize text. The cool thing is that if you are using any of the standard UI frameworks to render TextFields, TextViews, your app will automatically get Writing Tools! If you want to customize how your app behave while Writing Tools is active, you just need to use Apple's new TextView delegate API.

Genmoji opens up entirely new ways to communicate, letting users create a new emoji to match any moment.

The new Image Playground API delivers a consistent and easy-to-use experience. With this API, users can edit the prompt supplied to them to create an image that represents anything they want that is unique and can be stored on device. I am really excited about this because I can already see lots of useful use cases to work on! 💪

Xcode 16, Code Completion and Swift Assist

Xcode 16 has many new features to make you more productive and improve the quality of your apps. Things like a single view of your backtraces, showing relevant code from all stack frames together, deeper insight into your app's performance and enhancements to localization catalogs. Xcode 16 also marks a whole new chapter for development, as Apple infuse its tools with the power of Generative Models.

Code Completion is an innovative new engine that can predict the code you need. It uses your project symbols to customize suggestions. It runs locally on your Mac, keeping your code private, giving you super-fast results, and even works when you are offline. Xcode can even use the comments you write as context, given you a suggestion as soon as you start typing.

Swift Assist is a companion for all of your coding tasks and I have to say, one of my favorite announcements this year! Swift Assist can answer your coding questions and help with tasks like experimenting with new APIs. Like code completion, Swift Assist uses details from your project, including symbols and the relationships between them, to create personalized code.

My concern here is somehow my information is exposed to train machine learning models for example, and I don't want that for sure! However, Apple states that all of these tools are built with your privacy and security in mind. Swift Assist will be available later this year and I can't wait to try it out ✨.

Swift, Swift 6 and Swift Testing

Apple is working with the open source community to bring Swift to more platforms and domains. Swift is given support for Visual Studio Code and other editors. Linux support was expanded to include more distributions and the Windows support for Windows was also improved.

Swift 6 makes concurrent programming dramatically easier by introducing data-race safety. Other developments include improvements to concurrency, generics, and a new "Embedded Swift" subset for targeting highly-constrained environments like operating system kernels and microcontrollers.

Another important aspect of software development is writing tests. Swift Testing has expressive APIs that make it simple to write tests. Swift Testing also includes a flexible tagging system to help you organize your tests and test plans. With tags, you can selectively run tests across your test suite, like tests that use a certain module or that run on a specific device. You can easily parameterize tests so that they can be reused multiple times. Xcode 16 has full support for Swift Testing.

SwiftUI Previews and Interoperability between UI frameworks

This year, Apple has focused on previews, customizations and interoperability. Xcode Previews has a new dynamic linking architecture that uses the same build artifacts for previews for when you build-and-run. This avoids rebuilding your project when switching between the two, making for a dramatically smoother and more productive workflow.

A new @Previewable macro makes it possible to use dynamic properties like @State directly in a Xcode Preview, making much more easier to set up SwiftUI previews.

Many apps adopting SwiftUI also use views written with UIKit and AppKit, so great interoperability with these frameworks is critical. Now, UI frameworks share more common foundations. Gesture recognition has been factored out of UIKit, enabling you to take any built-in or custom UIGestureRecognizer and use it in your SwiftUI view hierarchy.

Also, animations have been factored out of SwiftUI so you can now set up animations on UIKit or AppKit views and them drive them with SwiftUI, including fully custom animations.

iOS and iPadOS 18

iOS is more customizable than ever, and it starts with Controls. They make getting to frequent tasks from your apps faster and easier and are a great way to engage with your app from more places across the system.

iPadOS delivers big updates to the ways that your users interact with your apps, starting with the redesigned tab bar. It floats at the top of your app and makes it easy to jump to your favorite tabs. Additionally, it turns into a sidebar for those moments when you want to dive deeper. The most interesting feature for iPadOS for far!

WatchOS 11 and macOS Sequoia

One of the coolest new features on watchOS this year actually starts on iOS: Live Activities. Watch wearers will see your compact leading and trailing views automatically in the Smart Stack, as well when significant event notifications occur. You can now bring your interactive widgets to watchOS, using the same APIs your are currently using on iOS and macOS.

App Intents let you create widgets with multiple interactive areas that perform actions and update state directly in the widget.

And for those of you eager to integrate double tap into your apps, handGestureShortcut is the modifier you have been looking for. Use this modifier to identify a Button or Toggle as the primary action in your app, widget or Live Activity to give your customers quick, one-handed control.

This year, macOS supports Apple Intelligence with features like Writing Tools, Genmoji and Image Playground that you can integrate right into your apps to create engaging experiences. It also introduces productivity features like easier window tiling, iPhone mirroring, and delivers news APIs including user space file system support and major improvements to MapKit.

Conclusion

Apple WWDC 2024 has laid the foundation for another thrilling year within the Apple ecosystem. With revolutionary updates spanning iOS, macOS, and new explorations into AR and health, Apple is once again pushing boundaries, equipping developers with state-of-the-art tools and delivering unmatched experiences to users. Stay tuned for more in-depth coverage and tutorials on how to leverage these new technologies in your projects!