Categories
iOS Swift SwiftUI

iOS 14 Checklist For Developers

Apple’s iOS 14 rollout without prior notice has taken the whole developer community by surprise. Regardless, it’s that time of the year again when you’ll be shipping your app updates for the latest OS.
To kick-off here are the big changes in iOS 14 that you should be aware of:

  • Introduction of Widgets and App Clips on the home screen.
  • Replacement of IDFA with a new AppTrackingTransparency framework for opt-in ad-tracking(at the time of writing, Apple has delayed this until Jan 2021).
  • Enhancements in UIKit framework — Say goodbye to Target Selectors and embrace UIAction instead.
  • New Vision requests for contour, trajectory detection along with hand and body pose estimation.
  • ARKit brings a new depth API for LiDAR scanners as well as location anchors to place AR experiences in specific places.
  • Apple’s PencilKit framework introduces the ability to track the speed and force of gestures drawn on PKCanvas. There’s also a new Scribble UITextField that recognizes handwritten text using an on-device machine learning.
    Now, let’s dig through some of the significant updates across different Apple frameworks and changes in APIs so that you’re all set for app releases on iOS 14.

Enhanced Pasteboard API

Apple is the leader of data privacy and with the latest iOS update, they show that once again.

iOS 14 introduces a floating notification every time your app reads contents from the clipboard. Now, to prevent your apps from needlessly accessing the pasteboard, there’s an enhanced UIPasteboard API which lets you determine the kind of the content present in the UIPasteboard before actually reading its value.

detectPatternsForPatterns:completionHandler and detectPatterns(for:inItemSet:completionHandler:) methods let you find certain patterns without triggering the notification.

At the same time, you can leverage UIPasteboard.DetectionPattern struct to determine if the pasteboard contains a probableWebUrl (which might be relevant to deep links) or a number.

Picture In Picture Mode

While iPadOS did support picture in picture mode earlier, iOS 14 finally brings it to the iPhone.
By using the AVPictureInPictureController.isPictureInPictureSupported() you can check whether the feature to play videos in the background is supported.

AVPictureInPictureController(playerLayer: playerView.playerLayer)

If like me, you’re adopting the PiP mode in your AVPlayer based apps for iOS 14, you could run into strange errors — picture-in-picture not launching automatically when the app is in the background.

Gladly, here’s a link that provides a solution by initializing AVAudioSession.sharedInstance().setActive(true) before the AVPictureInPictureController.

Limited Photos Library Access Permission

In iOS 13 and prior versions, allowing apps to access your photos library would literally let them access all your albums and media assets. This could easily open the door for privacy breaches as developers could upload the libraries to their cloud servers.

With iOS 14 Apple introduces limited photo access permission which lets the user opt for only selected photos or give access to the entire library thereby preserving privacy. This means the iOS developers have their work cut out.

So, there’s a new PHAccessLevel enum property that lets you define it as readWrite or addOnly

let accessLevel: PHAccessLevel = .readWrite

To query the authorization status of the photos library, simply pass the above enum in the following function:

let authorizationStatus = PHPhotoLibrary.authorizationStatus(for: accessLevel)

Starting iOS 14, the above authorizationStatus returns a new limited enum property, which means only the photos selected by the user would be visible to the developers. To request limited photo access permission, invoke the following function:

PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in
 switch status {  
 case .limited:      
   print("limited access granted")  
 default:
   print("not implemented")
 }
}

The following piece of code presents the image selection picker UI:

PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: self)

Images selected/deselected by the user can be monitored in the photoLibraryDidChange function by conforming and registering to the PHPhotoLibraryChangeObserver protocol.

Now, to prevent automatic photo access prompts every time, set the PHPhotoLibraryPreventAutomaticLimitedAccessAlert key to true in your Info.plist file.

SwiftUI Brings New Property Wrappers, Views, Modifiers, And App Lifecycle

SwiftUI, Apple’s new declarative UI framework was the talk of the town during WWDC 2019 and this year has been no different. In its second iteration with iOS 14, SwiftUI now includes a whole lot of new UI components ranging from VideoPlayer to Maps, Labels, Links, ColorPicker and ProgressView.

More importantly, iOS 14 introduces support for lazy loading of VStack and HStack by using LazyHStack and LazyVStack views instead. This means you needn’t worry about NavigationLinks loading destination views immediately.

There’s also a new Grid component which helps replicate UICollectionView to some extent and a matchedGeometryEffect modifier to create amazing transitions and animations.

Besides introducing a SwiftUI’s own App lifecycle by using brand new property wrappers and protocols, iOS 14 also introduces WidgetKit framework that lets you build beautiful powerful widgets purely using SwiftUI.

More Powerful CollectionView

While Collection Views didn’t debut in SwiftUI during WWDC 2020, but that didn’t stop it from receiving some powerful new updates.
Here are the major changes that you can leverage for your apps that use iOS 14:

  • UICollectionViewCompositionalLayout.list lets you create UITableView like appearances in UITableView thereby further boosting the ability to customize compositional layouts. I believe this strongly indicates the TableView’s might go obsolete in the future.
  • UICollectionView.CellRegistration structure brings a new way to configure UICollectionView cells. So you needn’t defining cell identifiers anymore as the new struct automatically takes care of cell registration when passed inside the dequeueConfiguredReusableCell.
  • DiffableDataSources that arrived with iOS 13 now brings SectionSnapshots as well to customize and update data on a per-section basis.

Better Privacy With CoreLocation

While iOS 13 brought deferred “Always allow” and a new “Allow Once” permission, iOS 14 further tightens the noose by allowing the user to grant access to an approximate location.
This means there’s a new property of the type CLAccuracyAuthorization which has two enum cases — fullAccuracy and reducedAccuracy (returns an approximate instead of the exact location).
Also, authorizationStatus() function now stands deprecated and you should use locationManagerDidChangeAuthorization instead to query location permission status.
CoreML Model Encryption
CoreML is Apple’s machine learning framework that lets you initialize models, run inferences, and even do on-device training. With iOS 14 Apple bumps up CoreML with the introduction of Model Deployment. This means you can ship updates to your machine learning models on the fly without updating the apps.
There’s also an improved CoreML model viewer in Xcode that shows the underlying layers. But it’s model encryption that stands out. Machine learning models aren’t easy to build and at times contain sensitive information. Earlier you could easily extract the .mlmodelc Core ML model files embedded in apps.
Now, that’s no longer possible once you encrypt models in Xcode 12. In doing so Core ML will automatically decrypt and load them in your app’s memory.
For handling encrypted models, iOS 14 brings a new CoreML model asynchronous initializer function:

MyModel.apply{
  
  switch result {
  
    case .success(let model):
        currentModel = model
        
    case .failure(let error):
        handleFailure(for: error)  
  }
}

The model only loads once it’s been decrypted successfully. It’s worth noting that the old init() way of initializing CoreML models will be deprecated in the future.

Conclusion

While the above updates are the most significant ones to get your apps up to speed, there are also other important changes such as the inclusion of sentence embedding in the Natural Language Framework and support for training style transfer models using CreateML.
This sums up the major changes developers need to know for iOS 14. Thanks for reading.

By Anupam Chugh

iOS Developer exploring the depths of ML and AR on Mobile.
Loves writing about thoughts, technology, and code.

Leave a Reply

Your email address will not be published. Required fields are marked *