Uploaded by 03.dab.showy

DestinationVideo

advertisement
Destination Video
This app is a multiplatform video playback app for visionOS, iOS, and tvOS. It demonstrates how
to use the unique features of Apple Vision Pro to create an immersive playback experience that
leverages 3D video and spatial audio.
For more information about the app and how it works, see Destination Video in the developer
documentation below.
Overview
Destination Video is a multiplatform video-playback app for visionOS, iOS, and tvOS. People get
a familiar media-browsing experience navigating the libraryʼs content and playing videos they
find interesting. The app provides a similar experience on supported platforms, but leverages
unique features of visionOS to create a novel, immersive playback experience.
Play video in an inline player
When you select a video in the library, Destination Video presents a view that displays
additional details about the item. The view presents controls to play the video and specify
whether to include it in your Up Next list. In visionOS, it also displays a video poster along its
leading edge. Tapping the view’s Preview button displays an inline preview of the video.
When you present an AVPlayerViewController object’s interface as a child of another view,
inline controls display, for example, pause, skip, and seek. Showing standard playback controls
in your app provides a familiar UI that automatically adapts its appearance to fit each platform,
and is the recommended choice in most cases.
Destination Video uses a simple UI for the inline player view: a single button that toggles state
of playback. AVPlayerViewController doesn’t provide this controls style, but the app uses it to
display the video content without controls by setting the value of its showsPlaybackControls
property to false. It then overlays the custom playback controls it requires. See Destination
Video’s InlinePlayerView type for details on how you can implement this.
Note
AVPlayerViewController only supports displaying 2D content when embedded inline.
Play video in a full-window player
One of the most exciting features of visionOS is its ability to play 3D video along with Spatial
Audio, which adds a deeper level of immersion to the viewing experience. Playing 3D content in
your app requires that you display AVPlayerViewController full window. When you present the
player this way, the system automatically docks it into the ideal viewing position, and presents
streamlined playback controls that keep the person’s focus on the content.
Note
In iOS or tvOS, you typically present video in a full-screen presentation using the
fullScreenCover(isPresented:onDismiss:content:) modifier. This API is available in visionOS;
however, the recommended way to present the player for full-window playback is to set it as
the root view of your app’s window group.
Destination Video’s ContentView displays the app’s library by default. It observes changes to the
player model’s presentation property, which indicates whether the app requests inline or fullwindow playback. When the presentation state changes to fullWindow, the view redraws the UI
to display the player view in place of the library.
struct ContentView: View {
/// The library's selection path.
@State private var navigationPath = [Video]()
/// A Boolean value that indicates whether the app is currently presenting an immersive
space.
@State private var isPresentingSpace = false
/// The app's player model.
@Environment(PlayerModel.self) private var player
var body: some View {
switch player.presentation {
case .fullWindow:
// Present the player full window and begin playback.
PlayerView()
.onAppear {
player.play()
}
default:
// Show the app's content library by default.
LibraryView(path: $navigationPath, isPresentingSpace: $isPresentingSpace)
}
}
}
When someone selects the Play Video button on the detail view, the app calls the player
model’s loadVideo(_: presentation:) method requesting the fullWindow presentation option.
Button {
/// Load the media item for full-window presentation.
player.loadVideo(video, presentation: .fullWindow)
} label: {
Label("Play Video", systemImage: "play.fill")
}
After the player model successfully loads the video content for playback, it updates its
presentation value to fullWindow, which causes the app to replace the library with PlayerView.
To dismiss the full-window player in visionOS, people tap the Back button in the player UI. To
handle this action, the app’s PlayerViewControllerDelegate type defines an
AVPlayerViewControllerDelegate object that handles the dismissal.
func playerViewController(_ playerViewController: AVPlayerViewController,
willEndFullScreenPresentationWithAnimationCoordinator coordinator:
UIViewControllerTransitionCoordinator) {
// Reset the player model's state.
player.reset()
}
When the delegate receives this call, it clears the media from the player model and resets the
presentation state back to its default value, which results in the Destination Video app
redisplaying the library view.
Configure the Spatial Audio experience
Media playback apps require common configuration of their capabilities and audio session. In
addition to performing the steps outlined in Configuring your app for media playback,
Destination Video also adopts new AVAudioSession API to customize a person’s Spatial Audio
experience.
After the app successfully loads a video for playback, it configures the Spatial Audio experience
for the current presentation. For the inline player view, it sets the experience to a small, focused
sound stage where the audio originates from the location of the view. When displaying a video
full window, it sets the experience to a large, fully immersive sound stage.
/// Configures a person's intended Spatial Audio experience to best fit the presentation.
/// - Parameter presentation: the requested player presentation.
private func configureAudioExperience(for presentation: Presentation) {
#if os(xrOS)
do {
let experience: AVAudioSessionSpatialExperience
switch presentation {
case .inline:
// Set a small, focused sound stage when watching trailers.
experience = .headTracked(soundStageSize: .small, anchoringStrategy: .automatic)
case .fullWindow:
// Set a large sound stage size when viewing full window.
experience = .headTracked(soundStageSize: .large, anchoringStrategy: .automatic)
}
try AVAudioSession.sharedInstance().setIntendedSpatialExperience(experience)
} catch {
logger.error("Unable to set the intended spatial experience. \(error.localizedDescription)")
}
#endif
}
Present an immersive space
Building video playback apps for visionOS provides new opportunities to enhance the viewing
experience beyond the bounds of the player window. To add a greater level of immersion, the
sample presents an immersive space that displays a scene around a person as they watch the
video. It defines the immersive space in the DestinationVideo app structure.
struct DestinationVideo: App {
var body: some Scene {
// The app's primary window.
WindowGroup {
ContentView()
}
// Defines an immersive space to present a destination in which to watch the video.
ImmersiveSpace(for: Destination.self) { $destination in
if let destination {
DestinationView(destination)
}
}
// Set the immersion style to progressive, so the person can use the Digital Crown to dial in
their experience.
.immersionStyle(selection: .constant(.progressive), in: .progressive)
}
}
The immersive space presents an instance of DestinationView, which maps a texture to the
inside of a sphere that it displays around a person. The app presents it using the .progressive
immersion style, which lets someone change their amount of immersion by turning the Digital
Crown on the device.
Play
The Destination Video app automatically presents the immersive space when a person navigates
to a video’s detail view, and dismisses it when they return to the library. To monitor these
events, the app observes its navigation path to determine when a navigation event occurs so it
can show or dismiss the space.
.onChange(of: navigationPath) {
Task {
// The selection path becomes empty when the person returns to the main library window.
if navigationPath.isEmpty {
if isSpaceOpen {
// Dismiss the space and return the person to their real-world space.
await dismissSpace()
isSpaceOpen = false
}
} else {
// The navigationPath has one video, or is empty.
guard let video = navigationPath.first else { fatalError() }
// Await the request to open the destination and set the state accordingly.
switch await openSpace(value: video.destination) {
case .opened: isSpaceOpen = true
default: isSpaceOpen = false
}
}
}
}
Provide a shared viewing experience
One of the best ways to enhance your app’s playback experience is to make that experience
shareable with others. You can use the AVFoundation and the GroupActivities frameworks to
build SharePlay experiences that bring people together even when they can’t be in the same
location.
The Destination Video app creates an experience where people can watch videos with others
across devices and platforms. It defines a group activity called VideoWatchingActivity that
adopts the GroupActivity protocol. When people have a FaceTime call active and they play a
video in the full-window player, it becomes eligible for playback for everyone on the call.
The app’s VideoWatchingCoordinator actor manages Destination Video’s SharePlay
functionality. It observes the activation of new VideoWatchingActivity sessions and when one
starts, it sets the GroupSession instance on the player object’s AVPlaybackCoordinator.
private var groupSession: GroupSession<VideoWatchingActivity>? {
didSet {
guard let groupSession else { return }
// Set the group session on the AVPlayer object's playback coordinator
// so it can synchronize playback with other devices.
playbackCoordinator.coordinateWithSession(groupSession)
}
}
With the player configured to use the group session, when the app loads new videos, they
become eligible to share with people in the FaceTime call.
---CODE--DestinationVideo.swift
/*
Abstract:
The main app structure.
*/
import SwiftUI
import os
@main
struct DestinationVideo: App {
/// An object that controls the video playback behavior.
@State private var player = PlayerModel()
/// An object that manages the library of video content.
@State private var library = VideoLibrary()
var body: some Scene {
// The app's primary content window.
WindowGroup {
ContentView()
.environment(player)
.environment(library)
#if !os(visionOS)
// Use a dark color scheme on supported platforms.
.preferredColorScheme(.dark)
.tint(.white)
#endif
}
#if os(visionOS)
// Defines an immersive space to present a destination in which to watch the video.
ImmersiveSpace(for: Destination.self) { $destination in
if let destination {
DestinationView(destination)
.environment(player)
}
}
// Set the immersion style to progressive, so the user can use the crown to dial in their
experience.
.immersionStyle(selection: .constant(.progressive), in: .progressive)
#endif
}
}
/// A global logger for the app.
let logger = Logger()
ContentView.swift
/*
Abstract:
A view that presents the app's user interface.
*/
import SwiftUI
// The app uses `LibraryView` as its main UI.
struct ContentView: View {
/// The library's selection path.
@State private var navigationPath = [Video]()
/// A Boolean value that indicates whether the app is currently presenting an immersive
space.
@State private var isPresentingSpace = false
/// The app's player model.
@Environment(PlayerModel.self) private var player
var body: some View {
#if os(visionOS)
switch player.presentation {
case .fullWindow:
// Present the player full window and begin playback.
PlayerView()
.onAppear {
player.play()
}
default:
// Show the app's content library by default.
LibraryView(path: $navigationPath, isPresentingSpace: $isPresentingSpace)
}
#else
LibraryView(path: $navigationPath)
// A custom modifier that shows the player in a fullscreen modal presentation in iOS and
tvOS.
.fullScreenCoverPlayer(player: player)
#endif
}
}
#Preview {
ContentView()
.environment(PlayerModel())
.environment(VideoLibrary())
}
DetailView.swift
/*
Abstract:
A view that presents a horizontal view of the video details.
*/
import SwiftUI
// The leading side of view displays a trailer view, and the trailing side displays video information
and action controls.
struct DetailView: View {
let video: Video
@Environment(PlayerModel.self) private var player
@Environment(VideoLibrary.self) private var library
let margins = 30.0
var body: some View {
HStack(alignment: .top, spacing: margins) {
// A view that plays video in an inline presentation.
TrailerView(video: video)
.aspectRatio(16 / 9, contentMode: .fit)
.frame(width: 620)
.cornerRadius(20)
VStack(alignment: .leading) {
// Displays video details.
VideoInfoView(video: video)
// Action controls.
HStack {
Group {
Button {
/// Load the media item for full-window presentation.
player.loadVideo(video, presentation: .fullWindow)
} label: {
Label("Play Video", systemImage: "play.fill")
.frame(maxWidth: .infinity)
}
Button {
// Calling this method toggles the video's inclusion state in the Up Next queue.
library.toggleUpNextState(for: video)
} label: {
let isUpNext = library.isVideoInUpNext(video)
Label(isUpNext ? "In Up Next" : "Add to Up Next",
systemImage: isUpNext ? "checkmark" : "plus")
.frame(maxWidth: .infinity)
}
}
.frame(maxWidth: .infinity)
}
.frame(maxWidth: 420)
Spacer()
}
}
.padding(margins)
}
}
#Preview {
NavigationStack {
DetailView(video: .preview)
.environment(PlayerModel())
.environment(VideoLibrary())
}
}
TrailerView.swift
/*
Abstract:
A view that displays a poster image that you tap to watch a trailer.
*/
import SwiftUI
struct TrailerView: View {
let video: Video
@State private var isPosterVisible = true
@Environment(PlayerModel.self) private var model
var body: some View {
if isPosterVisible {
Button {
// Loads the video for inline playback.
model.loadVideo(video)
withAnimation {
isPosterVisible = false
}
} label: {
TrailerPosterView(video: video)
}
.buttonStyle(.plain)
} else {
PlayerView(controlsStyle: .custom)
.onAppear {
if model.shouldAutoPlay {
model.play()
}
}
}
}
}
/// A view that displays the poster image with a play button image over it.
private struct TrailerPosterView: View {
let video: Video
var body: some View {
ZStack {
Image(video.landscapeImageName)
.resizable()
.frame(maxWidth: .infinity, maxHeight: .infinity)
VStack(spacing: 2) {
Image(systemName: "play.fill")
.font(.system(size: 24.0))
.padding(12)
.background(.thinMaterial)
.clipShape(.circle)
Text("Preview")
.font(.title3)
.shadow(color: .black.opacity(0.5), radius: 3, x: 1, y: 1)
}
}
}
}
#Preview("Trailer View") {
TrailerView(video: .preview)
.environment(PlayerModel())
}
#Preview("Trailer Poster View") {
TrailerPosterView(video: .preview)
}
DestinationView.swift
/*
Abstract:
A view presents a 360° destination image in an immersive space.
*/
import SwiftUI
import RealityKit
import Combine
/// A view that displays a 360 degree scene in which to watch video.
struct DestinationView: View {
@State private var destination: Destination
@State private var destinationChanged = false
@Environment(PlayerModel.self) private var model
init(_ destination: Destination) {
self.destination = destination
}
var body: some View {
RealityView { content in
let rootEntity = Entity()
rootEntity.addSkybox(for: destination)
content.add(rootEntity)
} update: { content in
guard destinationChanged else { return }
guard let entity = content.entities.first else { fatalError() }
entity.updateTexture(for: destination)
Task { @MainActor in
destinationChanged = false
}
}
// Handle the case where the app is already playing video in a destination and:
// 1. The user opens the Up Next tab and navigates to a new item, or
// 2. The user presses the "Play Next" button in the player UI.
.onChange(of: model.currentItem) { oldValue, newValue in
if let newValue, destination != newValue.destination {
destination = newValue.destination
destinationChanged = true
}
}
.transition(.opacity)
}
}
extension Entity {
func addSkybox(for destination: Destination) {
let subscription = TextureResource.loadAsync(named: destination.imageName).sink(
receiveCompletion: {
switch $0 {
case .finished: break
case .failure(let error): assertionFailure("\(error)")
}
},
receiveValue: { [weak self] texture in
guard let self = self else { return }
var material = UnlitMaterial()
material.color = .init(texture: .init(texture))
self.components.set(ModelComponent(
mesh: .generateSphere(radius: 1E3),
materials: [material]
))
self.scale *= .init(x: -1, y: 1, z: 1)
self.transform.translation += SIMD3<Float>(0.0, 1.0, 0.0)
// Rotate the sphere to show the best initial view of the space.
updateRotation(for: destination)
}
)
components.set(Entity.SubscriptionComponent(subscription: subscription))
}
func updateTexture(for destination: Destination) {
let subscription = TextureResource.loadAsync(named: destination.imageName).sink(
receiveCompletion: {
switch $0 {
case .finished: break
case .failure(let error): assertionFailure("\(error)")
}
},
receiveValue: { [weak self] texture in
guard let self = self else { return }
guard var modelComponent = self.components[ModelComponent.self] else {
fatalError("Should this be fatal? Probably.")
}
var material = UnlitMaterial()
material.color = .init(texture: .init(texture))
modelComponent.materials = [material]
self.components.set(modelComponent)
// Rotate the sphere to show the best initial view of the space.
updateRotation(for: destination)
}
)
components.set(Entity.SubscriptionComponent(subscription: subscription))
}
func updateRotation(for destination: Destination) {
// Rotate the immersive space around the Y-axis set the user's
// initial view of the immersive scene.
let angle = Angle.degrees(destination.rotationDegrees)
let rotation = simd_quatf(angle: Float(angle.radians), axis: SIMD3<Float>(0, 1, 0))
self.transform.rotation = rotation
}
/// A container for the subscription that comes from asynchronous texture loads.
///
/// In order for async loading callbacks to work we need to store
/// a subscription somewhere. Storing it on a component will keep
/// the subscription alive for as long as the component is attached.
struct SubscriptionComponent: Component {
var subscription: AnyCancellable
}
}
LibraryView.Swift
/*
Abstract:
A view that displays the list of videos the library contains.
*/
import SwiftUI
/// A view that presents the app's content library.
///
/// This view provides the app's main user interface. It displays two
/// horizontally scrolling rows of videos. The top row displays full-sized
/// cards that represent the Featured videos in the app. The bottom row
/// displays videos that the user adds to their Up Next queue.
///
struct LibraryView: View {
@Environment(PlayerModel.self) private var model
@Environment(VideoLibrary.self) private var library
/// A path that represents the user's content navigation path.
@Binding private var navigationPath: [Video]
/// A path that represents the user's content navigation path.
@Binding private var isPresentingSpace: Bool
/// Creates a `LibraryView` with a binding to a selection path.
///
/// The default value is an empty binding.
init(path: Binding<[Video]>, isPresentingSpace: Binding<Bool> = .constant(false)) {
_navigationPath = path
_isPresentingSpace = isPresentingSpace
}
var body: some View {
NavigationStack(path: $navigationPath) {
// Wrap the content in a vertically scrolling view.
ScrollView(showsIndicators: false) {
VStack(alignment: .leading, spacing: verticalPadding) {
// Displays the Destination Video logo image.
Image("dv_logo")
.resizable()
.scaledToFit()
.padding(.leading, outerPadding)
.padding(.bottom, isMobile ? 0 : 8)
.frame(height: logoHeight)
.accessibilityHidden(true)
// Displays a horizontally scrolling list of Featured videos.
VideoListView(title: "Featured",
videos: library.videos,
cardStyle: .full,
cardSpacing: horizontalSpacing)
// Displays a horizontally scrolling list of videos in the user's Up Next queue.
VideoListView(title: "Up Next",
videos: library.upNext,
cardStyle: .upNext,
cardSpacing: horizontalSpacing)
}
.padding([.top, .bottom], verticalPadding)
.navigationDestination(for: Video.self) { video in
DetailView(video: video)
.navigationTitle(video.title)
.navigationBarHidden(isTV)
}
}
#if os(tvOS)
.ignoresSafeArea()
#endif
}
#if os(visionOS)
// A custom view modifier that presents an immersive space when you navigate to the
detail view.
.updateImmersionOnChange(of: $navigationPath, isPresentingSpace: $isPresentingSpace)
#endif
}
// MARK: - Platform-specific metrics.
/// The vertical padding between views.
var verticalPadding: Double {
valueFor(iOS: 30, tvOS: 40, visionOS: 30)
}
var outerPadding: Double {
valueFor(iOS: 20, tvOS: 50, visionOS: 30)
}
var horizontalSpacing: Double {
valueFor(iOS: 20, tvOS: 80, visionOS: 30)
}
var logoHeight: Double {
valueFor(iOS: 24, tvOS: 60, visionOS: 34)
}
}
#Preview {
NavigationStack {
LibraryView(path: .constant([]))
.environment(PlayerModel())
.environment(VideoLibrary())
}
}
VideoListView.swift
/*
Abstract:
A list of video cards.
*/
import SwiftUI
/// A view the presents a horizontally scrollable list of video cards.
struct VideoListView: View {
typealias SelectionAction = (Video) -> Void
private let title: String?
private let videos: [Video]
private let cardStyle: VideoCardStyle
private let cardSpacing: Double
private let selectionAction: SelectionAction?
/// Creates a view to display the specified list of videos.
/// - Parameters:
/// - title: An optional title to display above the list.
/// - videos: The list of videos to display.
/// - cardStyle: The style for the video cards.
/// - cardSpacing: The spacing between cards.
/// - selectionAction: An optional action that you can specify to directly handle the card
selection.
/// When the app doesn't specify a selection action, the view presents the card as a
`NavigationLink.
init(title: String? = nil, videos: [Video], cardStyle: VideoCardStyle, cardSpacing: Double,
selectionAction: SelectionAction? = nil) {
self.title = title
self.videos = videos
self.cardStyle = cardStyle
self.cardSpacing = cardSpacing
self.selectionAction = selectionAction
}
var body: some View {
VStack(alignment: .leading, spacing: 0) {
titleView
.padding(.leading, margins)
.padding(.bottom, valueFor(iOS: 8, tvOS: -40, visionOS: 12))
ScrollView(.horizontal, showsIndicators: false) {
HStack(spacing: cardSpacing) {
ForEach(videos) { video in
Group {
// If the app initializes the view with a selection action closure,
// display a video card button that calls it.
if let selectionAction {
Button {
selectionAction(video)
} label: {
VideoCardView(video: video, style: cardStyle)
}
}
// Otherwise, create a navigation link.
else {
NavigationLink(value: video) {
VideoCardView(video: video, style: cardStyle)
}
}
}
.accessibilityLabel("\(video.title)")
}
}
.buttonStyle(buttonStyle)
// In tvOS, add vertical padding to accommodate card resizing.
.padding([.top, .bottom], isTV ? 60 : 0)
.padding([.leading, .trailing], margins)
}
}
}
@ViewBuilder
var titleView: some View {
if let title {
Text(title)
#if os(visionOS)
.font(cardStyle == .full ? .largeTitle : .title)
#elseif os(tvOS)
.font(cardStyle == .full ? .largeTitle.weight(.semibold) : .title2)
#else
.font(cardStyle == .full ? .title2.bold() : .title3.bold())
#endif
}
}
var buttonStyle: some PrimitiveButtonStyle {
#if os(tvOS)
.card
#else
.plain
#endif
}
var margins: Double {
valueFor(iOS: 20, tvOS: 50, visionOS: 30)
}
}
#Preview("Full") {
NavigationStack {
VideoListView(title: "Featured", videos: .all, cardStyle: .full, cardSpacing: 80)
.frame(height: 380)
}
}
#Preview("Up Next") {
NavigationStack {
VideoListView(title: "Up Next", videos: .all, cardStyle: .upNext, cardSpacing: 20)
.frame(height: 200)
}
}
#Preview("Compact") {
NavigationStack {
VideoListView(videos: .all, cardStyle: .compact, cardSpacing: 20)
.padding()
}
}
VideoCardView.swift
/*
Abstract:
A view that represents a video in the library.
*/
import SwiftUI
/// Constants that represent the supported styles for video cards.
enum VideoCardStyle {
/// A full video card style.
///
/// This style presents a poster image on top and information about the video
/// below, including video description and genres.
case full
/// A style for cards in the Up Next list.
///
/// This style presents a medium-sized poster image on top and a title string below.
case upNext
/// A compact video card style.
///
/// This style presents a compact-sized poster image on top and a title string below.
case compact
var cornerRadius: Double {
switch self {
case .full:
#if os(tvOS)
12.0
#else
20.0
#endif
case .upNext: 12.0
case .compact: 10.0
}
}
}
/// A view that represents a video in the library.
///
/// A user can select a video card to view the video details.
struct VideoCardView: View {
let video: Video
let style: VideoCardStyle
let cornerRadius = 20.0
/// Creates a video card view with a video and an optional style.
///
/// The default style is `.full`.
init(video: Video, style: VideoCardStyle = .full) {
self.video = video
self.style = style
}
var image: some View {
Image(video.landscapeImageName)
.resizable()
.scaledToFill()
}
var body: some View {
switch style {
case .compact:
posterCard
.frame(width: valueFor(iOS: 0, tvOS: 400, visionOS: 200))
case .upNext:
posterCard
.frame(width: valueFor(iOS: 250, tvOS: 500, visionOS: 360))
case .full:
VStack {
image
VStack(alignment: .leading) {
InfoLineView(year: video.info.releaseYear,
rating: video.info.contentRating,
duration: video.info.duration)
.foregroundStyle(.secondary)
.padding(.top, -10)
//.padding(.bottom, 3)
Text(video.title)
.font(isTV ? .title3 : .title)
//.padding(.bottom, 2)
Text(video.description)
#if os(tvOS)
.font(.callout)
#endif
.lineLimit(2)
Spacer()
HStack {
GenreView(genres: video.info.genres)
}
}
.padding(20)
}
.background(.thinMaterial)
#if os(tvOS)
.frame(width: 550, height: 590)
#else
.frame(width: isVision ? 395 : 300)
.shadow(radius: 5)
.hoverEffect()
#endif
.cornerRadius(style.cornerRadius)
}
}
@ViewBuilder
var posterCard: some View {
#if os(tvOS)
ZStack(alignment: .bottom) {
image
// Material gradient
GradientView(style: .ultraThinMaterial, height: 90, startPoint: .top)
Text(video.title)
.font(.caption.bold())
.padding()
}
.cornerRadius(style.cornerRadius)
#else
VStack {
image
.cornerRadius(style.cornerRadius)
Text(video.title)
.font(isVision ? .title3 : .headline)
.lineLimit(1)
}
.hoverEffect()
#endif
}
}
VideoInfoView.swift
/*
Abstract:
A view that displays information about a video like its title, actors, and rating.
*/
import SwiftUI
struct VideoInfoView: View {
let video: Video
var body: some View {
VStack(alignment: .leading) {
Text(video.title)
.font(isVision ? .title : .title2)
.padding(.bottom, 4)
InfoLineView(year: video.info.releaseYear,
rating: video.info.contentRating,
duration: video.info.duration)
.padding([.bottom], 4)
GenreView(genres: video.info.genres)
.padding(.bottom, 4)
RoleView(role: String(localized: "Stars"), people: video.info.stars)
.padding(.top, 1)
RoleView(role: String(localized: "Director"), people: video.info.directors)
.padding(.top, 1)
RoleView(role: String(localized: "Writers"), people: video.info.writers)
.padding(.top, 1)
.padding(.bottom, 12)
Text(video.description)
.font(.headline)
.padding(.bottom, 12)
}
}
}
/// A view that displays a horizontal list of the video's year, rating, and duration.
struct InfoLineView: View {
let year: String
let rating: String
let duration: String
var body: some View {
HStack {
Text("\(year) | \(rating) | \(duration)")
.font(isTV ? .caption : .subheadline.weight(.medium))
}
}
}
/// A view that displays a comma-separated list of genres for a video.
struct GenreView: View {
let genres: [String]
var body: some View {
HStack(spacing: 8) {
ForEach(genres, id: \.self) {
Text($0)
.fixedSize()
#if os(visionOS)
.font(.caption2.weight(.bold))
#else
.font(.caption)
#endif
.padding([.leading, .trailing], isTV ? 8: 4)
.padding([.top, .bottom], 4)
.background(RoundedRectangle(cornerRadius: 5).stroke())
.foregroundStyle(.secondary)
}
// Push the list to the leading edge.
Spacer()
}
}
}
/// A view that displays a name of a role, followed by one or more people who hold the
position.
struct RoleView: View {
let role: String
let people: [String]
var body: some View {
VStack(alignment: .leading) {
Text(role)
Text(people.formatted())
.foregroundStyle(.secondary)
}
}
}
#if os(visionOS)
#Preview {
VideoInfoView(video: .preview)
.padding()
.frame(width: 500, height: 500)
.background(.gray)
.previewLayout(.sizeThatFits)
}
#endif
WideDetailView.swift
/*
Abstract:
A view that display video detail in a wide layout.
*/
import SwiftUI
/// A view that displays action controls and video detail in a horizontal layout.
///
/// The detail view in iPadOS and tvOS use this view to display the video information.
struct WideDetailView: View {
let video: Video
let player: PlayerModel
let library: VideoLibrary
var body: some View {
// Arrange the content in a horizontal layout.
HStack(alignment: .top, spacing: isTV ? 40 : 20) {
VStack {
// A button that plays the video in a full-window presentation.
Button {
/// Load the media item for full-window presentation.
player.loadVideo(video, presentation: .fullWindow)
} label: {
Label("Play Video", systemImage: "play.fill")
.frame(maxWidth: .infinity)
}
// A button to toggle whether the video is in the user's Up Next queue.
Button {
// Calling this method toggles the video's inclusion state in the Up Next queue.
library.toggleUpNextState(for: video)
} label: {
let isUpNext = library.isVideoInUpNext(video)
Label(isUpNext ? "In Up Next" : "Add to Up Next",
systemImage: isUpNext ? "checkmark" : "plus")
.frame(maxWidth: .infinity)
}
}
.fontWeight(.semibold)
.foregroundStyle(.black)
.buttonStyle(.borderedProminent)
.frame(width: isTV ? 400 : 200)
// Make the buttons the same width.
.fixedSize(horizontal: true, vertical: false)
Text(video.description)
VStack(alignment: .leading, spacing: 4) {
RoleView(role: "Stars", people: video.info.stars)
RoleView(role: "Director", people: video.info.directors)
RoleView(role: "Writers", people: video.info.writers)
}
}
.frame(height: isTV ? 300 : 150)
.padding([.leading, .trailing], isTV ? 80 : 40)
.padding(.bottom, 20)
}
}
GradientView.swift
/*
Abstract:
A gradient view.
*/
import SwiftUI
/// A view that displays a vertical gradient.
struct GradientView: View {
/// A fill style for the gradient.
let style: any ShapeStyle
/// The height of the view in points.
let height: Double
/// The start point of the gradient.
///
/// This value can be `.top` or .`bottom`.
let startPoint: UnitPoint
/// Creates a gradient view.
/// - Parameters:
/// - style: A fill style for the gradient.
/// - height: The height of the view in points.
/// - startPoint: The start point of the gradient. The system throws an fatal error if that value
/// isn't `.top` or `.bottom`.
init(style: any ShapeStyle, height: Double, startPoint: UnitPoint) {
guard startPoint == .top || startPoint == .bottom else { fatalError() }
self.style = style
self.height = height
self.startPoint = startPoint
}
var body: some View {
Rectangle()
.fill(AnyShapeStyle(style))
.frame(height: height)
.mask {
LinearGradient(colors: [.clear, .black, .black],
startPoint: startPoint,
// Set the end point to be the opposite of the start.
endPoint: startPoint == .top ? .bottom : .top)
}
}
}
#Preview {
GradientView(style: .thinMaterial, height: 200, startPoint: .top)
}
ViewModifiers.swift
/*
Abstract:
Custom view modifiers that the app defines.
*/
import SwiftUI
extension View {
#if os(visionOS)
func updateImmersionOnChange(of path: Binding<[Video]>, isPresentingSpace:
Binding<Bool>) -> some View {
self.modifier(ImmersiveSpacePresentationModifier(navigationPath: path,
isPresentingSpace: isPresentingSpace))
}
#endif
// Only used in iOS and tvOS for full-screen modal presentation.
func fullScreenCoverPlayer(player: PlayerModel) -> some View {
self.modifier(FullScreenCoverModifier(player: player))
}
}
#if os(visionOS)
private struct ImmersiveSpacePresentationModifier: ViewModifier {
@Environment(\.openImmersiveSpace) private var openSpace
@Environment(\.dismissImmersiveSpace) private var dismissSpace
/// The current phase for the scene, which can be active, inactive, or background.
@Environment(\.scenePhase) private var scenePhase
@Binding var navigationPath: [Video]
@Binding var isPresentingSpace: Bool
func body(content: Content) -> some View {
content
.onChange(of: navigationPath) {
Task {
// The selection path becomes empty when the user returns to the main library
window.
if navigationPath.isEmpty {
if isPresentingSpace {
// Dismiss the space and return the user to their real-world space.
await dismissSpace()
isPresentingSpace = false
}
} else {
guard !isPresentingSpace else { return }
// The navigationPath has one video, or is empty.
guard let video = navigationPath.first else { fatalError() }
// Await the request to open the destination and set the state accordingly.
switch await openSpace(value: video.destination) {
case .opened: isPresentingSpace = true
default: isPresentingSpace = false
}
}
}
}
// Close the space and unload media when the user backgrounds the app.
.onChange(of: scenePhase) { _, newPhase in
if isPresentingSpace, newPhase == .background {
Task {
await dismissSpace()
}
}
}
}
}
#endif
private struct FullScreenCoverModifier: ViewModifier {
let player: PlayerModel
@State private var isPresentingPlayer = false
func body(content: Content) -> some View {
content
.fullScreenCover(isPresented: $isPresentingPlayer) {
PlayerView()
.onAppear {
player.play()
}
.onDisappear {
player.reset()
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.ignoresSafeArea()
}
// Observe the player's presentation property.
.onChange(of: player.presentation, { _, newPresentation in
isPresentingPlayer = newPresentation == .fullWindow
})
}
}
Video.swift
/*
Abstract:
An object that represents a video in the app's library.
*/
import Foundation
import UIKit
struct Video: Identifiable, Hashable, Codable {
/// The unique identifier of the item.
let id: Int
/// The URL of the video, which can be local or remote.
let url: URL
/// The title of the video.
let title: String
/// The base image name.
let imageName: String
/// The description of the video.
let description: String
/// The name of the video's portrait image.
var portraitImageName: String { "\(imageName)_portrait" }
/// The name of the video's landscape image.
var landscapeImageName: String { "\(imageName)_landscape" }
/// The data for the landscape image to create a metadata item to display in the Info panel.
var imageData: Data {
UIImage(named: landscapeImageName)?.pngData() ?? Data()
}
/// Detailed information about the video like its stars and content rating.
let info: Info
/// A url that resolves to specific local or remote media.
var resolvedURL: URL {
if url.scheme == nil {
return URL(fileURLWithPath: "\(Bundle.main.bundlePath)/\(url.path)")
}
return url
}
/// A Boolean value that indicates whether the video is hosted in a remote location.
var hasRemoteMedia: Bool {
url.scheme != nil
}
/// A destination in which to watch the video.
/// The app presents this destination in an immersive space.
var destination: Destination
/// An object that provides detailed information for a video.
struct Info: Hashable, Codable {
var releaseYear: String
var contentRating: String
var duration: String
var genres: [String]
var stars: [String]
var directors: [String]
var writers: [String]
var releaseDate: Date {
var components = DateComponents()
components.year = Int(releaseYear)
let calendar = Calendar(identifier: .gregorian)
return calendar.date(from: components)!
}
}
}
VideoLibrary.swift
/*
Abstract:
The app's video library.
*/
import Foundation
import SwiftUI
import Observation
/// An object that manages the app's video content.
///
/// The app puts an instance of this class into the environment so it can retrieve and
/// update the state of video content in the library.
@Observable class VideoLibrary {
private(set) var videos = [Video]()
private(set) var upNext = [Video]()
// A URL within the user's Documents directory to which to write their Up Next entries.
private let upNextURL = URL.documentsDirectory.appendingPathComponent("UpNext.json")
init() {
// Load all videos available in the library.
videos = loadVideos()
// The first time the app launches, set the last three videos as the default Up Next items.
upNext = loadUpNextVideos(default: Array(videos.suffix(3)))
}
/// Toggles whether the video exists in the Up Next queue.
/// - Parameter video: the video to update
func toggleUpNextState(for video: Video) {
if !upNext.contains(video) {
// Insert the video at the beginning of the list.
upNext.insert(video, at: 0)
} else {
// Remove the entry with the matching identifier.
upNext.removeAll(where: { $0.id == video.id })
}
// Persist the Up Next state to disk.
saveUpNext()
}
/// Returns a Boolean value that indicates whether the video exits in the Up Next list.
/// - Parameter video: the video to test,
/// - Returns: `true` if the item is in the Up Next list; otherwise, `false`.
func isVideoInUpNext(_ video: Video) -> Bool {
upNext.contains(video)
}
/// Finds the items to display in the video player's Up Next list.
func findUpNext(for video: Video) -> [Video] {
upNext.filter { $0.id != video.id }
}
/// Finds the next video in the Up Next list after the current video.
/// - Parameter video: the current video
/// - Returns: the next video, or `nil` if none exists.
func findVideoInUpNext(after video: Video) -> Video? {
switch upNext.count {
case 0:
// Up Next is empty. Return nil.
return nil
case 1:
// The passed in video is the only item in `upNext`, return nil.
if upNext.first == video {
return nil
} else {
// Return the only item.
return upNext.first
}
default:
// Find the index of the passed in video. If the video isn't in `upNext`, start at the first
item.
let videoIndex = upNext.firstIndex(of: video) ?? 0
if videoIndex < upNext.count - 1 {
return upNext[videoIndex + 1]
}
return upNext[0]
}
}
/// Loads the video content for the app.
private func loadVideos() -> [Video] {
let filename = "Videos.json"
guard let url = Bundle.main.url(forResource: filename, withExtension: nil) else {
fatalError("Couldn't find \(filename) in main bundle.")
}
return load(url)
}
/// Loads the user's list of videos in their Up Next list.
private func loadUpNextVideos(`default` defaultVideos: [Video]) -> [Video] {
// If this file doesn't exist, create it.
if !FileManager.default.fileExists(atPath: upNextURL.path) {
// Create an initial file with a default value.
if !FileManager.default.createFile(atPath: upNextURL.path, contents:
"\(defaultVideos.map { $0.id })".data(using: .utf8)) {
fatalError("Couldn't initialize Up Next store.")
}
}
// Load the ids of the videos in the list.
let ids: [Int] = load(upNextURL)
return videos.filter { ids.contains($0.id) }
}
/// Saves the Up Next data to disk.
///
/// The app saves the state using simple JSON persistence.
private func saveUpNext() {
do {
let encoder = JSONEncoder()
encoder.outputFormatting = .prettyPrinted
// Persist the ids only.
let data = try encoder.encode(upNext.map { $0.id })
try data.write(to: upNextURL)
} catch {
logger.error("Unable to save JSON data.")
}
}
private func load<T: Decodable>(_ url: URL, as type: T.Type = T.self) -> T {
let data: Data
do {
data = try Data(contentsOf: url)
} catch {
fatalError("Couldn't load \(url.path):\n\(error)")
}
do {
let decoder = JSONDecoder()
decoder.keyDecodingStrategy = .convertFromSnakeCase
return try decoder.decode(T.self, from: data)
} catch {
fatalError("Couldn't parse \(url.lastPathComponent) as \(T.self):\n\(error)")
}
}
}
Destination.Swift
/*
Abstract:
Constants that define named destinations the app supports.
*/
import Foundation
enum Destination: String, CaseIterable, Identifiable, Codable {
case beach
case camping
case creek
case hillside
case lake
case ocean
case park
var id: Self { self }
/// The environment image to load.
var imageName: String { "\(rawValue)_scene" }
/// A number of degrees to rotate the 360 "destination" image to provide the best initial view.
var rotationDegrees: Double {
switch self {
case .beach: 55
case .camping: -55
case .creek: 0
case .hillside: 0
case .lake: -55
case .ocean: 0
case .park: 190
}
}
}
PlayerView.swift
/*
Abstract:
A view that presents the video player.
*/
import SwiftUI
/// Constants that define the style of controls a player presents.
enum PlayerControlsStyle {
/// A value that indicates to use the system interface that AVPlayerViewController provides.
case system
/// A value that indicates to use compact controls that display a play/pause button.
case custom
}
/// A view that presents the video player.
struct PlayerView: View {
let controlsStyle: PlayerControlsStyle
@State private var showContextualActions = false
@Environment(PlayerModel.self) private var model
/// Creates a new player view.
init(controlsStyle: PlayerControlsStyle = .system) {
self.controlsStyle = controlsStyle
}
var body: some View {
switch controlsStyle {
case .system:
SystemPlayerView(showContextualActions: showContextualActions)
.onChange(of: model.shouldProposeNextVideo) { oldValue, newValue in
if oldValue != newValue {
showContextualActions = newValue
}
}
case .custom:
InlinePlayerView()
}
}
}
PlayerModel.swift
/*
Abstract:
A model object that manages the playback of video.
*/
import AVKit
import GroupActivities
import Combine
import Observation
/// The presentation modes the player supports.
enum Presentation {
/// Indicates to present the player as a child of a parent user interface.
case inline
/// Indicates to present the player in full-window exclusive mode.
case fullWindow
}
@Observable class PlayerModel {
/// A Boolean value that indicates whether playback is currently active.
private(set) var isPlaying = false
/// A Boolean value that indicates whether playback of the current item is complete.
private(set) var isPlaybackComplete = false
/// The presentation in which to display the current media.
private(set) var presentation: Presentation = .inline
/// The currently loaded video.
private(set) var currentItem: Video? = nil
/// A Boolean value that indicates whether the player should propose playing the next video
in the Up Next list.
private(set) var shouldProposeNextVideo = false
/// An object that manages the playback of a video's media.
private var player = AVPlayer()
/// The currently presented player view controller.
///
/// The life cycle of an `AVPlayerViewController` object is different than a typical view
controller. In addition
/// to displaying the player UI within your app, the view controller also manages the
presentation of the media
/// outside your app UI such as when using AirPlay, Picture in Picture, or docked full window.
To ensure the view
/// controller instance is preserved in these cases, the app stores a reference to it here (which
/// is an environment-scoped object).
///
/// This value is set by a call to the `makePlayerViewController()` method.
private var playerViewController: AVPlayerViewController? = nil
private var playerViewControllerDelegate: AVPlayerViewControllerDelegate? = nil
private(set) var shouldAutoPlay = true
// An object that manages the app's SharePlay implementation.
private var coordinator: VideoWatchingCoordinator! = nil
/// A token for periodic observation of the player's time.
private var timeObserver: Any? = nil
private var subscriptions = Set<AnyCancellable>()
init() {
coordinator = VideoWatchingCoordinator(playbackCoordinator:
player.playbackCoordinator)
observePlayback()
Task {
await configureAudioSession()
await observeSharedVideo()
}
}
/// Creates a new player view controller object.
/// - Returns: a configured player view controller.
func makePlayerViewController() -> AVPlayerViewController {
let delegate = PlayerViewControllerDelegate(player: self)
let controller = AVPlayerViewController()
controller.player = player
controller.delegate = delegate
// Set the model state
playerViewController = controller
playerViewControllerDelegate = delegate
return controller
}
private func observePlayback() {
// Return early if the model calls this more than once.
guard subscriptions.isEmpty else { return }
// Observe the time control status to determine whether playback is active.
player.publisher(for: \.timeControlStatus)
.receive(on: DispatchQueue.main)
.removeDuplicates()
.sink { [weak self] status in
self?.isPlaying = status == .playing
}
.store(in: &subscriptions)
// Observe this notification to know when a video plays to its end.
NotificationCenter.default
.publisher(for: .AVPlayerItemDidPlayToEndTime)
.receive(on: DispatchQueue.main)
.map { _ in true }
.sink { [weak self] isPlaybackComplete in
self?.isPlaybackComplete = isPlaybackComplete
}
.store(in: &subscriptions)
// Observe audio session interruptions.
NotificationCenter.default
.publisher(for: AVAudioSession.interruptionNotification)
.receive(on: DispatchQueue.main)
.sink { [weak self] notification in
// Wrap the notification in helper type that extracts the interruption type and options.
guard let result = InterruptionResult(notification) else { return }
// Resume playback, if appropriate.
if result.type == .ended && result.options == .shouldResume {
self?.player.play()
}
}.store(in: &subscriptions)
// Add an observer of the player object's current time. The app observes
// the player's current time to determine when to propose playing the next
// video in the Up Next list.
addTimeObserver()
}
/// Configures the audio session for video playback.
private func configureAudioSession() async {
let session = AVAudioSession.sharedInstance()
do {
// Configure the audio session for playback. Set the `moviePlayback` mode
// to reduce the audio's dynamic range to help normalize audio levels.
try session.setCategory(.playback, mode: .moviePlayback)
} catch {
logger.error("Unable to configure audio session: \(error.localizedDescription)")
}
}
/// Monitors the coordinator's `sharedVideo` property.
///
/// If this value changes due to a remote participant sharing a new activity, load and present
the new video.
private func observeSharedVideo() async {
let current = currentItem
await coordinator.$sharedVideo
.receive(on: DispatchQueue.main)
// Only observe non-nil values.
.compactMap { $0 }
// Only observe updates set by a remote participant.
.filter { $0 != current }
.sink { [weak self] video in
guard let self else { return }
// Load the video for full-window presentation.
loadVideo(video, presentation: .fullWindow)
}
.store(in: &subscriptions)
}
/// Loads a video for playback in the requested presentation.
/// - Parameters:
/// - video: The video to load for playback.
/// - presentation: The style in which to present the player.
/// - autoplay: A Boolean value that indicates whether to auto play that the content when
presented.
func loadVideo(_ video: Video, presentation: Presentation = .inline, autoplay: Bool = true) {
// Update the model state for the request.
currentItem = video
shouldAutoPlay = autoplay
isPlaybackComplete = false
switch presentation {
case .fullWindow:
Task { @MainActor in
// Attempt to SharePlay this video if a FaceTime call is active.
await coordinator.coordinatePlayback(of: video)
// After preparing for coordination, load the video into the player and present it.
replaceCurrentItem(with: video)
}
case .inline:
// Don't SharePlay the video the when playing it from the inline player,
// load the video into the player and present it.
replaceCurrentItem(with: video)
}
// In visionOS, configure the spatial experience for either .inline or .fullWindow playback.
configureAudioExperience(for: presentation)
// Set the presentation, which typically presents the player full window.
self.presentation = presentation
}
private func replaceCurrentItem(with video: Video) {
// Create a new player item and set it as the player's current item.
let playerItem = AVPlayerItem(url: video.resolvedURL)
// Set external metadata on the player item for the current video.
playerItem.externalMetadata = createMetadataItems(for: video)
// Set the new player item as current, and begin loading its data.
player.replaceCurrentItem(with: playerItem)
logger.debug("� \(video.title) enqueued for playback.")
}
/// Clears any loaded media and resets the player model to its default state.
func reset() {
currentItem = nil
player.replaceCurrentItem(with: nil)
playerViewController = nil
playerViewControllerDelegate = nil
// Reset the presentation state on the next cycle of the run loop.
Task { @MainActor in
presentation = .inline
}
}
/// Creates metadata items from the video items data.
/// - Parameter video: the video to create metadata for.
/// - Returns: An array of `AVMetadataItem` to set on a player item.
private func createMetadataItems(for video: Video) -> [AVMetadataItem] {
let mapping: [AVMetadataIdentifier: Any] = [
.commonIdentifierTitle: video.title,
.commonIdentifierArtwork: video.imageData,
.commonIdentifierDescription: video.description,
.commonIdentifierCreationDate: video.info.releaseDate,
.iTunesMetadataContentRating: video.info.contentRating,
.quickTimeMetadataGenre: video.info.genres
]
return mapping.compactMap { createMetadataItem(for: $0, value: $1) }
}
/// Creates a metadata item for a the specified identifier and value.
/// - Parameters:
/// - identifier: an identifier for the item.
/// - value: a value to associate with the item.
/// - Returns: a new `AVMetadataItem` object.
private func createMetadataItem(for identifier: AVMetadataIdentifier,
value: Any) -> AVMetadataItem {
let item = AVMutableMetadataItem()
item.identifier = identifier
item.value = value as? NSCopying & NSObjectProtocol
// Specify "und" to indicate an undefined language.
item.extendedLanguageTag = "und"
return item.copy() as! AVMetadataItem
}
/// Configures the user's intended spatial audio experience to best fit the presentation.
/// - Parameter presentation: the requested player presentation.
private func configureAudioExperience(for presentation: Presentation) {
#if os(visionOS)
do {
let experience: AVAudioSessionSpatialExperience
switch presentation {
case .inline:
// Set a small, focused sound stage when watching trailers.
experience = .headTracked(soundStageSize: .small, anchoringStrategy: .automatic)
case .fullWindow:
// Set a large sound stage size when viewing full window.
experience = .headTracked(soundStageSize: .large, anchoringStrategy: .automatic)
}
try AVAudioSession.sharedInstance().setIntendedSpatialExperience(experience)
} catch {
logger.error("Unable to set the intended spatial experience.
\(error.localizedDescription)")
}
#endif
}
// MARK: - Transport Control
func play() {
player.play()
}
func pause() {
player.pause()
}
func togglePlayback() {
player.timeControlStatus == .paused ? play() : pause()
}
// MARK: - Time Observation
private func addTimeObserver() {
removeTimeObserver()
// Observe the player's timing every 1 second
let timeInterval = CMTime(value: 1, timescale: 1)
timeObserver = player.addPeriodicTimeObserver(forInterval: timeInterval, queue: .main)
{ [weak self] time in
guard let self = self, let duration = player.currentItem?.duration else { return }
// Propose playing the next episode within 10 seconds of the end of the current episode.
let isInProposalRange = time.seconds >= duration.seconds - 10.0
if shouldProposeNextVideo != isInProposalRange {
shouldProposeNextVideo = isInProposalRange
}
}
}
private func removeTimeObserver() {
guard let timeObserver = timeObserver else { return }
player.removeTimeObserver(timeObserver)
self.timeObserver = nil
}
/// A coordinator that acts as the player view controller's delegate object.
final class PlayerViewControllerDelegate: NSObject, AVPlayerViewControllerDelegate {
let player: PlayerModel
init(player: PlayerModel) {
self.player = player
}
#if os(visionOS)
// The app adopts this method to reset the state of the player model when a user
// taps the back button in the visionOS player UI.
func playerViewController(_ playerViewController: AVPlayerViewController,
willEndFullScreenPresentationWithAnimationCoordinator coordinator:
UIViewControllerTransitionCoordinator) {
Task { @MainActor in
// Calling reset dismisses the full-window player.
player.reset()
}
}
#endif
}
}
SystemPlayerView.swift
/*
Abstract:
A view that provides a platform-specific playback user interface.
*/
import AVKit
import SwiftUI
// This view is a SwiftUI wrapper over `AVPlayerViewController`.
struct SystemPlayerView: UIViewControllerRepresentable {
@Environment(PlayerModel.self) private var model
@Environment(VideoLibrary.self) private var library
let showContextualActions: Bool
init(showContextualActions: Bool) {
self.showContextualActions = showContextualActions
}
func makeUIViewController(context: Context) -> AVPlayerViewController {
// Create a player view controller.
let controller = model.makePlayerViewController()
// Enable PiP on iOS and tvOS.
controller.allowsPictureInPicturePlayback = true
#if os(visionOS) || os(tvOS)
// Display an Up Next tab in the player UI.
if let upNextViewController {
controller.customInfoViewControllers = [upNextViewController]
}
#endif
// Return the configured controller object.
return controller
}
func updateUIViewController(_ controller: AVPlayerViewController, context: Context) {
#if os(visionOS) || os(tvOS)
Task { @MainActor in
// Rebuild the list of related video if necessary.
if let upNextViewController {
controller.customInfoViewControllers = [upNextViewController]
}
if let upNextAction, showContextualActions {
controller.contextualActions = [upNextAction]
} else {
controller.contextualActions = []
}
}
#endif
}
// A view controller that presents a list of Up Next videos.
var upNextViewController: UIViewController? {
guard let video = model.currentItem else { return nil }
// Find the Up Next list for this video. Return early it there are none.
let videos = library.findUpNext(for: video)
if videos.isEmpty { return nil }
let view = UpNextView(videos: videos, model: model)
let controller = UIHostingController(rootView: view)
// AVPlayerViewController uses the view controller's title as the tab name.
// Specify the view controller's title before setting as a `customInfoViewControllers` value.
controller.title = view.title
// Set the preferred content size for the tab.
controller.preferredContentSize = CGSize(width: 500, height: isTV ? 250 : 150)
return controller
}
var upNextAction: UIAction? {
// If there's no video loaded, return nil.
guard let video = model.currentItem else { return nil }
// Find the next video to play.
guard let nextVideo = library.findVideoInUpNext(after: video) else { return nil }
return UIAction(title: "Play Next", image: UIImage(systemName: "play.fill")) { _ in
// Load the video for full-window presentation.
model.loadVideo(nextVideo, presentation: .fullWindow)
}
}
}
InlinePlayerView.swift
/*
Abstract:
A view that display a simple inline video player with custom controls.
*/
import AVKit
import SwiftUI
struct InlinePlayerView: View {
@Environment(PlayerModel.self) private var model
var body: some View {
ZStack {
// A view that uses AVPlayerViewController to display the video content without controls.
VideoContentView()
// Custom inline controls to overlay on top of the video content.
InlineControlsView()
}
.onDisappear {
// If this view disappears, and it's not due to switching to full-window
// presentation, clear the model's loaded media.
if model.presentation != .fullWindow {
model.reset()
}
}
}
}
/// A view that defines a simple play/pause/replay button for the trailer player.
struct InlineControlsView: View {
@Environment(PlayerModel.self) private var player
@State private var isShowingControls = false
var body: some View {
VStack {
Image(systemName: player.isPlaying ? "pause.fill" : "play.fill")
.padding(8)
.background(.thinMaterial)
.clipShape(.circle)
}
.font(.largeTitle)
.frame(maxWidth: .infinity, maxHeight: .infinity)
.contentShape(Rectangle())
.onTapGesture {
player.togglePlayback()
isShowingControls = true
// Execute the code below on the next runloop cycle.
Task { @MainActor in
if player.isPlaying {
dismissAfterDelay()
}
}
}
}
func dismissAfterDelay() {
Task {
try! await Task.sleep(for: .seconds(3.0))
withAnimation(.easeOut(duration: 0.3)) {
isShowingControls = false
}
}
}
}
/// A view that presents the video content of an player object.
///
/// This class is a view controller representable type that adapts the interface
/// of AVPlayerViewController. It disables the view controller's default controls
/// so it can draw custom controls over the video content.
private struct VideoContentView: UIViewControllerRepresentable {
@Environment(PlayerModel.self) private var model
func makeUIViewController(context: Context) -> AVPlayerViewController {
let controller = model.makePlayerViewController()
// Disable the default system playback controls.
controller.showsPlaybackControls = false
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context)
{}
}
#Preview {
InlineControlsView()
.environment(PlayerModel())
}
UpNextView.swift
/*
Abstract:
A view that displays a list of movies related to the currently playing content.
*/
import SwiftUI
struct UpNextView: View {
let title = "Up Next"
let videos: [Video]
let model: PlayerModel
var body: some View {
VideoListView(videos: videos, cardStyle: .compact, cardSpacing: isTV ? 50 : 30) { video in
model.loadVideo(video, presentation: .fullWindow)
}
}
}
#Preview {
UpNextView(videos: .all, model: PlayerModel())
.padding()
}
VideoWatchingActivity.swift
/*
Abstract:
A group activity to watch a video with others.
*/
import Foundation
import Combine
import GroupActivities
import CoreTransferable
import UIKit
struct VideoWatchingActivity: GroupActivity, Transferable {
// A video to watch.
let video: Video
// Metadata that the system displays to participants.
var metadata: GroupActivityMetadata {
var metadata = GroupActivityMetadata()
metadata.type = .watchTogether
metadata.title = video.title
metadata.previewImage = previewImage
metadata.fallbackURL = fallbackURL
metadata.supportsContinuationOnTV = true
return metadata
}
var previewImage: CGImage? {
UIImage(named: video.landscapeImageName)?.cgImage
}
var fallbackURL: URL? {
// When working with remote media, specify the media's URL as the fallback.
video.hasRemoteMedia ? video.resolvedURL : nil
}
}
VideoWatchingCoordinator.swift
/*
Abstract:
An actor that manages the coordinated playback of a video with participants in a group session.
*/
import Foundation
import Combine
import GroupActivities
import AVFoundation
actor VideoWatchingCoordinator {
// Published values that the player, and other UI items, observe.
@Published private(set) var sharedVideo: Video?
// Combine subscriptions.
private var subscriptions = Set<AnyCancellable>()
// An object that determines video equality when sharing.
private let coordinatorDelegate = CoordinatorDelegate()
// A player object's playback coordinator.
private var playbackCoordinator: AVPlayerPlaybackCoordinator
init(playbackCoordinator: AVPlayerPlaybackCoordinator) {
self.playbackCoordinator = playbackCoordinator
self.playbackCoordinator.delegate = coordinatorDelegate
Task {
await startObservingSessions()
}
}
private func startObservingSessions() async {
// Await new sessions to watch video together.
for await session in VideoWatchingActivity.sessions() {
// Clean up the old session, if it exists.
cleanUpSession(groupSession)
#if os(visionOS)
// Retrieve the new session's system coordinator object to update its configuration.
guard let systemCoordinator = await session.systemCoordinator else { continue }
// Create a new configuration that enables all participants to share the same immersive
space.
var configuration = SystemCoordinator.Configuration()
configuration.supportsGroupImmersiveSpace = true
// Update the coordinator's configuration.
systemCoordinator.configuration = configuration
#endif
// Set the app's active group session before joining.
groupSession = session
let stateListener = Task {
await self.handleStateChanges(groupSession: session)
}
subscriptions.insert(.init { stateListener.cancel() })
// Observe when the local user or a remote participant changes the activity on the
GroupSession
let activityListener = Task {
await self.handleActivityChanges(groupSession: session)
}
subscriptions.insert(.init { activityListener.cancel() })
// Join the session to participate in playback coordination.
session.join()
}
}
private func cleanUpSession(_ session: GroupSession<VideoWatchingActivity>?) {
// Exit early if session isn't the same instance as the player model's session.
guard groupSession === session else { return }
// Leave the session and set the session and video to nil to publish the invalidated state.
groupSession?.leave()
groupSession = nil
sharedVideo = nil
subscriptions.removeAll()
}
private var groupSession: GroupSession<VideoWatchingActivity>? {
didSet {
guard let groupSession else { return }
// Set the group session on the AVPlayer instances's playback coordinator
// so it can synchronize playback with other devices.
playbackCoordinator.coordinateWithSession(groupSession)
}
}
private func handleActivityChanges(groupSession: GroupSession<VideoWatchingActivity>)
async {
for await newActivity in groupSession.$activity.values {
guard groupSession === self.groupSession else { return }
updateSharedVideo(video: newActivity.video)
}
}
private func handleStateChanges(groupSession: GroupSession<VideoWatchingActivity>) async
{
for await newState in groupSession.$state.values {
if case .invalidated = newState {
cleanUpSession(groupSession)
}
}
}
/// Updates the `sharedVideo` state for this actor.
/// - Parameter video: the video to set as shared.
private func updateSharedVideo(video: Video) {
coordinatorDelegate.video = video
// Set the video as the shared video.
sharedVideo = video
}
/// Coordinates the playback of this video with others in a group session.
/// - Parameter video: the video to share
func coordinatePlayback(of video: Video) async {
// Exit early if this video is already shared.
guard video != sharedVideo else { return }
// Create a new activity for the selected video.
let activity = VideoWatchingActivity(video: video)
switch await activity.prepareForActivation() {
case .activationPreferred:
do {
// Attempt to activate the new activity.
_ = try await activity.activate()
} catch {
logger.debug("Unable to activate the activity: \(error)")
}
case .activationDisabled:
// A FaceTime session isn't active, or the user prefers to play the
// video apart from the group. Set the sharedVideo to nil.
sharedVideo = nil
default:
break
}
}
/// An implementation of `AVPlayerPlaybackCoordinatorDelegate` that determines how
/// the playback coordinator identifies local and remote media.
private class CoordinatorDelegate: NSObject, AVPlayerPlaybackCoordinatorDelegate {
var video: Video?
// Adopting this delegate method is required when playing local media,
// or any time you need a custom strategy for identifying media. Without
// implementing this method, coordinated playback won't function correctly.
func playbackCoordinator(_ coordinator: AVPlayerPlaybackCoordinator,
identifierFor playerItem: AVPlayerItem) -> String {
// Return the video id as the player item identifier.
"\(video?.id ?? -1)"
}
}
}
AVExtensions.swift
/*
Abstract:
A structure that represents the result of an audio session interruption, such as a phone call.
*/
import AVFoundation
// A simple type that unpacks the relevant values from an AVAudioSession interruption event.
struct InterruptionResult {
let type: AVAudioSession.InterruptionType
let options: AVAudioSession.InterruptionOptions
init?(_ notification: Notification) {
// Determine the interruption type and options.
guard let type = notification.userInfo?[AVAudioSessionInterruptionTypeKey] as?
AVAudioSession.InterruptionType,
let options = notification.userInfo?[AVAudioSessionInterruptionOptionKey] as?
AVAudioSession.InterruptionOptions else {
return nil
}
self.type = type
self.options = options
}
}
ViewExtensions.swift
/*
Abstract:
Helper extensions to simplify multiplatform development.
*/
import SwiftUI
import UIKit
extension View {
/// A helper function that returns a platform-specific value.
func valueFor<V>(iOS: V, tvOS: V, visionOS: V) -> V {
#if os(visionOS)
visionOS
#elseif os(tvOS)
tvOS
#else
iOS
#endif
}
/// A Boolean value that indicates whether the current platform is visionOS.
///
/// If the value is `true`, `isMobile` is also true.
var isVision: Bool {
#if os(visionOS)
true
#else
false
#endif
}
/// A Boolean value that indicates whether the current platform is iOS or iPadOS.
var isMobile: Bool {
#if os(iOS)
true
#else
false
#endif
}
/// A Boolean value that indicates whether the current platform is tvOS.
var isTV: Bool {
#if os(tvOS)
true
#else
false
#endif
}
/// A debugging function to add a border around a view.
func debugBorder(color: Color = .red, width: CGFloat = 1.0, opacity: CGFloat = 1.0) -> some
View {
self
.border(color, width: width)
.opacity(opacity)
}
}
Download