feat: MPV player for both Android and iOS with added HW decoding PiP (with subtitles) (#1332)

Co-authored-by: Alex Kim <alexkim@Alexs-MacBook-Pro.local>
Co-authored-by: Alex <111128610+Alexk2309@users.noreply.github.com>
Co-authored-by: Simon-Eklundh <simon.eklundh@proton.me>
This commit is contained in:
Fredrik Burmester
2026-01-10 19:35:27 +01:00
committed by GitHub
parent df2f44e086
commit f1575ca48b
98 changed files with 3257 additions and 7448 deletions

View File

@@ -15,3 +15,11 @@ This file is auto-imported into CLAUDE.md and loaded at the start of each sessio
- **Intro modal trigger location**: The intro modal trigger logic should be in the `Home.tsx` component, not in the tabs `_layout.tsx`. Triggering modals from tab layout can interfere with native bottom tabs navigation. _(2025-01-09)_ - **Intro modal trigger location**: The intro modal trigger logic should be in the `Home.tsx` component, not in the tabs `_layout.tsx`. Triggering modals from tab layout can interfere with native bottom tabs navigation. _(2025-01-09)_
- **Tab folder naming**: The tab folders use underscore prefix naming like `(_home)` instead of just `(home)` based on the project's file structure conventions. _(2025-01-09)_ - **Tab folder naming**: The tab folders use underscore prefix naming like `(_home)` instead of just `(home)` based on the project's file structure conventions. _(2025-01-09)_
- **macOS header buttons fix**: Header buttons (`headerRight`/`headerLeft`) don't respond to touches on macOS Catalyst builds when using standard React Native `TouchableOpacity`. Fix by using `Pressable` from `react-native-gesture-handler` instead. The library is already installed and `GestureHandlerRootView` wraps the app. _(2026-01-10)_
- **Header button locations**: Header buttons are defined in multiple places: `app/(auth)/(tabs)/(home)/_layout.tsx` (SettingsButton, SessionsButton, back buttons), `components/common/HeaderBackButton.tsx` (reusable), `components/Chromecast.tsx`, `components/RoundButton.tsx`, and dynamically via `navigation.setOptions()` in `components/home/Home.tsx` and `app/(auth)/(tabs)/(home)/downloads/index.tsx`. _(2026-01-10)_
- **useNetworkAwareQueryClient limitations**: The `useNetworkAwareQueryClient` hook uses `Object.create(queryClient)` which breaks QueryClient methods that use JavaScript private fields (like `getQueriesData`, `setQueriesData`, `setQueryData`). Only use it when you ONLY need `invalidateQueries`. For cache manipulation, use standard `useQueryClient` from `@tanstack/react-query`. _(2026-01-10)_
- **Mark as played flow**: The "mark as played" button uses `PlayedStatus` component → `useMarkAsPlayed` hook → `usePlaybackManager.markItemPlayed()`. The hook does optimistic updates via `setQueriesData` before calling the API. Located in `components/PlayedStatus.tsx` and `hooks/useMarkAsPlayed.ts`. _(2026-01-10)_

2
.gitignore vendored
View File

@@ -19,7 +19,7 @@ web-build/
/androidtv /androidtv
# Module-specific Builds # Module-specific Builds
modules/vlc-player/android/build modules/mpv-player/android/build
modules/player/android modules/player/android
modules/hls-downloader/android/build modules/hls-downloader/android/build

View File

@@ -54,6 +54,11 @@ The Jellyfin Plugin for Streamyfin is a plugin you install into Jellyfin that ho
Chromecast support is currently under development. Video casting is already available, and we're actively working on adding subtitle support and additional features. Chromecast support is currently under development. Video casting is already available, and we're actively working on adding subtitle support and additional features.
### 🎬 MPV Player
Streamyfin uses [MPV](https://mpv.io/) as its primary video player on all platforms, powered by [MPVKit](https://github.com/mpvkit/MPVKit). MPV is a powerful, open-source media player known for its wide format support and high-quality playback.
Thanks to [@Alexk2309](https://github.com/Alexk2309) for the hard work building the native MPV module in Streamyfin.
### 🔍 Jellysearch ### 🔍 Jellysearch
[Jellysearch](https://gitlab.com/DomiStyle/jellysearch) works with Streamyfin [Jellysearch](https://gitlab.com/DomiStyle/jellysearch) works with Streamyfin
@@ -230,6 +235,7 @@ We also thank all other developers who have contributed to Streamyfin, your effo
A special mention to the following people and projects for their contributions: A special mention to the following people and projects for their contributions:
- [@Alexk2309](https://github.com/Alexk2309) for building the native MPV module that integrates [MPVKit](https://github.com/mpvkit/MPVKit) with React Native
- [Reiverr](https://github.com/aleksilassila/reiverr) for invaluable help with understanding the Jellyfin API - [Reiverr](https://github.com/aleksilassila/reiverr) for invaluable help with understanding the Jellyfin API
- [Jellyfin TS SDK](https://github.com/jellyfin/jellyfin-sdk-typescript) for providing the TypeScript SDK - [Jellyfin TS SDK](https://github.com/jellyfin/jellyfin-sdk-typescript) for providing the TypeScript SDK
- [Seerr](https://github.com/seerr-team/seerr) for enabling API integration with their project - [Seerr](https://github.com/seerr-team/seerr) for enabling API integration with their project

View File

@@ -6,9 +6,6 @@ module.exports = ({ config }) => {
"react-native-google-cast", "react-native-google-cast",
{ useDefaultExpandedMediaControls: true }, { useDefaultExpandedMediaControls: true },
]); ]);
// KSPlayer for iOS (GPU acceleration + native PiP)
config.plugins.push("./plugins/withKSPlayer.js");
} }
// Only override googleServicesFile if env var is set // Only override googleServicesFile if env var is set

View File

@@ -58,7 +58,8 @@
"expo-build-properties", "expo-build-properties",
{ {
"ios": { "ios": {
"deploymentTarget": "15.6" "deploymentTarget": "15.6",
"useFrameworks": "static"
}, },
"android": { "android": {
"buildArchs": ["arm64-v8a", "x86_64"], "buildArchs": ["arm64-v8a", "x86_64"],
@@ -66,7 +67,7 @@
"targetSdkVersion": 35, "targetSdkVersion": 35,
"buildToolsVersion": "35.0.0", "buildToolsVersion": "35.0.0",
"kotlinVersion": "2.0.21", "kotlinVersion": "2.0.21",
"minSdkVersion": 24, "minSdkVersion": 26,
"usesCleartextTraffic": true, "usesCleartextTraffic": true,
"packagingOptions": { "packagingOptions": {
"jniLibs": { "jniLibs": {
@@ -84,12 +85,6 @@
"initialOrientation": "DEFAULT" "initialOrientation": "DEFAULT"
} }
], ],
[
"expo-sensors",
{
"motionPermission": "Allow Streamyfin to access your device motion for landscape video watching."
}
],
"expo-localization", "expo-localization",
"expo-asset", "expo-asset",
[ [
@@ -120,7 +115,14 @@
["./plugins/withChangeNativeAndroidTextToWhite.js"], ["./plugins/withChangeNativeAndroidTextToWhite.js"],
["./plugins/withAndroidManifest.js"], ["./plugins/withAndroidManifest.js"],
["./plugins/withTrustLocalCerts.js"], ["./plugins/withTrustLocalCerts.js"],
["./plugins/withGradleProperties.js"] ["./plugins/withGradleProperties.js"],
[
"./plugins/withGitPod.js",
{
"podName": "MPVKit-GPL",
"podspecUrl": "https://raw.githubusercontent.com/streamyfin/MPVKit/0.40.0-av/MPVKit-GPL.podspec"
}
]
], ],
"experiments": { "experiments": {
"typedRoutes": true "typedRoutes": true

View File

@@ -1,7 +1,8 @@
import { Feather, Ionicons } from "@expo/vector-icons"; import { Feather, Ionicons } from "@expo/vector-icons";
import { Stack, useRouter } from "expo-router"; import { Stack, useRouter } from "expo-router";
import { useTranslation } from "react-i18next"; import { useTranslation } from "react-i18next";
import { Platform, TouchableOpacity, View } from "react-native"; import { Platform, View } from "react-native";
import { Pressable } from "react-native-gesture-handler";
import { nestedTabPageScreenOptions } from "@/components/stacks/NestedTabPageStack"; import { nestedTabPageScreenOptions } from "@/components/stacks/NestedTabPageStack";
const Chromecast = Platform.isTV ? null : require("@/components/Chromecast"); const Chromecast = Platform.isTV ? null : require("@/components/Chromecast");
@@ -46,13 +47,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
title: t("home.downloads.downloads_title"), title: t("home.downloads.downloads_title"),
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -65,13 +66,13 @@ export default function IndexLayout() {
headerShadowVisible: false, headerShadowVisible: false,
title: t("home.downloads.tvseries"), title: t("home.downloads.tvseries"),
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -84,13 +85,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -102,13 +103,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -120,13 +121,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -138,13 +139,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -156,13 +157,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -174,13 +175,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -192,13 +193,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -210,13 +211,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -228,13 +229,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -246,13 +247,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -264,13 +265,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -282,13 +283,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -300,13 +301,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -318,13 +319,13 @@ export default function IndexLayout() {
headerTransparent: Platform.OS === "ios", headerTransparent: Platform.OS === "ios",
headerShadowVisible: false, headerShadowVisible: false,
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => _router.back()} onPress={() => _router.back()}
className='pl-0.5' className='pl-0.5'
style={{ marginRight: Platform.OS === "android" ? 16 : 0 }} style={{ marginRight: Platform.OS === "android" ? 16 : 0 }}
> >
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
}} }}
/> />
@@ -336,9 +337,9 @@ export default function IndexLayout() {
options={{ options={{
title: "", title: "",
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity onPress={() => _router.back()} className='pl-0.5'> <Pressable onPress={() => _router.back()} className='pl-0.5'>
<Feather name='chevron-left' size={28} color='white' /> <Feather name='chevron-left' size={28} color='white' />
</TouchableOpacity> </Pressable>
), ),
headerShown: true, headerShown: true,
headerBlurEffect: "prominent", headerBlurEffect: "prominent",
@@ -354,13 +355,13 @@ const SettingsButton = () => {
const router = useRouter(); const router = useRouter();
return ( return (
<TouchableOpacity <Pressable
onPress={() => { onPress={() => {
router.push("/(auth)/settings"); router.push("/(auth)/settings");
}} }}
> >
<Feather name='settings' color={"white"} size={22} /> <Feather name='settings' color={"white"} size={22} />
</TouchableOpacity> </Pressable>
); );
}; };
@@ -369,7 +370,7 @@ const SessionsButton = () => {
const { sessions = [] } = useSessions({} as useSessionsProps); const { sessions = [] } = useSessions({} as useSessionsProps);
return ( return (
<TouchableOpacity <Pressable
onPress={() => { onPress={() => {
router.push("/(auth)/sessions"); router.push("/(auth)/sessions");
}} }}
@@ -380,6 +381,6 @@ const SessionsButton = () => {
color={sessions.length === 0 ? "white" : "#9333ea"} color={sessions.length === 0 ? "white" : "#9333ea"}
size={28} size={28}
/> />
</TouchableOpacity> </Pressable>
); );
}; };

View File

@@ -3,13 +3,8 @@ import { useNavigation, useRouter } from "expo-router";
import { useAtom } from "jotai"; import { useAtom } from "jotai";
import { useEffect, useMemo, useRef, useState } from "react"; import { useEffect, useMemo, useRef, useState } from "react";
import { useTranslation } from "react-i18next"; import { useTranslation } from "react-i18next";
import { import { Alert, Platform, ScrollView, View } from "react-native";
Alert, import { Pressable } from "react-native-gesture-handler";
Platform,
ScrollView,
TouchableOpacity,
View,
} from "react-native";
import { useSafeAreaInsets } from "react-native-safe-area-context"; import { useSafeAreaInsets } from "react-native-safe-area-context";
import { toast } from "sonner-native"; import { toast } from "sonner-native";
import { Text } from "@/components/common/Text"; import { Text } from "@/components/common/Text";
@@ -103,12 +98,12 @@ export default function page() {
useEffect(() => { useEffect(() => {
navigation.setOptions({ navigation.setOptions({
headerRight: () => ( headerRight: () => (
<TouchableOpacity <Pressable
onPress={bottomSheetModalRef.current?.present} onPress={bottomSheetModalRef.current?.present}
className='px-2' className='px-2'
> >
<DownloadSize items={downloadedFiles?.map((f) => f.item) || []} /> <DownloadSize items={downloadedFiles?.map((f) => f.item) || []} />
</TouchableOpacity> </Pressable>
), ),
}); });
}, [downloadedFiles]); }, [downloadedFiles]);

View File

@@ -2,8 +2,8 @@ import { Platform, ScrollView, View } from "react-native";
import { useSafeAreaInsets } from "react-native-safe-area-context"; import { useSafeAreaInsets } from "react-native-safe-area-context";
import { AudioToggles } from "@/components/settings/AudioToggles"; import { AudioToggles } from "@/components/settings/AudioToggles";
import { MediaProvider } from "@/components/settings/MediaContext"; import { MediaProvider } from "@/components/settings/MediaContext";
import { MpvSubtitleSettings } from "@/components/settings/MpvSubtitleSettings";
import { SubtitleToggles } from "@/components/settings/SubtitleToggles"; import { SubtitleToggles } from "@/components/settings/SubtitleToggles";
import { VlcSubtitleSettings } from "@/components/settings/VlcSubtitleSettings";
export default function AudioSubtitlesPage() { export default function AudioSubtitlesPage() {
const insets = useSafeAreaInsets(); const insets = useSafeAreaInsets();
@@ -23,7 +23,7 @@ export default function AudioSubtitlesPage() {
<MediaProvider> <MediaProvider>
<AudioToggles className='mb-4' /> <AudioToggles className='mb-4' />
<SubtitleToggles className='mb-4' /> <SubtitleToggles className='mb-4' />
<VlcSubtitleSettings className='mb-4' /> <MpvSubtitleSettings className='mb-4' />
</MediaProvider> </MediaProvider>
</View> </View>
</ScrollView> </ScrollView>

View File

@@ -1,7 +1,8 @@
import { Ionicons } from "@expo/vector-icons"; import { Ionicons } from "@expo/vector-icons";
import { Stack, useRouter } from "expo-router"; import { Stack, useRouter } from "expo-router";
import { useTranslation } from "react-i18next"; import { useTranslation } from "react-i18next";
import { Platform, TouchableOpacity } from "react-native"; import { Platform } from "react-native";
import { Pressable } from "react-native-gesture-handler";
import { nestedTabPageScreenOptions } from "@/components/stacks/NestedTabPageStack"; import { nestedTabPageScreenOptions } from "@/components/stacks/NestedTabPageStack";
import { useStreamystatsEnabled } from "@/hooks/useWatchlists"; import { useStreamystatsEnabled } from "@/hooks/useWatchlists";
@@ -22,14 +23,14 @@ export default function WatchlistsLayout() {
headerShadowVisible: false, headerShadowVisible: false,
headerRight: streamystatsEnabled headerRight: streamystatsEnabled
? () => ( ? () => (
<TouchableOpacity <Pressable
onPress={() => onPress={() =>
router.push("/(auth)/(tabs)/(watchlists)/create") router.push("/(auth)/(tabs)/(watchlists)/create")
} }
className='p-1.5' className='p-1.5'
> >
<Ionicons name='add' size={24} color='white' /> <Ionicons name='add' size={24} color='white' />
</TouchableOpacity> </Pressable>
) )
: undefined, : undefined,
}} }}

View File

@@ -14,7 +14,7 @@ import { router, useGlobalSearchParams, useNavigation } from "expo-router";
import { useAtomValue } from "jotai"; import { useAtomValue } from "jotai";
import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import { useCallback, useEffect, useMemo, useRef, useState } from "react";
import { useTranslation } from "react-i18next"; import { useTranslation } from "react-i18next";
import { Alert, Platform, View } from "react-native"; import { Alert, Platform, useWindowDimensions, View } from "react-native";
import { useAnimatedReaction, useSharedValue } from "react-native-reanimated"; import { useAnimatedReaction, useSharedValue } from "react-native-reanimated";
import { BITRATES } from "@/components/BitrateSelector"; import { BITRATES } from "@/components/BitrateSelector";
@@ -27,7 +27,6 @@ import {
PlaybackSpeedScope, PlaybackSpeedScope,
updatePlaybackSpeedSettings, updatePlaybackSpeedSettings,
} from "@/components/video-player/controls/utils/playback-speed-settings"; } from "@/components/video-player/controls/utils/playback-speed-settings";
import { OUTLINE_THICKNESS, VLC_COLORS } from "@/constants/SubtitleConstants";
import { useHaptic } from "@/hooks/useHaptic"; import { useHaptic } from "@/hooks/useHaptic";
import { useOrientation } from "@/hooks/useOrientation"; import { useOrientation } from "@/hooks/useOrientation";
import { usePlaybackManager } from "@/hooks/usePlaybackManager"; import { usePlaybackManager } from "@/hooks/usePlaybackManager";
@@ -35,24 +34,17 @@ import usePlaybackSpeed from "@/hooks/usePlaybackSpeed";
import { useInvalidatePlaybackProgressCache } from "@/hooks/useRevalidatePlaybackProgressCache"; import { useInvalidatePlaybackProgressCache } from "@/hooks/useRevalidatePlaybackProgressCache";
import { useWebSocket } from "@/hooks/useWebsockets"; import { useWebSocket } from "@/hooks/useWebsockets";
import { import {
type PlaybackStatePayload, type MpvOnErrorEventPayload,
type ProgressUpdatePayload, type MpvOnPlaybackStateChangePayload,
type SfOnErrorEventPayload, type MpvOnProgressEventPayload,
type SfOnPictureInPictureChangePayload, MpvPlayerView,
type SfOnPlaybackStateChangePayload, type MpvPlayerViewRef,
type SfOnProgressEventPayload, type MpvVideoSource,
SfPlayerView,
type SfPlayerViewRef,
type SfVideoSource,
setHardwareDecode,
type VlcPlayerSource,
VlcPlayerView,
type VlcPlayerViewRef,
} from "@/modules"; } from "@/modules";
import { useDownload } from "@/providers/DownloadProvider"; import { useDownload } from "@/providers/DownloadProvider";
import { DownloadedItem } from "@/providers/Downloads/types"; import { DownloadedItem } from "@/providers/Downloads/types";
import { apiAtom, userAtom } from "@/providers/JellyfinProvider"; import { apiAtom, userAtom } from "@/providers/JellyfinProvider";
import { useSettings, VideoPlayerIOS } from "@/utils/atoms/settings"; import { useSettings } from "@/utils/atoms/settings";
import { getStreamUrl } from "@/utils/jellyfin/media/getStreamUrl"; import { getStreamUrl } from "@/utils/jellyfin/media/getStreamUrl";
import { import {
getMpvAudioId, getMpvAudioId,
@@ -63,29 +55,21 @@ import { generateDeviceProfile } from "@/utils/profiles/native";
import { msToTicks, ticksToSeconds } from "@/utils/time"; import { msToTicks, ticksToSeconds } from "@/utils/time";
export default function page() { export default function page() {
const videoRef = useRef<SfPlayerViewRef | VlcPlayerViewRef>(null); const videoRef = useRef<MpvPlayerViewRef>(null);
const user = useAtomValue(userAtom); const user = useAtomValue(userAtom);
const api = useAtomValue(apiAtom); const api = useAtomValue(apiAtom);
const { t } = useTranslation(); const { t } = useTranslation();
const navigation = useNavigation(); const navigation = useNavigation();
const { settings, updateSettings } = useSettings(); const { settings, updateSettings } = useSettings();
// Determine which player to use: const { width: screenWidth, height: screenHeight } = useWindowDimensions();
// - Android always uses VLC
// - iOS uses user setting (KSPlayer by default, VLC optional)
const useVlcPlayer =
Platform.OS === "android" ||
(Platform.OS === "ios" && settings.videoPlayerIOS === VideoPlayerIOS.VLC);
const [isPlaybackStopped, setIsPlaybackStopped] = useState(false); const [isPlaybackStopped, setIsPlaybackStopped] = useState(false);
const [showControls, _setShowControls] = useState(true); const [showControls, _setShowControls] = useState(true);
const [isPipMode, setIsPipMode] = useState(false); const [isPipMode, setIsPipMode] = useState(false);
const [aspectRatio, setAspectRatio] = useState< const [aspectRatio] = useState<"default" | "16:9" | "4:3" | "1:1" | "21:9">(
"default" | "16:9" | "4:3" | "1:1" | "21:9" "default",
>("default"); );
const [scaleFactor, setScaleFactor] = useState<
0 | 0.25 | 0.5 | 0.75 | 1.0 | 1.25 | 1.5 | 2.0
>(0);
const [isZoomedToFill, setIsZoomedToFill] = useState(false); const [isZoomedToFill, setIsZoomedToFill] = useState(false);
const [isPlaying, setIsPlaying] = useState(false); const [isPlaying, setIsPlaying] = useState(false);
const [isMuted, setIsMuted] = useState(false); const [isMuted, setIsMuted] = useState(false);
@@ -190,15 +174,11 @@ export default function page() {
updateSettings, updateSettings,
); );
// Apply speed to the current player // Apply speed to the current player (MPV)
setCurrentPlaybackSpeed(speed); setCurrentPlaybackSpeed(speed);
if (useVlcPlayer) { await videoRef.current?.setSpeed?.(speed);
await (videoRef.current as VlcPlayerViewRef)?.setRate?.(speed);
} else {
await (videoRef.current as SfPlayerViewRef)?.setSpeed?.(speed);
}
}, },
[item, settings, updateSettings, useVlcPlayer], [item, settings, updateSettings],
); );
/** Gets the initial playback position from the URL. */ /** Gets the initial playback position from the URL. */
@@ -311,11 +291,7 @@ export default function page() {
maxStreamingBitrate: bitrateValue, maxStreamingBitrate: bitrateValue,
mediaSourceId: mediaSourceId, mediaSourceId: mediaSourceId,
subtitleStreamIndex: subtitleIndex, subtitleStreamIndex: subtitleIndex,
deviceProfile: generateDeviceProfile({ deviceProfile: generateDeviceProfile(),
platform: Platform.OS as "ios" | "android",
player: useVlcPlayer ? "vlc" : "ksplayer",
audioMode: settings.audioTranscodeMode,
}),
}); });
if (!res) return; if (!res) return;
const { mediaSource, sessionId, url } = res; const { mediaSource, sessionId, url } = res;
@@ -407,7 +383,6 @@ export default function page() {
}); });
reportPlaybackStopped(); reportPlaybackStopped();
setIsPlaybackStopped(true); setIsPlaybackStopped(true);
// KSPlayer doesn't have a stop method, use pause instead
videoRef.current?.pause(); videoRef.current?.pause();
revalidateProgressCache(); revalidateProgressCache();
}, [videoRef, reportPlaybackStopped, progress]); }, [videoRef, reportPlaybackStopped, progress]);
@@ -465,13 +440,13 @@ export default function page() {
[], [],
); );
/** Progress handler for iOS (SfPlayer) - position in seconds */ /** Progress handler for MPV - position in seconds */
const onProgressSf = useCallback( const onProgress = useCallback(
async (data: { nativeEvent: SfOnProgressEventPayload }) => { async (data: { nativeEvent: MpvOnProgressEventPayload }) => {
if (isSeeking.get() || isPlaybackStopped) return; if (isSeeking.get() || isPlaybackStopped) return;
const { position } = data.nativeEvent; const { position } = data.nativeEvent;
// KSPlayer reports position in seconds, convert to ms // MPV reports position in seconds, convert to ms
const currentTime = position * 1000; const currentTime = position * 1000;
if (isBuffering) { if (isBuffering) {
@@ -514,63 +489,14 @@ export default function page() {
], ],
); );
/** Progress handler for Android (VLC) - currentTime in milliseconds */
const onProgressVlc = useCallback(
async (data: ProgressUpdatePayload) => {
if (isSeeking.get() || isPlaybackStopped) return;
const { currentTime } = data.nativeEvent;
// VLC reports currentTime in milliseconds
if (isBuffering) {
setIsBuffering(false);
}
progress.set(currentTime);
// Update URL immediately after seeking, or every 30 seconds during normal playback
const now = Date.now();
const shouldUpdateUrl = wasJustSeeking.get();
wasJustSeeking.value = false;
if (
shouldUpdateUrl ||
now - lastUrlUpdateTime.get() > URL_UPDATE_INTERVAL
) {
router.setParams({
playbackPosition: msToTicks(currentTime).toString(),
});
lastUrlUpdateTime.value = now;
}
if (!item?.Id) return;
const progressInfo = currentPlayStateInfo();
if (progressInfo) {
playbackManager.reportPlaybackProgress(progressInfo);
}
},
[
item?.Id,
audioIndex,
subtitleIndex,
mediaSourceId,
isPlaying,
stream,
isSeeking,
isPlaybackStopped,
isBuffering,
],
);
/** Gets the initial playback position in seconds. */ /** Gets the initial playback position in seconds. */
const startPosition = useMemo(() => { const _startPosition = useMemo(() => {
return ticksToSeconds(getInitialPlaybackTicks()); return ticksToSeconds(getInitialPlaybackTicks());
}, [getInitialPlaybackTicks]); }, [getInitialPlaybackTicks]);
/** Build video source config for iOS (SfPlayer/KSPlayer) */ /** Build video source config for MPV */
const sfVideoSource = useMemo<SfVideoSource | undefined>(() => { const videoSource = useMemo<MpvVideoSource | undefined>(() => {
if (!stream?.url || useVlcPlayer) return undefined; if (!stream?.url) return undefined;
const mediaSource = stream.mediaSource; const mediaSource = stream.mediaSource;
const isTranscoding = Boolean(mediaSource?.TranscodingUrl); const isTranscoding = Boolean(mediaSource?.TranscodingUrl);
@@ -609,15 +535,10 @@ export default function page() {
: (item?.UserData?.PlaybackPositionTicks ?? 0); : (item?.UserData?.PlaybackPositionTicks ?? 0);
const startPos = ticksToSeconds(startTicks); const startPos = ticksToSeconds(startTicks);
// For transcoded streams, the server already handles seeking via startTimeTicks,
// so we should NOT also tell the player to seek (would cause double-seeking).
// For direct play/stream, the player needs to seek itself.
const playerStartPos = isTranscoding ? 0 : startPos;
// Build source config - headers only needed for online streaming // Build source config - headers only needed for online streaming
const source: SfVideoSource = { const source: MpvVideoSource = {
url: stream.url, url: stream.url,
startPosition: playerStartPos, startPosition: startPos,
autoplay: true, autoplay: true,
initialSubtitleId, initialSubtitleId,
initialAudioId, initialAudioId,
@@ -646,167 +567,6 @@ export default function page() {
subtitleIndex, subtitleIndex,
audioIndex, audioIndex,
offline, offline,
useVlcPlayer,
]);
/** Build video source config for Android (VLC) */
const vlcVideoSource = useMemo<VlcPlayerSource | undefined>(() => {
if (!stream?.url || !useVlcPlayer) return undefined;
const mediaSource = stream.mediaSource;
const isTranscoding = Boolean(mediaSource?.TranscodingUrl);
// Get external subtitle URLs for VLC (need name and DeliveryUrl)
// - Online: prepend API base path to server URLs
// - Offline: use local file paths (stored in DeliveryUrl during download)
let externalSubs: { name: string; DeliveryUrl: string }[] | undefined;
if (!offline && api?.basePath) {
externalSubs = mediaSource?.MediaStreams?.filter(
(s) =>
s.Type === "Subtitle" &&
s.DeliveryMethod === "External" &&
s.DeliveryUrl,
).map((s) => ({
name: s.DisplayTitle || s.Title || `Subtitle ${s.Index}`,
DeliveryUrl: `${api.basePath}${s.DeliveryUrl}`,
}));
} else if (offline) {
externalSubs = mediaSource?.MediaStreams?.filter(
(s) =>
s.Type === "Subtitle" &&
s.DeliveryMethod === "External" &&
s.DeliveryUrl,
).map((s) => ({
name: s.DisplayTitle || s.Title || `Subtitle ${s.Index}`,
DeliveryUrl: s.DeliveryUrl!,
}));
}
// Build VLC init options (required for VLC to work properly)
const initOptions: string[] = [""];
// Get all subtitle and audio streams
const allSubs =
mediaSource?.MediaStreams?.filter((s) => s.Type === "Subtitle") ?? [];
const textSubs = allSubs.filter((s) => s.IsTextSubtitleStream);
const allAudio =
mediaSource?.MediaStreams?.filter((s) => s.Type === "Audio") ?? [];
// Find chosen tracks
const chosenSubtitleTrack = allSubs.find((s) => s.Index === subtitleIndex);
const chosenAudioTrack = allAudio.find((a) => a.Index === audioIndex);
// Set subtitle track
if (
chosenSubtitleTrack &&
(!isTranscoding || chosenSubtitleTrack.IsTextSubtitleStream)
) {
const finalIndex = !isTranscoding
? allSubs.indexOf(chosenSubtitleTrack)
: [...textSubs].reverse().indexOf(chosenSubtitleTrack);
if (finalIndex >= 0) {
initOptions.push(`--sub-track=${finalIndex}`);
}
}
// Set audio track
if (!isTranscoding && chosenAudioTrack) {
const audioTrackIndex = allAudio.indexOf(chosenAudioTrack);
if (audioTrackIndex >= 0) {
initOptions.push(`--audio-track=${audioTrackIndex}`);
}
}
// Add VLC subtitle styling from settings
if (settings.subtitleSize) {
initOptions.push(`--sub-text-scale=${settings.subtitleSize}`);
}
initOptions.push(`--sub-margin=${settings.vlcSubtitleMargin ?? 40}`);
// Text color
if (
settings.vlcTextColor &&
VLC_COLORS[settings.vlcTextColor] !== undefined
) {
initOptions.push(`--freetype-color=${VLC_COLORS[settings.vlcTextColor]}`);
}
// Background styling
if (
settings.vlcBackgroundColor &&
VLC_COLORS[settings.vlcBackgroundColor] !== undefined
) {
initOptions.push(
`--freetype-background-color=${VLC_COLORS[settings.vlcBackgroundColor]}`,
);
}
if (settings.vlcBackgroundOpacity !== undefined) {
initOptions.push(
`--freetype-background-opacity=${settings.vlcBackgroundOpacity}`,
);
}
// Outline styling
if (
settings.vlcOutlineColor &&
VLC_COLORS[settings.vlcOutlineColor] !== undefined
) {
initOptions.push(
`--freetype-outline-color=${VLC_COLORS[settings.vlcOutlineColor]}`,
);
}
if (settings.vlcOutlineOpacity !== undefined) {
initOptions.push(
`--freetype-outline-opacity=${settings.vlcOutlineOpacity}`,
);
}
if (
settings.vlcOutlineThickness &&
OUTLINE_THICKNESS[settings.vlcOutlineThickness] !== undefined
) {
initOptions.push(
`--freetype-outline-thickness=${OUTLINE_THICKNESS[settings.vlcOutlineThickness]}`,
);
}
// Bold text
if (settings.vlcIsBold) {
initOptions.push("--freetype-bold");
}
// For transcoded streams, the server already handles seeking via startTimeTicks,
// so we should NOT also tell the player to seek (would cause double-seeking).
// For direct play/stream, the player needs to seek itself.
const playerStartPos = isTranscoding ? 0 : startPosition;
const source: VlcPlayerSource = {
uri: stream.url,
startPosition: playerStartPos,
autoplay: true,
isNetwork: !offline,
externalSubtitles: externalSubs,
initOptions,
};
return source;
}, [
stream?.url,
stream?.mediaSource,
startPosition,
useVlcPlayer,
api?.basePath,
offline,
subtitleIndex,
audioIndex,
settings.subtitleSize,
settings.vlcTextColor,
settings.vlcBackgroundColor,
settings.vlcBackgroundOpacity,
settings.vlcOutlineColor,
settings.vlcOutlineOpacity,
settings.vlcOutlineThickness,
settings.vlcIsBold,
settings.vlcSubtitleMargin,
]); ]);
const volumeUpCb = useCallback(async () => { const volumeUpCb = useCallback(async () => {
@@ -888,9 +648,9 @@ export default function page() {
setVolume: setVolumeCb, setVolume: setVolumeCb,
}); });
/** Playback state handler for iOS (SfPlayer) */ /** Playback state handler for MPV */
const onPlaybackStateChangedSf = useCallback( const onPlaybackStateChanged = useCallback(
async (e: { nativeEvent: SfOnPlaybackStateChangePayload }) => { async (e: { nativeEvent: MpvOnPlaybackStateChangePayload }) => {
const { isPaused, isPlaying: playing, isLoading } = e.nativeEvent; const { isPaused, isPlaying: playing, isLoading } = e.nativeEvent;
if (playing) { if (playing) {
@@ -924,52 +684,9 @@ export default function page() {
[playbackManager, item?.Id, progress], [playbackManager, item?.Id, progress],
); );
/** Playback state handler for Android (VLC) */ /** PiP handler for MPV */
const onPlaybackStateChangedVlc = useCallback( const _onPictureInPictureChange = useCallback(
async (e: PlaybackStatePayload) => { (e: { nativeEvent: { isActive: boolean } }) => {
const {
state,
isBuffering: buffering,
isPlaying: playing,
} = e.nativeEvent;
if (state === "Playing" || playing) {
setIsPlaying(true);
setIsBuffering(false);
setHasPlaybackStarted(true);
setTracksReady(true); // VLC tracks are ready when playback starts
if (item?.Id) {
const progressInfo = currentPlayStateInfo();
if (progressInfo) {
playbackManager.reportPlaybackProgress(progressInfo);
}
}
if (!Platform.isTV) await activateKeepAwakeAsync();
return;
}
if (state === "Paused") {
setIsPlaying(false);
if (item?.Id) {
const progressInfo = currentPlayStateInfo();
if (progressInfo) {
playbackManager.reportPlaybackProgress(progressInfo);
}
}
if (!Platform.isTV) await deactivateKeepAwake();
return;
}
if (state === "Buffering" || buffering) {
setIsBuffering(true);
}
},
[playbackManager, item?.Id, progress],
);
/** PiP handler for iOS (SfPlayer) */
const onPictureInPictureChangeSf = useCallback(
(e: { nativeEvent: SfOnPictureInPictureChangePayload }) => {
const { isActive } = e.nativeEvent; const { isActive } = e.nativeEvent;
setIsPipMode(isActive); setIsPipMode(isActive);
// Hide controls when entering PiP // Hide controls when entering PiP
@@ -980,19 +697,6 @@ export default function page() {
[], [],
); );
/** PiP handler for Android (VLC) */
const onPipStartedVlc = useCallback(
(e: { nativeEvent: { pipStarted: boolean } }) => {
const { pipStarted } = e.nativeEvent;
setIsPipMode(pipStarted);
// Hide controls when entering PiP
if (pipStarted) {
_setShowControls(false);
}
},
[],
);
const [isMounted, setIsMounted] = useState(false); const [isMounted, setIsMounted] = useState(false);
// Add useEffect to handle mounting // Add useEffect to handle mounting
@@ -1014,96 +718,79 @@ export default function page() {
videoRef.current?.pause?.(); videoRef.current?.pause?.();
}, []); }, []);
const seek = useCallback( const seek = useCallback((position: number) => {
(position: number) => { // MPV expects seconds, convert from ms
if (useVlcPlayer) {
// VLC expects milliseconds
videoRef.current?.seekTo?.(position);
} else {
// KSPlayer expects seconds, convert from ms
videoRef.current?.seekTo?.(position / 1000); videoRef.current?.seekTo?.(position / 1000);
} }, []);
},
[useVlcPlayer],
);
const handleZoomToggle = useCallback(async () => { const handleZoomToggle = useCallback(async () => {
// Zoom toggle only supported when using SfPlayer (KSPlayer)
if (useVlcPlayer) return;
const newZoomState = !isZoomedToFill; const newZoomState = !isZoomedToFill;
await videoRef.current?.setZoomedToFill?.(newZoomState);
setIsZoomedToFill(newZoomState); setIsZoomedToFill(newZoomState);
await (videoRef.current as SfPlayerViewRef)?.setVideoZoomToFill?.(
newZoomState,
);
}, [isZoomedToFill, useVlcPlayer]);
// VLC-specific handlers for aspect ratio and scale factor // Adjust subtitle position to compensate for video cropping when zoomed
const handleSetVideoAspectRatio = useCallback( if (newZoomState) {
async (newAspectRatio: string | null) => { // Get video dimensions from mediaSource
if (!useVlcPlayer) return; const videoStream = stream?.mediaSource?.MediaStreams?.find(
const ratio = (newAspectRatio ?? "default") as (s) => s.Type === "Video",
| "default"
| "16:9"
| "4:3"
| "1:1"
| "21:9";
setAspectRatio(ratio);
await (videoRef.current as VlcPlayerViewRef)?.setVideoAspectRatio?.(
newAspectRatio,
);
},
[useVlcPlayer],
); );
const videoWidth = videoStream?.Width ?? 1920;
const videoHeight = videoStream?.Height ?? 1080;
const handleSetVideoScaleFactor = useCallback( const videoAR = videoWidth / videoHeight;
async (newScaleFactor: number) => { const screenAR = screenWidth / screenHeight;
if (!useVlcPlayer) return;
setScaleFactor(
newScaleFactor as 0 | 0.25 | 0.5 | 0.75 | 1.0 | 1.25 | 1.5 | 2.0,
);
await (videoRef.current as VlcPlayerViewRef)?.setVideoScaleFactor?.(
newScaleFactor,
);
},
[useVlcPlayer],
);
// Apply KSPlayer global settings before video loads (only when using KSPlayer) if (screenAR > videoAR) {
useEffect(() => { // Screen is wider than video - video height extends beyond screen
if (Platform.OS === "ios" && !useVlcPlayer) { // Calculate how much of the video is cropped at the bottom (as % of video height)
setHardwareDecode(settings.ksHardwareDecode); const bottomCropPercent = 50 * (1 - videoAR / screenAR);
// Only adjust by 70% of the crop to keep a comfortable margin from the edge
// (subtitles already have some built-in padding from the bottom)
const adjustmentFactor = 0.7;
const newSubPos = Math.round(
100 - bottomCropPercent * adjustmentFactor,
);
await videoRef.current?.setSubtitlePosition?.(newSubPos);
} }
}, [settings.ksHardwareDecode, useVlcPlayer]); // If videoAR >= screenAR, sides are cropped but bottom is visible, no adjustment needed
} else {
// Restore to default position (bottom of video frame)
await videoRef.current?.setSubtitlePosition?.(100);
}
}, [isZoomedToFill, stream?.mediaSource, screenWidth, screenHeight]);
// Apply subtitle settings when video loads (SfPlayer-specific) // Apply subtitle settings when video loads
useEffect(() => { useEffect(() => {
if (useVlcPlayer || !isVideoLoaded || !videoRef.current) return; if (!isVideoLoaded || !videoRef.current) return;
const sfRef = videoRef.current as SfPlayerViewRef;
const applySubtitleSettings = async () => { const applySubtitleSettings = async () => {
if (settings.mpvSubtitleScale !== undefined) { if (settings.mpvSubtitleScale !== undefined) {
await sfRef?.setSubtitleScale?.(settings.mpvSubtitleScale); await videoRef.current?.setSubtitleScale?.(settings.mpvSubtitleScale);
} }
if (settings.mpvSubtitleMarginY !== undefined) { if (settings.mpvSubtitleMarginY !== undefined) {
await sfRef?.setSubtitleMarginY?.(settings.mpvSubtitleMarginY); await videoRef.current?.setSubtitleMarginY?.(
settings.mpvSubtitleMarginY,
);
} }
if (settings.mpvSubtitleAlignX !== undefined) { if (settings.mpvSubtitleAlignX !== undefined) {
await sfRef?.setSubtitleAlignX?.(settings.mpvSubtitleAlignX); await videoRef.current?.setSubtitleAlignX?.(settings.mpvSubtitleAlignX);
} }
if (settings.mpvSubtitleAlignY !== undefined) { if (settings.mpvSubtitleAlignY !== undefined) {
await sfRef?.setSubtitleAlignY?.(settings.mpvSubtitleAlignY); await videoRef.current?.setSubtitleAlignY?.(settings.mpvSubtitleAlignY);
} }
if (settings.mpvSubtitleFontSize !== undefined) { if (settings.mpvSubtitleFontSize !== undefined) {
await sfRef?.setSubtitleFontSize?.(settings.mpvSubtitleFontSize); await videoRef.current?.setSubtitleFontSize?.(
settings.mpvSubtitleFontSize,
);
} }
// Apply subtitle size from general settings // Apply subtitle size from general settings
if (settings.subtitleSize) { if (settings.subtitleSize) {
await sfRef?.setSubtitleFontSize?.(settings.subtitleSize); await videoRef.current?.setSubtitleFontSize?.(settings.subtitleSize);
} }
}; };
applySubtitleSettings(); applySubtitleSettings();
}, [isVideoLoaded, settings, useVlcPlayer]); }, [isVideoLoaded, settings]);
// Apply initial playback speed when video loads // Apply initial playback speed when video loads
useEffect(() => { useEffect(() => {
@@ -1112,20 +799,12 @@ export default function page() {
const applyInitialPlaybackSpeed = async () => { const applyInitialPlaybackSpeed = async () => {
if (initialPlaybackSpeed !== 1.0) { if (initialPlaybackSpeed !== 1.0) {
setCurrentPlaybackSpeed(initialPlaybackSpeed); setCurrentPlaybackSpeed(initialPlaybackSpeed);
if (useVlcPlayer) { await videoRef.current?.setSpeed?.(initialPlaybackSpeed);
await (videoRef.current as VlcPlayerViewRef)?.setRate?.(
initialPlaybackSpeed,
);
} else {
await (videoRef.current as SfPlayerViewRef)?.setSpeed?.(
initialPlaybackSpeed,
);
}
} }
}; };
applyInitialPlaybackSpeed(); applyInitialPlaybackSpeed();
}, [isVideoLoaded, initialPlaybackSpeed, useVlcPlayer]); }, [isVideoLoaded, initialPlaybackSpeed]);
// Show error UI first, before checking loading/missingdata // Show error UI first, before checking loading/missingdata
if (itemStatus.isError || streamStatus.isError) { if (itemStatus.isError || streamStatus.isError) {
@@ -1160,7 +839,6 @@ export default function page() {
mediaSource={stream?.mediaSource} mediaSource={stream?.mediaSource}
isVideoLoaded={isVideoLoaded} isVideoLoaded={isVideoLoaded}
tracksReady={tracksReady} tracksReady={tracksReady}
useVlcPlayer={useVlcPlayer}
offline={offline} offline={offline}
downloadedItem={downloadedItem} downloadedItem={downloadedItem}
> >
@@ -1183,39 +861,14 @@ export default function page() {
justifyContent: "center", justifyContent: "center",
}} }}
> >
{useVlcPlayer ? ( <MpvPlayerView
<VlcPlayerView ref={videoRef}
ref={videoRef as React.RefObject<VlcPlayerViewRef>} source={videoSource}
source={vlcVideoSource!}
style={{ width: "100%", height: "100%" }} style={{ width: "100%", height: "100%" }}
onVideoProgress={onProgressVlc} onProgress={onProgress}
onVideoStateChange={onPlaybackStateChangedVlc} onPlaybackStateChange={onPlaybackStateChanged}
onPipStarted={onPipStartedVlc}
onVideoLoadEnd={() => {
// Note: VLC only fires this on error, not on successful load
// tracksReady is set in onPlaybackStateChangedVlc when state is "Playing"
setIsVideoLoaded(true);
}}
onVideoError={(e: PlaybackStatePayload) => {
console.error("Video Error:", e.nativeEvent);
Alert.alert(
t("player.error"),
t("player.an_error_occured_while_playing_the_video"),
);
writeToLog("ERROR", "Video Error", e.nativeEvent);
}}
progressUpdateInterval={1000}
/>
) : (
<SfPlayerView
ref={videoRef as React.RefObject<SfPlayerViewRef>}
source={sfVideoSource}
style={{ width: "100%", height: "100%" }}
onProgress={onProgressSf}
onPlaybackStateChange={onPlaybackStateChangedSf}
onPictureInPictureChange={onPictureInPictureChangeSf}
onLoad={() => setIsVideoLoaded(true)} onLoad={() => setIsVideoLoaded(true)}
onError={(e: { nativeEvent: SfOnErrorEventPayload }) => { onError={(e: { nativeEvent: MpvOnErrorEventPayload }) => {
console.error("Video Error:", e.nativeEvent); console.error("Video Error:", e.nativeEvent);
Alert.alert( Alert.alert(
t("player.error"), t("player.error"),
@@ -1227,7 +880,6 @@ export default function page() {
setTracksReady(true); setTracksReady(true);
}} }}
/> />
)}
{!hasPlaybackStarted && ( {!hasPlaybackStarted && (
<View <View
style={{ style={{
@@ -1263,11 +915,7 @@ export default function page() {
seek={seek} seek={seek}
enableTrickplay={true} enableTrickplay={true}
offline={offline} offline={offline}
useVlcPlayer={useVlcPlayer}
aspectRatio={aspectRatio} aspectRatio={aspectRatio}
setVideoAspectRatio={handleSetVideoAspectRatio}
scaleFactor={scaleFactor}
setVideoScaleFactor={handleSetVideoScaleFactor}
isZoomedToFill={isZoomedToFill} isZoomedToFill={isZoomedToFill}
onZoomToggle={handleZoomToggle} onZoomToggle={handleZoomToggle}
api={api} api={api}

View File

@@ -42,7 +42,6 @@
"expo-router": "~6.0.21", "expo-router": "~6.0.21",
"expo-screen-orientation": "~9.0.8", "expo-screen-orientation": "~9.0.8",
"expo-secure-store": "^15.0.8", "expo-secure-store": "^15.0.8",
"expo-sensors": "~15.0.8",
"expo-sharing": "~14.0.8", "expo-sharing": "~14.0.8",
"expo-splash-screen": "~31.0.13", "expo-splash-screen": "~31.0.13",
"expo-status-bar": "~3.0.9", "expo-status-bar": "~3.0.9",
@@ -1054,8 +1053,6 @@
"expo-secure-store": ["expo-secure-store@15.0.8", "", { "peerDependencies": { "expo": "*" } }, "sha512-lHnzvRajBu4u+P99+0GEMijQMFCOYpWRO4dWsXSuMt77+THPIGjzNvVKrGSl6mMrLsfVaKL8BpwYZLGlgA+zAw=="], "expo-secure-store": ["expo-secure-store@15.0.8", "", { "peerDependencies": { "expo": "*" } }, "sha512-lHnzvRajBu4u+P99+0GEMijQMFCOYpWRO4dWsXSuMt77+THPIGjzNvVKrGSl6mMrLsfVaKL8BpwYZLGlgA+zAw=="],
"expo-sensors": ["expo-sensors@15.0.8", "", { "dependencies": { "invariant": "^2.2.4" }, "peerDependencies": { "expo": "*", "react-native": "*" } }, "sha512-ttibOSCYjFAMIfjV+vVukO1v7GKlbcPRfxcRqbTaSMGneewDwVSXbGFImY530fj1BR3mWq4n9jHnuDp8tAEY9g=="],
"expo-server": ["expo-server@1.0.5", "", {}, "sha512-IGR++flYH70rhLyeXF0Phle56/k4cee87WeQ4mamS+MkVAVP+dDlOHf2nN06Z9Y2KhU0Gp1k+y61KkghF7HdhA=="], "expo-server": ["expo-server@1.0.5", "", {}, "sha512-IGR++flYH70rhLyeXF0Phle56/k4cee87WeQ4mamS+MkVAVP+dDlOHf2nN06Z9Y2KhU0Gp1k+y61KkghF7HdhA=="],
"expo-sharing": ["expo-sharing@14.0.8", "", { "peerDependencies": { "expo": "*" } }, "sha512-A1pPr2iBrxypFDCWVAESk532HK+db7MFXbvO2sCV9ienaFXAk7lIBm6bkqgE6vzRd9O3RGdEGzYx80cYlc089Q=="], "expo-sharing": ["expo-sharing@14.0.8", "", { "peerDependencies": { "expo": "*" } }, "sha512-A1pPr2iBrxypFDCWVAESk532HK+db7MFXbvO2sCV9ienaFXAk7lIBm6bkqgE6vzRd9O3RGdEGzYx80cYlc089Q=="],

View File

@@ -1,6 +1,7 @@
import { Feather } from "@expo/vector-icons"; import { Feather } from "@expo/vector-icons";
import { useCallback, useEffect } from "react"; import { useCallback, useEffect } from "react";
import { Platform, TouchableOpacity } from "react-native"; import { Platform } from "react-native";
import { Pressable } from "react-native-gesture-handler";
import GoogleCast, { import GoogleCast, {
CastButton, CastButton,
CastContext, CastContext,
@@ -44,7 +45,7 @@ export function Chromecast({
if (Platform.OS === "ios") { if (Platform.OS === "ios") {
return ( return (
<TouchableOpacity <Pressable
className='mr-4' className='mr-4'
onPress={() => { onPress={() => {
if (mediaStatus?.currentItemId) CastContext.showExpandedControls(); if (mediaStatus?.currentItemId) CastContext.showExpandedControls();
@@ -54,7 +55,7 @@ export function Chromecast({
> >
<AndroidCastButton /> <AndroidCastButton />
<Feather name='cast' size={22} color={"white"} /> <Feather name='cast' size={22} color={"white"} />
</TouchableOpacity> </Pressable>
); );
} }

View File

@@ -209,6 +209,7 @@ export const DownloadItems: React.FC<DownloadProps> = ({
subtitleStreamIndex: subtitleIndex ?? -1, subtitleStreamIndex: subtitleIndex ?? -1,
maxBitrate: selectedOptions?.bitrate || defaultBitrate, maxBitrate: selectedOptions?.bitrate || defaultBitrate,
deviceId: api.deviceInfo.id, deviceId: api.deviceInfo.id,
audioMode: settings?.audioTranscodeMode,
}); });
return { return {

View File

@@ -1,7 +1,8 @@
import { Ionicons } from "@expo/vector-icons"; import { Ionicons } from "@expo/vector-icons";
import { BlurView } from "expo-blur"; import { BlurView } from "expo-blur";
import type { PropsWithChildren } from "react"; import type { PropsWithChildren } from "react";
import { Platform, TouchableOpacity, type ViewProps } from "react-native"; import { Platform, type ViewProps } from "react-native";
import { Pressable } from "react-native-gesture-handler";
import { useHaptic } from "@/hooks/useHaptic"; import { useHaptic } from "@/hooks/useHaptic";
interface Props extends ViewProps { interface Props extends ViewProps {
@@ -38,7 +39,7 @@ export const RoundButton: React.FC<PropsWithChildren<Props>> = ({
if (Platform.OS === "ios") { if (Platform.OS === "ios") {
return ( return (
<TouchableOpacity <Pressable
onPress={handlePress} onPress={handlePress}
className={`rounded-full ${buttonSize} flex items-center justify-center ${fillColorClass}`} className={`rounded-full ${buttonSize} flex items-center justify-center ${fillColorClass}`}
{...(viewProps as any)} {...(viewProps as any)}
@@ -51,13 +52,13 @@ export const RoundButton: React.FC<PropsWithChildren<Props>> = ({
/> />
) : null} ) : null}
{children ? children : null} {children ? children : null}
</TouchableOpacity> </Pressable>
); );
} }
if (fillColor) if (fillColor)
return ( return (
<TouchableOpacity <Pressable
onPress={handlePress} onPress={handlePress}
className={`rounded-full ${buttonSize} flex items-center justify-center ${fillColorClass}`} className={`rounded-full ${buttonSize} flex items-center justify-center ${fillColorClass}`}
{...(viewProps as any)} {...(viewProps as any)}
@@ -70,12 +71,12 @@ export const RoundButton: React.FC<PropsWithChildren<Props>> = ({
/> />
) : null} ) : null}
{children ? children : null} {children ? children : null}
</TouchableOpacity> </Pressable>
); );
if (background === false) if (background === false)
return ( return (
<TouchableOpacity <Pressable
onPress={handlePress} onPress={handlePress}
className={`rounded-full ${buttonSize} flex items-center justify-center ${fillColorClass}`} className={`rounded-full ${buttonSize} flex items-center justify-center ${fillColorClass}`}
{...(viewProps as any)} {...(viewProps as any)}
@@ -88,12 +89,12 @@ export const RoundButton: React.FC<PropsWithChildren<Props>> = ({
/> />
) : null} ) : null}
{children ? children : null} {children ? children : null}
</TouchableOpacity> </Pressable>
); );
if (Platform.OS === "android") if (Platform.OS === "android")
return ( return (
<TouchableOpacity <Pressable
onPress={handlePress} onPress={handlePress}
className={`rounded-full ${buttonSize} flex items-center justify-center ${ className={`rounded-full ${buttonSize} flex items-center justify-center ${
fillColor ? fillColorClass : "bg-transparent" fillColor ? fillColorClass : "bg-transparent"
@@ -108,11 +109,11 @@ export const RoundButton: React.FC<PropsWithChildren<Props>> = ({
/> />
) : null} ) : null}
{children ? children : null} {children ? children : null}
</TouchableOpacity> </Pressable>
); );
return ( return (
<TouchableOpacity onPress={handlePress} {...(viewProps as any)}> <Pressable onPress={handlePress} {...(viewProps as any)}>
<BlurView <BlurView
intensity={90} intensity={90}
className={`rounded-full overflow-hidden ${buttonSize} flex items-center justify-center ${fillColorClass}`} className={`rounded-full overflow-hidden ${buttonSize} flex items-center justify-center ${fillColorClass}`}
@@ -127,6 +128,6 @@ export const RoundButton: React.FC<PropsWithChildren<Props>> = ({
) : null} ) : null}
{children ? children : null} {children ? children : null}
</BlurView> </BlurView>
</TouchableOpacity> </Pressable>
); );
}; };

View File

@@ -1,42 +1,36 @@
import { Ionicons } from "@expo/vector-icons"; import { Ionicons } from "@expo/vector-icons";
import { BlurView, type BlurViewProps } from "expo-blur"; import { BlurView, type BlurViewProps } from "expo-blur";
import { useRouter } from "expo-router"; import { useRouter } from "expo-router";
import { import { Platform } from "react-native";
Platform, import { Pressable, type PressableProps } from "react-native-gesture-handler";
TouchableOpacity,
type TouchableOpacityProps,
} from "react-native";
interface Props extends BlurViewProps { interface Props extends BlurViewProps {
background?: "blur" | "transparent"; background?: "blur" | "transparent";
touchableOpacityProps?: TouchableOpacityProps; pressableProps?: Omit<PressableProps, "onPress">;
} }
export const HeaderBackButton: React.FC<Props> = ({ export const HeaderBackButton: React.FC<Props> = ({
background = "transparent", background = "transparent",
touchableOpacityProps, pressableProps,
...props ...props
}) => { }) => {
const router = useRouter(); const router = useRouter();
if (Platform.OS === "ios") { if (Platform.OS === "ios") {
return ( return (
<TouchableOpacity <Pressable
onPress={() => router.back()} onPress={() => router.back()}
className='flex items-center justify-center w-9 h-9' className='flex items-center justify-center w-9 h-9'
{...touchableOpacityProps} {...pressableProps}
> >
<Ionicons name='arrow-back' size={24} color='white' /> <Ionicons name='arrow-back' size={24} color='white' />
</TouchableOpacity> </Pressable>
); );
} }
if (background === "transparent" && Platform.OS !== "android") if (background === "transparent" && Platform.OS !== "android")
return ( return (
<TouchableOpacity <Pressable onPress={() => router.back()} {...pressableProps}>
onPress={() => router.back()}
{...touchableOpacityProps}
>
<BlurView <BlurView
{...props} {...props}
intensity={100} intensity={100}
@@ -49,14 +43,14 @@ export const HeaderBackButton: React.FC<Props> = ({
color='white' color='white'
/> />
</BlurView> </BlurView>
</TouchableOpacity> </Pressable>
); );
return ( return (
<TouchableOpacity <Pressable
onPress={() => router.back()} onPress={() => router.back()}
className=' rounded-full p-2' className=' rounded-full p-2'
{...touchableOpacityProps} {...pressableProps}
> >
<Ionicons <Ionicons
className='drop-shadow-2xl' className='drop-shadow-2xl'
@@ -64,6 +58,6 @@ export const HeaderBackButton: React.FC<Props> = ({
size={24} size={24}
color='white' color='white'
/> />
</TouchableOpacity> </Pressable>
); );
}; };

View File

@@ -21,9 +21,9 @@ import {
Platform, Platform,
RefreshControl, RefreshControl,
ScrollView, ScrollView,
TouchableOpacity,
View, View,
} from "react-native"; } from "react-native";
import { Pressable } from "react-native-gesture-handler";
import { useSafeAreaInsets } from "react-native-safe-area-context"; import { useSafeAreaInsets } from "react-native-safe-area-context";
import { Button } from "@/components/Button"; import { Button } from "@/components/Button";
import { Text } from "@/components/common/Text"; import { Text } from "@/components/common/Text";
@@ -118,7 +118,7 @@ export const Home = () => {
} }
navigation.setOptions({ navigation.setOptions({
headerLeft: () => ( headerLeft: () => (
<TouchableOpacity <Pressable
onPress={() => { onPress={() => {
router.push("/(auth)/downloads"); router.push("/(auth)/downloads");
}} }}
@@ -130,7 +130,7 @@ export const Home = () => {
color={hasDownloads ? Colors.primary : "white"} color={hasDownloads ? Colors.primary : "white"}
size={24} size={24}
/> />
</TouchableOpacity> </Pressable>
), ),
}); });
}, [navigation, router, hasDownloads]); }, [navigation, router, hasDownloads]);

View File

@@ -1,40 +0,0 @@
import type React from "react";
import { useCallback } from "react";
import { useTranslation } from "react-i18next";
import { Platform, Switch } from "react-native";
import { setHardwareDecode } from "@/modules/sf-player";
import { useSettings } from "@/utils/atoms/settings";
import { ListGroup } from "../list/ListGroup";
import { ListItem } from "../list/ListItem";
export const KSPlayerSettings: React.FC = () => {
const { settings, updateSettings } = useSettings();
const { t } = useTranslation();
const handleHardwareDecodeChange = useCallback(
(value: boolean) => {
updateSettings({ ksHardwareDecode: value });
setHardwareDecode(value);
},
[updateSettings],
);
if (Platform.OS !== "ios" || !settings) return null;
return (
<ListGroup
title={t("home.settings.subtitles.ksplayer_title")}
className='mt-4'
>
<ListItem
title={t("home.settings.subtitles.hardware_decode")}
subtitle={t("home.settings.subtitles.hardware_decode_description")}
>
<Switch
value={settings.ksHardwareDecode}
onValueChange={handleHardwareDecodeChange}
/>
</ListItem>
</ListGroup>
);
};

View File

@@ -0,0 +1,133 @@
import { Ionicons } from "@expo/vector-icons";
import { useMemo } from "react";
import { Platform, View, type ViewProps } from "react-native";
import { Stepper } from "@/components/inputs/Stepper";
import { Text } from "../common/Text";
import { ListGroup } from "../list/ListGroup";
import { ListItem } from "../list/ListItem";
import { PlatformDropdown } from "../PlatformDropdown";
import { useMedia } from "./MediaContext";
interface Props extends ViewProps {}
type AlignX = "left" | "center" | "right";
type AlignY = "top" | "center" | "bottom";
export const MpvSubtitleSettings: React.FC<Props> = ({ ...props }) => {
const isTv = Platform.isTV;
const media = useMedia();
const { settings, updateSettings } = media;
const alignXOptions: AlignX[] = ["left", "center", "right"];
const alignYOptions: AlignY[] = ["top", "center", "bottom"];
const alignXLabels: Record<AlignX, string> = {
left: "Left",
center: "Center",
right: "Right",
};
const alignYLabels: Record<AlignY, string> = {
top: "Top",
center: "Center",
bottom: "Bottom",
};
const alignXOptionGroups = useMemo(() => {
const options = alignXOptions.map((align) => ({
type: "radio" as const,
label: alignXLabels[align],
value: align,
selected: align === (settings?.mpvSubtitleAlignX ?? "center"),
onPress: () => updateSettings({ mpvSubtitleAlignX: align }),
}));
return [{ options }];
}, [settings?.mpvSubtitleAlignX, updateSettings]);
const alignYOptionGroups = useMemo(() => {
const options = alignYOptions.map((align) => ({
type: "radio" as const,
label: alignYLabels[align],
value: align,
selected: align === (settings?.mpvSubtitleAlignY ?? "bottom"),
onPress: () => updateSettings({ mpvSubtitleAlignY: align }),
}));
return [{ options }];
}, [settings?.mpvSubtitleAlignY, updateSettings]);
if (isTv) return null;
if (!settings) return null;
return (
<View {...props}>
<ListGroup
title='MPV Subtitle Settings'
description={
<Text className='text-[#8E8D91] text-xs'>
Advanced subtitle customization for MPV player
</Text>
}
>
<ListItem title='Subtitle Scale'>
<Stepper
value={settings.mpvSubtitleScale ?? 1.0}
step={0.1}
min={0.5}
max={2.0}
onUpdate={(value) =>
updateSettings({ mpvSubtitleScale: Math.round(value * 10) / 10 })
}
/>
</ListItem>
<ListItem title='Vertical Margin'>
<Stepper
value={settings.mpvSubtitleMarginY ?? 0}
step={5}
min={0}
max={100}
onUpdate={(value) => updateSettings({ mpvSubtitleMarginY: value })}
/>
</ListItem>
<ListItem title='Horizontal Alignment'>
<PlatformDropdown
groups={alignXOptionGroups}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>
{alignXLabels[settings?.mpvSubtitleAlignX ?? "center"]}
</Text>
<Ionicons
name='chevron-expand-sharp'
size={18}
color='#5A5960'
/>
</View>
}
title='Horizontal Alignment'
/>
</ListItem>
<ListItem title='Vertical Alignment'>
<PlatformDropdown
groups={alignYOptionGroups}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>
{alignYLabels[settings?.mpvSubtitleAlignY ?? "bottom"]}
</Text>
<Ionicons
name='chevron-expand-sharp'
size={18}
color='#5A5960'
/>
</View>
}
title='Vertical Alignment'
/>
</ListItem>
</ListGroup>
</View>
);
};

View File

@@ -141,36 +141,6 @@ export const OtherSettings: React.FC = () => {
/> />
</ListItem> </ListItem>
{/* {(Platform.OS === "ios" || Platform.isTVOS)&& (
<ListItem
title={t("home.settings.other.video_player")}
disabled={pluginSettings?.defaultPlayer?.locked}
>
<Dropdown
data={Object.values(VideoPlayer).filter(isNumber)}
disabled={pluginSettings?.defaultPlayer?.locked}
keyExtractor={String}
titleExtractor={(item) => t(`home.settings.other.video_players.${VideoPlayer[item]}`)}
title={
<TouchableOpacity className="flex flex-row items-center justify-between py-1.5 pl-3">
<Text className="mr-1 text-[#8E8D91]">
{t(`home.settings.other.video_players.${VideoPlayer[settings.defaultPlayer]}`)}
</Text>
<Ionicons
name="chevron-expand-sharp"
size={18}
color="#5A5960"
/>
</TouchableOpacity>
}
label={t("home.settings.other.orientation")}
onSelected={(defaultPlayer) =>
updateSettings({ defaultPlayer })
}
/>
</ListItem>
)} */}
<ListItem <ListItem
title={t("home.settings.other.show_custom_menu_links")} title={t("home.settings.other.show_custom_menu_links")}
disabled={pluginSettings?.showCustomMenuLinks?.locked} disabled={pluginSettings?.showCustomMenuLinks?.locked}

View File

@@ -13,7 +13,6 @@ import { ScreenOrientationEnum, useSettings } from "@/utils/atoms/settings";
import { Text } from "../common/Text"; import { Text } from "../common/Text";
import { ListGroup } from "../list/ListGroup"; import { ListGroup } from "../list/ListGroup";
import { ListItem } from "../list/ListItem"; import { ListItem } from "../list/ListItem";
import { VideoPlayerSettings } from "./VideoPlayerSettings";
export const PlaybackControlsSettings: React.FC = () => { export const PlaybackControlsSettings: React.FC = () => {
const { settings, updateSettings, pluginSettings } = useSettings(); const { settings, updateSettings, pluginSettings } = useSettings();
@@ -231,8 +230,6 @@ export const PlaybackControlsSettings: React.FC = () => {
/> />
</ListItem> </ListItem>
</ListGroup> </ListGroup>
<VideoPlayerSettings />
</DisabledSetting> </DisabledSetting>
); );
}; };

View File

@@ -1,93 +0,0 @@
import { Ionicons } from "@expo/vector-icons";
import type React from "react";
import { useCallback, useMemo } from "react";
import { useTranslation } from "react-i18next";
import { Platform, Switch, View } from "react-native";
import { setHardwareDecode } from "@/modules/sf-player";
import { useSettings, VideoPlayerIOS } from "@/utils/atoms/settings";
import { Text } from "../common/Text";
import { ListGroup } from "../list/ListGroup";
import { ListItem } from "../list/ListItem";
import { PlatformDropdown } from "../PlatformDropdown";
export const VideoPlayerSettings: React.FC = () => {
const { settings, updateSettings } = useSettings();
const { t } = useTranslation();
const handleHardwareDecodeChange = useCallback(
(value: boolean) => {
updateSettings({ ksHardwareDecode: value });
setHardwareDecode(value);
},
[updateSettings],
);
const videoPlayerOptions = useMemo(
() => [
{
options: [
{
type: "radio" as const,
label: t("home.settings.video_player.ksplayer"),
value: VideoPlayerIOS.KSPlayer,
selected: settings?.videoPlayerIOS === VideoPlayerIOS.KSPlayer,
onPress: () =>
updateSettings({ videoPlayerIOS: VideoPlayerIOS.KSPlayer }),
},
{
type: "radio" as const,
label: t("home.settings.video_player.vlc"),
value: VideoPlayerIOS.VLC,
selected: settings?.videoPlayerIOS === VideoPlayerIOS.VLC,
onPress: () =>
updateSettings({ videoPlayerIOS: VideoPlayerIOS.VLC }),
},
],
},
],
[settings?.videoPlayerIOS, t, updateSettings],
);
const getPlayerLabel = useCallback(() => {
switch (settings?.videoPlayerIOS) {
case VideoPlayerIOS.VLC:
return t("home.settings.video_player.vlc");
default:
return t("home.settings.video_player.ksplayer");
}
}, [settings?.videoPlayerIOS, t]);
if (Platform.OS !== "ios" || !settings) return null;
return (
<ListGroup title={t("home.settings.video_player.title")} className='mt-4'>
<ListItem
title={t("home.settings.video_player.video_player")}
subtitle={t("home.settings.video_player.video_player_description")}
>
<PlatformDropdown
groups={videoPlayerOptions}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>{getPlayerLabel()}</Text>
<Ionicons name='chevron-expand-sharp' size={18} color='#5A5960' />
</View>
}
title={t("home.settings.video_player.video_player")}
/>
</ListItem>
{settings.videoPlayerIOS === VideoPlayerIOS.KSPlayer && (
<ListItem
title={t("home.settings.subtitles.hardware_decode")}
subtitle={t("home.settings.subtitles.hardware_decode_description")}
>
<Switch
value={settings.ksHardwareDecode}
onValueChange={handleHardwareDecodeChange}
/>
</ListItem>
)}
</ListGroup>
);
};

View File

@@ -1,245 +0,0 @@
import { Ionicons } from "@expo/vector-icons";
import { useMemo } from "react";
import { useTranslation } from "react-i18next";
import { Platform, View, type ViewProps } from "react-native";
import { Switch } from "react-native-gesture-handler";
import {
OUTLINE_THICKNESS_OPTIONS,
VLC_COLOR_OPTIONS,
} from "@/constants/SubtitleConstants";
import { useSettings, VideoPlayerIOS } from "@/utils/atoms/settings";
import { Text } from "../common/Text";
import { Stepper } from "../inputs/Stepper";
import { ListGroup } from "../list/ListGroup";
import { ListItem } from "../list/ListItem";
import { PlatformDropdown } from "../PlatformDropdown";
interface Props extends ViewProps {}
/**
* VLC Subtitle Settings component
* Only shown when VLC is the active player (Android always, iOS when VLC selected)
* Note: These settings are applied via VLC init options and take effect on next playback
*/
export const VlcSubtitleSettings: React.FC<Props> = ({ ...props }) => {
const { t } = useTranslation();
const { settings, updateSettings } = useSettings();
// Only show for VLC users
const isVlcPlayer =
Platform.OS === "android" ||
(Platform.OS === "ios" && settings.videoPlayerIOS === VideoPlayerIOS.VLC);
const textColorOptions = useMemo(
() => [
{
options: VLC_COLOR_OPTIONS.map((color) => ({
type: "radio" as const,
label: color,
value: color,
selected: settings.vlcTextColor === color,
onPress: () => updateSettings({ vlcTextColor: color }),
})),
},
],
[settings.vlcTextColor, updateSettings],
);
const backgroundColorOptions = useMemo(
() => [
{
options: VLC_COLOR_OPTIONS.map((color) => ({
type: "radio" as const,
label: color,
value: color,
selected: settings.vlcBackgroundColor === color,
onPress: () => updateSettings({ vlcBackgroundColor: color }),
})),
},
],
[settings.vlcBackgroundColor, updateSettings],
);
const outlineColorOptions = useMemo(
() => [
{
options: VLC_COLOR_OPTIONS.map((color) => ({
type: "radio" as const,
label: color,
value: color,
selected: settings.vlcOutlineColor === color,
onPress: () => updateSettings({ vlcOutlineColor: color }),
})),
},
],
[settings.vlcOutlineColor, updateSettings],
);
const outlineThicknessOptions = useMemo(
() => [
{
options: OUTLINE_THICKNESS_OPTIONS.map((thickness) => ({
type: "radio" as const,
label: thickness,
value: thickness,
selected: settings.vlcOutlineThickness === thickness,
onPress: () => updateSettings({ vlcOutlineThickness: thickness }),
})),
},
],
[settings.vlcOutlineThickness, updateSettings],
);
if (!isVlcPlayer) return null;
if (Platform.isTV) return null;
return (
<View {...props}>
<ListGroup
title={t("home.settings.vlc_subtitles.title")}
description={
<Text className='text-[#8E8D91] text-xs'>
{t("home.settings.vlc_subtitles.hint")}
</Text>
}
>
{/* Text Color */}
<ListItem title={t("home.settings.vlc_subtitles.text_color")}>
<PlatformDropdown
groups={textColorOptions}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>
{settings.vlcTextColor || "White"}
</Text>
<Ionicons
name='chevron-expand-sharp'
size={18}
color='#5A5960'
/>
</View>
}
title={t("home.settings.vlc_subtitles.text_color")}
/>
</ListItem>
{/* Background Color */}
<ListItem title={t("home.settings.vlc_subtitles.background_color")}>
<PlatformDropdown
groups={backgroundColorOptions}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>
{settings.vlcBackgroundColor || "Black"}
</Text>
<Ionicons
name='chevron-expand-sharp'
size={18}
color='#5A5960'
/>
</View>
}
title={t("home.settings.vlc_subtitles.background_color")}
/>
</ListItem>
{/* Background Opacity */}
<ListItem title={t("home.settings.vlc_subtitles.background_opacity")}>
<Stepper
value={Math.round(
((settings.vlcBackgroundOpacity ?? 128) / 255) * 100,
)}
step={10}
min={0}
max={100}
appendValue='%'
onUpdate={(value) =>
updateSettings({
vlcBackgroundOpacity: Math.round((value / 100) * 255),
})
}
/>
</ListItem>
{/* Outline Color */}
<ListItem title={t("home.settings.vlc_subtitles.outline_color")}>
<PlatformDropdown
groups={outlineColorOptions}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>
{settings.vlcOutlineColor || "Black"}
</Text>
<Ionicons
name='chevron-expand-sharp'
size={18}
color='#5A5960'
/>
</View>
}
title={t("home.settings.vlc_subtitles.outline_color")}
/>
</ListItem>
{/* Outline Opacity */}
<ListItem title={t("home.settings.vlc_subtitles.outline_opacity")}>
<Stepper
value={Math.round(
((settings.vlcOutlineOpacity ?? 255) / 255) * 100,
)}
step={10}
min={0}
max={100}
appendValue='%'
onUpdate={(value) =>
updateSettings({
vlcOutlineOpacity: Math.round((value / 100) * 255),
})
}
/>
</ListItem>
{/* Outline Thickness */}
<ListItem title={t("home.settings.vlc_subtitles.outline_thickness")}>
<PlatformDropdown
groups={outlineThicknessOptions}
trigger={
<View className='flex flex-row items-center justify-between py-1.5 pl-3'>
<Text className='mr-1 text-[#8E8D91]'>
{settings.vlcOutlineThickness || "Normal"}
</Text>
<Ionicons
name='chevron-expand-sharp'
size={18}
color='#5A5960'
/>
</View>
}
title={t("home.settings.vlc_subtitles.outline_thickness")}
/>
</ListItem>
{/* Bold Text */}
<ListItem title={t("home.settings.vlc_subtitles.bold")}>
<Switch
value={settings.vlcIsBold ?? false}
onValueChange={(value) => updateSettings({ vlcIsBold: value })}
/>
</ListItem>
{/* Subtitle Margin */}
<ListItem title={t("home.settings.vlc_subtitles.margin")}>
<Stepper
value={settings.vlcSubtitleMargin ?? 40}
step={10}
min={0}
max={200}
onUpdate={(value) =>
updateSettings({ vlcSubtitleMargin: Math.round(value) })
}
/>
</ListItem>
</ListGroup>
</View>
);
};

View File

@@ -19,7 +19,14 @@ export const commonScreenOptions: ICommonScreenOptions = {
headerLeft: () => <HeaderBackButton />, headerLeft: () => <HeaderBackButton />,
}; };
const routes = ["persons/[personId]", "items/page", "series/[id]"]; const routes = [
"persons/[personId]",
"items/page",
"series/[id]",
"music/album/[albumId]",
"music/artist/[artistId]",
"music/playlist/[playlistId]",
];
export const nestedTabPageScreenOptions: Record<string, ICommonScreenOptions> = export const nestedTabPageScreenOptions: Record<string, ICommonScreenOptions> =
Object.fromEntries(routes.map((route) => [route, commonScreenOptions])); Object.fromEntries(routes.map((route) => [route, commonScreenOptions]));

View File

@@ -96,9 +96,11 @@ export const BottomControls: FC<BottomControlsProps> = ({
style={[ style={[
{ {
position: "absolute", position: "absolute",
right: settings?.safeAreaInControlsEnabled ? insets.right : 0, right:
left: settings?.safeAreaInControlsEnabled ? insets.left : 0, (settings?.safeAreaInControlsEnabled ?? true) ? insets.right : 0,
bottom: settings?.safeAreaInControlsEnabled left: (settings?.safeAreaInControlsEnabled ?? true) ? insets.left : 0,
bottom:
(settings?.safeAreaInControlsEnabled ?? true)
? Math.max(insets.bottom - 17, 0) ? Math.max(insets.bottom - 17, 0)
: 0, : 0,
}, },

View File

@@ -1,4 +1,4 @@
import { useEffect, useRef } from "react"; import { useEffect, useRef, useState } from "react";
import { Platform, StyleSheet, View } from "react-native"; import { Platform, StyleSheet, View } from "react-native";
import { Slider } from "react-native-awesome-slider"; import { Slider } from "react-native-awesome-slider";
import { useSharedValue } from "react-native-reanimated"; import { useSharedValue } from "react-native-reanimated";
@@ -16,10 +16,19 @@ const BrightnessSlider = () => {
const max = useSharedValue(100); const max = useSharedValue(100);
const isUserInteracting = useRef(false); const isUserInteracting = useRef(false);
const lastKnownBrightness = useRef<number>(50); const lastKnownBrightness = useRef<number>(50);
const brightnessSupportedRef = useRef(true);
const [brightnessSupported, setBrightnessSupported] = useState(true);
// Update brightness from device // Update brightness from device
const updateBrightnessFromDevice = async () => { const updateBrightnessFromDevice = async () => {
if (isTv || !Brightness || isUserInteracting.current) return; // Check ref (not state) to avoid stale closure in setInterval
if (
isTv ||
!Brightness ||
isUserInteracting.current ||
!brightnessSupportedRef.current
)
return;
try { try {
const currentBrightness = await Brightness.getBrightnessAsync(); const currentBrightness = await Brightness.getBrightnessAsync();
@@ -31,7 +40,10 @@ const BrightnessSlider = () => {
lastKnownBrightness.current = brightnessPercent; lastKnownBrightness.current = brightnessPercent;
} }
} catch (error) { } catch (error) {
console.error("Error fetching brightness:", error); console.warn("Brightness not supported on this device:", error);
// Update both ref (stops interval) and state (triggers re-render to hide)
brightnessSupportedRef.current = false;
setBrightnessSupported(false);
} }
}; };
@@ -66,7 +78,7 @@ const BrightnessSlider = () => {
}, 100); }, 100);
}; };
if (isTv) return null; if (isTv || !brightnessSupported) return null;
return ( return (
<View style={styles.sliderContainer}> <View style={styles.sliderContainer}>

View File

@@ -38,8 +38,8 @@ export const CenterControls: FC<CenterControlsProps> = ({
style={{ style={{
position: "absolute", position: "absolute",
top: "50%", top: "50%",
left: settings?.safeAreaInControlsEnabled ? insets.left : 0, left: (settings?.safeAreaInControlsEnabled ?? true) ? insets.left : 0,
right: settings?.safeAreaInControlsEnabled ? insets.right : 0, right: (settings?.safeAreaInControlsEnabled ?? true) ? insets.right : 0,
flexDirection: "row", flexDirection: "row",
justifyContent: "space-between", justifyContent: "space-between",
alignItems: "center", alignItems: "center",

View File

@@ -37,7 +37,6 @@ import { useVideoTime } from "./hooks/useVideoTime";
import { useControlsTimeout } from "./useControlsTimeout"; import { useControlsTimeout } from "./useControlsTimeout";
import { PlaybackSpeedScope } from "./utils/playback-speed-settings"; import { PlaybackSpeedScope } from "./utils/playback-speed-settings";
import { type AspectRatio } from "./VideoScalingModeSelector"; import { type AspectRatio } from "./VideoScalingModeSelector";
import { type ScaleFactor } from "./VlcZoomControl";
interface Props { interface Props {
item: BaseItemDto; item: BaseItemDto;
@@ -56,13 +55,7 @@ interface Props {
startPictureInPicture?: () => Promise<void>; startPictureInPicture?: () => Promise<void>;
play: () => void; play: () => void;
pause: () => void; pause: () => void;
useVlcPlayer?: boolean;
// VLC-specific props
setVideoAspectRatio?: (aspectRatio: string | null) => Promise<void>;
aspectRatio?: AspectRatio; aspectRatio?: AspectRatio;
scaleFactor?: ScaleFactor;
setVideoScaleFactor?: (scaleFactor: number) => Promise<void>;
// KSPlayer-specific props
isZoomedToFill?: boolean; isZoomedToFill?: boolean;
onZoomToggle?: () => void; onZoomToggle?: () => void;
api?: Api | null; api?: Api | null;
@@ -87,11 +80,7 @@ export const Controls: FC<Props> = ({
showControls, showControls,
setShowControls, setShowControls,
mediaSource, mediaSource,
useVlcPlayer = false,
setVideoAspectRatio,
aspectRatio = "default", aspectRatio = "default",
scaleFactor = 0,
setVideoScaleFactor,
isZoomedToFill = false, isZoomedToFill = false,
onZoomToggle, onZoomToggle,
offline = false, offline = false,
@@ -121,7 +110,7 @@ export const Controls: FC<Props> = ({
} = useTrickplay(item); } = useTrickplay(item);
const min = useSharedValue(0); const min = useSharedValue(0);
const max = useSharedValue(item.RunTimeTicks || 0); const max = useSharedValue(ticksToMs(item.RunTimeTicks || 0));
// Animation values for controls // Animation values for controls
const controlsOpacity = useSharedValue(showControls ? 1 : 0); const controlsOpacity = useSharedValue(showControls ? 1 : 0);
@@ -483,11 +472,7 @@ export const Controls: FC<Props> = ({
goToNextItem={goToNextItem} goToNextItem={goToNextItem}
previousItem={previousItem} previousItem={previousItem}
nextItem={nextItem} nextItem={nextItem}
useVlcPlayer={useVlcPlayer}
aspectRatio={aspectRatio} aspectRatio={aspectRatio}
setVideoAspectRatio={setVideoAspectRatio}
scaleFactor={scaleFactor}
setVideoScaleFactor={setVideoScaleFactor}
isZoomedToFill={isZoomedToFill} isZoomedToFill={isZoomedToFill}
onZoomToggle={onZoomToggle} onZoomToggle={onZoomToggle}
playbackSpeed={playbackSpeed} playbackSpeed={playbackSpeed}

View File

@@ -7,19 +7,14 @@ import { useRouter } from "expo-router";
import { type FC, useCallback, useState } from "react"; import { type FC, useCallback, useState } from "react";
import { Platform, TouchableOpacity, View } from "react-native"; import { Platform, TouchableOpacity, View } from "react-native";
import { useSafeAreaInsets } from "react-native-safe-area-context"; import { useSafeAreaInsets } from "react-native-safe-area-context";
import { PlaybackSpeedSelector } from "@/components/PlaybackSpeedSelector";
import { useHaptic } from "@/hooks/useHaptic"; import { useHaptic } from "@/hooks/useHaptic";
import { useOrientation } from "@/hooks/useOrientation"; import { useOrientation } from "@/hooks/useOrientation";
import { OrientationLock } from "@/packages/expo-screen-orientation"; import { OrientationLock } from "@/packages/expo-screen-orientation";
import { useSettings, VideoPlayerIOS } from "@/utils/atoms/settings"; import { useSettings } from "@/utils/atoms/settings";
import { ICON_SIZES } from "./constants"; import { ICON_SIZES } from "./constants";
import DropdownView from "./dropdown/DropdownView"; import DropdownView from "./dropdown/DropdownView";
import { PlaybackSpeedScope } from "./utils/playback-speed-settings"; import { PlaybackSpeedScope } from "./utils/playback-speed-settings";
import { import { type AspectRatio } from "./VideoScalingModeSelector";
type AspectRatio,
AspectRatioSelector,
} from "./VideoScalingModeSelector";
import { type ScaleFactor, VlcZoomControl } from "./VlcZoomControl";
import { ZoomToggle } from "./ZoomToggle"; import { ZoomToggle } from "./ZoomToggle";
interface HeaderControlsProps { interface HeaderControlsProps {
@@ -33,13 +28,7 @@ interface HeaderControlsProps {
goToNextItem: (options: { isAutoPlay?: boolean }) => void; goToNextItem: (options: { isAutoPlay?: boolean }) => void;
previousItem?: BaseItemDto | null; previousItem?: BaseItemDto | null;
nextItem?: BaseItemDto | null; nextItem?: BaseItemDto | null;
useVlcPlayer?: boolean;
// VLC-specific props
aspectRatio?: AspectRatio; aspectRatio?: AspectRatio;
setVideoAspectRatio?: (aspectRatio: string | null) => Promise<void>;
scaleFactor?: ScaleFactor;
setVideoScaleFactor?: (scaleFactor: number) => Promise<void>;
// KSPlayer-specific props
isZoomedToFill?: boolean; isZoomedToFill?: boolean;
onZoomToggle?: () => void; onZoomToggle?: () => void;
// Playback speed props // Playback speed props
@@ -58,11 +47,7 @@ export const HeaderControls: FC<HeaderControlsProps> = ({
goToNextItem, goToNextItem,
previousItem, previousItem,
nextItem, nextItem,
useVlcPlayer = false, aspectRatio: _aspectRatio = "default",
aspectRatio = "default",
setVideoAspectRatio,
scaleFactor = 0,
setVideoScaleFactor,
isZoomedToFill = false, isZoomedToFill = false,
onZoomToggle, onZoomToggle,
playbackSpeed = 1.0, playbackSpeed = 1.0,
@@ -109,9 +94,10 @@ export const HeaderControls: FC<HeaderControlsProps> = ({
style={[ style={[
{ {
position: "absolute", position: "absolute",
top: settings?.safeAreaInControlsEnabled ? insets.top : 0, top: (settings?.safeAreaInControlsEnabled ?? true) ? insets.top : 0,
left: settings?.safeAreaInControlsEnabled ? insets.left : 0, left: (settings?.safeAreaInControlsEnabled ?? true) ? insets.left : 0,
right: settings?.safeAreaInControlsEnabled ? insets.right : 0, right:
(settings?.safeAreaInControlsEnabled ?? true) ? insets.right : 0,
}, },
]} ]}
pointerEvents={showControls ? "auto" : "none"} pointerEvents={showControls ? "auto" : "none"}
@@ -120,7 +106,10 @@ export const HeaderControls: FC<HeaderControlsProps> = ({
<View className='mr-auto p-2' pointerEvents='box-none'> <View className='mr-auto p-2' pointerEvents='box-none'>
{!Platform.isTV && (!offline || !mediaSource?.TranscodingUrl) && ( {!Platform.isTV && (!offline || !mediaSource?.TranscodingUrl) && (
<View pointerEvents='auto'> <View pointerEvents='auto'>
<DropdownView /> <DropdownView
playbackSpeed={playbackSpeed}
setPlaybackSpeed={setPlaybackSpeed}
/>
</View> </View>
)} )}
</View> </View>
@@ -142,9 +131,7 @@ export const HeaderControls: FC<HeaderControlsProps> = ({
/> />
</TouchableOpacity> </TouchableOpacity>
)} )}
{!Platform.isTV && {!Platform.isTV && startPictureInPicture && (
startPictureInPicture &&
settings?.videoPlayerIOS !== VideoPlayerIOS.VLC && (
<TouchableOpacity <TouchableOpacity
onPress={startPictureInPicture} onPress={startPictureInPicture}
className='aspect-square flex flex-col rounded-xl items-center justify-center p-2' className='aspect-square flex flex-col rounded-xl items-center justify-center p-2'
@@ -188,47 +175,12 @@ export const HeaderControls: FC<HeaderControlsProps> = ({
/> />
</TouchableOpacity> </TouchableOpacity>
)} )}
{/* Playback Speed Control */} {/* MPV Zoom Toggle */}
{!Platform.isTV && setPlaybackSpeed && (
<PlaybackSpeedSelector
selected={playbackSpeed}
onChange={setPlaybackSpeed}
item={item}
/>
)}
{/* VLC-specific controls: Aspect Ratio and Scale/Zoom */}
{useVlcPlayer && (
<AspectRatioSelector
currentRatio={aspectRatio}
onRatioChange={async (newRatio) => {
if (setVideoAspectRatio) {
const aspectRatioString =
newRatio === "default" ? null : newRatio;
await setVideoAspectRatio(aspectRatioString);
}
}}
disabled={!setVideoAspectRatio}
/>
)}
{useVlcPlayer && (
<VlcZoomControl
currentScale={scaleFactor}
onScaleChange={async (newScale) => {
if (setVideoScaleFactor) {
await setVideoScaleFactor(newScale);
}
}}
disabled={!setVideoScaleFactor}
/>
)}
{/* KSPlayer-specific control: Zoom to Fill */}
{!useVlcPlayer && (
<ZoomToggle <ZoomToggle
isZoomedToFill={isZoomedToFill} isZoomedToFill={isZoomedToFill}
onToggle={onZoomToggle ?? (() => {})} onToggle={onZoomToggle ?? (() => {})}
disabled={!onZoomToggle} disabled={!onZoomToggle}
/> />
)}
<TouchableOpacity <TouchableOpacity
onPress={onClose} onPress={onClose}
className='aspect-square flex flex-col rounded-xl items-center justify-center p-2' className='aspect-square flex flex-col rounded-xl items-center justify-center p-2'

View File

@@ -1,121 +0,0 @@
import { Ionicons } from "@expo/vector-icons";
import React, { useMemo } from "react";
import { Platform, View } from "react-native";
import {
type OptionGroup,
PlatformDropdown,
} from "@/components/PlatformDropdown";
import { useHaptic } from "@/hooks/useHaptic";
import { ICON_SIZES } from "./constants";
export type ScaleFactor = 0 | 0.25 | 0.5 | 0.75 | 1.0 | 1.25 | 1.5 | 2.0;
interface VlcZoomControlProps {
currentScale: ScaleFactor;
onScaleChange: (scale: ScaleFactor) => void;
disabled?: boolean;
}
interface ScaleOption {
id: ScaleFactor;
label: string;
description: string;
}
const SCALE_OPTIONS: ScaleOption[] = [
{
id: 0,
label: "Fit",
description: "Fit video to screen",
},
{
id: 0.25,
label: "25%",
description: "Quarter size",
},
{
id: 0.5,
label: "50%",
description: "Half size",
},
{
id: 0.75,
label: "75%",
description: "Three quarters",
},
{
id: 1.0,
label: "100%",
description: "Original video size",
},
{
id: 1.25,
label: "125%",
description: "Slight zoom",
},
{
id: 1.5,
label: "150%",
description: "Medium zoom",
},
{
id: 2.0,
label: "200%",
description: "Maximum zoom",
},
];
export const VlcZoomControl: React.FC<VlcZoomControlProps> = ({
currentScale,
onScaleChange,
disabled = false,
}) => {
const lightHapticFeedback = useHaptic("light");
const handleScaleSelect = (scale: ScaleFactor) => {
onScaleChange(scale);
lightHapticFeedback();
};
const optionGroups = useMemo<OptionGroup[]>(() => {
return [
{
options: SCALE_OPTIONS.map((option) => ({
type: "radio" as const,
label: option.label,
value: option.id,
selected: option.id === currentScale,
onPress: () => handleScaleSelect(option.id),
disabled,
})),
},
];
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [currentScale, disabled]);
const trigger = useMemo(
() => (
<View
className='aspect-square flex flex-col rounded-xl items-center justify-center p-2'
style={{ opacity: disabled ? 0.5 : 1 }}
>
<Ionicons name='scan-outline' size={ICON_SIZES.HEADER} color='white' />
</View>
),
[disabled],
);
// Hide on TV platforms
if (Platform.isTV) return null;
return (
<PlatformDropdown
title='Zoom'
groups={optionGroups}
trigger={trigger}
bottomSheetConfig={{
enablePanDownToClose: true,
}}
/>
);
};

View File

@@ -9,19 +9,15 @@ import React, {
useContext, useContext,
useMemo, useMemo,
} from "react"; } from "react";
import type { SfPlayerViewRef, VlcPlayerViewRef } from "@/modules"; import type { MpvPlayerViewRef } from "@/modules";
import type { DownloadedItem } from "@/providers/Downloads/types"; import type { DownloadedItem } from "@/providers/Downloads/types";
// Union type for both player refs
type PlayerRef = SfPlayerViewRef | VlcPlayerViewRef;
interface PlayerContextProps { interface PlayerContextProps {
playerRef: MutableRefObject<PlayerRef | null>; playerRef: MutableRefObject<MpvPlayerViewRef | null>;
item: BaseItemDto; item: BaseItemDto;
mediaSource: MediaSourceInfo | null | undefined; mediaSource: MediaSourceInfo | null | undefined;
isVideoLoaded: boolean; isVideoLoaded: boolean;
tracksReady: boolean; tracksReady: boolean;
useVlcPlayer: boolean;
offline: boolean; offline: boolean;
downloadedItem: DownloadedItem | null; downloadedItem: DownloadedItem | null;
} }
@@ -30,12 +26,11 @@ const PlayerContext = createContext<PlayerContextProps | undefined>(undefined);
interface PlayerProviderProps { interface PlayerProviderProps {
children: ReactNode; children: ReactNode;
playerRef: MutableRefObject<PlayerRef | null>; playerRef: MutableRefObject<MpvPlayerViewRef | null>;
item: BaseItemDto; item: BaseItemDto;
mediaSource: MediaSourceInfo | null | undefined; mediaSource: MediaSourceInfo | null | undefined;
isVideoLoaded: boolean; isVideoLoaded: boolean;
tracksReady: boolean; tracksReady: boolean;
useVlcPlayer: boolean;
offline?: boolean; offline?: boolean;
downloadedItem?: DownloadedItem | null; downloadedItem?: DownloadedItem | null;
} }
@@ -47,7 +42,6 @@ export const PlayerProvider: React.FC<PlayerProviderProps> = ({
mediaSource, mediaSource,
isVideoLoaded, isVideoLoaded,
tracksReady, tracksReady,
useVlcPlayer,
offline = false, offline = false,
downloadedItem = null, downloadedItem = null,
}) => { }) => {
@@ -58,7 +52,6 @@ export const PlayerProvider: React.FC<PlayerProviderProps> = ({
mediaSource, mediaSource,
isVideoLoaded, isVideoLoaded,
tracksReady, tracksReady,
useVlcPlayer,
offline, offline,
downloadedItem, downloadedItem,
}), }),
@@ -68,7 +61,6 @@ export const PlayerProvider: React.FC<PlayerProviderProps> = ({
mediaSource, mediaSource,
isVideoLoaded, isVideoLoaded,
tracksReady, tracksReady,
useVlcPlayer,
offline, offline,
downloadedItem, downloadedItem,
], ],
@@ -87,30 +79,26 @@ export const usePlayerContext = () => {
return context; return context;
}; };
// Player controls hook - supports both SfPlayer (iOS) and VlcPlayer (Android) // Player controls hook - MPV player only
export const usePlayerControls = () => { export const usePlayerControls = () => {
const { playerRef } = usePlayerContext(); const { playerRef } = usePlayerContext();
// Helper to get SfPlayer-specific ref (for iOS-only features)
const getSfRef = () => playerRef.current as SfPlayerViewRef | null;
return { return {
// Subtitle controls (both players support these, but with different interfaces) // Subtitle controls
getSubtitleTracks: async () => { getSubtitleTracks: async () => {
return playerRef.current?.getSubtitleTracks?.() ?? null; return playerRef.current?.getSubtitleTracks?.() ?? null;
}, },
setSubtitleTrack: (trackId: number) => { setSubtitleTrack: (trackId: number) => {
playerRef.current?.setSubtitleTrack?.(trackId); playerRef.current?.setSubtitleTrack?.(trackId);
}, },
// iOS only (SfPlayer)
disableSubtitles: () => { disableSubtitles: () => {
getSfRef()?.disableSubtitles?.(); playerRef.current?.disableSubtitles?.();
}, },
addSubtitleFile: (url: string, select = true) => { addSubtitleFile: (url: string, select = true) => {
getSfRef()?.addSubtitleFile?.(url, select); playerRef.current?.addSubtitleFile?.(url, select);
}, },
// Audio controls (both players) // Audio controls
getAudioTracks: async () => { getAudioTracks: async () => {
return playerRef.current?.getAudioTracks?.() ?? null; return playerRef.current?.getAudioTracks?.() ?? null;
}, },
@@ -118,26 +106,25 @@ export const usePlayerControls = () => {
playerRef.current?.setAudioTrack?.(trackId); playerRef.current?.setAudioTrack?.(trackId);
}, },
// Playback controls (both players) // Playback controls
play: () => playerRef.current?.play?.(), play: () => playerRef.current?.play?.(),
pause: () => playerRef.current?.pause?.(), pause: () => playerRef.current?.pause?.(),
seekTo: (position: number) => playerRef.current?.seekTo?.(position), seekTo: (position: number) => playerRef.current?.seekTo?.(position),
// iOS only (SfPlayer) seekBy: (offset: number) => playerRef.current?.seekBy?.(offset),
seekBy: (offset: number) => getSfRef()?.seekBy?.(offset), setSpeed: (speed: number) => playerRef.current?.setSpeed?.(speed),
setSpeed: (speed: number) => getSfRef()?.setSpeed?.(speed),
// Subtitle positioning - iOS only (SfPlayer) // Subtitle positioning
setSubtitleScale: (scale: number) => getSfRef()?.setSubtitleScale?.(scale), setSubtitleScale: (scale: number) =>
playerRef.current?.setSubtitleScale?.(scale),
setSubtitlePosition: (position: number) => setSubtitlePosition: (position: number) =>
getSfRef()?.setSubtitlePosition?.(position), playerRef.current?.setSubtitlePosition?.(position),
setSubtitleMarginY: (margin: number) => setSubtitleMarginY: (margin: number) =>
getSfRef()?.setSubtitleMarginY?.(margin), playerRef.current?.setSubtitleMarginY?.(margin),
setSubtitleFontSize: (size: number) => setSubtitleFontSize: (size: number) =>
getSfRef()?.setSubtitleFontSize?.(size), playerRef.current?.setSubtitleFontSize?.(size),
// PiP (both players) // PiP
startPictureInPicture: () => playerRef.current?.startPictureInPicture?.(), startPictureInPicture: () => playerRef.current?.startPictureInPicture?.(),
// iOS only (SfPlayer) stopPictureInPicture: () => playerRef.current?.stopPictureInPicture?.(),
stopPictureInPicture: () => getSfRef()?.stopPictureInPicture?.(),
}; };
}; };

View File

@@ -8,7 +8,7 @@
* ============================================================================ * ============================================================================
* *
* - Jellyfin is source of truth for subtitle list (embedded + external) * - Jellyfin is source of truth for subtitle list (embedded + external)
* - KSPlayer only knows about: * - MPV only knows about:
* - Embedded subs it finds in the video stream * - Embedded subs it finds in the video stream
* - External subs we explicitly add via addSubtitleFile() * - External subs we explicitly add via addSubtitleFile()
* - UI shows Jellyfin's complete list * - UI shows Jellyfin's complete list
@@ -24,8 +24,8 @@
* - Value of -1 means disabled/none * - Value of -1 means disabled/none
* *
* 2. MPV INDEX (track.mpvIndex) * 2. MPV INDEX (track.mpvIndex)
* - KSPlayer's internal track ID * - MPV's internal track ID
* - KSPlayer orders tracks as: [all embedded, then all external] * - MPV orders tracks as: [all embedded, then all external]
* - IDs: 1..embeddedCount for embedded, embeddedCount+1.. for external * - IDs: 1..embeddedCount for embedded, embeddedCount+1.. for external
* - Value of -1 means track needs replacePlayer() (e.g., burned-in sub) * - Value of -1 means track needs replacePlayer() (e.g., burned-in sub)
* *
@@ -34,15 +34,15 @@
* ============================================================================ * ============================================================================
* *
* Embedded (DeliveryMethod.Embed): * Embedded (DeliveryMethod.Embed):
* - Already in KSPlayer's track list * - Already in MPV's track list
* - Select via setSubtitleTrack(mpvId) * - Select via setSubtitleTrack(mpvId)
* *
* External (DeliveryMethod.External): * External (DeliveryMethod.External):
* - Loaded into KSPlayer's srtControl on video start * - Loaded into MPV on video start
* - Select via setSubtitleTrack(embeddedCount + externalPosition + 1) * - Select via setSubtitleTrack(embeddedCount + externalPosition + 1)
* *
* Image-based during transcoding: * Image-based during transcoding:
* - Burned into video by Jellyfin, not in KSPlayer * - Burned into video by Jellyfin, not in MPV
* - Requires replacePlayer() to change * - Requires replacePlayer() to change
*/ */
@@ -57,7 +57,7 @@ import {
useMemo, useMemo,
useState, useState,
} from "react"; } from "react";
import type { SfAudioTrack, TrackInfo } from "@/modules"; import type { MpvAudioTrack } from "@/modules";
import { isImageBasedSubtitle } from "@/utils/jellyfin/subtitleUtils"; import { isImageBasedSubtitle } from "@/utils/jellyfin/subtitleUtils";
import type { Track } from "../types"; import type { Track } from "../types";
import { usePlayerContext, usePlayerControls } from "./PlayerContext"; import { usePlayerContext, usePlayerControls } from "./PlayerContext";
@@ -75,7 +75,7 @@ export const VideoProvider: React.FC<{ children: ReactNode }> = ({
const [subtitleTracks, setSubtitleTracks] = useState<Track[] | null>(null); const [subtitleTracks, setSubtitleTracks] = useState<Track[] | null>(null);
const [audioTracks, setAudioTracks] = useState<Track[] | null>(null); const [audioTracks, setAudioTracks] = useState<Track[] | null>(null);
const { tracksReady, mediaSource, useVlcPlayer, offline, downloadedItem } = const { tracksReady, mediaSource, offline, downloadedItem } =
usePlayerContext(); usePlayerContext();
const playerControls = usePlayerControls(); const playerControls = usePlayerControls();
@@ -149,7 +149,7 @@ export const VideoProvider: React.FC<{ children: ReactNode }> = ({
{ {
name: downloadedTrack.DisplayTitle || "Audio", name: downloadedTrack.DisplayTitle || "Audio",
index: downloadedTrack.Index ?? 0, index: downloadedTrack.Index ?? 0,
mpvIndex: useVlcPlayer ? 0 : 1, // Only track in file mpvIndex: 1, // Only track in file (MPV uses 1-based indexing)
setTrack: () => { setTrack: () => {
// Track is already selected (only one available) // Track is already selected (only one available)
router.setParams({ audioIndex: String(downloadedTrack.Index) }); router.setParams({ audioIndex: String(downloadedTrack.Index) });
@@ -212,99 +212,12 @@ export const VideoProvider: React.FC<{ children: ReactNode }> = ({
return; return;
} }
// For VLC player, use simpler track handling with server indices // MPV track handling
if (useVlcPlayer) {
// Get VLC track info (VLC returns TrackInfo[] with 'index' property)
const vlcSubtitleData = (await playerControls
.getSubtitleTracks()
.catch(() => null)) as TrackInfo[] | null;
const vlcAudioData = (await playerControls
.getAudioTracks()
.catch(() => null)) as TrackInfo[] | null;
// VLC reverses HLS subtitles during transcoding
let vlcSubs: TrackInfo[] = vlcSubtitleData ? [...vlcSubtitleData] : [];
if (isTranscoding && vlcSubs.length > 1) {
vlcSubs = [vlcSubs[0], ...vlcSubs.slice(1).reverse()];
}
// Build subtitle tracks for VLC
const subs: Track[] = [];
let vlcSubIndex = 1; // VLC track indices start at 1 (0 is usually "Disable")
for (const sub of allSubs) {
const isTextBased =
sub.DeliveryMethod === SubtitleDeliveryMethod.Embed ||
sub.DeliveryMethod === SubtitleDeliveryMethod.Hls ||
sub.DeliveryMethod === SubtitleDeliveryMethod.External;
// Get VLC's internal index for this track
const vlcTrackIndex = vlcSubs[vlcSubIndex]?.index ?? -1;
if (isTextBased) vlcSubIndex++;
// For image-based subs during transcoding, or non-text subs, use replacePlayer
const needsPlayerRefresh =
(isTranscoding && isImageBasedSubtitle(sub)) || !isTextBased;
subs.push({
name: sub.DisplayTitle || "Unknown",
index: sub.Index ?? -1,
mpvIndex: vlcTrackIndex,
setTrack: () => {
if (needsPlayerRefresh) {
replacePlayer({ subtitleIndex: String(sub.Index) });
} else if (vlcTrackIndex !== -1) {
playerControls.setSubtitleTrack(vlcTrackIndex);
router.setParams({ subtitleIndex: String(sub.Index) });
} else {
replacePlayer({ subtitleIndex: String(sub.Index) });
}
},
});
}
// Add "Disable" option
subs.unshift({
name: "Disable",
index: -1,
mpvIndex: -1,
setTrack: () => {
playerControls.setSubtitleTrack(-1);
router.setParams({ subtitleIndex: "-1" });
},
});
// Build audio tracks for VLC
const vlcAudio: TrackInfo[] = vlcAudioData ? [...vlcAudioData] : [];
const audio: Track[] = allAudio.map((a, idx) => {
const vlcTrackIndex = vlcAudio[idx + 1]?.index ?? idx;
return {
name: a.DisplayTitle || "Unknown",
index: a.Index ?? -1,
mpvIndex: vlcTrackIndex,
setTrack: () => {
if (isTranscoding) {
replacePlayer({ audioIndex: String(a.Index) });
} else {
playerControls.setAudioTrack(vlcTrackIndex);
router.setParams({ audioIndex: String(a.Index) });
}
},
};
});
setSubtitleTracks(subs.sort((a, b) => a.index - b.index));
setAudioTracks(audio);
return;
}
// KSPlayer track handling (original logic)
const audioData = await playerControls.getAudioTracks().catch(() => null); const audioData = await playerControls.getAudioTracks().catch(() => null);
const playerAudio = (audioData as SfAudioTrack[]) ?? []; const playerAudio = (audioData as MpvAudioTrack[]) ?? [];
// Separate embedded vs external subtitles from Jellyfin's list // Separate embedded vs external subtitles from Jellyfin's list
// KSPlayer orders tracks as: [all embedded, then all external] // MPV orders tracks as: [all embedded, then all external]
const embeddedSubs = allSubs.filter( const embeddedSubs = allSubs.filter(
(s) => s.DeliveryMethod === SubtitleDeliveryMethod.Embed, (s) => s.DeliveryMethod === SubtitleDeliveryMethod.Embed,
); );
@@ -312,7 +225,7 @@ export const VideoProvider: React.FC<{ children: ReactNode }> = ({
(s) => s.DeliveryMethod === SubtitleDeliveryMethod.External, (s) => s.DeliveryMethod === SubtitleDeliveryMethod.External,
); );
// Count embedded subs that will be in KSPlayer // Count embedded subs that will be in MPV
// (excludes image-based subs during transcoding as they're burned in) // (excludes image-based subs during transcoding as they're burned in)
const embeddedInPlayer = embeddedSubs.filter( const embeddedInPlayer = embeddedSubs.filter(
(s) => !isTranscoding || !isImageBasedSubtitle(s), (s) => !isTranscoding || !isImageBasedSubtitle(s),
@@ -339,8 +252,8 @@ export const VideoProvider: React.FC<{ children: ReactNode }> = ({
continue; continue;
} }
// Calculate KSPlayer track ID based on type // Calculate MPV track ID based on type
// KSPlayer IDs: [1..embeddedCount] for embedded, [embeddedCount+1..] for external // MPV IDs: [1..embeddedCount] for embedded, [embeddedCount+1..] for external
let mpvId = -1; let mpvId = -1;
if (isEmbedded) { if (isEmbedded) {
@@ -428,7 +341,7 @@ export const VideoProvider: React.FC<{ children: ReactNode }> = ({
}; };
fetchTracks(); fetchTracks();
}, [tracksReady, mediaSource, useVlcPlayer, offline, downloadedItem]); }, [tracksReady, mediaSource, offline, downloadedItem]);
return ( return (
<VideoContext.Provider value={{ subtitleTracks, audioTracks }}> <VideoContext.Provider value={{ subtitleTracks, audioTracks }}>

View File

@@ -7,9 +7,11 @@ import {
type OptionGroup, type OptionGroup,
PlatformDropdown, PlatformDropdown,
} from "@/components/PlatformDropdown"; } from "@/components/PlatformDropdown";
import { PLAYBACK_SPEEDS } from "@/components/PlaybackSpeedSelector";
import { useSettings } from "@/utils/atoms/settings"; import { useSettings } from "@/utils/atoms/settings";
import { usePlayerContext } from "../contexts/PlayerContext"; import { usePlayerContext } from "../contexts/PlayerContext";
import { useVideoContext } from "../contexts/VideoContext"; import { useVideoContext } from "../contexts/VideoContext";
import { PlaybackSpeedScope } from "../utils/playback-speed-settings";
// Subtitle size presets (stored as scale * 100, so 1.0 = 100) // Subtitle size presets (stored as scale * 100, so 1.0 = 100)
const SUBTITLE_SIZE_PRESETS = [ const SUBTITLE_SIZE_PRESETS = [
@@ -23,9 +25,17 @@ const SUBTITLE_SIZE_PRESETS = [
{ label: "1.2", value: 120 }, { label: "1.2", value: 120 },
] as const; ] as const;
const DropdownView = () => { interface DropdownViewProps {
playbackSpeed?: number;
setPlaybackSpeed?: (speed: number, scope: PlaybackSpeedScope) => void;
}
const DropdownView = ({
playbackSpeed = 1.0,
setPlaybackSpeed,
}: DropdownViewProps) => {
const { subtitleTracks, audioTracks } = useVideoContext(); const { subtitleTracks, audioTracks } = useVideoContext();
const { item, mediaSource, useVlcPlayer } = usePlayerContext(); const { item, mediaSource } = usePlayerContext();
const { settings, updateSettings } = useSettings(); const { settings, updateSettings } = useSettings();
const router = useRouter(); const router = useRouter();
@@ -110,8 +120,7 @@ const DropdownView = () => {
})), })),
}); });
// Subtitle Size Section (KSPlayer only - VLC uses settings) // Subtitle Size Section
if (!useVlcPlayer) {
groups.push({ groups.push({
title: "Subtitle Size", title: "Subtitle Size",
options: SUBTITLE_SIZE_PRESETS.map((preset) => ({ options: SUBTITLE_SIZE_PRESETS.map((preset) => ({
@@ -123,7 +132,6 @@ const DropdownView = () => {
})), })),
}); });
} }
}
// Audio Section // Audio Section
if (audioTracks && audioTracks.length > 0) { if (audioTracks && audioTracks.length > 0) {
@@ -139,6 +147,20 @@ const DropdownView = () => {
}); });
} }
// Speed Section
if (setPlaybackSpeed) {
groups.push({
title: "Speed",
options: PLAYBACK_SPEEDS.map((speed) => ({
type: "radio" as const,
label: speed.label,
value: speed.value.toString(),
selected: playbackSpeed === speed.value,
onPress: () => setPlaybackSpeed(speed.value, PlaybackSpeedScope.All),
})),
});
}
return groups; return groups;
// eslint-disable-next-line react-hooks/exhaustive-deps // eslint-disable-next-line react-hooks/exhaustive-deps
}, [ }, [
@@ -151,7 +173,8 @@ const DropdownView = () => {
audioIndex, audioIndex,
settings.subtitleSize, settings.subtitleSize,
updateSettings, updateSettings,
useVlcPlayer, playbackSpeed,
setPlaybackSpeed,
// Note: subtitleTracks and audioTracks are intentionally excluded // Note: subtitleTracks and audioTracks are intentionally excluded
// because we use subtitleTracksKey and audioTracksKey for stability // because we use subtitleTracksKey and audioTracksKey for stability
]); ]);

View File

@@ -34,6 +34,7 @@ export const useVolumeAndBrightness = ({
const initialVolume = useRef<number | null>(null); const initialVolume = useRef<number | null>(null);
const initialBrightness = useRef<number | null>(null); const initialBrightness = useRef<number | null>(null);
const dragStartY = useRef<number | null>(null); const dragStartY = useRef<number | null>(null);
const brightnessSupported = useRef(true);
const startVolumeDrag = useCallback(async (startY: number) => { const startVolumeDrag = useCallback(async (startY: number) => {
if (Platform.isTV || !VolumeManager) return; if (Platform.isTV || !VolumeManager) return;
@@ -88,20 +89,26 @@ export const useVolumeAndBrightness = ({
}, []); }, []);
const startBrightnessDrag = useCallback(async (startY: number) => { const startBrightnessDrag = useCallback(async (startY: number) => {
if (Platform.isTV || !Brightness) return; if (Platform.isTV || !Brightness || !brightnessSupported.current) return;
try { try {
const brightness = await Brightness.getBrightnessAsync(); const brightness = await Brightness.getBrightnessAsync();
initialBrightness.current = brightness; initialBrightness.current = brightness;
dragStartY.current = startY; dragStartY.current = startY;
} catch (error) { } catch (error) {
console.error("Error starting brightness drag:", error); console.warn("Brightness not supported on this device:", error);
brightnessSupported.current = false;
} }
}, []); }, []);
const updateBrightnessDrag = useCallback( const updateBrightnessDrag = useCallback(
async (deltaY: number) => { async (deltaY: number) => {
if (Platform.isTV || !Brightness || initialBrightness.current === null) if (
Platform.isTV ||
!Brightness ||
initialBrightness.current === null ||
!brightnessSupported.current
)
return; return;
try { try {
@@ -118,7 +125,8 @@ export const useVolumeAndBrightness = ({
const brightnessPercent = Math.round(newBrightness * 100); const brightnessPercent = Math.round(newBrightness * 100);
onBrightnessChange?.(brightnessPercent); onBrightnessChange?.(brightnessPercent);
} catch (error) { } catch (error) {
console.error("Error updating brightness:", error); console.warn("Brightness not supported on this device:", error);
brightnessSupported.current = false;
} }
}, },
[onBrightnessChange], [onBrightnessChange],

View File

@@ -1,40 +0,0 @@
/**
* VLC subtitle styling constants
* These values are used with VLC's FreeType subtitle rendering engine
*/
// VLC color values (decimal representation of hex colors)
export const VLC_COLORS: Record<string, number> = {
Black: 0,
Gray: 8421504,
Silver: 12632256,
White: 16777215,
Maroon: 8388608,
Red: 16711680,
Fuchsia: 16711935,
Yellow: 16776960,
Olive: 8421376,
Green: 32768,
Teal: 32896,
Lime: 65280,
Purple: 8388736,
Navy: 128,
Blue: 255,
Aqua: 65535,
};
// VLC color names for UI display
export const VLC_COLOR_OPTIONS = Object.keys(VLC_COLORS);
// VLC outline thickness values in pixels
export const OUTLINE_THICKNESS: Record<string, number> = {
None: 0,
Thin: 2,
Normal: 4,
Thick: 6,
};
// Outline thickness options for UI
export const OUTLINE_THICKNESS_OPTIONS = Object.keys(
OUTLINE_THICKNESS,
) as Array<"None" | "Thin" | "Normal" | "Thick">;

View File

@@ -1,12 +1,12 @@
import type { BaseItemDto } from "@jellyfin/sdk/lib/generated-client/models"; import type { BaseItemDto } from "@jellyfin/sdk/lib/generated-client/models";
import { useQueryClient } from "@tanstack/react-query";
import { useCallback } from "react"; import { useCallback } from "react";
import { useNetworkAwareQueryClient } from "@/hooks/useNetworkAwareQueryClient";
import { useHaptic } from "./useHaptic"; import { useHaptic } from "./useHaptic";
import { usePlaybackManager } from "./usePlaybackManager"; import { usePlaybackManager } from "./usePlaybackManager";
import { useInvalidatePlaybackProgressCache } from "./useRevalidatePlaybackProgressCache"; import { useInvalidatePlaybackProgressCache } from "./useRevalidatePlaybackProgressCache";
export const useMarkAsPlayed = (items: BaseItemDto[]) => { export const useMarkAsPlayed = (items: BaseItemDto[]) => {
const queryClient = useNetworkAwareQueryClient(); const queryClient = useQueryClient();
const lightHapticFeedback = useHaptic("light"); const lightHapticFeedback = useHaptic("light");
const { markItemPlayed, markItemUnplayed } = usePlaybackManager(); const { markItemPlayed, markItemUnplayed } = usePlaybackManager();
const invalidatePlaybackProgressCache = useInvalidatePlaybackProgressCache(); const invalidatePlaybackProgressCache = useInvalidatePlaybackProgressCache();

View File

@@ -1,108 +0,0 @@
import { ViewStyle } from "react-native";
export type PlaybackStatePayload = {
nativeEvent: {
target: number;
state: "Opening" | "Buffering" | "Playing" | "Paused" | "Error";
currentTime: number;
duration: number;
isBuffering: boolean;
isPlaying: boolean;
};
};
export type ProgressUpdatePayload = {
nativeEvent: {
currentTime: number;
duration: number;
isPlaying: boolean;
isBuffering: boolean;
};
};
export type VideoLoadStartPayload = {
nativeEvent: {
target: number;
};
};
export type PipStartedPayload = {
nativeEvent: {
pipStarted: boolean;
};
};
export type VideoStateChangePayload = PlaybackStatePayload;
export type VideoProgressPayload = ProgressUpdatePayload;
export type VlcPlayerSource = {
uri: string;
type?: string;
isNetwork?: boolean;
autoplay?: boolean;
startPosition?: number;
externalSubtitles?: { name: string; DeliveryUrl: string }[];
initOptions?: any[];
mediaOptions?: { [key: string]: any };
};
export type TrackInfo = {
name: string;
index: number;
language?: string;
};
export type ChapterInfo = {
name: string;
timeOffset: number;
duration: number;
};
export type NowPlayingMetadata = {
title?: string;
artist?: string;
albumTitle?: string;
artworkUri?: string;
};
export type VlcPlayerViewProps = {
source: VlcPlayerSource;
style?: ViewStyle | ViewStyle[];
progressUpdateInterval?: number;
paused?: boolean;
muted?: boolean;
volume?: number;
videoAspectRatio?: string;
nowPlayingMetadata?: NowPlayingMetadata;
onVideoProgress?: (event: ProgressUpdatePayload) => void;
onVideoStateChange?: (event: PlaybackStatePayload) => void;
onVideoLoadStart?: (event: VideoLoadStartPayload) => void;
onVideoLoadEnd?: (event: VideoLoadStartPayload) => void;
onVideoError?: (event: PlaybackStatePayload) => void;
onPipStarted?: (event: PipStartedPayload) => void;
};
export interface VlcPlayerViewRef {
startPictureInPicture: () => Promise<void>;
play: () => Promise<void>;
pause: () => Promise<void>;
stop: () => Promise<void>;
seekTo: (time: number) => Promise<void>;
setAudioTrack: (trackIndex: number) => Promise<void>;
getAudioTracks: () => Promise<TrackInfo[] | null>;
setSubtitleTrack: (trackIndex: number) => Promise<void>;
getSubtitleTracks: () => Promise<TrackInfo[] | null>;
setSubtitleDelay: (delay: number) => Promise<void>;
setAudioDelay: (delay: number) => Promise<void>;
takeSnapshot: (path: string, width: number, height: number) => Promise<void>;
setRate: (rate: number) => Promise<void>;
nextChapter: () => Promise<void>;
previousChapter: () => Promise<void>;
getChapters: () => Promise<ChapterInfo[] | null>;
setVideoCropGeometry: (cropGeometry: string | null) => Promise<void>;
getVideoCropGeometry: () => Promise<string | null>;
setSubtitleURL: (url: string) => Promise<void>;
setVideoAspectRatio: (aspectRatio: string | null) => Promise<void>;
setVideoScaleFactor: (scaleFactor: number) => Promise<void>;
}

View File

@@ -1,152 +0,0 @@
import { requireNativeViewManager } from "expo-modules-core";
import * as React from "react";
import { ViewStyle } from "react-native";
import type {
VlcPlayerSource,
VlcPlayerViewProps,
VlcPlayerViewRef,
} from "./VlcPlayer.types";
interface NativeViewRef extends VlcPlayerViewRef {
setNativeProps?: (props: Partial<VlcPlayerViewProps>) => void;
}
const VLCViewManager = requireNativeViewManager("VlcPlayer");
// Create a forwarded ref version of the native view
const NativeView = React.forwardRef<NativeViewRef, VlcPlayerViewProps>(
(props, ref) => {
return <VLCViewManager {...props} ref={ref} />;
},
);
const VlcPlayerView = React.forwardRef<VlcPlayerViewRef, VlcPlayerViewProps>(
(props, ref) => {
const nativeRef = React.useRef<NativeViewRef>(null);
React.useImperativeHandle(ref, () => ({
startPictureInPicture: async () => {
await nativeRef.current?.startPictureInPicture();
},
play: async () => {
await nativeRef.current?.play();
},
pause: async () => {
await nativeRef.current?.pause();
},
stop: async () => {
await nativeRef.current?.stop();
},
seekTo: async (time: number) => {
await nativeRef.current?.seekTo(time);
},
setAudioTrack: async (trackIndex: number) => {
await nativeRef.current?.setAudioTrack(trackIndex);
},
getAudioTracks: async () => {
const tracks = await nativeRef.current?.getAudioTracks();
return tracks ?? null;
},
setSubtitleTrack: async (trackIndex: number) => {
await nativeRef.current?.setSubtitleTrack(trackIndex);
},
getSubtitleTracks: async () => {
const tracks = await nativeRef.current?.getSubtitleTracks();
return tracks ?? null;
},
setSubtitleDelay: async (delay: number) => {
await nativeRef.current?.setSubtitleDelay(delay);
},
setAudioDelay: async (delay: number) => {
await nativeRef.current?.setAudioDelay(delay);
},
takeSnapshot: async (path: string, width: number, height: number) => {
await nativeRef.current?.takeSnapshot(path, width, height);
},
setRate: async (rate: number) => {
await nativeRef.current?.setRate(rate);
},
nextChapter: async () => {
await nativeRef.current?.nextChapter();
},
previousChapter: async () => {
await nativeRef.current?.previousChapter();
},
getChapters: async () => {
const chapters = await nativeRef.current?.getChapters();
return chapters ?? null;
},
setVideoCropGeometry: async (geometry: string | null) => {
await nativeRef.current?.setVideoCropGeometry(geometry);
},
getVideoCropGeometry: async () => {
const geometry = await nativeRef.current?.getVideoCropGeometry();
return geometry ?? null;
},
setSubtitleURL: async (url: string) => {
await nativeRef.current?.setSubtitleURL(url);
},
setVideoAspectRatio: async (aspectRatio: string | null) => {
await nativeRef.current?.setVideoAspectRatio(aspectRatio);
},
setVideoScaleFactor: async (scaleFactor: number) => {
await nativeRef.current?.setVideoScaleFactor(scaleFactor);
},
}));
const {
source,
style,
progressUpdateInterval = 500,
paused,
muted,
volume,
videoAspectRatio,
nowPlayingMetadata,
onVideoLoadStart,
onVideoStateChange,
onVideoProgress,
onVideoLoadEnd,
onVideoError,
onPipStarted,
...otherProps
} = props;
const baseSource: VlcPlayerSource =
typeof source === "string"
? ({ uri: source } as unknown as VlcPlayerSource)
: source;
// Create a new object to avoid mutating frozen source
const processedSource: VlcPlayerSource = {
...baseSource,
startPosition:
baseSource.startPosition !== undefined
? Math.floor(baseSource.startPosition)
: undefined,
};
return (
<NativeView
{...otherProps}
ref={nativeRef}
source={processedSource}
style={[{ width: "100%", height: "100%" }, style as ViewStyle]}
progressUpdateInterval={progressUpdateInterval}
paused={paused}
muted={muted}
volume={volume}
videoAspectRatio={videoAspectRatio}
nowPlayingMetadata={nowPlayingMetadata}
onVideoLoadStart={onVideoLoadStart}
onVideoLoadEnd={onVideoLoadEnd}
onVideoStateChange={onVideoStateChange}
onVideoProgress={onVideoProgress}
onVideoError={onVideoError}
onPipStarted={onPipStarted}
/>
);
},
);
export default VlcPlayerView;

View File

@@ -8,39 +8,17 @@ export type {
} from "./background-downloader"; } from "./background-downloader";
export { default as BackgroundDownloader } from "./background-downloader"; export { default as BackgroundDownloader } from "./background-downloader";
// Streamyfin Player (KSPlayer-based) - GPU acceleration + native PiP (iOS) // MPV Player (iOS + Android)
export type { export type {
AudioTrack as SfAudioTrack, AudioTrack as MpvAudioTrack,
OnErrorEventPayload as SfOnErrorEventPayload, MpvPlayerViewProps,
OnLoadEventPayload as SfOnLoadEventPayload, MpvPlayerViewRef,
OnPictureInPictureChangePayload as SfOnPictureInPictureChangePayload, OnErrorEventPayload as MpvOnErrorEventPayload,
OnPlaybackStateChangePayload as SfOnPlaybackStateChangePayload, OnLoadEventPayload as MpvOnLoadEventPayload,
OnProgressEventPayload as SfOnProgressEventPayload, OnPlaybackStateChangePayload as MpvOnPlaybackStateChangePayload,
OnTracksReadyEventPayload as SfOnTracksReadyEventPayload, OnProgressEventPayload as MpvOnProgressEventPayload,
SfPlayerViewProps, OnTracksReadyEventPayload as MpvOnTracksReadyEventPayload,
SfPlayerViewRef, SubtitleTrack as MpvSubtitleTrack,
SubtitleTrack as SfSubtitleTrack, VideoSource as MpvVideoSource,
VideoSource as SfVideoSource, } from "./mpv-player";
} from "./sf-player"; export { MpvPlayerView } from "./mpv-player";
export {
getHardwareDecode,
SfPlayerView,
setHardwareDecode,
} from "./sf-player";
// VLC Player (Android)
export type {
ChapterInfo,
NowPlayingMetadata,
PipStartedPayload,
PlaybackStatePayload,
ProgressUpdatePayload,
TrackInfo,
VideoLoadStartPayload,
VideoProgressPayload,
VideoStateChangePayload,
VlcPlayerSource,
VlcPlayerViewProps,
VlcPlayerViewRef,
} from "./VlcPlayer.types";
export { default as VlcPlayerView } from "./VlcPlayerView";

View File

@@ -25,7 +25,7 @@ if (useManagedAndroidSdkVersions) {
project.android { project.android {
compileSdkVersion safeExtGet("compileSdkVersion", 36) compileSdkVersion safeExtGet("compileSdkVersion", 36)
defaultConfig { defaultConfig {
minSdkVersion safeExtGet("minSdkVersion", 24) minSdkVersion safeExtGet("minSdkVersion", 26)
targetSdkVersion safeExtGet("targetSdkVersion", 36) targetSdkVersion safeExtGet("targetSdkVersion", 36)
} }
} }
@@ -36,8 +36,22 @@ android {
defaultConfig { defaultConfig {
versionCode 1 versionCode 1
versionName "0.7.6" versionName "0.7.6"
ndk {
// Architectures supported by mpv-android
abiFilters 'arm64-v8a', 'armeabi-v7a', 'x86', 'x86_64'
}
} }
lintOptions { lintOptions {
abortOnError false abortOnError false
} }
sourceSets {
main {
jniLibs.srcDirs = ['libs']
}
}
}
dependencies {
// libmpv from Maven Central
implementation 'dev.jdtech.mpv:libmpv:0.5.1'
} }

View File

@@ -1,2 +1,9 @@
<manifest> <manifest xmlns:android="http://schemas.android.com/apk/res/android">
<!-- Required for network streaming -->
<uses-permission android:name="android.permission.INTERNET" />
<!-- Picture-in-Picture feature -->
<uses-feature
android:name="android.software.picture_in_picture"
android:required="false" />
</manifest> </manifest>

View File

@@ -0,0 +1,552 @@
package expo.modules.mpvplayer
import android.content.Context
import android.os.Handler
import android.os.Looper
import android.util.Log
import android.view.Surface
/**
* MPV renderer that wraps libmpv for video playback.
* This mirrors the iOS MPVLayerRenderer implementation.
*/
class MPVLayerRenderer(private val context: Context) : MPVLib.EventObserver {
companion object {
private const val TAG = "MPVLayerRenderer"
// Property observation format types
const val MPV_FORMAT_NONE = 0
const val MPV_FORMAT_STRING = 1
const val MPV_FORMAT_OSD_STRING = 2
const val MPV_FORMAT_FLAG = 3
const val MPV_FORMAT_INT64 = 4
const val MPV_FORMAT_DOUBLE = 5
const val MPV_FORMAT_NODE = 6
}
interface Delegate {
fun onPositionChanged(position: Double, duration: Double)
fun onPauseChanged(isPaused: Boolean)
fun onLoadingChanged(isLoading: Boolean)
fun onReadyToSeek()
fun onTracksReady()
fun onError(message: String)
fun onVideoDimensionsChanged(width: Int, height: Int)
}
var delegate: Delegate? = null
private val mainHandler = Handler(Looper.getMainLooper())
private var surface: Surface? = null
private var isRunning = false
private var isStopping = false
// Cached state
private var cachedPosition: Double = 0.0
private var cachedDuration: Double = 0.0
private var _isPaused: Boolean = true
private var _isLoading: Boolean = false
private var _playbackSpeed: Double = 1.0
private var isReadyToSeek: Boolean = false
// Video dimensions
private var _videoWidth: Int = 0
private var _videoHeight: Int = 0
val videoWidth: Int
get() = _videoWidth
val videoHeight: Int
get() = _videoHeight
// Current video config
private var currentUrl: String? = null
private var currentHeaders: Map<String, String>? = null
private var pendingExternalSubtitles: List<String> = emptyList()
private var initialSubtitleId: Int? = null
private var initialAudioId: Int? = null
val isPausedState: Boolean
get() = _isPaused
val currentPosition: Double
get() = cachedPosition
val duration: Double
get() = cachedDuration
fun start() {
if (isRunning) return
try {
MPVLib.create(context)
MPVLib.addObserver(this)
// Configure mpv options before initialization (based on Findroid)
MPVLib.setOptionString("vo", "gpu")
MPVLib.setOptionString("gpu-context", "android")
MPVLib.setOptionString("opengl-es", "yes")
// Hardware video decoding
MPVLib.setOptionString("hwdec", "mediacodec-copy")
MPVLib.setOptionString("hwdec-codecs", "h264,hevc,mpeg4,mpeg2video,vp8,vp9,av1")
// Cache settings for better network streaming
MPVLib.setOptionString("cache", "yes")
MPVLib.setOptionString("cache-pause-initial", "yes")
MPVLib.setOptionString("demuxer-max-bytes", "150MiB")
MPVLib.setOptionString("demuxer-max-back-bytes", "75MiB")
MPVLib.setOptionString("demuxer-readahead-secs", "20")
// Seeking optimization - faster seeking at the cost of less precision
// Use keyframe seeking by default (much faster for network streams)
MPVLib.setOptionString("hr-seek", "no")
// Drop frames during seeking for faster response
MPVLib.setOptionString("hr-seek-framedrop", "yes")
// Subtitle settings
MPVLib.setOptionString("sub-scale-with-window", "yes")
MPVLib.setOptionString("sub-use-margins", "no")
MPVLib.setOptionString("subs-match-os-language", "yes")
MPVLib.setOptionString("subs-fallback", "yes")
// Important: Start with force-window=no, will be set to yes when surface is attached
MPVLib.setOptionString("force-window", "no")
MPVLib.setOptionString("keep-open", "always")
MPVLib.initialize()
// Observe properties
observeProperties()
isRunning = true
Log.i(TAG, "MPV renderer started")
} catch (e: Exception) {
Log.e(TAG, "Failed to start MPV renderer: ${e.message}")
delegate?.onError("Failed to start renderer: ${e.message}")
}
}
fun stop() {
if (isStopping) return
if (!isRunning) return
isStopping = true
isRunning = false
try {
MPVLib.removeObserver(this)
MPVLib.detachSurface()
MPVLib.destroy()
} catch (e: Exception) {
Log.e(TAG, "Error stopping MPV: ${e.message}")
}
isStopping = false
}
/**
* Attach surface and re-enable video output.
* Based on Findroid's implementation.
*/
fun attachSurface(surface: Surface) {
this.surface = surface
if (isRunning) {
MPVLib.attachSurface(surface)
// Re-enable video output after attaching surface (Findroid approach)
MPVLib.setOptionString("force-window", "yes")
MPVLib.setOptionString("vo", "gpu")
Log.i(TAG, "Surface attached, video output re-enabled")
}
}
/**
* Detach surface and disable video output.
* Based on Findroid's implementation.
*/
fun detachSurface() {
this.surface = null
if (isRunning) {
try {
// Disable video output before detaching surface (Findroid approach)
MPVLib.setOptionString("vo", "null")
MPVLib.setOptionString("force-window", "no")
Log.i(TAG, "Video output disabled before surface detach")
} catch (e: Exception) {
Log.e(TAG, "Failed to disable video output: ${e.message}")
}
MPVLib.detachSurface()
}
}
/**
* Updates the surface size. Called from surfaceChanged.
* Based on Findroid's implementation.
*/
fun updateSurfaceSize(width: Int, height: Int) {
if (isRunning) {
MPVLib.setPropertyString("android-surface-size", "${width}x$height")
Log.i(TAG, "Surface size updated: ${width}x$height")
}
}
fun load(
url: String,
headers: Map<String, String>? = null,
startPosition: Double? = null,
externalSubtitles: List<String>? = null,
initialSubtitleId: Int? = null,
initialAudioId: Int? = null
) {
currentUrl = url
currentHeaders = headers
pendingExternalSubtitles = externalSubtitles ?: emptyList()
this.initialSubtitleId = initialSubtitleId
this.initialAudioId = initialAudioId
_isLoading = true
isReadyToSeek = false
mainHandler.post { delegate?.onLoadingChanged(true) }
// Stop previous playback
MPVLib.command(arrayOf("stop"))
// Set HTTP headers if provided
updateHttpHeaders(headers)
// Set start position
if (startPosition != null && startPosition > 0) {
MPVLib.setPropertyString("start", String.format("%.2f", startPosition))
} else {
MPVLib.setPropertyString("start", "0")
}
// Set initial audio track if specified
if (initialAudioId != null && initialAudioId > 0) {
setAudioTrack(initialAudioId)
}
// Set initial subtitle track if no external subs
if (pendingExternalSubtitles.isEmpty()) {
if (initialSubtitleId != null) {
setSubtitleTrack(initialSubtitleId)
} else {
disableSubtitles()
}
} else {
disableSubtitles()
}
// Load the file
MPVLib.command(arrayOf("loadfile", url, "replace"))
}
fun reloadCurrentItem() {
currentUrl?.let { url ->
load(url, currentHeaders)
}
}
private fun updateHttpHeaders(headers: Map<String, String>?) {
if (headers.isNullOrEmpty()) {
// Clear headers
return
}
val headerString = headers.entries.joinToString("\r\n") { "${it.key}: ${it.value}" }
MPVLib.setPropertyString("http-header-fields", headerString)
}
private fun observeProperties() {
MPVLib.observeProperty("duration", MPV_FORMAT_DOUBLE)
MPVLib.observeProperty("time-pos", MPV_FORMAT_DOUBLE)
MPVLib.observeProperty("pause", MPV_FORMAT_FLAG)
MPVLib.observeProperty("track-list/count", MPV_FORMAT_INT64)
MPVLib.observeProperty("paused-for-cache", MPV_FORMAT_FLAG)
// Video dimensions for PiP aspect ratio
MPVLib.observeProperty("video-params/w", MPV_FORMAT_INT64)
MPVLib.observeProperty("video-params/h", MPV_FORMAT_INT64)
}
// MARK: - Playback Controls
fun play() {
MPVLib.setPropertyBoolean("pause", false)
}
fun pause() {
MPVLib.setPropertyBoolean("pause", true)
}
fun togglePause() {
if (_isPaused) play() else pause()
}
fun seekTo(seconds: Double) {
val clamped = maxOf(0.0, seconds)
cachedPosition = clamped
MPVLib.command(arrayOf("seek", clamped.toString(), "absolute"))
}
fun seekBy(seconds: Double) {
val newPosition = maxOf(0.0, cachedPosition + seconds)
cachedPosition = newPosition
MPVLib.command(arrayOf("seek", seconds.toString(), "relative"))
}
fun setSpeed(speed: Double) {
_playbackSpeed = speed
MPVLib.setPropertyDouble("speed", speed)
}
fun getSpeed(): Double {
return MPVLib.getPropertyDouble("speed") ?: _playbackSpeed
}
// MARK: - Subtitle Controls
fun getSubtitleTracks(): List<Map<String, Any>> {
val tracks = mutableListOf<Map<String, Any>>()
val trackCount = MPVLib.getPropertyInt("track-list/count") ?: 0
for (i in 0 until trackCount) {
val trackType = MPVLib.getPropertyString("track-list/$i/type") ?: continue
if (trackType != "sub") continue
val trackId = MPVLib.getPropertyInt("track-list/$i/id") ?: continue
val track = mutableMapOf<String, Any>("id" to trackId)
MPVLib.getPropertyString("track-list/$i/title")?.let { track["title"] = it }
MPVLib.getPropertyString("track-list/$i/lang")?.let { track["lang"] = it }
val selected = MPVLib.getPropertyBoolean("track-list/$i/selected") ?: false
track["selected"] = selected
tracks.add(track)
}
return tracks
}
fun setSubtitleTrack(trackId: Int) {
Log.i(TAG, "setSubtitleTrack: setting sid to $trackId")
if (trackId < 0) {
MPVLib.setPropertyString("sid", "no")
} else {
MPVLib.setPropertyInt("sid", trackId)
}
}
fun disableSubtitles() {
MPVLib.setPropertyString("sid", "no")
}
fun getCurrentSubtitleTrack(): Int {
return MPVLib.getPropertyInt("sid") ?: 0
}
fun addSubtitleFile(url: String, select: Boolean = true) {
val flag = if (select) "select" else "cached"
MPVLib.command(arrayOf("sub-add", url, flag))
}
// MARK: - Subtitle Positioning
fun setSubtitlePosition(position: Int) {
MPVLib.setPropertyInt("sub-pos", position)
}
fun setSubtitleScale(scale: Double) {
MPVLib.setPropertyDouble("sub-scale", scale)
}
fun setSubtitleMarginY(margin: Int) {
MPVLib.setPropertyInt("sub-margin-y", margin)
}
fun setSubtitleAlignX(alignment: String) {
MPVLib.setPropertyString("sub-align-x", alignment)
}
fun setSubtitleAlignY(alignment: String) {
MPVLib.setPropertyString("sub-align-y", alignment)
}
fun setSubtitleFontSize(size: Int) {
MPVLib.setPropertyInt("sub-font-size", size)
}
// MARK: - Audio Track Controls
fun getAudioTracks(): List<Map<String, Any>> {
val tracks = mutableListOf<Map<String, Any>>()
val trackCount = MPVLib.getPropertyInt("track-list/count") ?: 0
for (i in 0 until trackCount) {
val trackType = MPVLib.getPropertyString("track-list/$i/type") ?: continue
if (trackType != "audio") continue
val trackId = MPVLib.getPropertyInt("track-list/$i/id") ?: continue
val track = mutableMapOf<String, Any>("id" to trackId)
MPVLib.getPropertyString("track-list/$i/title")?.let { track["title"] = it }
MPVLib.getPropertyString("track-list/$i/lang")?.let { track["lang"] = it }
MPVLib.getPropertyString("track-list/$i/codec")?.let { track["codec"] = it }
val channels = MPVLib.getPropertyInt("track-list/$i/audio-channels")
if (channels != null && channels > 0) {
track["channels"] = channels
}
val selected = MPVLib.getPropertyBoolean("track-list/$i/selected") ?: false
track["selected"] = selected
tracks.add(track)
}
return tracks
}
fun setAudioTrack(trackId: Int) {
Log.i(TAG, "setAudioTrack: setting aid to $trackId")
MPVLib.setPropertyInt("aid", trackId)
}
fun getCurrentAudioTrack(): Int {
return MPVLib.getPropertyInt("aid") ?: 0
}
// MARK: - Video Scaling
fun setZoomedToFill(zoomed: Boolean) {
// panscan: 0.0 = fit (letterbox), 1.0 = fill (crop)
val panscanValue = if (zoomed) 1.0 else 0.0
Log.i(TAG, "setZoomedToFill: setting panscan to $panscanValue")
MPVLib.setPropertyDouble("panscan", panscanValue)
}
// MARK: - MPVLib.EventObserver
override fun eventProperty(property: String) {
// Property changed but no value provided
}
override fun eventProperty(property: String, value: Long) {
when (property) {
"track-list/count" -> {
if (value > 0) {
Log.i(TAG, "Track list updated: $value tracks available")
mainHandler.post { delegate?.onTracksReady() }
}
}
"video-params/w" -> {
val width = value.toInt()
if (width > 0 && width != _videoWidth) {
_videoWidth = width
notifyVideoDimensionsIfReady()
}
}
"video-params/h" -> {
val height = value.toInt()
if (height > 0 && height != _videoHeight) {
_videoHeight = height
notifyVideoDimensionsIfReady()
}
}
}
}
private fun notifyVideoDimensionsIfReady() {
if (_videoWidth > 0 && _videoHeight > 0) {
Log.i(TAG, "Video dimensions: ${_videoWidth}x${_videoHeight}")
mainHandler.post { delegate?.onVideoDimensionsChanged(_videoWidth, _videoHeight) }
}
}
override fun eventProperty(property: String, value: Boolean) {
when (property) {
"pause" -> {
if (value != _isPaused) {
_isPaused = value
mainHandler.post { delegate?.onPauseChanged(value) }
}
}
"paused-for-cache" -> {
if (value != _isLoading) {
_isLoading = value
mainHandler.post { delegate?.onLoadingChanged(value) }
}
}
}
}
override fun eventProperty(property: String, value: String) {
// Handle string properties if needed
}
override fun eventProperty(property: String, value: Double) {
when (property) {
"duration" -> {
cachedDuration = value
mainHandler.post { delegate?.onPositionChanged(cachedPosition, cachedDuration) }
}
"time-pos" -> {
cachedPosition = value
mainHandler.post { delegate?.onPositionChanged(cachedPosition, cachedDuration) }
}
}
}
override fun event(eventId: Int) {
when (eventId) {
MPVLib.MPV_EVENT_FILE_LOADED -> {
// Add external subtitles now that file is loaded
if (pendingExternalSubtitles.isNotEmpty()) {
for (subUrl in pendingExternalSubtitles) {
MPVLib.command(arrayOf("sub-add", subUrl))
}
pendingExternalSubtitles = emptyList()
// Set subtitle after external subs are added
initialSubtitleId?.let { setSubtitleTrack(it) } ?: disableSubtitles()
}
if (!isReadyToSeek) {
isReadyToSeek = true
mainHandler.post { delegate?.onReadyToSeek() }
}
if (_isLoading) {
_isLoading = false
mainHandler.post { delegate?.onLoadingChanged(false) }
}
}
MPVLib.MPV_EVENT_SEEK -> {
// Seek started - show loading indicator
if (!_isLoading) {
_isLoading = true
mainHandler.post { delegate?.onLoadingChanged(true) }
}
}
MPVLib.MPV_EVENT_PLAYBACK_RESTART -> {
// Video playback has started/restarted (including after seek)
if (_isLoading) {
_isLoading = false
mainHandler.post { delegate?.onLoadingChanged(false) }
}
}
MPVLib.MPV_EVENT_END_FILE -> {
Log.i(TAG, "Playback ended")
}
MPVLib.MPV_EVENT_SHUTDOWN -> {
Log.w(TAG, "MPV shutdown")
}
}
}
}

View File

@@ -0,0 +1,220 @@
package expo.modules.mpvplayer
import android.content.Context
import android.util.Log
import android.view.Surface
import dev.jdtech.mpv.MPVLib as LibMPV
/**
* Wrapper around the dev.jdtech.mpv.MPVLib class.
* This provides a consistent interface for the rest of the app.
*/
object MPVLib {
private const val TAG = "MPVLib"
private var initialized = false
// Event observer interface
interface EventObserver {
fun eventProperty(property: String)
fun eventProperty(property: String, value: Long)
fun eventProperty(property: String, value: Boolean)
fun eventProperty(property: String, value: String)
fun eventProperty(property: String, value: Double)
fun event(eventId: Int)
}
private val observers = mutableListOf<EventObserver>()
// Library event observer that forwards to our observers
private val libObserver = object : LibMPV.EventObserver {
override fun eventProperty(property: String) {
synchronized(observers) {
for (observer in observers) {
observer.eventProperty(property)
}
}
}
override fun eventProperty(property: String, value: Long) {
synchronized(observers) {
for (observer in observers) {
observer.eventProperty(property, value)
}
}
}
override fun eventProperty(property: String, value: Boolean) {
synchronized(observers) {
for (observer in observers) {
observer.eventProperty(property, value)
}
}
}
override fun eventProperty(property: String, value: String) {
synchronized(observers) {
for (observer in observers) {
observer.eventProperty(property, value)
}
}
}
override fun eventProperty(property: String, value: Double) {
synchronized(observers) {
for (observer in observers) {
observer.eventProperty(property, value)
}
}
}
override fun event(eventId: Int) {
synchronized(observers) {
for (observer in observers) {
observer.event(eventId)
}
}
}
}
fun addObserver(observer: EventObserver) {
synchronized(observers) {
observers.add(observer)
}
}
fun removeObserver(observer: EventObserver) {
synchronized(observers) {
observers.remove(observer)
}
}
// MPV Event IDs
const val MPV_EVENT_NONE = 0
const val MPV_EVENT_SHUTDOWN = 1
const val MPV_EVENT_LOG_MESSAGE = 2
const val MPV_EVENT_GET_PROPERTY_REPLY = 3
const val MPV_EVENT_SET_PROPERTY_REPLY = 4
const val MPV_EVENT_COMMAND_REPLY = 5
const val MPV_EVENT_START_FILE = 6
const val MPV_EVENT_END_FILE = 7
const val MPV_EVENT_FILE_LOADED = 8
const val MPV_EVENT_IDLE = 11
const val MPV_EVENT_TICK = 14
const val MPV_EVENT_CLIENT_MESSAGE = 16
const val MPV_EVENT_VIDEO_RECONFIG = 17
const val MPV_EVENT_AUDIO_RECONFIG = 18
const val MPV_EVENT_SEEK = 20
const val MPV_EVENT_PLAYBACK_RESTART = 21
const val MPV_EVENT_PROPERTY_CHANGE = 22
const val MPV_EVENT_QUEUE_OVERFLOW = 24
// End file reason
const val MPV_END_FILE_REASON_EOF = 0
const val MPV_END_FILE_REASON_STOP = 2
const val MPV_END_FILE_REASON_QUIT = 3
const val MPV_END_FILE_REASON_ERROR = 4
const val MPV_END_FILE_REASON_REDIRECT = 5
/**
* Create and initialize the MPV library
*/
fun create(context: Context, configDir: String? = null) {
if (initialized) return
try {
LibMPV.create(context)
LibMPV.addObserver(libObserver)
initialized = true
Log.i(TAG, "libmpv created successfully")
} catch (e: Exception) {
Log.e(TAG, "Failed to create libmpv: ${e.message}")
throw e
}
}
fun initialize() {
LibMPV.init()
}
fun destroy() {
if (!initialized) return
try {
LibMPV.removeObserver(libObserver)
LibMPV.destroy()
} catch (e: Exception) {
Log.e(TAG, "Error destroying mpv: ${e.message}")
}
initialized = false
}
fun isInitialized(): Boolean = initialized
fun attachSurface(surface: Surface) {
LibMPV.attachSurface(surface)
}
fun detachSurface() {
LibMPV.detachSurface()
}
fun command(cmd: Array<String?>) {
LibMPV.command(cmd)
}
fun setOptionString(name: String, value: String): Int {
return LibMPV.setOptionString(name, value)
}
fun getPropertyInt(name: String): Int? {
return try {
LibMPV.getPropertyInt(name)
} catch (e: Exception) {
null
}
}
fun getPropertyDouble(name: String): Double? {
return try {
LibMPV.getPropertyDouble(name)
} catch (e: Exception) {
null
}
}
fun getPropertyBoolean(name: String): Boolean? {
return try {
LibMPV.getPropertyBoolean(name)
} catch (e: Exception) {
null
}
}
fun getPropertyString(name: String): String? {
return try {
LibMPV.getPropertyString(name)
} catch (e: Exception) {
null
}
}
fun setPropertyInt(name: String, value: Int) {
LibMPV.setPropertyInt(name, value)
}
fun setPropertyDouble(name: String, value: Double) {
LibMPV.setPropertyDouble(name, value)
}
fun setPropertyBoolean(name: String, value: Boolean) {
LibMPV.setPropertyBoolean(name, value)
}
fun setPropertyString(name: String, value: String) {
LibMPV.setPropertyString(name, value)
}
fun observeProperty(name: String, format: Int) {
LibMPV.observeProperty(name, format)
}
}

View File

@@ -2,49 +2,179 @@ package expo.modules.mpvplayer
import expo.modules.kotlin.modules.Module import expo.modules.kotlin.modules.Module
import expo.modules.kotlin.modules.ModuleDefinition import expo.modules.kotlin.modules.ModuleDefinition
import java.net.URL
class MpvPlayerModule : Module() { class MpvPlayerModule : Module() {
// Each module class must implement the definition function. The definition consists of components
// that describes the module's functionality and behavior.
// See https://docs.expo.dev/modules/module-api for more details about available components.
override fun definition() = ModuleDefinition { override fun definition() = ModuleDefinition {
// Sets the name of the module that JavaScript code will use to refer to the module. Takes a string as an argument.
// Can be inferred from module's class name, but it's recommended to set it explicitly for clarity.
// The module will be accessible from `requireNativeModule('MpvPlayer')` in JavaScript.
Name("MpvPlayer") Name("MpvPlayer")
// Defines constant property on the module.
Constant("PI") {
Math.PI
}
// Defines event names that the module can send to JavaScript. // Defines event names that the module can send to JavaScript.
Events("onChange") Events("onChange")
// Defines a JavaScript synchronous function that runs the native code on the JavaScript thread. // Defines a JavaScript synchronous function that runs the native code on the JavaScript thread.
Function("hello") { Function("hello") {
"Hello world! 👋" "Hello from MPV Player! 👋"
} }
// Defines a JavaScript function that always returns a Promise and whose native code // Defines a JavaScript function that always returns a Promise and whose native code
// is by default dispatched on the different thread than the JavaScript runtime runs on. // is by default dispatched on the different thread than the JavaScript runtime runs on.
AsyncFunction("setValueAsync") { value: String -> AsyncFunction("setValueAsync") { value: String ->
// Send an event to JavaScript. sendEvent("onChange", mapOf("value" to value))
sendEvent("onChange", mapOf(
"value" to value
))
} }
// Enables the module to be used as a native view. Definition components that are accepted as part of // Enables the module to be used as a native view.
// the view definition: Prop, Events.
View(MpvPlayerView::class) { View(MpvPlayerView::class) {
// Defines a setter for the `url` prop. // All video load options are passed via a single "source" prop
Prop("url") { view: MpvPlayerView, url: URL -> Prop("source") { view: MpvPlayerView, source: Map<String, Any?>? ->
view.webView.loadUrl(url.toString()) if (source == null) return@Prop
val urlString = source["url"] as? String ?: return@Prop
@Suppress("UNCHECKED_CAST")
val config = VideoLoadConfig(
url = urlString,
headers = source["headers"] as? Map<String, String>,
externalSubtitles = source["externalSubtitles"] as? List<String>,
startPosition = (source["startPosition"] as? Number)?.toDouble(),
autoplay = (source["autoplay"] as? Boolean) ?: true,
initialSubtitleId = (source["initialSubtitleId"] as? Number)?.toInt(),
initialAudioId = (source["initialAudioId"] as? Number)?.toInt()
)
view.loadVideo(config)
} }
// Defines an event that the view can send to JavaScript.
Events("onLoad") // Async function to play video
AsyncFunction("play") { view: MpvPlayerView ->
view.play()
}
// Async function to pause video
AsyncFunction("pause") { view: MpvPlayerView ->
view.pause()
}
// Async function to seek to position
AsyncFunction("seekTo") { view: MpvPlayerView, position: Double ->
view.seekTo(position)
}
// Async function to seek by offset
AsyncFunction("seekBy") { view: MpvPlayerView, offset: Double ->
view.seekBy(offset)
}
// Async function to set playback speed
AsyncFunction("setSpeed") { view: MpvPlayerView, speed: Double ->
view.setSpeed(speed)
}
// Function to get current speed
AsyncFunction("getSpeed") { view: MpvPlayerView ->
view.getSpeed()
}
// Function to check if paused
AsyncFunction("isPaused") { view: MpvPlayerView ->
view.isPaused()
}
// Function to get current position
AsyncFunction("getCurrentPosition") { view: MpvPlayerView ->
view.getCurrentPosition()
}
// Function to get duration
AsyncFunction("getDuration") { view: MpvPlayerView ->
view.getDuration()
}
// Picture in Picture functions
AsyncFunction("startPictureInPicture") { view: MpvPlayerView ->
view.startPictureInPicture()
}
AsyncFunction("stopPictureInPicture") { view: MpvPlayerView ->
view.stopPictureInPicture()
}
AsyncFunction("isPictureInPictureSupported") { view: MpvPlayerView ->
view.isPictureInPictureSupported()
}
AsyncFunction("isPictureInPictureActive") { view: MpvPlayerView ->
view.isPictureInPictureActive()
}
// Subtitle functions
AsyncFunction("getSubtitleTracks") { view: MpvPlayerView ->
view.getSubtitleTracks()
}
AsyncFunction("setSubtitleTrack") { view: MpvPlayerView, trackId: Int ->
view.setSubtitleTrack(trackId)
}
AsyncFunction("disableSubtitles") { view: MpvPlayerView ->
view.disableSubtitles()
}
AsyncFunction("getCurrentSubtitleTrack") { view: MpvPlayerView ->
view.getCurrentSubtitleTrack()
}
AsyncFunction("addSubtitleFile") { view: MpvPlayerView, url: String, select: Boolean ->
view.addSubtitleFile(url, select)
}
// Subtitle positioning functions
AsyncFunction("setSubtitlePosition") { view: MpvPlayerView, position: Int ->
view.setSubtitlePosition(position)
}
AsyncFunction("setSubtitleScale") { view: MpvPlayerView, scale: Double ->
view.setSubtitleScale(scale)
}
AsyncFunction("setSubtitleMarginY") { view: MpvPlayerView, margin: Int ->
view.setSubtitleMarginY(margin)
}
AsyncFunction("setSubtitleAlignX") { view: MpvPlayerView, alignment: String ->
view.setSubtitleAlignX(alignment)
}
AsyncFunction("setSubtitleAlignY") { view: MpvPlayerView, alignment: String ->
view.setSubtitleAlignY(alignment)
}
AsyncFunction("setSubtitleFontSize") { view: MpvPlayerView, size: Int ->
view.setSubtitleFontSize(size)
}
// Audio track functions
AsyncFunction("getAudioTracks") { view: MpvPlayerView ->
view.getAudioTracks()
}
AsyncFunction("setAudioTrack") { view: MpvPlayerView, trackId: Int ->
view.setAudioTrack(trackId)
}
AsyncFunction("getCurrentAudioTrack") { view: MpvPlayerView ->
view.getCurrentAudioTrack()
}
// Video scaling functions
AsyncFunction("setZoomedToFill") { view: MpvPlayerView, zoomed: Boolean ->
view.setZoomedToFill(zoomed)
}
AsyncFunction("isZoomedToFill") { view: MpvPlayerView ->
view.isZoomedToFill()
}
// Defines events that the view can send to JavaScript
Events("onLoad", "onPlaybackStateChange", "onProgress", "onError", "onTracksReady")
} }
} }
} }

View File

@@ -1,30 +1,398 @@
package expo.modules.mpvplayer package expo.modules.mpvplayer
import android.content.Context import android.content.Context
import android.webkit.WebView import android.graphics.Color
import android.webkit.WebViewClient import android.os.Build
import android.util.Log
import android.view.SurfaceHolder
import android.view.SurfaceView
import android.widget.FrameLayout
import expo.modules.kotlin.AppContext import expo.modules.kotlin.AppContext
import expo.modules.kotlin.viewevent.EventDispatcher import expo.modules.kotlin.viewevent.EventDispatcher
import expo.modules.kotlin.views.ExpoView import expo.modules.kotlin.views.ExpoView
class MpvPlayerView(context: Context, appContext: AppContext) : ExpoView(context, appContext) { /**
// Creates and initializes an event dispatcher for the `onLoad` event. * Configuration for loading a video
// The name of the event is inferred from the value and needs to match the event name defined in the module. */
private val onLoad by EventDispatcher() data class VideoLoadConfig(
val url: String,
val headers: Map<String, String>? = null,
val externalSubtitles: List<String>? = null,
val startPosition: Double? = null,
val autoplay: Boolean = true,
val initialSubtitleId: Int? = null,
val initialAudioId: Int? = null
)
// Defines a WebView that will be used as the root subview. /**
internal val webView = WebView(context).apply { * MpvPlayerView - ExpoView that hosts the MPV player.
layoutParams = LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT) * This mirrors the iOS MpvPlayerView implementation.
webViewClient = object : WebViewClient() { */
override fun onPageFinished(view: WebView, url: String) { class MpvPlayerView(context: Context, appContext: AppContext) : ExpoView(context, appContext),
// Sends an event to JavaScript. Triggers a callback defined on the view component in JavaScript. MPVLayerRenderer.Delegate, SurfaceHolder.Callback {
onLoad(mapOf("url" to url))
} companion object {
private const val TAG = "MpvPlayerView"
/**
* Detect if running on an Android emulator.
* MPV player has EGL/OpenGL compatibility issues on emulators.
*/
private fun isEmulator(): Boolean {
return (Build.FINGERPRINT.startsWith("generic")
|| Build.FINGERPRINT.startsWith("unknown")
|| Build.MODEL.contains("google_sdk")
|| Build.MODEL.contains("Emulator")
|| Build.MODEL.contains("Android SDK built for x86")
|| Build.MANUFACTURER.contains("Genymotion")
|| (Build.BRAND.startsWith("generic") && Build.DEVICE.startsWith("generic"))
|| "google_sdk" == Build.PRODUCT
|| Build.HARDWARE.contains("goldfish")
|| Build.HARDWARE.contains("ranchu"))
} }
} }
// Event dispatchers
val onLoad by EventDispatcher()
val onPlaybackStateChange by EventDispatcher()
val onProgress by EventDispatcher()
val onError by EventDispatcher()
val onTracksReady by EventDispatcher()
private var surfaceView: SurfaceView
private var renderer: MPVLayerRenderer? = null
private var pipController: PiPController? = null
private var currentUrl: String? = null
private var cachedPosition: Double = 0.0
private var cachedDuration: Double = 0.0
private var intendedPlayState: Boolean = false
private var surfaceReady: Boolean = false
private var pendingConfig: VideoLoadConfig? = null
init { init {
// Adds the WebView to the view hierarchy. setBackgroundColor(Color.BLACK)
addView(webView)
// Create SurfaceView for video rendering
surfaceView = SurfaceView(context).apply {
layoutParams = FrameLayout.LayoutParams(
FrameLayout.LayoutParams.MATCH_PARENT,
FrameLayout.LayoutParams.MATCH_PARENT
)
holder.addCallback(this@MpvPlayerView)
}
addView(surfaceView)
// Initialize renderer
renderer = MPVLayerRenderer(context)
renderer?.delegate = this
// Initialize PiP controller with Expo's AppContext for proper activity access
pipController = PiPController(context, appContext)
pipController?.setPlayerView(surfaceView)
pipController?.delegate = object : PiPController.Delegate {
override fun onPlay() {
play()
}
override fun onPause() {
pause()
}
override fun onSeekBy(seconds: Double) {
seekBy(seconds)
}
}
// Start the renderer (skip on emulators to avoid EGL crashes)
if (isEmulator()) {
Log.w(TAG, "Running on emulator - MPV player disabled due to EGL/OpenGL compatibility issues")
// Don't start renderer on emulator, will show error when trying to play
} else {
try {
renderer?.start()
} catch (e: Exception) {
Log.e(TAG, "Failed to start renderer: ${e.message}")
onError(mapOf("error" to "Failed to start renderer: ${e.message}"))
}
}
}
private var isOnEmulator: Boolean = isEmulator()
// MARK: - SurfaceHolder.Callback
override fun surfaceCreated(holder: SurfaceHolder) {
Log.i(TAG, "Surface created")
surfaceReady = true
renderer?.attachSurface(holder.surface)
// If we have a pending load, execute it now
pendingConfig?.let { config ->
loadVideoInternal(config)
pendingConfig = null
}
}
override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {
Log.i(TAG, "Surface changed: ${width}x${height}")
// Update MPV with the new surface size (Findroid approach)
renderer?.updateSurfaceSize(width, height)
}
override fun surfaceDestroyed(holder: SurfaceHolder) {
Log.i(TAG, "Surface destroyed")
surfaceReady = false
renderer?.detachSurface()
}
// MARK: - Video Loading
fun loadVideo(config: VideoLoadConfig) {
// Block video loading on emulators
if (isOnEmulator) {
Log.w(TAG, "Cannot load video on emulator - MPV player not supported")
onError(mapOf("error" to "MPV player is not supported on emulators. Please test on a real device."))
return
}
// Skip reload if same URL is already playing
if (currentUrl == config.url) {
return
}
if (!surfaceReady) {
// Surface not ready, store config and load when ready
pendingConfig = config
return
}
loadVideoInternal(config)
}
private fun loadVideoInternal(config: VideoLoadConfig) {
currentUrl = config.url
renderer?.load(
url = config.url,
headers = config.headers,
startPosition = config.startPosition,
externalSubtitles = config.externalSubtitles,
initialSubtitleId = config.initialSubtitleId,
initialAudioId = config.initialAudioId
)
if (config.autoplay) {
play()
}
onLoad(mapOf("url" to config.url))
}
// Convenience method for simple loads
fun loadVideo(url: String, headers: Map<String, String>? = null) {
loadVideo(VideoLoadConfig(url = url, headers = headers))
}
// MARK: - Playback Controls
fun play() {
intendedPlayState = true
renderer?.play()
pipController?.setPlaybackRate(1.0)
}
fun pause() {
intendedPlayState = false
renderer?.pause()
pipController?.setPlaybackRate(0.0)
}
fun seekTo(position: Double) {
renderer?.seekTo(position)
}
fun seekBy(offset: Double) {
renderer?.seekBy(offset)
}
fun setSpeed(speed: Double) {
renderer?.setSpeed(speed)
}
fun getSpeed(): Double {
return renderer?.getSpeed() ?: 1.0
}
fun isPaused(): Boolean {
return renderer?.isPausedState ?: true
}
fun getCurrentPosition(): Double {
return cachedPosition
}
fun getDuration(): Double {
return cachedDuration
}
// MARK: - Picture in Picture
fun startPictureInPicture() {
Log.i(TAG, "startPictureInPicture called")
pipController?.startPictureInPicture()
}
fun stopPictureInPicture() {
pipController?.stopPictureInPicture()
}
fun isPictureInPictureSupported(): Boolean {
return pipController?.isPictureInPictureSupported() ?: false
}
fun isPictureInPictureActive(): Boolean {
return pipController?.isPictureInPictureActive() ?: false
}
// MARK: - Subtitle Controls
fun getSubtitleTracks(): List<Map<String, Any>> {
return renderer?.getSubtitleTracks() ?: emptyList()
}
fun setSubtitleTrack(trackId: Int) {
renderer?.setSubtitleTrack(trackId)
}
fun disableSubtitles() {
renderer?.disableSubtitles()
}
fun getCurrentSubtitleTrack(): Int {
return renderer?.getCurrentSubtitleTrack() ?: 0
}
fun addSubtitleFile(url: String, select: Boolean = true) {
renderer?.addSubtitleFile(url, select)
}
// MARK: - Subtitle Positioning
fun setSubtitlePosition(position: Int) {
renderer?.setSubtitlePosition(position)
}
fun setSubtitleScale(scale: Double) {
renderer?.setSubtitleScale(scale)
}
fun setSubtitleMarginY(margin: Int) {
renderer?.setSubtitleMarginY(margin)
}
fun setSubtitleAlignX(alignment: String) {
renderer?.setSubtitleAlignX(alignment)
}
fun setSubtitleAlignY(alignment: String) {
renderer?.setSubtitleAlignY(alignment)
}
fun setSubtitleFontSize(size: Int) {
renderer?.setSubtitleFontSize(size)
}
// MARK: - Audio Track Controls
fun getAudioTracks(): List<Map<String, Any>> {
return renderer?.getAudioTracks() ?: emptyList()
}
fun setAudioTrack(trackId: Int) {
renderer?.setAudioTrack(trackId)
}
fun getCurrentAudioTrack(): Int {
return renderer?.getCurrentAudioTrack() ?: 0
}
// MARK: - Video Scaling
private var _isZoomedToFill: Boolean = false
fun setZoomedToFill(zoomed: Boolean) {
_isZoomedToFill = zoomed
renderer?.setZoomedToFill(zoomed)
}
fun isZoomedToFill(): Boolean {
return _isZoomedToFill
}
// MARK: - MPVLayerRenderer.Delegate
override fun onPositionChanged(position: Double, duration: Double) {
cachedPosition = position
cachedDuration = duration
// Update PiP progress
if (pipController?.isPictureInPictureActive() == true) {
pipController?.setCurrentTime(position, duration)
}
onProgress(mapOf(
"position" to position,
"duration" to duration,
"progress" to if (duration > 0) position / duration else 0.0
))
}
override fun onPauseChanged(isPaused: Boolean) {
// Sync PiP playback rate
pipController?.setPlaybackRate(if (isPaused) 0.0 else 1.0)
onPlaybackStateChange(mapOf(
"isPaused" to isPaused,
"isPlaying" to !isPaused
))
}
override fun onLoadingChanged(isLoading: Boolean) {
onPlaybackStateChange(mapOf(
"isLoading" to isLoading
))
}
override fun onReadyToSeek() {
onPlaybackStateChange(mapOf(
"isReadyToSeek" to true
))
}
override fun onTracksReady() {
onTracksReady(emptyMap<String, Any>())
}
override fun onVideoDimensionsChanged(width: Int, height: Int) {
// Update PiP controller with video dimensions for proper aspect ratio
pipController?.setVideoDimensions(width, height)
}
override fun onError(message: String) {
onError(mapOf("error" to message))
}
// MARK: - Cleanup
fun cleanup() {
pipController?.stopPictureInPicture()
renderer?.stop()
surfaceView.holder.removeCallback(this)
}
override fun onDetachedFromWindow() {
super.onDetachedFromWindow()
cleanup()
} }
} }

View File

@@ -0,0 +1,263 @@
package expo.modules.mpvplayer
import android.app.Activity
import android.app.PictureInPictureParams
import android.content.Context
import android.content.pm.PackageManager
import android.graphics.Rect
import android.os.Build
import android.util.Log
import android.util.Rational
import android.view.View
import androidx.annotation.RequiresApi
import expo.modules.kotlin.AppContext
/**
* Picture-in-Picture controller for Android.
* This mirrors the iOS PiPController implementation.
*/
class PiPController(private val context: Context, private val appContext: AppContext? = null) {
companion object {
private const val TAG = "PiPController"
private const val DEFAULT_ASPECT_WIDTH = 16
private const val DEFAULT_ASPECT_HEIGHT = 9
}
interface Delegate {
fun onPlay()
fun onPause()
fun onSeekBy(seconds: Double)
}
var delegate: Delegate? = null
private var currentPosition: Double = 0.0
private var currentDuration: Double = 0.0
private var playbackRate: Double = 1.0
// Video dimensions for proper aspect ratio
private var videoWidth: Int = 0
private var videoHeight: Int = 0
// Reference to the player view for source rect
private var playerView: View? = null
/**
* Check if Picture-in-Picture is supported on this device
*/
fun isPictureInPictureSupported(): Boolean {
return if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
context.packageManager.hasSystemFeature(PackageManager.FEATURE_PICTURE_IN_PICTURE)
} else {
false
}
}
/**
* Check if Picture-in-Picture is currently active
*/
fun isPictureInPictureActive(): Boolean {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val activity = getActivity()
return activity?.isInPictureInPictureMode ?: false
}
return false
}
/**
* Start Picture-in-Picture mode
*/
fun startPictureInPicture() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val activity = getActivity()
if (activity == null) {
Log.e(TAG, "Cannot start PiP: no activity found")
return
}
if (!isPictureInPictureSupported()) {
Log.e(TAG, "PiP not supported on this device")
return
}
try {
val params = buildPiPParams(forEntering = true)
activity.enterPictureInPictureMode(params)
Log.i(TAG, "Entered PiP mode")
} catch (e: Exception) {
Log.e(TAG, "Failed to enter PiP: ${e.message}")
}
} else {
Log.w(TAG, "PiP requires Android O or higher")
}
}
/**
* Stop Picture-in-Picture mode
*/
fun stopPictureInPicture() {
// On Android, exiting PiP is typically done by the user
// or by finishing the activity. We can request to move task to back.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val activity = getActivity()
if (activity?.isInPictureInPictureMode == true) {
// Move task to back which will exit PiP
activity.moveTaskToBack(false)
}
}
}
/**
* Update the current playback position and duration
* Note: We don't update PiP params here as we're not using progress in PiP controls
*/
fun setCurrentTime(position: Double, duration: Double) {
currentPosition = position
currentDuration = duration
}
/**
* Set the playback rate (0.0 for paused, 1.0 for playing)
*/
fun setPlaybackRate(rate: Double) {
playbackRate = rate
// Update PiP params to reflect play/pause state
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val activity = getActivity()
if (activity?.isInPictureInPictureMode == true) {
try {
activity.setPictureInPictureParams(buildPiPParams())
} catch (e: Exception) {
Log.e(TAG, "Failed to update PiP params: ${e.message}")
}
}
}
}
/**
* Set the video dimensions for proper aspect ratio calculation
*/
fun setVideoDimensions(width: Int, height: Int) {
if (width > 0 && height > 0) {
videoWidth = width
videoHeight = height
Log.i(TAG, "Video dimensions set: ${width}x${height}")
// Update PiP params if active
updatePiPParamsIfNeeded()
}
}
/**
* Set the player view reference for source rect hint
*/
fun setPlayerView(view: View?) {
playerView = view
}
private fun updatePiPParamsIfNeeded() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val activity = getActivity()
if (activity?.isInPictureInPictureMode == true) {
try {
activity.setPictureInPictureParams(buildPiPParams())
} catch (e: Exception) {
Log.e(TAG, "Failed to update PiP params: ${e.message}")
}
}
}
}
/**
* Build Picture-in-Picture params for the current player state.
* Calculates proper aspect ratio and source rect based on video and view dimensions.
*/
@RequiresApi(Build.VERSION_CODES.O)
private fun buildPiPParams(forEntering: Boolean = false): PictureInPictureParams {
val view = playerView
val viewWidth = view?.width ?: 0
val viewHeight = view?.height ?: 0
// Display aspect ratio from view (exactly like Findroid)
val displayAspectRatio = Rational(viewWidth.coerceAtLeast(1), viewHeight.coerceAtLeast(1))
// Video aspect ratio with 2.39:1 clamping (exactly like Findroid)
// Findroid: Rational(it.width.coerceAtMost((it.height * 2.39f).toInt()),
// it.height.coerceAtMost((it.width * 2.39f).toInt()))
val aspectRatio = if (videoWidth > 0 && videoHeight > 0) {
Rational(
videoWidth.coerceAtMost((videoHeight * 2.39f).toInt()),
videoHeight.coerceAtMost((videoWidth * 2.39f).toInt())
)
} else {
Rational(DEFAULT_ASPECT_WIDTH, DEFAULT_ASPECT_HEIGHT)
}
// Source rect hint calculation (exactly like Findroid)
val sourceRectHint = if (viewWidth > 0 && viewHeight > 0 && videoWidth > 0 && videoHeight > 0) {
if (displayAspectRatio < aspectRatio) {
// Letterboxing - black bars top/bottom
val space = ((viewHeight - (viewWidth.toFloat() / aspectRatio.toFloat())) / 2).toInt()
Rect(
0,
space,
viewWidth,
(viewWidth.toFloat() / aspectRatio.toFloat()).toInt() + space
)
} else {
// Pillarboxing - black bars left/right
val space = ((viewWidth - (viewHeight.toFloat() * aspectRatio.toFloat())) / 2).toInt()
Rect(
space,
0,
(viewHeight.toFloat() * aspectRatio.toFloat()).toInt() + space,
viewHeight
)
}
} else {
null
}
val builder = PictureInPictureParams.Builder()
.setAspectRatio(aspectRatio)
sourceRectHint?.let { builder.setSourceRectHint(it) }
// On Android 12+, enable auto-enter (like Findroid)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
builder.setAutoEnterEnabled(true)
}
return builder.build()
}
private fun getActivity(): Activity? {
// First try Expo's AppContext (preferred in React Native)
appContext?.currentActivity?.let { return it }
// Fallback: Try to get from context wrapper chain
var ctx = context
while (ctx is android.content.ContextWrapper) {
if (ctx is Activity) {
return ctx
}
ctx = ctx.baseContext
}
return null
}
/**
* Handle PiP action (called from activity when user taps PiP controls)
*/
fun handlePiPAction(action: String) {
when (action) {
"play" -> delegate?.onPlay()
"pause" -> delegate?.onPause()
"skip_forward" -> delegate?.onSeekBy(10.0)
"skip_backward" -> delegate?.onSeekBy(-10.0)
}
}
}

View File

@@ -1,5 +1,8 @@
{ {
"platforms": ["android", "web"], "platforms": ["apple", "android", "web"],
"apple": {
"modules": ["MpvPlayerModule"]
},
"android": { "android": {
"modules": ["expo.modules.mpvplayer.MpvPlayerModule"] "modules": ["expo.modules.mpvplayer.MpvPlayerModule"]
} }

View File

@@ -0,0 +1,726 @@
import UIKit
import MPVKit
import CoreMedia
import CoreVideo
import AVFoundation
protocol MPVLayerRendererDelegate: AnyObject {
func renderer(_ renderer: MPVLayerRenderer, didUpdatePosition position: Double, duration: Double)
func renderer(_ renderer: MPVLayerRenderer, didChangePause isPaused: Bool)
func renderer(_ renderer: MPVLayerRenderer, didChangeLoading isLoading: Bool)
func renderer(_ renderer: MPVLayerRenderer, didBecomeReadyToSeek: Bool)
func renderer(_ renderer: MPVLayerRenderer, didBecomeTracksReady: Bool)
}
/// MPV player using vo_avfoundation for video output.
/// This renders video directly to AVSampleBufferDisplayLayer for PiP support.
final class MPVLayerRenderer {
enum RendererError: Error {
case mpvCreationFailed
case mpvInitialization(Int32)
}
private let displayLayer: AVSampleBufferDisplayLayer
private let queue = DispatchQueue(label: "mpv.avfoundation", qos: .userInitiated)
private let stateQueue = DispatchQueue(label: "mpv.avfoundation.state", attributes: .concurrent)
private var mpv: OpaquePointer?
private var currentPreset: PlayerPreset?
private var currentURL: URL?
private var currentHeaders: [String: String]?
private var pendingExternalSubtitles: [String] = []
private var initialSubtitleId: Int?
private var initialAudioId: Int?
private var isRunning = false
private var isStopping = false
weak var delegate: MPVLayerRendererDelegate?
// Thread-safe state for playback
private var _cachedDuration: Double = 0
private var _cachedPosition: Double = 0
private var _isPaused: Bool = true
private var _playbackSpeed: Double = 1.0
private var _isLoading: Bool = false
private var _isReadyToSeek: Bool = false
// Thread-safe accessors
private var cachedDuration: Double {
get { stateQueue.sync { _cachedDuration } }
set { stateQueue.async(flags: .barrier) { self._cachedDuration = newValue } }
}
private var cachedPosition: Double {
get { stateQueue.sync { _cachedPosition } }
set { stateQueue.async(flags: .barrier) { self._cachedPosition = newValue } }
}
private var isPaused: Bool {
get { stateQueue.sync { _isPaused } }
set { stateQueue.async(flags: .barrier) { self._isPaused = newValue } }
}
private var playbackSpeed: Double {
get { stateQueue.sync { _playbackSpeed } }
set { stateQueue.async(flags: .barrier) { self._playbackSpeed = newValue } }
}
private var isLoading: Bool {
get { stateQueue.sync { _isLoading } }
set { stateQueue.async(flags: .barrier) { self._isLoading = newValue } }
}
private var isReadyToSeek: Bool {
get { stateQueue.sync { _isReadyToSeek } }
set { stateQueue.async(flags: .barrier) { self._isReadyToSeek = newValue } }
}
var isPausedState: Bool {
return isPaused
}
init(displayLayer: AVSampleBufferDisplayLayer) {
self.displayLayer = displayLayer
}
deinit {
stop()
}
func start() throws {
guard !isRunning else { return }
guard let handle = mpv_create() else {
throw RendererError.mpvCreationFailed
}
mpv = handle
// Logging - only warnings and errors in release, verbose in debug
#if DEBUG
checkError(mpv_request_log_messages(handle, "warn"))
#else
checkError(mpv_request_log_messages(handle, "no"))
#endif
// Detect if running on simulator
#if targetEnvironment(simulator)
let isSimulator = true
#else
let isSimulator = false
#endif
// Pass the AVSampleBufferDisplayLayer to mpv via --wid
// The vo_avfoundation driver expects this
let layerPtrInt = Int(bitPattern: Unmanaged.passUnretained(displayLayer).toOpaque())
var displayLayerPtr = Int64(layerPtrInt)
checkError(mpv_set_option(handle, "wid", MPV_FORMAT_INT64, &displayLayerPtr))
// Use AVFoundation video output - required for PiP support
checkError(mpv_set_option_string(handle, "vo", "avfoundation"))
// Enable composite OSD mode - renders subtitles directly onto video frames using GPU
// This is better for PiP as subtitles are baked into the video
checkError(mpv_set_option_string(handle, "avfoundation-composite-osd", "yes"))
// Hardware decoding with VideoToolbox
// On simulator, use software decoding since VideoToolbox is not available
// On device, use VideoToolbox with software fallback enabled
let hwdecValue = isSimulator ? "no" : "videotoolbox"
checkError(mpv_set_option_string(handle, "hwdec", hwdecValue))
checkError(mpv_set_option_string(handle, "hwdec-codecs", "all"))
checkError(mpv_set_option_string(handle, "hwdec-software-fallback", "yes"))
// Subtitle and audio settings
checkError(mpv_set_option_string(mpv, "subs-match-os-language", "yes"))
checkError(mpv_set_option_string(mpv, "subs-fallback", "yes"))
// Initialize mpv
let initStatus = mpv_initialize(handle)
guard initStatus >= 0 else {
throw RendererError.mpvInitialization(initStatus)
}
// Observe properties
observeProperties()
// Setup wakeup callback
mpv_set_wakeup_callback(handle, { ctx in
guard let ctx = ctx else { return }
let instance = Unmanaged<MPVLayerRenderer>.fromOpaque(ctx).takeUnretainedValue()
instance.processEvents()
}, Unmanaged.passUnretained(self).toOpaque())
isRunning = true
}
func stop() {
if isStopping { return }
if !isRunning, mpv == nil { return }
isRunning = false
isStopping = true
queue.sync { [weak self] in
guard let self, let handle = self.mpv else { return }
mpv_set_wakeup_callback(handle, nil, nil)
mpv_terminate_destroy(handle)
self.mpv = nil
}
DispatchQueue.main.async { [weak self] in
guard let self else { return }
if #available(iOS 18.0, *) {
self.displayLayer.sampleBufferRenderer.flush(removingDisplayedImage: true, completionHandler: nil)
} else {
self.displayLayer.flushAndRemoveImage()
}
}
isStopping = false
}
func load(
url: URL,
with preset: PlayerPreset,
headers: [String: String]? = nil,
startPosition: Double? = nil,
externalSubtitles: [String]? = nil,
initialSubtitleId: Int? = nil,
initialAudioId: Int? = nil
) {
currentPreset = preset
currentURL = url
currentHeaders = headers
pendingExternalSubtitles = externalSubtitles ?? []
self.initialSubtitleId = initialSubtitleId
self.initialAudioId = initialAudioId
queue.async { [weak self] in
guard let self else { return }
self.isLoading = true
self.isReadyToSeek = false
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didChangeLoading: true)
}
guard let handle = self.mpv else { return }
self.apply(commands: preset.commands, on: handle)
// Stop previous playback before loading new file
self.command(handle, ["stop"])
self.updateHTTPHeaders(headers)
// Set start position
if let startPos = startPosition, startPos > 0 {
self.setProperty(name: "start", value: String(format: "%.2f", startPos))
} else {
self.setProperty(name: "start", value: "0")
}
// Set initial audio track if specified
if let audioId = self.initialAudioId, audioId > 0 {
self.setAudioTrack(audioId)
}
// Set initial subtitle track if no external subs
if self.pendingExternalSubtitles.isEmpty {
if let subId = self.initialSubtitleId {
self.setSubtitleTrack(subId)
} else {
self.disableSubtitles()
}
} else {
self.disableSubtitles()
}
let target = url.isFileURL ? url.path : url.absoluteString
self.command(handle, ["loadfile", target, "replace"])
}
}
func reloadCurrentItem() {
guard let url = currentURL, let preset = currentPreset else { return }
load(url: url, with: preset, headers: currentHeaders)
}
func applyPreset(_ preset: PlayerPreset) {
currentPreset = preset
guard let handle = mpv else { return }
queue.async { [weak self] in
guard let self else { return }
self.apply(commands: preset.commands, on: handle)
}
}
// MARK: - Property Helpers
private func setOption(name: String, value: String) {
guard let handle = mpv else { return }
checkError(mpv_set_option_string(handle, name, value))
}
private func setProperty(name: String, value: String) {
guard let handle = mpv else { return }
let status = mpv_set_property_string(handle, name, value)
if status < 0 {
Logger.shared.log("Failed to set property \(name)=\(value) (\(status))", type: "Warn")
}
}
private func clearProperty(name: String) {
guard let handle = mpv else { return }
let status = mpv_set_property(handle, name, MPV_FORMAT_NONE, nil)
if status < 0 {
Logger.shared.log("Failed to clear property \(name) (\(status))", type: "Warn")
}
}
private func updateHTTPHeaders(_ headers: [String: String]?) {
guard let headers, !headers.isEmpty else {
clearProperty(name: "http-header-fields")
return
}
let headerString = headers
.map { key, value in "\(key): \(value)" }
.joined(separator: "\r\n")
setProperty(name: "http-header-fields", value: headerString)
}
private func observeProperties() {
guard let handle = mpv else { return }
let properties: [(String, mpv_format)] = [
("duration", MPV_FORMAT_DOUBLE),
("time-pos", MPV_FORMAT_DOUBLE),
("pause", MPV_FORMAT_FLAG),
("track-list/count", MPV_FORMAT_INT64),
("paused-for-cache", MPV_FORMAT_FLAG)
]
for (name, format) in properties {
mpv_observe_property(handle, 0, name, format)
}
}
private func apply(commands: [[String]], on handle: OpaquePointer) {
for command in commands {
guard !command.isEmpty else { continue }
self.command(handle, command)
}
}
private func command(_ handle: OpaquePointer, _ args: [String]) {
guard !args.isEmpty else { return }
_ = withCStringArray(args) { pointer in
mpv_command_async(handle, 0, pointer)
}
}
private func commandSync(_ handle: OpaquePointer, _ args: [String]) -> Int32 {
guard !args.isEmpty else { return -1 }
return withCStringArray(args) { pointer in
mpv_command(handle, pointer)
}
}
private func checkError(_ status: CInt) {
if status < 0 {
Logger.shared.log("MPV API error: \(String(cString: mpv_error_string(status)))", type: "Error")
}
}
// MARK: - Event Handling
private func processEvents() {
queue.async { [weak self] in
guard let self else { return }
while self.mpv != nil && !self.isStopping {
guard let handle = self.mpv,
let eventPointer = mpv_wait_event(handle, 0) else { return }
let event = eventPointer.pointee
if event.event_id == MPV_EVENT_NONE { break }
self.handleEvent(event)
if event.event_id == MPV_EVENT_SHUTDOWN { break }
}
}
}
private func handleEvent(_ event: mpv_event) {
switch event.event_id {
case MPV_EVENT_FILE_LOADED:
// Add external subtitles now that the file is loaded
let hadExternalSubs = !pendingExternalSubtitles.isEmpty
if hadExternalSubs, let handle = mpv {
for subUrl in pendingExternalSubtitles {
command(handle, ["sub-add", subUrl])
}
pendingExternalSubtitles = []
// Set subtitle after external subs are added
if let subId = initialSubtitleId {
setSubtitleTrack(subId)
} else {
disableSubtitles()
}
}
if !isReadyToSeek {
isReadyToSeek = true
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didBecomeReadyToSeek: true)
}
}
// Notify loading ended
if isLoading {
isLoading = false
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didChangeLoading: false)
}
}
case MPV_EVENT_SEEK:
// Seek started - show loading indicator
if !isLoading {
isLoading = true
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didChangeLoading: true)
}
}
case MPV_EVENT_PLAYBACK_RESTART:
// Video playback has started/restarted (including after seek)
if isLoading {
isLoading = false
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didChangeLoading: false)
}
}
case MPV_EVENT_PROPERTY_CHANGE:
if let property = event.data?.assumingMemoryBound(to: mpv_event_property.self).pointee.name {
let name = String(cString: property)
refreshProperty(named: name, event: event)
}
case MPV_EVENT_SHUTDOWN:
Logger.shared.log("mpv shutdown", type: "Warn")
case MPV_EVENT_LOG_MESSAGE:
if let logMessagePointer = event.data?.assumingMemoryBound(to: mpv_event_log_message.self) {
let component = String(cString: logMessagePointer.pointee.prefix)
let text = String(cString: logMessagePointer.pointee.text)
let lower = text.lowercased()
if lower.contains("error") {
Logger.shared.log("mpv[\(component)] \(text)", type: "Error")
} else if lower.contains("warn") || lower.contains("warning") {
Logger.shared.log("mpv[\(component)] \(text)", type: "Warn")
}
}
default:
break
}
}
private func refreshProperty(named name: String, event: mpv_event) {
guard let handle = mpv else { return }
switch name {
case "duration":
var value = Double(0)
let status = getProperty(handle: handle, name: name, format: MPV_FORMAT_DOUBLE, value: &value)
if status >= 0 {
cachedDuration = value
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didUpdatePosition: self.cachedPosition, duration: self.cachedDuration)
}
}
case "time-pos":
var value = Double(0)
let status = getProperty(handle: handle, name: name, format: MPV_FORMAT_DOUBLE, value: &value)
if status >= 0 {
cachedPosition = value
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didUpdatePosition: self.cachedPosition, duration: self.cachedDuration)
}
}
case "pause":
var flag: Int32 = 0
let status = getProperty(handle: handle, name: name, format: MPV_FORMAT_FLAG, value: &flag)
if status >= 0 {
let newPaused = flag != 0
if newPaused != isPaused {
isPaused = newPaused
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didChangePause: self.isPaused)
}
}
}
case "paused-for-cache":
var flag: Int32 = 0
let status = getProperty(handle: handle, name: name, format: MPV_FORMAT_FLAG, value: &flag)
if status >= 0 {
let buffering = flag != 0
if buffering != isLoading {
isLoading = buffering
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didChangeLoading: buffering)
}
}
}
case "track-list/count":
var trackCount: Int64 = 0
let status = getProperty(handle: handle, name: name, format: MPV_FORMAT_INT64, value: &trackCount)
if status >= 0 && trackCount > 0 {
Logger.shared.log("Track list updated: \(trackCount) tracks available", type: "Info")
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.delegate?.renderer(self, didBecomeTracksReady: true)
}
}
default:
break
}
}
private func getStringProperty(handle: OpaquePointer, name: String) -> String? {
var result: String?
if let cString = mpv_get_property_string(handle, name) {
result = String(cString: cString)
mpv_free(cString)
}
return result
}
@discardableResult
private func getProperty<T>(handle: OpaquePointer, name: String, format: mpv_format, value: inout T) -> Int32 {
return withUnsafeMutablePointer(to: &value) { mutablePointer in
return mpv_get_property(handle, name, format, mutablePointer)
}
}
@inline(__always)
private func withCStringArray<R>(_ args: [String], body: (UnsafeMutablePointer<UnsafePointer<CChar>?>?) -> R) -> R {
var cStrings = [UnsafeMutablePointer<CChar>?]()
cStrings.reserveCapacity(args.count + 1)
for s in args {
cStrings.append(strdup(s))
}
cStrings.append(nil)
defer {
for ptr in cStrings where ptr != nil {
free(ptr)
}
}
return cStrings.withUnsafeMutableBufferPointer { buffer in
return buffer.baseAddress!.withMemoryRebound(to: UnsafePointer<CChar>?.self, capacity: buffer.count) { rebound in
return body(UnsafeMutablePointer(mutating: rebound))
}
}
}
// MARK: - Playback Controls
func play() {
setProperty(name: "pause", value: "no")
}
func pausePlayback() {
setProperty(name: "pause", value: "yes")
}
func togglePause() {
if isPaused { play() } else { pausePlayback() }
}
func seek(to seconds: Double) {
guard let handle = mpv else { return }
let clamped = max(0, seconds)
cachedPosition = clamped
commandSync(handle, ["seek", String(clamped), "absolute"])
}
func seek(by seconds: Double) {
guard let handle = mpv else { return }
let newPosition = max(0, cachedPosition + seconds)
cachedPosition = newPosition
commandSync(handle, ["seek", String(seconds), "relative"])
}
/// Sync timebase - no-op for vo_avfoundation (mpv handles timing)
func syncTimebase() {
// vo_avfoundation manages its own timebase
}
func setSpeed(_ speed: Double) {
playbackSpeed = speed
setProperty(name: "speed", value: String(speed))
}
func getSpeed() -> Double {
guard let handle = mpv else { return 1.0 }
var speed: Double = 1.0
getProperty(handle: handle, name: "speed", format: MPV_FORMAT_DOUBLE, value: &speed)
return speed
}
// MARK: - Subtitle Controls
func getSubtitleTracks() -> [[String: Any]] {
guard let handle = mpv else {
Logger.shared.log("getSubtitleTracks: mpv handle is nil", type: "Warn")
return []
}
var tracks: [[String: Any]] = []
var trackCount: Int64 = 0
getProperty(handle: handle, name: "track-list/count", format: MPV_FORMAT_INT64, value: &trackCount)
for i in 0..<trackCount {
guard let trackType = getStringProperty(handle: handle, name: "track-list/\(i)/type"),
trackType == "sub" else { continue }
var trackId: Int64 = 0
getProperty(handle: handle, name: "track-list/\(i)/id", format: MPV_FORMAT_INT64, value: &trackId)
var track: [String: Any] = ["id": Int(trackId)]
if let title = getStringProperty(handle: handle, name: "track-list/\(i)/title") {
track["title"] = title
}
if let lang = getStringProperty(handle: handle, name: "track-list/\(i)/lang") {
track["lang"] = lang
}
var selected: Int32 = 0
getProperty(handle: handle, name: "track-list/\(i)/selected", format: MPV_FORMAT_FLAG, value: &selected)
track["selected"] = selected != 0
Logger.shared.log("getSubtitleTracks: found sub track id=\(trackId), title=\(track["title"] ?? "none"), lang=\(track["lang"] ?? "none")", type: "Info")
tracks.append(track)
}
Logger.shared.log("getSubtitleTracks: returning \(tracks.count) subtitle tracks", type: "Info")
return tracks
}
func setSubtitleTrack(_ trackId: Int) {
Logger.shared.log("setSubtitleTrack: setting sid to \(trackId)", type: "Info")
guard mpv != nil else {
Logger.shared.log("setSubtitleTrack: mpv handle is nil!", type: "Error")
return
}
if trackId < 0 {
setProperty(name: "sid", value: "no")
} else {
setProperty(name: "sid", value: String(trackId))
}
}
func disableSubtitles() {
setProperty(name: "sid", value: "no")
}
func getCurrentSubtitleTrack() -> Int {
guard let handle = mpv else { return 0 }
var sid: Int64 = 0
getProperty(handle: handle, name: "sid", format: MPV_FORMAT_INT64, value: &sid)
return Int(sid)
}
func addSubtitleFile(url: String, select: Bool = true) {
guard let handle = mpv else { return }
let flag = select ? "select" : "cached"
commandSync(handle, ["sub-add", url, flag])
}
// MARK: - Subtitle Positioning
func setSubtitlePosition(_ position: Int) {
setProperty(name: "sub-pos", value: String(position))
}
func setSubtitleScale(_ scale: Double) {
setProperty(name: "sub-scale", value: String(scale))
}
func setSubtitleMarginY(_ margin: Int) {
setProperty(name: "sub-margin-y", value: String(margin))
}
func setSubtitleAlignX(_ alignment: String) {
setProperty(name: "sub-align-x", value: alignment)
}
func setSubtitleAlignY(_ alignment: String) {
setProperty(name: "sub-align-y", value: alignment)
}
func setSubtitleFontSize(_ size: Int) {
setProperty(name: "sub-font-size", value: String(size))
}
// MARK: - Audio Track Controls
func getAudioTracks() -> [[String: Any]] {
guard let handle = mpv else {
Logger.shared.log("getAudioTracks: mpv handle is nil", type: "Warn")
return []
}
var tracks: [[String: Any]] = []
var trackCount: Int64 = 0
getProperty(handle: handle, name: "track-list/count", format: MPV_FORMAT_INT64, value: &trackCount)
for i in 0..<trackCount {
guard let trackType = getStringProperty(handle: handle, name: "track-list/\(i)/type"),
trackType == "audio" else { continue }
var trackId: Int64 = 0
getProperty(handle: handle, name: "track-list/\(i)/id", format: MPV_FORMAT_INT64, value: &trackId)
var track: [String: Any] = ["id": Int(trackId)]
if let title = getStringProperty(handle: handle, name: "track-list/\(i)/title") {
track["title"] = title
}
if let lang = getStringProperty(handle: handle, name: "track-list/\(i)/lang") {
track["lang"] = lang
}
if let codec = getStringProperty(handle: handle, name: "track-list/\(i)/codec") {
track["codec"] = codec
}
var channels: Int64 = 0
getProperty(handle: handle, name: "track-list/\(i)/audio-channels", format: MPV_FORMAT_INT64, value: &channels)
if channels > 0 {
track["channels"] = Int(channels)
}
var selected: Int32 = 0
getProperty(handle: handle, name: "track-list/\(i)/selected", format: MPV_FORMAT_FLAG, value: &selected)
track["selected"] = selected != 0
Logger.shared.log("getAudioTracks: found audio track id=\(trackId), title=\(track["title"] ?? "none"), lang=\(track["lang"] ?? "none")", type: "Info")
tracks.append(track)
}
Logger.shared.log("getAudioTracks: returning \(tracks.count) audio tracks", type: "Info")
return tracks
}
func setAudioTrack(_ trackId: Int) {
guard mpv != nil else {
Logger.shared.log("setAudioTrack: mpv handle is nil", type: "Warn")
return
}
Logger.shared.log("setAudioTrack: setting aid to \(trackId)", type: "Info")
setProperty(name: "aid", value: String(trackId))
}
func getCurrentAudioTrack() -> Int {
guard let handle = mpv else { return 0 }
var aid: Int64 = 0
getProperty(handle: handle, name: "aid", format: MPV_FORMAT_INT64, value: &aid)
return Int(aid)
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -13,16 +13,21 @@ Pod::Spec.new do |s|
s.static_framework = true s.static_framework = true
s.dependency 'ExpoModulesCore' s.dependency 'ExpoModulesCore'
s.dependency 'MPVKit', '~> 0.40.0' s.dependency 'MPVKit-GPL'
# Swift/Objective-C compatibility # Swift/Objective-C compatibility
s.pod_target_xcconfig = { s.pod_target_xcconfig = {
'DEFINES_MODULE' => 'YES', 'DEFINES_MODULE' => 'YES',
# Strip debug symbols to avoid DWARF errors from MPVKit 'VALID_ARCHS' => 'arm64',
'EXCLUDED_ARCHS[sdk=iphonesimulator*]' => 'i386',
'DEBUG_INFORMATION_FORMAT' => 'dwarf', 'DEBUG_INFORMATION_FORMAT' => 'dwarf',
'STRIP_INSTALLED_PRODUCT' => 'YES', 'STRIP_INSTALLED_PRODUCT' => 'YES',
'DEPLOYMENT_POSTPROCESSING' => 'YES', 'DEPLOYMENT_POSTPROCESSING' => 'YES',
} }
s.source_files = "**/*.{h,m,mm,swift,hpp,cpp}" s.user_target_xcconfig = {
'EXCLUDED_ARCHS[sdk=iphonesimulator*]' => 'i386'
}
s.source_files = "*.{h,m,mm,swift,hpp,cpp}"
end end

View File

@@ -164,6 +164,15 @@ public class MpvPlayerModule: Module {
return view.getCurrentAudioTrack() return view.getCurrentAudioTrack()
} }
// Video scaling functions
AsyncFunction("setZoomedToFill") { (view: MpvPlayerView, zoomed: Bool) in
view.setZoomedToFill(zoomed)
}
AsyncFunction("isZoomedToFill") { (view: MpvPlayerView) -> Bool in
return view.isZoomedToFill()
}
// Defines events that the view can send to JavaScript // Defines events that the view can send to JavaScript
Events("onLoad", "onPlaybackStateChange", "onProgress", "onError", "onTracksReady") Events("onLoad", "onPlaybackStateChange", "onProgress", "onError", "onTracksReady")
} }

View File

@@ -38,7 +38,7 @@ struct VideoLoadConfig {
// to apply the proper styling (e.g. border radius and shadows). // to apply the proper styling (e.g. border radius and shadows).
class MpvPlayerView: ExpoView { class MpvPlayerView: ExpoView {
private let displayLayer = AVSampleBufferDisplayLayer() private let displayLayer = AVSampleBufferDisplayLayer()
private var renderer: MPVSoftwareRenderer? private var renderer: MPVLayerRenderer?
private var videoContainer: UIView! private var videoContainer: UIView!
private var pipController: PiPController? private var pipController: PiPController?
@@ -52,6 +52,7 @@ class MpvPlayerView: ExpoView {
private var cachedPosition: Double = 0 private var cachedPosition: Double = 0
private var cachedDuration: Double = 0 private var cachedDuration: Double = 0
private var intendedPlayState: Bool = false // For PiP - ignores transient states during seek private var intendedPlayState: Bool = false // For PiP - ignores transient states during seek
private var _isZoomedToFill: Bool = false
required init(appContext: AppContext? = nil) { required init(appContext: AppContext? = nil) {
super.init(appContext: appContext) super.init(appContext: appContext)
@@ -83,7 +84,7 @@ class MpvPlayerView: ExpoView {
videoContainer.bottomAnchor.constraint(equalTo: bottomAnchor) videoContainer.bottomAnchor.constraint(equalTo: bottomAnchor)
]) ])
renderer = MPVSoftwareRenderer(displayLayer: displayLayer) renderer = MPVLayerRenderer(displayLayer: displayLayer)
renderer?.delegate = self renderer?.delegate = self
// Setup PiP // Setup PiP
@@ -148,12 +149,14 @@ class MpvPlayerView: ExpoView {
func play() { func play() {
intendedPlayState = true intendedPlayState = true
renderer?.play() renderer?.play()
pipController?.setPlaybackRate(1.0)
pipController?.updatePlaybackState() pipController?.updatePlaybackState()
} }
func pause() { func pause() {
intendedPlayState = false intendedPlayState = false
renderer?.pausePlayback() renderer?.pausePlayback()
pipController?.setPlaybackRate(0.0)
pipController?.updatePlaybackState() pipController?.updatePlaybackState()
} }
@@ -267,6 +270,17 @@ class MpvPlayerView: ExpoView {
renderer?.setSubtitleFontSize(size) renderer?.setSubtitleFontSize(size)
} }
// MARK: - Video Scaling
func setZoomedToFill(_ zoomed: Bool) {
_isZoomedToFill = zoomed
displayLayer.videoGravity = zoomed ? .resizeAspectFill : .resizeAspect
}
func isZoomedToFill() -> Bool {
return _isZoomedToFill
}
deinit { deinit {
pipController?.stopPictureInPicture() pipController?.stopPictureInPicture()
renderer?.stop() renderer?.stop()
@@ -274,18 +288,18 @@ class MpvPlayerView: ExpoView {
} }
} }
// MARK: - MPVSoftwareRendererDelegate // MARK: - MPVLayerRendererDelegate
extension MpvPlayerView: MPVSoftwareRendererDelegate { extension MpvPlayerView: MPVLayerRendererDelegate {
func renderer(_: MPVSoftwareRenderer, didUpdatePosition position: Double, duration: Double) { func renderer(_: MPVLayerRenderer, didUpdatePosition position: Double, duration: Double) {
cachedPosition = position cachedPosition = position
cachedDuration = duration cachedDuration = duration
DispatchQueue.main.async { [weak self] in DispatchQueue.main.async { [weak self] in
guard let self else { return } guard let self else { return }
// Only update PiP state when PiP is active // Update PiP current time for progress bar
if self.pipController?.isPictureInPictureActive == true { if self.pipController?.isPictureInPictureActive == true {
self.pipController?.updatePlaybackState() self.pipController?.setCurrentTimeFromSeconds(position, duration: duration)
} }
self.onProgress([ self.onProgress([
@@ -296,21 +310,23 @@ extension MpvPlayerView: MPVSoftwareRendererDelegate {
} }
} }
func renderer(_: MPVSoftwareRenderer, didChangePause isPaused: Bool) { func renderer(_: MPVLayerRenderer, didChangePause isPaused: Bool) {
DispatchQueue.main.async { [weak self] in DispatchQueue.main.async { [weak self] in
guard let self else { return } guard let self else { return }
// Don't update intendedPlayState here - it's only set by user actions (play/pause) // Don't update intendedPlayState here - it's only set by user actions (play/pause)
// This prevents PiP UI flicker during seeking // This prevents PiP UI flicker during seeking
// Sync timebase rate with actual playback state
self.pipController?.setPlaybackRate(isPaused ? 0.0 : 1.0)
self.onPlaybackStateChange([ self.onPlaybackStateChange([
"isPaused": isPaused, "isPaused": isPaused,
"isPlaying": !isPaused, "isPlaying": !isPaused,
]) ])
// Note: Don't call updatePlaybackState() here to avoid flicker
// PiP queries pipControllerIsPlaying when it needs the state
} }
} }
func renderer(_: MPVSoftwareRenderer, didChangeLoading isLoading: Bool) { func renderer(_: MPVLayerRenderer, didChangeLoading isLoading: Bool) {
DispatchQueue.main.async { [weak self] in DispatchQueue.main.async { [weak self] in
guard let self else { return } guard let self else { return }
self.onPlaybackStateChange([ self.onPlaybackStateChange([
@@ -319,7 +335,7 @@ extension MpvPlayerView: MPVSoftwareRendererDelegate {
} }
} }
func renderer(_: MPVSoftwareRenderer, didBecomeReadyToSeek: Bool) { func renderer(_: MPVLayerRenderer, didBecomeReadyToSeek: Bool) {
DispatchQueue.main.async { [weak self] in DispatchQueue.main.async { [weak self] in
guard let self else { return } guard let self else { return }
self.onPlaybackStateChange([ self.onPlaybackStateChange([
@@ -328,7 +344,7 @@ extension MpvPlayerView: MPVSoftwareRendererDelegate {
} }
} }
func renderer(_: MPVSoftwareRenderer, didBecomeTracksReady: Bool) { func renderer(_: MPVLayerRenderer, didBecomeTracksReady: Bool) {
DispatchQueue.main.async { [weak self] in DispatchQueue.main.async { [weak self] in
guard let self else { return } guard let self else { return }
self.onTracksReady([:]) self.onTracksReady([:])
@@ -343,12 +359,14 @@ extension MpvPlayerView: PiPControllerDelegate {
print("PiP will start") print("PiP will start")
// Sync timebase before PiP starts for smooth transition // Sync timebase before PiP starts for smooth transition
renderer?.syncTimebase() renderer?.syncTimebase()
pipController?.updatePlaybackState() // Set current time for PiP progress bar
pipController?.setCurrentTimeFromSeconds(cachedPosition, duration: cachedDuration)
} }
func pipController(_ controller: PiPController, didStartPictureInPicture: Bool) { func pipController(_ controller: PiPController, didStartPictureInPicture: Bool) {
print("PiP did start: \(didStartPictureInPicture)") print("PiP did start: \(didStartPictureInPicture)")
pipController?.updatePlaybackState() // Ensure current time is synced when PiP starts
pipController?.setCurrentTimeFromSeconds(cachedPosition, duration: cachedDuration)
} }
func pipController(_ controller: PiPController, willStopPictureInPicture: Bool) { func pipController(_ controller: PiPController, willStopPictureInPicture: Bool) {
@@ -371,12 +389,16 @@ extension MpvPlayerView: PiPControllerDelegate {
func pipControllerPlay(_ controller: PiPController) { func pipControllerPlay(_ controller: PiPController) {
print("PiP play requested") print("PiP play requested")
play() intendedPlayState = true
renderer?.play()
pipController?.setPlaybackRate(1.0)
} }
func pipControllerPause(_ controller: PiPController) { func pipControllerPause(_ controller: PiPController) {
print("PiP pause requested") print("PiP pause requested")
pause() intendedPlayState = false
renderer?.pausePlayback()
pipController?.setPlaybackRate(0.0)
} }
func pipController(_ controller: PiPController, skipByInterval interval: CMTime) { func pipController(_ controller: PiPController, skipByInterval interval: CMTime) {
@@ -394,4 +416,8 @@ extension MpvPlayerView: PiPControllerDelegate {
func pipControllerDuration(_ controller: PiPController) -> Double { func pipControllerDuration(_ controller: PiPController) -> Double {
return getDuration() return getDuration()
} }
func pipControllerCurrentPosition(_ controller: PiPController) -> Double {
return getCurrentPosition()
}
} }

View File

@@ -12,6 +12,7 @@ protocol PiPControllerDelegate: AnyObject {
func pipController(_ controller: PiPController, skipByInterval interval: CMTime) func pipController(_ controller: PiPController, skipByInterval interval: CMTime)
func pipControllerIsPlaying(_ controller: PiPController) -> Bool func pipControllerIsPlaying(_ controller: PiPController) -> Bool
func pipControllerDuration(_ controller: PiPController) -> Double func pipControllerDuration(_ controller: PiPController) -> Double
func pipControllerCurrentPosition(_ controller: PiPController) -> Double
} }
final class PiPController: NSObject { final class PiPController: NSObject {
@@ -20,6 +21,13 @@ final class PiPController: NSObject {
weak var delegate: PiPControllerDelegate? weak var delegate: PiPControllerDelegate?
// Timebase for PiP progress tracking
private var timebase: CMTimebase?
// Track current time for PiP progress
private var currentTime: CMTime = .zero
private var currentDuration: Double = 0
var isPictureInPictureSupported: Bool { var isPictureInPictureSupported: Bool {
return AVPictureInPictureController.isPictureInPictureSupported() return AVPictureInPictureController.isPictureInPictureSupported()
} }
@@ -35,9 +43,29 @@ final class PiPController: NSObject {
init(sampleBufferDisplayLayer: AVSampleBufferDisplayLayer) { init(sampleBufferDisplayLayer: AVSampleBufferDisplayLayer) {
self.sampleBufferDisplayLayer = sampleBufferDisplayLayer self.sampleBufferDisplayLayer = sampleBufferDisplayLayer
super.init() super.init()
setupTimebase()
setupPictureInPicture() setupPictureInPicture()
} }
private func setupTimebase() {
// Create a timebase for tracking playback time
var newTimebase: CMTimebase?
let status = CMTimebaseCreateWithSourceClock(
allocator: kCFAllocatorDefault,
sourceClock: CMClockGetHostTimeClock(),
timebaseOut: &newTimebase
)
if status == noErr, let tb = newTimebase {
timebase = tb
CMTimebaseSetTime(tb, time: .zero)
CMTimebaseSetRate(tb, rate: 0) // Start paused
// Set the control timebase on the display layer
sampleBufferDisplayLayer?.controlTimebase = tb
}
}
private func setupPictureInPicture() { private func setupPictureInPicture() {
guard isPictureInPictureSupported, guard isPictureInPictureSupported,
let displayLayer = sampleBufferDisplayLayer else { let displayLayer = sampleBufferDisplayLayer else {
@@ -81,6 +109,9 @@ final class PiPController: NSObject {
} }
func updatePlaybackState() { func updatePlaybackState() {
// Only invalidate when PiP is active to avoid "no context menu visible" warnings
guard isPictureInPictureActive else { return }
if Thread.isMainThread { if Thread.isMainThread {
pipController?.invalidatePlaybackState() pipController?.invalidatePlaybackState()
} else { } else {
@@ -89,6 +120,36 @@ final class PiPController: NSObject {
} }
} }
} }
/// Updates the current playback time for PiP progress display
func setCurrentTime(_ time: CMTime) {
currentTime = time
// Update the timebase to reflect current position
if let tb = timebase {
CMTimebaseSetTime(tb, time: time)
}
// Only invalidate when PiP is active to avoid unnecessary updates
if isPictureInPictureActive {
updatePlaybackState()
}
}
/// Updates the current playback time from seconds
func setCurrentTimeFromSeconds(_ seconds: Double, duration: Double) {
guard seconds >= 0 else { return }
currentDuration = duration
let time = CMTime(seconds: seconds, preferredTimescale: 1000)
setCurrentTime(time)
}
/// Updates the playback rate on the timebase (1.0 = playing, 0.0 = paused)
func setPlaybackRate(_ rate: Float) {
if let tb = timebase {
CMTimebaseSetRate(tb, rate: Float64(rate))
}
}
} }
// MARK: - AVPictureInPictureControllerDelegate // MARK: - AVPictureInPictureControllerDelegate

View File

@@ -86,6 +86,9 @@ export interface MpvPlayerViewRef {
getAudioTracks: () => Promise<AudioTrack[]>; getAudioTracks: () => Promise<AudioTrack[]>;
setAudioTrack: (trackId: number) => Promise<void>; setAudioTrack: (trackId: number) => Promise<void>;
getCurrentAudioTrack: () => Promise<number>; getCurrentAudioTrack: () => Promise<number>;
// Video scaling
setZoomedToFill: (zoomed: boolean) => Promise<void>;
isZoomedToFill: () => Promise<boolean>;
} }
export type SubtitleTrack = { export type SubtitleTrack = {

View File

@@ -94,6 +94,13 @@ export default React.forwardRef<MpvPlayerViewRef, MpvPlayerViewProps>(
getCurrentAudioTrack: async () => { getCurrentAudioTrack: async () => {
return await nativeRef.current?.getCurrentAudioTrack(); return await nativeRef.current?.getCurrentAudioTrack();
}, },
// Video scaling
setZoomedToFill: async (zoomed: boolean) => {
await nativeRef.current?.setZoomedToFill(zoomed);
},
isZoomedToFill: async () => {
return await nativeRef.current?.isZoomedToFill();
},
})); }));
return <NativeView ref={nativeRef} {...props} />; return <NativeView ref={nativeRef} {...props} />;

View File

@@ -1,14 +1,14 @@
import { MpvPlayerViewProps } from "./MpvPlayer.types"; import { MpvPlayerViewProps } from "./MpvPlayer.types";
export default function MpvPlayerView(props: MpvPlayerViewProps) { export default function MpvPlayerView(props: MpvPlayerViewProps) {
const url = props.source?.url; const url = props.source?.url ?? "";
return ( return (
<div> <div>
<iframe <iframe
title='MPV Player' title='MPV Player'
style={{ flex: 1 }} style={{ flex: 1 }}
src={url} src={url}
onLoad={() => props.onLoad?.({ nativeEvent: { url: url ?? "" } })} onLoad={() => props.onLoad?.({ nativeEvent: { url } })}
/> />
</div> </div>
); );

View File

@@ -1,71 +0,0 @@
apply plugin: 'com.android.library'
apply plugin: 'kotlin-android'
apply plugin: 'maven-publish'
group = 'expo.modules.sfplayer'
version = '1.0.0'
buildscript {
def expoModulesCorePlugin = new File(project(":expo-modules-core").projectDir.absolutePath, "ExpoModulesCorePlugin.gradle")
if (expoModulesCorePlugin.exists()) {
apply from: expoModulesCorePlugin
applyKotlinExpoModulesCorePlugin()
}
}
afterEvaluate {
publishing {
publications {
release(MavenPublication) {
from components.release
}
}
repositories {
maven {
url = mavenLocal().url
}
}
}
}
android {
compileSdkVersion safeExtGet("compileSdkVersion", 34)
def agpVersion = com.android.Version.ANDROID_GRADLE_PLUGIN_VERSION
if (agpVersion.tokenize('.')[0].toInteger() < 8) {
compileOptions {
sourceCompatibility JavaVersion.VERSION_11
targetCompatibility JavaVersion.VERSION_11
}
kotlinOptions {
jvmTarget = JavaVersion.VERSION_11.majorVersion
}
}
namespace "expo.modules.sfplayer"
defaultConfig {
minSdkVersion safeExtGet("minSdkVersion", 23)
targetSdkVersion safeExtGet("targetSdkVersion", 34)
}
lintOptions {
abortOnError false
}
publishing {
singleVariant("release") {
withSourcesJar()
}
}
}
repositories {
mavenCentral()
}
dependencies {
implementation project(':expo-modules-core')
}
def safeExtGet(prop, fallback) {
rootProject.ext.has(prop) ? rootProject.ext.get(prop) : fallback
}

View File

@@ -1,2 +0,0 @@
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
</manifest>

View File

@@ -1,120 +0,0 @@
package expo.modules.sfplayer
import expo.modules.kotlin.modules.Module
import expo.modules.kotlin.modules.ModuleDefinition
class SfPlayerModule : Module() {
override fun definition() = ModuleDefinition {
Name("SfPlayer")
View(SfPlayerView::class) {
Prop("source") { view: SfPlayerView, source: Map<String, Any>? ->
// Android stub - KSPlayer is iOS only
}
AsyncFunction("play") { view: SfPlayerView ->
}
AsyncFunction("pause") { view: SfPlayerView ->
}
AsyncFunction("seekTo") { view: SfPlayerView, position: Double ->
}
AsyncFunction("seekBy") { view: SfPlayerView, offset: Double ->
}
AsyncFunction("setSpeed") { view: SfPlayerView, speed: Double ->
}
AsyncFunction("getSpeed") { view: SfPlayerView ->
1.0
}
AsyncFunction("isPaused") { view: SfPlayerView ->
true
}
AsyncFunction("getCurrentPosition") { view: SfPlayerView ->
0.0
}
AsyncFunction("getDuration") { view: SfPlayerView ->
0.0
}
AsyncFunction("startPictureInPicture") { view: SfPlayerView ->
}
AsyncFunction("stopPictureInPicture") { view: SfPlayerView ->
}
AsyncFunction("isPictureInPictureSupported") { view: SfPlayerView ->
false
}
AsyncFunction("isPictureInPictureActive") { view: SfPlayerView ->
false
}
AsyncFunction("getSubtitleTracks") { view: SfPlayerView ->
emptyList<Map<String, Any>>()
}
AsyncFunction("setSubtitleTrack") { view: SfPlayerView, trackId: Int ->
}
AsyncFunction("disableSubtitles") { view: SfPlayerView ->
}
AsyncFunction("getCurrentSubtitleTrack") { view: SfPlayerView ->
0
}
AsyncFunction("addSubtitleFile") { view: SfPlayerView, url: String, select: Boolean ->
}
AsyncFunction("setSubtitlePosition") { view: SfPlayerView, position: Int ->
}
AsyncFunction("setSubtitleScale") { view: SfPlayerView, scale: Double ->
}
AsyncFunction("setSubtitleMarginY") { view: SfPlayerView, margin: Int ->
}
AsyncFunction("setSubtitleAlignX") { view: SfPlayerView, alignment: String ->
}
AsyncFunction("setSubtitleAlignY") { view: SfPlayerView, alignment: String ->
}
AsyncFunction("setSubtitleFontSize") { view: SfPlayerView, size: Int ->
}
AsyncFunction("getAudioTracks") { view: SfPlayerView ->
emptyList<Map<String, Any>>()
}
AsyncFunction("setAudioTrack") { view: SfPlayerView, trackId: Int ->
}
AsyncFunction("getCurrentAudioTrack") { view: SfPlayerView ->
0
}
AsyncFunction("setVideoZoomToFill") { view: SfPlayerView, enabled: Boolean ->
}
AsyncFunction("getVideoZoomToFill") { view: SfPlayerView ->
false
}
AsyncFunction("setAutoPipEnabled") { view: SfPlayerView, enabled: Boolean ->
}
Events("onLoad", "onPlaybackStateChange", "onProgress", "onError", "onTracksReady", "onPictureInPictureChange")
}
}
}

View File

@@ -1,29 +0,0 @@
package expo.modules.sfplayer
import android.content.Context
import android.view.View
import android.widget.FrameLayout
import expo.modules.kotlin.AppContext
import expo.modules.kotlin.views.ExpoView
class SfPlayerView(context: Context, appContext: AppContext) : ExpoView(context, appContext) {
private val placeholder: View = View(context).apply {
setBackgroundColor(android.graphics.Color.BLACK)
layoutParams = FrameLayout.LayoutParams(
FrameLayout.LayoutParams.MATCH_PARENT,
FrameLayout.LayoutParams.MATCH_PARENT
)
}
init {
addView(placeholder)
}
}

View File

@@ -1,9 +0,0 @@
{
"platforms": ["ios", "tvos", "android"],
"ios": {
"modules": ["SfPlayerModule"]
},
"android": {
"modules": ["expo.modules.sfplayer.SfPlayerModule"]
}
}

View File

@@ -1 +0,0 @@
export * from "./src";

View File

@@ -1,32 +0,0 @@
Pod::Spec.new do |s|
s.name = 'SfPlayer'
s.module_name = 'SfPlayer'
s.version = '1.0.0'
s.summary = 'Streamyfin Player - KSPlayer wrapper for Expo'
s.description = 'Video player with GPU acceleration and PiP support for Expo, powered by KSPlayer'
s.author = 'streamyfin'
s.homepage = 'https://github.com/streamyfin/streamyfin'
s.license = { :type => 'MPL-2.0' }
s.platforms = {
:ios => '15.1',
:tvos => '15.1'
}
s.source = { git: 'https://github.com/streamyfin/streamyfin.git' }
s.static_framework = true
s.swift_version = '5.9'
s.dependency 'ExpoModulesCore'
s.dependency 'KSPlayer'
s.dependency 'DisplayCriteria'
# KSPlayer pods are injected into the Podfile via plugins/withKSPlayer.js
s.pod_target_xcconfig = {
'DEFINES_MODULE' => 'YES',
'DEBUG_INFORMATION_FORMAT' => 'dwarf',
'STRIP_INSTALLED_PRODUCT' => 'YES',
'DEPLOYMENT_POSTPROCESSING' => 'YES',
}
s.source_files = "**/*.{h,m,mm,swift,hpp,cpp}"
end

View File

@@ -1,179 +0,0 @@
import ExpoModulesCore
public class SfPlayerModule: Module {
public func definition() -> ModuleDefinition {
Name("SfPlayer")
// Module-level functions (not tied to a specific view instance)
Function("setHardwareDecode") { (enabled: Bool) in
SfPlayerView.setHardwareDecode(enabled)
}
Function("getHardwareDecode") { () -> Bool in
return SfPlayerView.getHardwareDecode()
}
// Enables the module to be used as a native view
View(SfPlayerView.self) {
// All video load options are passed via a single "source" prop
Prop("source") { (view: SfPlayerView, source: [String: Any]?) in
guard let source = source,
let urlString = source["url"] as? String,
let videoURL = URL(string: urlString) else { return }
let config = VideoLoadConfig(
url: videoURL,
headers: source["headers"] as? [String: String],
externalSubtitles: source["externalSubtitles"] as? [String],
startPosition: source["startPosition"] as? Double,
autoplay: (source["autoplay"] as? Bool) ?? true,
initialSubtitleId: source["initialSubtitleId"] as? Int,
initialAudioId: source["initialAudioId"] as? Int
)
view.loadVideo(config: config)
}
// Playback controls
AsyncFunction("play") { (view: SfPlayerView) in
view.play()
}
AsyncFunction("pause") { (view: SfPlayerView) in
view.pause()
}
AsyncFunction("seekTo") { (view: SfPlayerView, position: Double) in
view.seekTo(position: position)
}
AsyncFunction("seekBy") { (view: SfPlayerView, offset: Double) in
view.seekBy(offset: offset)
}
AsyncFunction("setSpeed") { (view: SfPlayerView, speed: Double) in
view.setSpeed(speed: speed)
}
AsyncFunction("getSpeed") { (view: SfPlayerView) -> Double in
return view.getSpeed()
}
AsyncFunction("isPaused") { (view: SfPlayerView) -> Bool in
return view.isPaused()
}
AsyncFunction("getCurrentPosition") { (view: SfPlayerView) -> Double in
return view.getCurrentPosition()
}
AsyncFunction("getDuration") { (view: SfPlayerView) -> Double in
return view.getDuration()
}
// Picture in Picture
AsyncFunction("startPictureInPicture") { (view: SfPlayerView) in
view.startPictureInPicture()
}
AsyncFunction("stopPictureInPicture") { (view: SfPlayerView) in
view.stopPictureInPicture()
}
AsyncFunction("isPictureInPictureSupported") { (view: SfPlayerView) -> Bool in
return view.isPictureInPictureSupported()
}
AsyncFunction("isPictureInPictureActive") { (view: SfPlayerView) -> Bool in
return view.isPictureInPictureActive()
}
AsyncFunction("setAutoPipEnabled") { (view: SfPlayerView, enabled: Bool) in
view.setAutoPipEnabled(enabled)
}
// Subtitle functions
AsyncFunction("getSubtitleTracks") { (view: SfPlayerView) -> [[String: Any]] in
return view.getSubtitleTracks()
}
AsyncFunction("setSubtitleTrack") { (view: SfPlayerView, trackId: Int) in
view.setSubtitleTrack(trackId)
}
AsyncFunction("disableSubtitles") { (view: SfPlayerView) in
view.disableSubtitles()
}
AsyncFunction("getCurrentSubtitleTrack") { (view: SfPlayerView) -> Int in
return view.getCurrentSubtitleTrack()
}
AsyncFunction("addSubtitleFile") { (view: SfPlayerView, url: String, select: Bool) in
view.addSubtitleFile(url: url, select: select)
}
// Subtitle positioning
AsyncFunction("setSubtitlePosition") { (view: SfPlayerView, position: Int) in
view.setSubtitlePosition(position)
}
AsyncFunction("setSubtitleScale") { (view: SfPlayerView, scale: Double) in
view.setSubtitleScale(scale)
}
AsyncFunction("setSubtitleMarginY") { (view: SfPlayerView, margin: Int) in
view.setSubtitleMarginY(margin)
}
AsyncFunction("setSubtitleAlignX") { (view: SfPlayerView, alignment: String) in
view.setSubtitleAlignX(alignment)
}
AsyncFunction("setSubtitleAlignY") { (view: SfPlayerView, alignment: String) in
view.setSubtitleAlignY(alignment)
}
AsyncFunction("setSubtitleFontSize") { (view: SfPlayerView, size: Int) in
view.setSubtitleFontSize(size)
}
AsyncFunction("setSubtitleColor") { (view: SfPlayerView, hexColor: String) in
view.setSubtitleColor(hexColor)
}
AsyncFunction("setSubtitleBackgroundColor") { (view: SfPlayerView, hexColor: String) in
view.setSubtitleBackgroundColor(hexColor)
}
AsyncFunction("setSubtitleFontName") { (view: SfPlayerView, fontName: String) in
view.setSubtitleFontName(fontName)
}
// Audio track functions
AsyncFunction("getAudioTracks") { (view: SfPlayerView) -> [[String: Any]] in
return view.getAudioTracks()
}
AsyncFunction("setAudioTrack") { (view: SfPlayerView, trackId: Int) in
view.setAudioTrack(trackId)
}
AsyncFunction("getCurrentAudioTrack") { (view: SfPlayerView) -> Int in
return view.getCurrentAudioTrack()
}
// Video zoom
AsyncFunction("setVideoZoomToFill") { (view: SfPlayerView, enabled: Bool) in
view.setVideoZoomToFill(enabled)
}
AsyncFunction("getVideoZoomToFill") { (view: SfPlayerView) -> Bool in
return view.getVideoZoomToFill()
}
// Events that the view can send to JavaScript
Events("onLoad", "onPlaybackStateChange", "onProgress", "onError", "onTracksReady", "onPictureInPictureChange")
}
}
}

View File

@@ -1,317 +0,0 @@
import AVFoundation
import ExpoModulesCore
import UIKit
class SfPlayerView: ExpoView {
private var player: SfPlayerWrapper?
private var videoContainer: UIView!
let onLoad = EventDispatcher()
let onPlaybackStateChange = EventDispatcher()
let onProgress = EventDispatcher()
let onError = EventDispatcher()
let onTracksReady = EventDispatcher()
let onPictureInPictureChange = EventDispatcher()
private var currentURL: URL?
private var cachedPosition: Double = 0
private var cachedDuration: Double = 0
private var intendedPlayState: Bool = false
required init(appContext: AppContext? = nil) {
super.init(appContext: appContext)
setupView()
}
private func setupView() {
clipsToBounds = true
backgroundColor = .black
videoContainer = UIView()
videoContainer.translatesAutoresizingMaskIntoConstraints = false
videoContainer.backgroundColor = .black
videoContainer.clipsToBounds = true
addSubview(videoContainer)
NSLayoutConstraint.activate([
videoContainer.topAnchor.constraint(equalTo: topAnchor),
videoContainer.leadingAnchor.constraint(equalTo: leadingAnchor),
videoContainer.trailingAnchor.constraint(equalTo: trailingAnchor),
videoContainer.bottomAnchor.constraint(equalTo: bottomAnchor)
])
// Initialize player
player = SfPlayerWrapper()
player?.delegate = self
// Configure Audio Session for PiP and background playback
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .moviePlayback)
try? AVAudioSession.sharedInstance().setActive(true)
// Add player view to container
if let playerView = player?.view {
playerView.translatesAutoresizingMaskIntoConstraints = false
videoContainer.addSubview(playerView)
NSLayoutConstraint.activate([
playerView.topAnchor.constraint(equalTo: videoContainer.topAnchor),
playerView.leadingAnchor.constraint(equalTo: videoContainer.leadingAnchor),
playerView.trailingAnchor.constraint(equalTo: videoContainer.trailingAnchor),
playerView.bottomAnchor.constraint(equalTo: videoContainer.bottomAnchor)
])
}
}
override func layoutSubviews() {
super.layoutSubviews()
player?.updateLayout(bounds: videoContainer.bounds)
}
// MARK: - Video Loading
func loadVideo(config: VideoLoadConfig) {
// Skip reload if same URL is already playing
if currentURL == config.url {
return
}
currentURL = config.url
player?.load(config: config)
if config.autoplay {
play()
}
onLoad(["url": config.url.absoluteString])
}
func loadVideo(url: URL, headers: [String: String]? = nil) {
loadVideo(config: VideoLoadConfig(url: url, headers: headers))
}
// MARK: - Playback Controls
func play() {
intendedPlayState = true
player?.play()
}
func pause() {
intendedPlayState = false
player?.pause()
}
func seekTo(position: Double) {
player?.seek(to: position)
}
func seekBy(offset: Double) {
player?.seek(by: offset)
}
func setSpeed(speed: Double) {
player?.setSpeed(speed)
}
func getSpeed() -> Double {
return player?.getSpeed() ?? 1.0
}
func isPaused() -> Bool {
return player?.getIsPaused() ?? true
}
func getCurrentPosition() -> Double {
return cachedPosition
}
func getDuration() -> Double {
return cachedDuration
}
// MARK: - Picture in Picture
func startPictureInPicture() {
player?.startPictureInPicture()
}
func stopPictureInPicture() {
player?.stopPictureInPicture()
}
func isPictureInPictureSupported() -> Bool {
return player?.isPictureInPictureSupported() ?? false
}
func isPictureInPictureActive() -> Bool {
return player?.isPictureInPictureActive() ?? false
}
func setAutoPipEnabled(_ enabled: Bool) {
player?.setAutoPipEnabled(enabled)
}
// MARK: - Subtitle Controls
func getSubtitleTracks() -> [[String: Any]] {
return player?.getSubtitleTracks() ?? []
}
func setSubtitleTrack(_ trackId: Int) {
player?.setSubtitleTrack(trackId)
}
func disableSubtitles() {
player?.disableSubtitles()
}
func getCurrentSubtitleTrack() -> Int {
return player?.getCurrentSubtitleTrack() ?? 0
}
func addSubtitleFile(url: String, select: Bool = true) {
player?.addSubtitleFile(url: url, select: select)
}
// MARK: - Subtitle Positioning
func setSubtitlePosition(_ position: Int) {
player?.setSubtitlePosition(position)
}
func setSubtitleScale(_ scale: Double) {
player?.setSubtitleScale(scale)
}
func setSubtitleMarginY(_ margin: Int) {
player?.setSubtitleMarginY(margin)
}
func setSubtitleAlignX(_ alignment: String) {
player?.setSubtitleAlignX(alignment)
}
func setSubtitleAlignY(_ alignment: String) {
player?.setSubtitleAlignY(alignment)
}
func setSubtitleFontSize(_ size: Int) {
player?.setSubtitleFontSize(size)
}
func setSubtitleColor(_ hexColor: String) {
player?.setSubtitleColor(hexColor)
}
func setSubtitleBackgroundColor(_ hexColor: String) {
player?.setSubtitleBackgroundColor(hexColor)
}
func setSubtitleFontName(_ fontName: String) {
player?.setSubtitleFontName(fontName)
}
// MARK: - Hardware Decode (static, affects all players)
static func setHardwareDecode(_ enabled: Bool) {
SfPlayerWrapper.setHardwareDecode(enabled)
}
static func getHardwareDecode() -> Bool {
return SfPlayerWrapper.getHardwareDecode()
}
// MARK: - Audio Track Controls
func getAudioTracks() -> [[String: Any]] {
return player?.getAudioTracks() ?? []
}
func setAudioTrack(_ trackId: Int) {
player?.setAudioTrack(trackId)
}
func getCurrentAudioTrack() -> Int {
return player?.getCurrentAudioTrack() ?? 0
}
// MARK: - Video Zoom
func setVideoZoomToFill(_ enabled: Bool) {
player?.setVideoZoomToFill(enabled)
}
func getVideoZoomToFill() -> Bool {
return player?.getVideoZoomToFill() ?? false
}
deinit {
player?.stopPictureInPicture()
}
}
// MARK: - SfPlayerWrapperDelegate
extension SfPlayerView: SfPlayerWrapperDelegate {
func player(_ player: SfPlayerWrapper, didUpdatePosition position: Double, duration: Double) {
cachedPosition = position
cachedDuration = duration
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onProgress([
"position": position,
"duration": duration,
"progress": duration > 0 ? position / duration : 0,
])
}
}
func player(_ player: SfPlayerWrapper, didChangePause isPaused: Bool) {
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onPlaybackStateChange([
"isPaused": isPaused,
"isPlaying": !isPaused,
])
}
}
func player(_ player: SfPlayerWrapper, didChangeLoading isLoading: Bool) {
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onPlaybackStateChange([
"isLoading": isLoading,
])
}
}
func player(_ player: SfPlayerWrapper, didBecomeReadyToSeek: Bool) {
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onPlaybackStateChange([
"isReadyToSeek": didBecomeReadyToSeek,
])
}
}
func player(_ player: SfPlayerWrapper, didBecomeTracksReady: Bool) {
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onTracksReady([:])
}
}
func player(_ player: SfPlayerWrapper, didEncounterError error: String) {
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onError(["error": error])
}
}
func player(_ player: SfPlayerWrapper, didChangePictureInPicture isActive: Bool) {
DispatchQueue.main.async { [weak self] in
guard let self else { return }
self.onPictureInPictureChange(["isActive": isActive])
}
}
}

View File

@@ -1,869 +0,0 @@
import AVFoundation
import AVKit
import KSPlayer
import SwiftUI
import UIKit
protocol SfPlayerWrapperDelegate: AnyObject {
func player(_ player: SfPlayerWrapper, didUpdatePosition position: Double, duration: Double)
func player(_ player: SfPlayerWrapper, didChangePause isPaused: Bool)
func player(_ player: SfPlayerWrapper, didChangeLoading isLoading: Bool)
func player(_ player: SfPlayerWrapper, didBecomeReadyToSeek: Bool)
func player(_ player: SfPlayerWrapper, didBecomeTracksReady: Bool)
func player(_ player: SfPlayerWrapper, didEncounterError error: String)
func player(_ player: SfPlayerWrapper, didChangePictureInPicture isActive: Bool)
}
/// Configuration for loading a video
struct VideoLoadConfig {
let url: URL
var headers: [String: String]?
var externalSubtitles: [String]?
var startPosition: Double?
var autoplay: Bool
var initialSubtitleId: Int?
var initialAudioId: Int?
init(
url: URL,
headers: [String: String]? = nil,
externalSubtitles: [String]? = nil,
startPosition: Double? = nil,
autoplay: Bool = true,
initialSubtitleId: Int? = nil,
initialAudioId: Int? = nil
) {
self.url = url
self.headers = headers
self.externalSubtitles = externalSubtitles
self.startPosition = startPosition
self.autoplay = autoplay
self.initialSubtitleId = initialSubtitleId
self.initialAudioId = initialAudioId
}
}
final class SfPlayerWrapper: NSObject {
// MARK: - Properties
private var playerView: IOSVideoPlayerView?
private var containerView: UIView?
private var cachedPosition: Double = 0
private var cachedDuration: Double = 0
private var isPaused: Bool = true
private var isLoading: Bool = false
private var currentURL: URL?
private var pendingExternalSubtitles: [String] = []
private var initialSubtitleId: Int?
private var initialAudioId: Int?
private var pendingStartPosition: Double?
private var progressTimer: Timer?
private var pipController: AVPictureInPictureController?
/// Scale factor for image-based subtitles (PGS, VOBSUB)
/// Default 1.0 = no scaling; setSubtitleFontSize derives scale from font size
private var subtitleScale: CGFloat = 1.0
/// When true, setSubtitleFontSize won't override the scale (user set explicit value)
private var isScaleExplicitlySet: Bool = false
/// Optional override for subtitle font family
private var subtitleFontName: String?
weak var delegate: SfPlayerWrapperDelegate?
var view: UIView? { containerView }
// MARK: - Initialization
override init() {
super.init()
setupPlayer()
}
deinit {
stopProgressTimer()
playerView?.pause()
playerView = nil
}
// MARK: - Setup
private func setupPlayer() {
// Configure KSPlayer options for hardware acceleration
KSOptions.canBackgroundPlay = true
KSOptions.isAutoPlay = false
KSOptions.isSecondOpen = true
KSOptions.isAccurateSeek = true
KSOptions.hardwareDecode = true
// Create container view
let container = UIView()
container.backgroundColor = .black
container.clipsToBounds = true
containerView = container
}
private func createPlayerView(frame: CGRect) -> IOSVideoPlayerView {
let player = IOSVideoPlayerView()
player.frame = frame
player.delegate = self
// Hide ALL KSPlayer UI elements - we use our own JS controls
player.toolBar.isHidden = true
player.navigationBar.isHidden = true
player.topMaskView.isHidden = true
player.bottomMaskView.isHidden = true
player.loadingIndector.isHidden = false
player.seekToView.isHidden = true
player.replayButton.isHidden = true
player.lockButton.isHidden = true
player.controllerView.isHidden = true
player.titleLabel.isHidden = true
// Ensure subtitle views are visible for rendering
player.subtitleBackView.isHidden = false
player.subtitleLabel.isHidden = false
// Disable all gestures - handled in JS
player.tapGesture.isEnabled = false
player.doubleTapGesture.isEnabled = false
player.panGesture.isEnabled = false
// Disable interaction on hidden elements
player.controllerView.isUserInteractionEnabled = false
applySubtitleFont()
return player
}
// MARK: - Progress Timer
private func startProgressTimer() {
stopProgressTimer()
progressTimer = Timer.scheduledTimer(withTimeInterval: 0.5, repeats: true) { [weak self] _ in
self?.updateProgress()
}
}
private func stopProgressTimer() {
progressTimer?.invalidate()
progressTimer = nil
}
private func updateProgress() {
guard let player = playerView?.playerLayer?.player else { return }
let position = player.currentPlaybackTime
let duration = player.duration
if position != cachedPosition || duration != cachedDuration {
cachedPosition = position
cachedDuration = duration
delegate?.player(self, didUpdatePosition: position, duration: duration)
}
}
// MARK: - Public API
func load(config: VideoLoadConfig) {
guard config.url != currentURL else { return }
currentURL = config.url
pendingExternalSubtitles = config.externalSubtitles ?? []
initialSubtitleId = config.initialSubtitleId
initialAudioId = config.initialAudioId
// Store start position to seek after video is ready
if let startPos = config.startPosition, startPos > 0 {
pendingStartPosition = startPos
} else {
pendingStartPosition = nil
}
isLoading = true
delegate?.player(self, didChangeLoading: true)
// Create or reset player view
if playerView == nil, let container = containerView {
let player = createPlayerView(frame: container.bounds)
player.translatesAutoresizingMaskIntoConstraints = false
container.addSubview(player)
// Pin player to all edges of container
NSLayoutConstraint.activate([
player.topAnchor.constraint(equalTo: container.topAnchor),
player.leadingAnchor.constraint(equalTo: container.leadingAnchor),
player.trailingAnchor.constraint(equalTo: container.trailingAnchor),
player.bottomAnchor.constraint(equalTo: container.bottomAnchor)
])
playerView = player
}
// Configure options for this media
let options = KSOptions()
// Set HTTP headers if provided
if let headers = config.headers, !headers.isEmpty {
for (key, value) in headers {
options.appendHeader(["key": key, "value": value])
}
}
// Note: startPosition is handled via explicit seek in readyToPlay callback
// because KSPlayer's options.startPlayTime doesn't work reliably
// Set the URL with options
playerView?.set(url: config.url, options: options)
if config.autoplay {
play()
}
}
func play() {
isPaused = false
playerView?.play()
startProgressTimer()
delegate?.player(self, didChangePause: false)
}
func pause() {
isPaused = true
playerView?.pause()
delegate?.player(self, didChangePause: true)
}
func seek(to seconds: Double) {
let time = max(0, seconds)
let wasPaused = isPaused
cachedPosition = time
playerView?.seek(time: time) { [weak self] finished in
guard let self, finished else { return }
// KSPlayer may auto-resume after seeking, so enforce the intended state
if wasPaused {
self.pause()
}
self.updateProgress()
}
}
func seek(by seconds: Double) {
let newPosition = max(0, cachedPosition + seconds)
seek(to: newPosition)
}
func setSpeed(_ speed: Double) {
playerView?.playerLayer?.player.playbackRate = Float(speed)
}
func getSpeed() -> Double {
return Double(playerView?.playerLayer?.player.playbackRate ?? 1.0)
}
func getCurrentPosition() -> Double {
return cachedPosition
}
func getDuration() -> Double {
return cachedDuration
}
func getIsPaused() -> Bool {
return isPaused
}
// MARK: - Picture in Picture
private func setupPictureInPicture() {
guard AVPictureInPictureController.isPictureInPictureSupported() else { return }
// Get the PiP controller from KSPlayer
guard let pip = playerView?.playerLayer?.player.pipController else { return }
pipController = pip
pip.delegate = self
// Enable automatic PiP when app goes to background (swipe up to home)
if #available(iOS 14.2, *) {
pip.canStartPictureInPictureAutomaticallyFromInline = true
}
}
func startPictureInPicture() {
pipController?.startPictureInPicture()
}
func stopPictureInPicture() {
pipController?.stopPictureInPicture()
}
func isPictureInPictureSupported() -> Bool {
return AVPictureInPictureController.isPictureInPictureSupported()
}
func isPictureInPictureActive() -> Bool {
return pipController?.isPictureInPictureActive ?? false
}
func setAutoPipEnabled(_ enabled: Bool) {
if #available(iOS 14.2, *) {
pipController?.canStartPictureInPictureAutomaticallyFromInline = enabled
}
}
// MARK: - Subtitle Controls
func getSubtitleTracks() -> [[String: Any]] {
var tracks: [[String: Any]] = []
// srtControl.subtitleInfos should contain ALL subtitles KSPlayer knows about
// (both embedded that were auto-detected and external that were added)
if let srtControl = playerView?.srtControl {
let allSubtitles = srtControl.subtitleInfos
let selectedInfo = srtControl.selectedSubtitleInfo
print("[SfPlayer] getSubtitleTracks - srtControl has \(allSubtitles.count) subtitles")
for (index, info) in allSubtitles.enumerated() {
let isSelected = selectedInfo?.subtitleID == info.subtitleID
let trackInfo: [String: Any] = [
"id": index + 1, // 1-based ID
"selected": isSelected,
"title": info.name,
"lang": "",
"source": "srtControl"
]
tracks.append(trackInfo)
print("[SfPlayer] [\(index + 1)]: \(info.name) (selected: \(isSelected))")
}
}
// Also log embedded tracks from player for debugging
if let player = playerView?.playerLayer?.player {
let embeddedTracks = player.tracks(mediaType: .subtitle)
print("[SfPlayer] getSubtitleTracks - player.tracks has \(embeddedTracks.count) embedded tracks")
for (i, track) in embeddedTracks.enumerated() {
print("[SfPlayer] embedded[\(i)]: \(track.name) (enabled: \(track.isEnabled))")
}
}
return tracks
}
func setSubtitleTrack(_ trackId: Int) {
print("[SfPlayer] setSubtitleTrack called with trackId: \(trackId)")
// Handle disable case
if trackId < 0 {
print("[SfPlayer] Disabling subtitles (trackId < 0)")
disableSubtitles()
return
}
guard let player = playerView?.playerLayer?.player,
let srtControl = playerView?.srtControl else {
print("[SfPlayer] setSubtitleTrack - player or srtControl not available")
return
}
let embeddedTracks = player.tracks(mediaType: .subtitle)
let index = trackId - 1 // Convert to 0-based
print("[SfPlayer] setSubtitleTrack - embedded tracks: \(embeddedTracks.count), srtControl.subtitleInfos: \(srtControl.subtitleInfos.count), index: \(index)")
// Log all available subtitles for debugging
print("[SfPlayer] Available in srtControl:")
for (i, info) in srtControl.subtitleInfos.enumerated() {
print("[SfPlayer] [\(i)]: \(info.name)")
}
// KSPlayer's srtControl might contain all subtitles (embedded + external)
// Try to find and select the subtitle at the given index in srtControl
let allSubtitles = srtControl.subtitleInfos
if index >= 0 && index < allSubtitles.count {
let subtitleInfo = allSubtitles[index]
srtControl.selectedSubtitleInfo = subtitleInfo
playerView?.updateSrt()
print("[SfPlayer] Selected subtitle from srtControl: \(subtitleInfo.name)")
return
}
// Fallback: try selecting embedded track directly via player.select()
// This handles cases where srtControl doesn't have all embedded tracks
if index >= 0 && index < embeddedTracks.count {
let track = embeddedTracks[index]
player.select(track: track)
print("[SfPlayer] Fallback: Selected embedded track via player.select(): \(track.name)")
return
}
print("[SfPlayer] WARNING: index \(index) out of range")
}
func disableSubtitles() {
print("[SfPlayer] disableSubtitles called")
// Clear srtControl selection (handles both embedded and external via srtControl)
playerView?.srtControl.selectedSubtitleInfo = nil
playerView?.updateSrt()
// Also disable any embedded tracks selected via player.select()
if let player = playerView?.playerLayer?.player {
let subtitleTracks = player.tracks(mediaType: .subtitle)
for track in subtitleTracks {
if track.isEnabled {
// KSPlayer doesn't have a direct "disable" - selecting a different track would disable this one
print("[SfPlayer] Note: embedded track '\(track.name)' is still enabled at decoder level")
}
}
}
}
func getCurrentSubtitleTrack() -> Int {
guard let srtControl = playerView?.srtControl,
let selectedInfo = srtControl.selectedSubtitleInfo else {
return 0 // No subtitle selected
}
// Find the selected subtitle in srtControl.subtitleInfos
let allSubtitles = srtControl.subtitleInfos
for (index, info) in allSubtitles.enumerated() {
if info.subtitleID == selectedInfo.subtitleID {
return index + 1 // 1-based ID
}
}
return 0
}
func addSubtitleFile(url: String, select: Bool) {
print("[SfPlayer] addSubtitleFile called with url: \(url), select: \(select)")
guard let subUrl = URL(string: url) else {
print("[SfPlayer] Failed to create URL from string")
return
}
// If player is ready, add directly via srtControl
if let srtControl = playerView?.srtControl {
let subtitleInfo = URLSubtitleInfo(url: subUrl)
srtControl.addSubtitle(info: subtitleInfo)
print("[SfPlayer] Added subtitle via srtControl: \(subtitleInfo.name)")
if select {
srtControl.selectedSubtitleInfo = subtitleInfo
playerView?.updateSrt()
print("[SfPlayer] Selected subtitle: \(subtitleInfo.name)")
}
} else {
// Player not ready yet, queue for later
print("[SfPlayer] Player not ready, queuing subtitle")
pendingExternalSubtitles.append(url)
}
}
// MARK: - Subtitle Positioning
func setSubtitlePosition(_ position: Int) {
// KSPlayer subtitle positioning through options
}
func setSubtitleScale(_ scale: Double) {
subtitleScale = CGFloat(scale)
isScaleExplicitlySet = true
applySubtitleScale()
}
private func applySubtitleScale() {
guard let subtitleBackView = playerView?.subtitleBackView else { return }
// Apply scale transform to subtitle view
// This scales both text and image-based subtitles (PGS, VOBSUB)
subtitleBackView.transform = CGAffineTransform(scaleX: subtitleScale, y: subtitleScale)
}
func setSubtitleMarginY(_ margin: Int) {
var position = SubtitleModel.textPosition
position.verticalMargin = CGFloat(margin)
SubtitleModel.textPosition = position
playerView?.updateSrt()
}
func setSubtitleAlignX(_ alignment: String) {
var position = SubtitleModel.textPosition
switch alignment.lowercased() {
case "left":
position.horizontalAlign = .leading
case "right":
position.horizontalAlign = .trailing
default:
position.horizontalAlign = .center
}
SubtitleModel.textPosition = position
playerView?.updateSrt()
}
func setSubtitleAlignY(_ alignment: String) {
var position = SubtitleModel.textPosition
switch alignment.lowercased() {
case "top":
position.verticalAlign = .top
case "center":
position.verticalAlign = .center
default:
position.verticalAlign = .bottom
}
SubtitleModel.textPosition = position
playerView?.updateSrt()
}
func setSubtitleFontSize(_ size: Int) {
// Size is now a scale value * 100 (e.g., 100 = 1.0, 60 = 0.6)
// Convert to actual scale for both text and image subtitles
let scale = CGFloat(size) / 100.0
// Set font size for text-based subtitles (SRT, ASS, VTT)
// Base font size ~50pt, scaled by user preference
SubtitleModel.textFontSize = 50.0 * scale
// Apply scale for image-based subtitles (PGS, VOBSUB)
// Only if scale wasn't explicitly set via setSubtitleScale
if !isScaleExplicitlySet {
subtitleScale = min(max(scale, 0.3), 1.5) // Clamp to 0.3-1.5
applySubtitleScale()
}
playerView?.updateSrt()
}
func setSubtitleFontName(_ name: String?) {
subtitleFontName = name
applySubtitleFont()
}
func setSubtitleColor(_ hexColor: String) {
if let color = UIColor(hex: hexColor) {
SubtitleModel.textColor = Color(color)
playerView?.subtitleLabel.textColor = color
playerView?.updateSrt()
}
}
func setSubtitleBackgroundColor(_ hexColor: String) {
if let color = UIColor(hex: hexColor) {
SubtitleModel.textBackgroundColor = Color(color)
playerView?.subtitleBackView.backgroundColor = color
playerView?.updateSrt()
}
}
// MARK: - Hardware Decode
static func setHardwareDecode(_ enabled: Bool) {
KSOptions.hardwareDecode = enabled
}
static func getHardwareDecode() -> Bool {
return KSOptions.hardwareDecode
}
// MARK: - Private helpers
private func applySubtitleFont() {
guard let playerView else { return }
let currentSize = playerView.subtitleLabel.font.pointSize
let baseFont: UIFont
if let subtitleFontName,
!subtitleFontName.isEmpty,
subtitleFontName.lowercased() != "system",
let customFont = UIFont(name: subtitleFontName, size: currentSize) {
baseFont = customFont
} else {
baseFont = UIFont.systemFont(ofSize: currentSize)
}
// Remove any implicit italic trait to avoid overly slanted rendering
let nonItalicDescriptor = baseFont.fontDescriptor
.withSymbolicTraits(baseFont.fontDescriptor.symbolicTraits.subtracting(.traitItalic))
?? baseFont.fontDescriptor
let finalFont = UIFont(descriptor: nonItalicDescriptor, size: currentSize)
playerView.subtitleLabel.font = finalFont
playerView.updateSrt()
}
// MARK: - Audio Controls
func getAudioTracks() -> [[String: Any]] {
guard let player = playerView?.playerLayer?.player else { return [] }
var tracks: [[String: Any]] = []
let audioTracks = player.tracks(mediaType: .audio)
for (index, track) in audioTracks.enumerated() {
let trackInfo: [String: Any] = [
"id": index + 1,
"selected": track.isEnabled,
"title": track.name,
"lang": track.language ?? ""
]
tracks.append(trackInfo)
}
return tracks
}
func setAudioTrack(_ trackId: Int) {
guard let player = playerView?.playerLayer?.player else { return }
let audioTracks = player.tracks(mediaType: .audio)
let index = trackId - 1
if index >= 0 && index < audioTracks.count {
let track = audioTracks[index]
player.select(track: track)
}
}
func getCurrentAudioTrack() -> Int {
guard let player = playerView?.playerLayer?.player else { return 0 }
let audioTracks = player.tracks(mediaType: .audio)
for (index, track) in audioTracks.enumerated() {
if track.isEnabled {
return index + 1
}
}
return 0
}
// MARK: - Video Zoom
func setVideoZoomToFill(_ enabled: Bool) {
// Toggle between fit (black bars) and fill (crop to fill screen)
let contentMode: UIView.ContentMode = enabled ? .scaleAspectFill : .scaleAspectFit
playerView?.playerLayer?.player.view?.contentMode = contentMode
}
func getVideoZoomToFill() -> Bool {
return playerView?.playerLayer?.player.view?.contentMode == .scaleAspectFill
}
// MARK: - Layout
func updateLayout(bounds: CGRect) {
containerView?.layoutIfNeeded()
}
}
// MARK: - PlayerControllerDelegate
extension SfPlayerWrapper: PlayerControllerDelegate {
func playerController(state: KSPlayerState) {
switch state {
case .initialized:
break
case .preparing:
isLoading = true
delegate?.player(self, didChangeLoading: true)
case .readyToPlay:
isLoading = false
delegate?.player(self, didChangeLoading: false)
delegate?.player(self, didBecomeReadyToSeek: true)
delegate?.player(self, didBecomeTracksReady: true)
// Seek to pending start position if set
// Pause first, seek, then resume to avoid showing video at wrong position
if let startPos = pendingStartPosition, startPos > 0 {
let capturedStartPos = startPos
let wasPlaying = !isPaused
pendingStartPosition = nil
// Pause to prevent showing frames at wrong position
playerView?.pause()
// Small delay then seek
DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) { [weak self] in
guard let self else { return }
self.playerView?.seek(time: capturedStartPos) { [weak self] finished in
guard let self else { return }
if finished && wasPlaying {
self.play()
}
}
}
}
// Center video content - KSAVPlayerView maps contentMode to videoGravity
playerView?.playerLayer?.player.view?.contentMode = .scaleAspectFit
// Setup PiP controller with delegate
setupPictureInPicture()
// Add embedded subtitles from player to srtControl
// This makes them available for selection and rendering via srtControl
if let player = playerView?.playerLayer?.player,
let subtitleDataSource = player.subtitleDataSouce {
print("[SfPlayer] Adding embedded subtitles from player.subtitleDataSouce")
playerView?.srtControl.addSubtitle(dataSouce: subtitleDataSource)
}
// Load pending external subtitles via srtControl
print("[SfPlayer] readyToPlay - Loading \(pendingExternalSubtitles.count) external subtitles")
for subUrlString in pendingExternalSubtitles {
print("[SfPlayer] Adding external subtitle: \(subUrlString)")
if let subUrl = URL(string: subUrlString) {
let subtitleInfo = URLSubtitleInfo(url: subUrl)
playerView?.srtControl.addSubtitle(info: subtitleInfo)
print("[SfPlayer] Added subtitle info: \(subtitleInfo.name)")
} else {
print("[SfPlayer] Failed to create URL from: \(subUrlString)")
}
}
pendingExternalSubtitles.removeAll()
// Log all available subtitles in srtControl
let allSubtitles = playerView?.srtControl.subtitleInfos ?? []
print("[SfPlayer] srtControl now has \(allSubtitles.count) subtitles:")
for (i, info) in allSubtitles.enumerated() {
print("[SfPlayer] [\(i)]: \(info.name)")
}
// Also log embedded tracks from player for reference
let embeddedTracks = playerView?.playerLayer?.player.tracks(mediaType: .subtitle) ?? []
print("[SfPlayer] player.tracks has \(embeddedTracks.count) embedded tracks")
// Apply initial track selection
print("[SfPlayer] Applying initial track selections - subId: \(String(describing: initialSubtitleId)), audioId: \(String(describing: initialAudioId))")
if let subId = initialSubtitleId {
if subId < 0 {
print("[SfPlayer] Disabling subtitles (subId < 0)")
disableSubtitles()
} else {
print("[SfPlayer] Setting subtitle track to: \(subId)")
setSubtitleTrack(subId)
}
}
if let audioId = initialAudioId {
print("[SfPlayer] Setting audio track to: \(audioId)")
setAudioTrack(audioId)
}
// Debug: Check selected subtitle after applying
if let selectedSub = playerView?.srtControl.selectedSubtitleInfo {
print("[SfPlayer] Currently selected subtitle: \(selectedSub.name)")
} else {
print("[SfPlayer] No subtitle currently selected in srtControl")
}
case .buffering:
isLoading = true
delegate?.player(self, didChangeLoading: true)
case .bufferFinished:
isLoading = false
delegate?.player(self, didChangeLoading: false)
case .paused:
isPaused = true
delegate?.player(self, didChangePause: true)
case .playedToTheEnd:
isPaused = true
delegate?.player(self, didChangePause: true)
stopProgressTimer()
case .error:
delegate?.player(self, didEncounterError: "Playback error occurred")
@unknown default:
break
}
}
func playerController(currentTime: TimeInterval, totalTime: TimeInterval) {
cachedPosition = currentTime
cachedDuration = totalTime
delegate?.player(self, didUpdatePosition: currentTime, duration: totalTime)
}
func playerController(finish error: Error?) {
if let error = error {
delegate?.player(self, didEncounterError: error.localizedDescription)
}
stopProgressTimer()
}
func playerController(maskShow: Bool) {
// UI mask visibility changed
}
func playerController(action: PlayerButtonType) {
// Button action handled
}
func playerController(bufferedCount: Int, consumeTime: TimeInterval) {
// Buffering progress
}
func playerController(seek: TimeInterval) {
// Seek completed
}
}
// MARK: - AVPictureInPictureControllerDelegate
extension SfPlayerWrapper: AVPictureInPictureControllerDelegate {
func pictureInPictureControllerWillStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
delegate?.player(self, didChangePictureInPicture: true)
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
delegate?.player(self, didChangePictureInPicture: false)
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: Error) {
delegate?.player(self, didEncounterError: "PiP failed: \(error.localizedDescription)")
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, restoreUserInterfaceForPictureInPictureStopWithCompletionHandler completionHandler: @escaping (Bool) -> Void) {
// Called when user taps to restore from PiP - return true to allow restoration
completionHandler(true)
}
}
// MARK: - UIColor Hex Extension
extension UIColor {
convenience init?(hex: String) {
var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")
var rgb: UInt64 = 0
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 1.0
let length = hexSanitized.count
guard Scanner(string: hexSanitized).scanHexInt64(&rgb) else { return nil }
if length == 6 {
r = CGFloat((rgb & 0xFF0000) >> 16) / 255.0
g = CGFloat((rgb & 0x00FF00) >> 8) / 255.0
b = CGFloat(rgb & 0x0000FF) / 255.0
} else if length == 8 {
r = CGFloat((rgb & 0xFF000000) >> 24) / 255.0
g = CGFloat((rgb & 0x00FF0000) >> 16) / 255.0
b = CGFloat((rgb & 0x0000FF00) >> 8) / 255.0
a = CGFloat(rgb & 0x000000FF) / 255.0
} else {
return nil
}
self.init(red: r, green: g, blue: b, alpha: a)
}
}

View File

@@ -1,111 +0,0 @@
import type { StyleProp, ViewStyle } from "react-native";
export type OnLoadEventPayload = {
url: string;
};
export type OnPlaybackStateChangePayload = {
isPaused?: boolean;
isPlaying?: boolean;
isLoading?: boolean;
isReadyToSeek?: boolean;
};
export type OnProgressEventPayload = {
position: number;
duration: number;
progress: number;
};
export type OnErrorEventPayload = {
error: string;
};
export type OnTracksReadyEventPayload = Record<string, never>;
export type OnPictureInPictureChangePayload = {
isActive: boolean;
};
export type VideoSource = {
url: string;
headers?: Record<string, string>;
externalSubtitles?: string[];
startPosition?: number;
autoplay?: boolean;
/** Subtitle track ID to select on start (1-based, -1 to disable) */
initialSubtitleId?: number;
/** Audio track ID to select on start (1-based) */
initialAudioId?: number;
};
export type SfPlayerViewProps = {
source?: VideoSource;
style?: StyleProp<ViewStyle>;
onLoad?: (event: { nativeEvent: OnLoadEventPayload }) => void;
onPlaybackStateChange?: (event: {
nativeEvent: OnPlaybackStateChangePayload;
}) => void;
onProgress?: (event: { nativeEvent: OnProgressEventPayload }) => void;
onError?: (event: { nativeEvent: OnErrorEventPayload }) => void;
onTracksReady?: (event: { nativeEvent: OnTracksReadyEventPayload }) => void;
onPictureInPictureChange?: (event: {
nativeEvent: OnPictureInPictureChangePayload;
}) => void;
};
export interface SfPlayerViewRef {
play: () => Promise<void>;
pause: () => Promise<void>;
seekTo: (position: number) => Promise<void>;
seekBy: (offset: number) => Promise<void>;
setSpeed: (speed: number) => Promise<void>;
getSpeed: () => Promise<number>;
isPaused: () => Promise<boolean>;
getCurrentPosition: () => Promise<number>;
getDuration: () => Promise<number>;
startPictureInPicture: () => Promise<void>;
stopPictureInPicture: () => Promise<void>;
isPictureInPictureSupported: () => Promise<boolean>;
isPictureInPictureActive: () => Promise<boolean>;
setAutoPipEnabled: (enabled: boolean) => Promise<void>;
// Subtitle controls
getSubtitleTracks: () => Promise<SubtitleTrack[]>;
setSubtitleTrack: (trackId: number) => Promise<void>;
disableSubtitles: () => Promise<void>;
getCurrentSubtitleTrack: () => Promise<number>;
addSubtitleFile: (url: string, select?: boolean) => Promise<void>;
// Subtitle positioning
setSubtitlePosition: (position: number) => Promise<void>;
setSubtitleScale: (scale: number) => Promise<void>;
setSubtitleMarginY: (margin: number) => Promise<void>;
setSubtitleAlignX: (alignment: "left" | "center" | "right") => Promise<void>;
setSubtitleAlignY: (alignment: "top" | "center" | "bottom") => Promise<void>;
setSubtitleFontSize: (size: number) => Promise<void>;
setSubtitleColor: (hexColor: string) => Promise<void>;
setSubtitleBackgroundColor: (hexColor: string) => Promise<void>;
setSubtitleFontName: (fontName: string) => Promise<void>;
// Audio controls
getAudioTracks: () => Promise<AudioTrack[]>;
setAudioTrack: (trackId: number) => Promise<void>;
getCurrentAudioTrack: () => Promise<number>;
// Video zoom
setVideoZoomToFill: (enabled: boolean) => Promise<void>;
getVideoZoomToFill: () => Promise<boolean>;
}
export type SubtitleTrack = {
id: number;
title?: string;
lang?: string;
selected?: boolean;
};
export type AudioTrack = {
id: number;
title?: string;
lang?: string;
codec?: string;
channels?: number;
selected?: boolean;
};

View File

@@ -1,120 +0,0 @@
import { requireNativeView } from "expo";
import * as React from "react";
import { useImperativeHandle, useRef } from "react";
import { SfPlayerViewProps, SfPlayerViewRef } from "./SfPlayer.types";
const NativeView: React.ComponentType<SfPlayerViewProps & { ref?: any }> =
requireNativeView("SfPlayer");
export default React.forwardRef<SfPlayerViewRef, SfPlayerViewProps>(
function SfPlayerView(props, ref) {
const nativeRef = useRef<any>(null);
useImperativeHandle(ref, () => ({
play: async () => {
await nativeRef.current?.play();
},
pause: async () => {
await nativeRef.current?.pause();
},
seekTo: async (position: number) => {
await nativeRef.current?.seekTo(position);
},
seekBy: async (offset: number) => {
await nativeRef.current?.seekBy(offset);
},
setSpeed: async (speed: number) => {
await nativeRef.current?.setSpeed(speed);
},
getSpeed: async () => {
return (await nativeRef.current?.getSpeed()) ?? 1.0;
},
isPaused: async () => {
return (await nativeRef.current?.isPaused()) ?? true;
},
getCurrentPosition: async () => {
return (await nativeRef.current?.getCurrentPosition()) ?? 0;
},
getDuration: async () => {
return (await nativeRef.current?.getDuration()) ?? 0;
},
startPictureInPicture: async () => {
await nativeRef.current?.startPictureInPicture();
},
stopPictureInPicture: async () => {
await nativeRef.current?.stopPictureInPicture();
},
isPictureInPictureSupported: async () => {
return (
(await nativeRef.current?.isPictureInPictureSupported()) ?? false
);
},
isPictureInPictureActive: async () => {
return (await nativeRef.current?.isPictureInPictureActive()) ?? false;
},
setAutoPipEnabled: async (enabled: boolean) => {
await nativeRef.current?.setAutoPipEnabled(enabled);
},
getSubtitleTracks: async () => {
return (await nativeRef.current?.getSubtitleTracks()) ?? [];
},
setSubtitleTrack: async (trackId: number) => {
await nativeRef.current?.setSubtitleTrack(trackId);
},
disableSubtitles: async () => {
await nativeRef.current?.disableSubtitles();
},
getCurrentSubtitleTrack: async () => {
return (await nativeRef.current?.getCurrentSubtitleTrack()) ?? 0;
},
addSubtitleFile: async (url: string, select = true) => {
await nativeRef.current?.addSubtitleFile(url, select);
},
setSubtitlePosition: async (position: number) => {
await nativeRef.current?.setSubtitlePosition(position);
},
setSubtitleScale: async (scale: number) => {
await nativeRef.current?.setSubtitleScale(scale);
},
setSubtitleMarginY: async (margin: number) => {
await nativeRef.current?.setSubtitleMarginY(margin);
},
setSubtitleAlignX: async (alignment: "left" | "center" | "right") => {
await nativeRef.current?.setSubtitleAlignX(alignment);
},
setSubtitleAlignY: async (alignment: "top" | "center" | "bottom") => {
await nativeRef.current?.setSubtitleAlignY(alignment);
},
setSubtitleFontSize: async (size: number) => {
await nativeRef.current?.setSubtitleFontSize(size);
},
setSubtitleColor: async (hexColor: string) => {
await nativeRef.current?.setSubtitleColor(hexColor);
},
setSubtitleBackgroundColor: async (hexColor: string) => {
await nativeRef.current?.setSubtitleBackgroundColor(hexColor);
},
setSubtitleFontName: async (fontName: string) => {
await nativeRef.current?.setSubtitleFontName?.(fontName);
},
getAudioTracks: async () => {
return (await nativeRef.current?.getAudioTracks()) ?? [];
},
setAudioTrack: async (trackId: number) => {
await nativeRef.current?.setAudioTrack(trackId);
},
getCurrentAudioTrack: async () => {
return (await nativeRef.current?.getCurrentAudioTrack()) ?? 0;
},
setVideoZoomToFill: async (enabled: boolean) => {
await nativeRef.current?.setVideoZoomToFill(enabled);
},
getVideoZoomToFill: async () => {
return (await nativeRef.current?.getVideoZoomToFill()) ?? false;
},
}));
return <NativeView ref={nativeRef} {...props} />;
},
);

View File

@@ -1,15 +0,0 @@
import { requireNativeModule } from "expo-modules-core";
export * from "./SfPlayer.types";
export { default as SfPlayerView } from "./SfPlayerView";
// Module-level functions for global KSPlayer settings
const SfPlayerModule = requireNativeModule("SfPlayer");
export function setHardwareDecode(enabled: boolean): void {
SfPlayerModule.setHardwareDecode(enabled);
}
export function getHardwareDecode(): boolean {
return SfPlayerModule.getHardwareDecode();
}

View File

@@ -1,7 +0,0 @@
{
"platforms": ["ios", "tvos"],
"ios": {
"modules": ["VlcPlayer4Module"],
"appDelegateSubscribers": ["AppLifecycleDelegate"]
}
}

View File

@@ -1,32 +0,0 @@
import ExpoModulesCore
protocol SimpleAppLifecycleListener {
func applicationDidEnterBackground() -> Void
func applicationDidEnterForeground() -> Void
}
public class AppLifecycleDelegate: ExpoAppDelegateSubscriber {
public func applicationDidBecomeActive(_ application: UIApplication) {
// The app has become active.
}
public func applicationWillResignActive(_ application: UIApplication) {
// The app is about to become inactive.
}
public func applicationDidEnterBackground(_ application: UIApplication) {
VLCManager.shared.listeners.forEach { listener in
listener.applicationDidEnterBackground()
}
}
public func applicationWillEnterForeground(_ application: UIApplication) {
VLCManager.shared.listeners.forEach { listener in
listener.applicationDidEnterForeground()
}
}
public func applicationWillTerminate(_ application: UIApplication) {
// The app is about to terminate.
}
}

View File

@@ -1,4 +0,0 @@
class VLCManager {
static let shared = VLCManager()
var listeners: [SimpleAppLifecycleListener] = []
}

View File

@@ -1,22 +0,0 @@
Pod::Spec.new do |s|
s.name = 'VlcPlayer4'
s.version = '4.0.0a10'
s.summary = 'A sample project summary'
s.description = 'A sample project description'
s.author = ''
s.homepage = 'https://docs.expo.dev/modules/'
s.platforms = { :ios => '13.4', :tvos => '16' }
s.source = { git: '' }
s.static_framework = true
s.dependency 'ExpoModulesCore'
s.ios.dependency 'VLCKit', s.version
s.tvos.dependency 'VLCKit', s.version
# Swift/Objective-C compatibility
s.pod_target_xcconfig = {
'DEFINES_MODULE' => 'YES',
'SWIFT_COMPILATION_MODE' => 'wholemodule'
}
s.source_files = "*.{h,m,mm,swift,hpp,cpp}"
end

View File

@@ -1,71 +0,0 @@
import ExpoModulesCore
public class VlcPlayer4Module: Module {
public func definition() -> ModuleDefinition {
Name("VlcPlayer4")
View(VlcPlayer4View.self) {
Prop("source") { (view: VlcPlayer4View, source: [String: Any]) in
view.setSource(source)
}
Prop("paused") { (view: VlcPlayer4View, paused: Bool) in
if paused {
view.pause()
} else {
view.play()
}
}
Events(
"onPlaybackStateChanged",
"onVideoStateChange",
"onVideoLoadStart",
"onVideoLoadEnd",
"onVideoProgress",
"onVideoError",
"onPipStarted"
)
AsyncFunction("startPictureInPicture") { (view: VlcPlayer4View) in
view.startPictureInPicture()
}
AsyncFunction("play") { (view: VlcPlayer4View) in
view.play()
}
AsyncFunction("pause") { (view: VlcPlayer4View) in
view.pause()
}
AsyncFunction("stop") { (view: VlcPlayer4View) in
view.stop()
}
AsyncFunction("seekTo") { (view: VlcPlayer4View, time: Int32) in
view.seekTo(time)
}
AsyncFunction("setAudioTrack") { (view: VlcPlayer4View, trackIndex: Int) in
view.setAudioTrack(trackIndex)
}
AsyncFunction("getAudioTracks") { (view: VlcPlayer4View) -> [[String: Any]]? in
return view.getAudioTracks()
}
AsyncFunction("setSubtitleTrack") { (view: VlcPlayer4View, trackIndex: Int) in
view.setSubtitleTrack(trackIndex)
}
AsyncFunction("getSubtitleTracks") { (view: VlcPlayer4View) -> [[String: Any]]? in
return view.getSubtitleTracks()
}
AsyncFunction("setSubtitleURL") {
(view: VlcPlayer4View, url: String, name: String) in
view.setSubtitleURL(url, name: name)
}
}
}
}

View File

@@ -1,507 +0,0 @@
import ExpoModulesCore
import UIKit
import VLCKit
import os
public class VLCPlayerView: UIView {
func setupView(parent: UIView) {
self.backgroundColor = .black
self.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
self.leadingAnchor.constraint(equalTo: parent.leadingAnchor),
self.trailingAnchor.constraint(equalTo: parent.trailingAnchor),
self.topAnchor.constraint(equalTo: parent.topAnchor),
self.bottomAnchor.constraint(equalTo: parent.bottomAnchor),
])
}
public override func layoutSubviews() {
super.layoutSubviews()
for subview in subviews {
subview.frame = bounds
}
}
}
class VLCPlayerWrapper: NSObject {
private var lastProgressCall = Date().timeIntervalSince1970
public var player: VLCMediaPlayer = VLCMediaPlayer()
private var updatePlayerState: (() -> Void)?
private var updateVideoProgress: (() -> Void)?
private var playerView: VLCPlayerView = VLCPlayerView()
public weak var pipController: VLCPictureInPictureWindowControlling?
override public init() {
super.init()
player.delegate = self
player.drawable = self
player.scaleFactor = 0
}
public func setup(
parent: UIView,
updatePlayerState: (() -> Void)?,
updateVideoProgress: (() -> Void)?
) {
self.updatePlayerState = updatePlayerState
self.updateVideoProgress = updateVideoProgress
player.delegate = self
parent.addSubview(playerView)
playerView.setupView(parent: parent)
}
public func getPlayerView() -> UIView {
return playerView
}
}
// MARK: - VLCPictureInPictureDrawable
extension VLCPlayerWrapper: VLCPictureInPictureDrawable {
public func mediaController() -> (any VLCPictureInPictureMediaControlling)! {
return self
}
public func pictureInPictureReady() -> (((any VLCPictureInPictureWindowControlling)?) -> Void)!
{
return { [weak self] controller in
self?.pipController = controller
}
}
}
// MARK: - VLCPictureInPictureMediaControlling
extension VLCPlayerWrapper: VLCPictureInPictureMediaControlling {
func mediaTime() -> Int64 {
return player.time.value?.int64Value ?? 0
}
func mediaLength() -> Int64 {
return player.media?.length.value?.int64Value ?? 0
}
func play() {
player.play()
}
func pause() {
player.pause()
}
func seek(by offset: Int64, completion: @escaping () -> Void) {
player.jump(withOffset: Int32(offset), completion: completion)
}
func isMediaSeekable() -> Bool {
return player.isSeekable
}
func isMediaPlaying() -> Bool {
return player.isPlaying
}
}
// MARK: - VLCDrawable
extension VLCPlayerWrapper: VLCDrawable {
public func addSubview(_ view: UIView) {
playerView.addSubview(view)
}
public func bounds() -> CGRect {
return playerView.bounds
}
}
// MARK: - VLCMediaPlayerDelegate
extension VLCPlayerWrapper: VLCMediaPlayerDelegate {
func mediaPlayerTimeChanged(_ aNotification: Notification) {
DispatchQueue.main.async { [weak self] in
guard let self = self else { return }
let timeNow = Date().timeIntervalSince1970
if timeNow - self.lastProgressCall >= 1 {
self.lastProgressCall = timeNow
self.updateVideoProgress?()
}
}
}
func mediaPlayerStateChanged(_ state: VLCMediaPlayerState) {
DispatchQueue.main.async { [weak self] in
guard let self = self else { return }
self.updatePlayerState?()
guard let pipController = self.pipController else { return }
pipController.invalidatePlaybackState()
}
}
}
class VlcPlayer4View: ExpoView {
let logger = Logger(subsystem: Bundle.main.bundleIdentifier!, category: "VlcPlayer4View")
private var vlc: VLCPlayerWrapper = VLCPlayerWrapper()
private var progressUpdateInterval: TimeInterval = 1.0 // Update interval set to 1 second
private var isPaused: Bool = false
private var customSubtitles: [(internalName: String, originalName: String)] = []
private var startPosition: Int32 = 0
private var externalTrack: [String: String]?
private var isStopping: Bool = false // Define isStopping here
private var externalSubtitles: [[String: String]]?
var hasSource = false
var initialSeekPerformed = false
// A flag variable determinging if we should perform the initial seek. Its either transcoding or offline playback. that makes
var shouldPerformInitialSeek: Bool = false
// MARK: - Initialization
required init(appContext: AppContext? = nil) {
super.init(appContext: appContext)
setupVLC()
setupNotifications()
VLCManager.shared.listeners.append(self)
}
// MARK: - Setup
private func setupVLC() {
vlc.setup(
parent: self,
updatePlayerState: updatePlayerState,
updateVideoProgress: updateVideoProgress
)
}
// Workaround: When playing an HLS video for the first time, seeking to a specific time immediately can cause a crash.
// To avoid this, we wait until the video has started playing before performing the initial seek.
func performInitialSeek() {
guard !initialSeekPerformed,
startPosition > 0,
shouldPerformInitialSeek,
vlc.player.isSeekable else { return }
initialSeekPerformed = true
logger.debug("First time update, performing initial seek to \(self.startPosition) seconds")
vlc.player.time = VLCTime(int: startPosition * 1000)
}
private func setupNotifications() {
NotificationCenter.default.addObserver(
self, selector: #selector(applicationWillResignActive),
name: UIApplication.willResignActiveNotification, object: nil)
NotificationCenter.default.addObserver(
self, selector: #selector(applicationDidBecomeActive),
name: UIApplication.didBecomeActiveNotification, object: nil)
}
// MARK: - Public Methods
func startPictureInPicture() {
self.vlc.pipController?.stateChangeEventHandler = { (isStarted: Bool) in
self.onPipStarted?(["pipStarted": isStarted])
}
self.vlc.pipController?.startPictureInPicture()
}
@objc func play() {
self.vlc.player.play()
self.isPaused = false
logger.debug("Play")
}
@objc func pause() {
self.vlc.player.pause()
self.isPaused = true
}
@objc func seekTo(_ time: Int32) {
let wasPlaying = vlc.player.isPlaying
if wasPlaying {
self.pause()
}
if let duration = vlc.player.media?.length.intValue {
logger.debug("Seeking to time: \(time) Video Duration \(duration)")
// If the specified time is greater than the duration, seek to the end
let seekTime = time > duration ? duration - 1000 : time
vlc.player.time = VLCTime(int: seekTime)
self.updatePlayerState()
// Let mediaPlayerStateChanged handle play state change
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5) {
if wasPlaying {
self.play()
}
}
} else {
logger.error("Unable to retrieve video duration")
}
}
@objc func setSource(_ source: [String: Any]) {
logger.debug("Setting source...")
DispatchQueue.main.async { [weak self] in
guard let self = self else { return }
if self.hasSource {
return
}
var mediaOptions = source["mediaOptions"] as? [String: Any] ?? [:]
self.externalTrack = source["externalTrack"] as? [String: String]
let initOptions: [String] = source["initOptions"] as? [String] ?? []
self.startPosition = source["startPosition"] as? Int32 ?? 0
self.externalSubtitles = source["externalSubtitles"] as? [[String: String]]
for item in initOptions {
let option = item.components(separatedBy: "=")
mediaOptions.updateValue(
option[1], forKey: option[0].replacingOccurrences(of: "--", with: ""))
}
guard let uri = source["uri"] as? String, !uri.isEmpty else {
logger.error("Invalid or empty URI")
self.onVideoError?(["error": "Invalid or empty URI"])
return
}
let autoplay = source["autoplay"] as? Bool ?? false
let isNetwork = source["isNetwork"] as? Bool ?? false
// Set shouldPeformIntial based on isTranscoding and is not a network stream
self.shouldPerformInitialSeek = uri.contains("m3u8") || !isNetwork
self.onVideoLoadStart?(["target": self.reactTag ?? NSNull()])
let media: VLCMedia!
if isNetwork {
logger.debug("Loading network file: \(uri)")
media = VLCMedia(url: URL(string: uri)!)
} else {
logger.debug("Loading local file: \(uri)")
if uri.starts(with: "file://"), let url = URL(string: uri) {
media = VLCMedia(url: url)
} else {
media = VLCMedia(path: uri)
}
}
logger.debug("Media options: \(mediaOptions)")
media.addOptions(mediaOptions)
self.vlc.player.media = media
self.setInitialExternalSubtitles()
self.hasSource = true
if autoplay {
logger.info("Playing...")
// The Video is not transcoding so it its safe to seek to the start position.
if !self.shouldPerformInitialSeek {
self.vlc.player.time = VLCTime(number: NSNumber(value: self.startPosition * 1000))
}
self.play()
}
}
}
@objc func setAudioTrack(_ trackIndex: Int) {
print("Setting audio track: \(trackIndex)")
let track = self.vlc.player.audioTracks[trackIndex]
track.isSelectedExclusively = true
}
@objc func getAudioTracks() -> [[String: Any]]? {
return vlc.player.audioTracks.enumerated().map {
return ["name": $1.trackName, "index": $0]
}
}
@objc func setSubtitleTrack(_ trackIndex: Int) {
logger.debug("Attempting to set subtitle track to index: \(trackIndex)")
if trackIndex == -1 {
logger.debug("Disabling all subtitles")
for track in self.vlc.player.textTracks {
track.isSelected = false
}
return
}
let track = self.vlc.player.textTracks[trackIndex]
track.isSelectedExclusively = true;
logger.debug("Current subtitle track index after setting: \(track.trackName)")
}
@objc func setSubtitleURL(_ subtitleURL: String, name: String) {
guard let url = URL(string: subtitleURL) else {
logger.error("Invalid subtitle URL")
return
}
let result = self.vlc.player.addPlaybackSlave(url, type: .subtitle, enforce: false)
if result == 0 {
let internalName = "Track \(self.customSubtitles.count)"
self.customSubtitles.append((internalName: internalName, originalName: name))
logger.debug("Subtitle added with result: \(result) \(internalName)")
} else {
logger.debug("Failed to add subtitle")
}
}
@objc func getSubtitleTracks() -> [[String: Any]]? {
if self.vlc.player.textTracks.count == 0 {
return nil
}
logger.debug("Number of subtitle tracks: \(self.vlc.player.textTracks.count)")
let tracks = self.vlc.player.textTracks.enumerated().map { (index, track) in
if let customSubtitle = customSubtitles.first(where: {
$0.internalName == track.trackName
}) {
return ["name": customSubtitle.originalName, "index": index]
} else {
return ["name": track.trackName, "index": index]
}
}
logger.debug("Subtitle tracks: \(tracks)")
return tracks
}
@objc func stop(completion: (() -> Void)? = nil) {
logger.debug("Stopping media...")
guard !isStopping else {
completion?()
return
}
isStopping = true
// If we're not on the main thread, dispatch to main thread
if !Thread.isMainThread {
DispatchQueue.main.async { [weak self] in
self?.performStop(completion: completion)
}
} else {
performStop(completion: completion)
}
}
// MARK: - Private Methods
@objc private func applicationWillResignActive() {
}
@objc private func applicationDidBecomeActive() {
}
private func setInitialExternalSubtitles() {
if let externalSubtitles = self.externalSubtitles {
for subtitle in externalSubtitles {
if let subtitleName = subtitle["name"],
let subtitleURL = subtitle["DeliveryUrl"]
{
print("Setting external subtitle: \(subtitleName) \(subtitleURL)")
self.setSubtitleURL(subtitleURL, name: subtitleName)
}
}
}
}
private func performStop(completion: (() -> Void)? = nil) {
// Stop the media player
vlc.player.stop()
// Remove observer
NotificationCenter.default.removeObserver(self)
// Clear the video view
vlc.getPlayerView().removeFromSuperview()
isStopping = false
completion?()
}
private func updateVideoProgress() {
guard self.vlc.player.media != nil else { return }
let currentTimeMs = self.vlc.player.time.intValue
let durationMs = self.vlc.player.media?.length.intValue ?? 0
logger.debug("Current time: \(currentTimeMs)")
self.onVideoProgress?([
"currentTime": currentTimeMs,
"duration": durationMs,
])
}
private func updatePlayerState() {
let player = self.vlc.player
if player.isPlaying {
performInitialSeek()
}
self.onVideoStateChange?([
"target": self.reactTag ?? NSNull(),
"currentTime": player.time.intValue,
"duration": player.media?.length.intValue ?? 0,
"error": false,
"isPlaying": player.isPlaying,
"isBuffering": !player.isPlaying && player.state == VLCMediaPlayerState.buffering,
"state": player.state.description,
])
}
// MARK: - Expo Events
@objc var onPlaybackStateChanged: RCTDirectEventBlock?
@objc var onVideoLoadStart: RCTDirectEventBlock?
@objc var onVideoStateChange: RCTDirectEventBlock?
@objc var onVideoProgress: RCTDirectEventBlock?
@objc var onVideoLoadEnd: RCTDirectEventBlock?
@objc var onVideoError: RCTDirectEventBlock?
@objc var onPipStarted: RCTDirectEventBlock?
// MARK: - Deinitialization
deinit {
logger.debug("Deinitialization")
performStop()
VLCManager.shared.listeners.removeAll()
}
}
// MARK: - SimpleAppLifecycleListener
extension VlcPlayer4View: SimpleAppLifecycleListener {
func applicationDidEnterBackground() {
logger.debug("Entering background")
}
func applicationDidEnterForeground() {
logger.debug("Entering foreground, is player visible? \(self.vlc.getPlayerView().superview != nil)")
if !self.vlc.getPlayerView().isDescendant(of: self) {
logger.debug("Player view is missing. Adding back as subview")
self.addSubview(self.vlc.getPlayerView())
}
// Current solution to fixing black screen when re-entering application
if let videoTrack = self.vlc.player.videoTracks.first(where: { $0.isSelected == true }),
!self.vlc.isMediaPlaying()
{
videoTrack.isSelected = false
videoTrack.isSelectedExclusively = true
self.vlc.player.play()
self.vlc.player.pause()
}
}
}
extension VLCMediaPlayerState {
var description: String {
switch self {
case .opening: return "Opening"
case .buffering: return "Buffering"
case .playing: return "Playing"
case .paused: return "Paused"
case .stopped: return "Stopped"
case .error: return "Error"
case .stopping: return "Stopping"
@unknown default: return "Unknown"
}
}
}

View File

@@ -1,5 +0,0 @@
import { requireNativeModule } from "expo-modules-core";
// It loads the native module object from the JSI or falls back to
// the bridge module (from NativeModulesProxy) if the remote debugger is on.
export default requireNativeModule("VlcPlayer4");

View File

@@ -1,47 +0,0 @@
plugins {
id 'com.android.library'
id 'kotlin-android'
id 'kotlin-kapt'
}
group = 'expo.modules.vlcplayer'
version = '0.6.0'
def expoModulesCorePlugin = new File(project(":expo-modules-core").projectDir.absolutePath, "ExpoModulesCorePlugin.gradle")
def kotlinVersion = findProperty('android.kotlinVersion') ?: '1.9.25'
apply from: expoModulesCorePlugin
applyKotlinExpoModulesCorePlugin()
useDefaultAndroidSdkVersions()
useCoreDependencies()
useExpoPublishing()
android {
namespace "expo.modules.vlcplayer"
compileOptions {
sourceCompatibility JavaVersion.VERSION_17
targetCompatibility JavaVersion.VERSION_17
}
kotlinOptions {
jvmTarget = "17"
}
lintOptions {
abortOnError false
}
}
dependencies {
implementation 'io.github.mengzhidaren:vlc-android-sdk:3.6.3'
implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlinVersion"
}
tasks.withType(org.jetbrains.kotlin.gradle.tasks.KotlinCompile).configureEach {
kotlinOptions {
freeCompilerArgs += ["-Xshow-kotlin-compiler-errors"]
jvmTarget = "17"
}
}

View File

@@ -1,2 +0,0 @@
<manifest>
</manifest>

View File

@@ -1,38 +0,0 @@
package expo.modules.vlcplayer
import expo.modules.core.interfaces.ReactActivityLifecycleListener
// TODO: Creating a separate package class and adding this as a lifecycle listener did not work...
// https://docs.expo.dev/modules/android-lifecycle-listeners/
object VLCManager: ReactActivityLifecycleListener {
val listeners: MutableList<ReactActivityLifecycleListener> = mutableListOf()
// override fun onCreate(activity: Activity?, savedInstanceState: Bundle?) {
// listeners.forEach {
// it.onCreate(activity, savedInstanceState)
// }
// }
//
// override fun onResume(activity: Activity?) {
// listeners.forEach {
// it.onResume(activity)
// }
// }
//
// override fun onPause(activity: Activity?) {
// listeners.forEach {
// it.onPause(activity)
// }
// }
//
// override fun onUserLeaveHint(activity: Activity?) {
// listeners.forEach {
// it.onUserLeaveHint(activity)
// }
// }
//
// override fun onDestroy(activity: Activity?) {
// listeners.forEach {
// it.onDestroy(activity)
// }
// }
}

View File

@@ -1,99 +0,0 @@
package expo.modules.vlcplayer
import androidx.core.os.bundleOf
import expo.modules.kotlin.modules.Module
import expo.modules.kotlin.modules.ModuleDefinition
class VlcPlayerModule : Module() {
override fun definition() = ModuleDefinition {
Name("VlcPlayer")
OnActivityEntersForeground {
VLCManager.listeners.forEach {
it.onResume(appContext.currentActivity)
}
}
OnActivityEntersBackground {
VLCManager.listeners.forEach {
it.onPause(appContext.currentActivity)
}
}
View(VlcPlayerView::class) {
Prop("source") { view: VlcPlayerView, source: Map<String, Any> ->
view.setSource(source)
}
Prop("paused") { view: VlcPlayerView, paused: Boolean ->
if (paused) {
view.pause()
} else {
view.play()
}
}
Events(
"onPlaybackStateChanged",
"onVideoStateChange",
"onVideoLoadStart",
"onVideoLoadEnd",
"onVideoProgress",
"onVideoError",
"onPipStarted"
)
AsyncFunction("startPictureInPicture") { view: VlcPlayerView ->
view.startPictureInPicture()
}
AsyncFunction("play") { view: VlcPlayerView ->
view.play()
}
AsyncFunction("pause") { view: VlcPlayerView ->
view.pause()
}
AsyncFunction("stop") { view: VlcPlayerView ->
view.stop()
}
AsyncFunction("seekTo") { view: VlcPlayerView, time: Int ->
view.seekTo(time)
}
AsyncFunction("setAudioTrack") { view: VlcPlayerView, trackIndex: Int ->
view.setAudioTrack(trackIndex)
}
AsyncFunction("getAudioTracks") { view: VlcPlayerView ->
view.getAudioTracks()
}
AsyncFunction("setSubtitleTrack") { view: VlcPlayerView, trackIndex: Int ->
view.setSubtitleTrack(trackIndex)
}
AsyncFunction("getSubtitleTracks") { view: VlcPlayerView ->
view.getSubtitleTracks()
}
AsyncFunction("setSubtitleURL") { view: VlcPlayerView, url: String, name: String ->
view.setSubtitleURL(url, name)
}
AsyncFunction("setVideoAspectRatio") { view: VlcPlayerView, aspectRatio: String? ->
view.setVideoAspectRatio(aspectRatio)
}
AsyncFunction("setVideoScaleFactor") { view: VlcPlayerView, scaleFactor: Float ->
view.setVideoScaleFactor(scaleFactor)
}
AsyncFunction("setRate") { view: VlcPlayerView, rate: Float ->
view.setRate(rate)
}
}
}
}

View File

@@ -1,487 +0,0 @@
package expo.modules.vlcplayer
import android.R
import android.app.Activity
import android.app.PendingIntent
import android.app.PendingIntent.FLAG_IMMUTABLE
import android.app.PendingIntent.FLAG_UPDATE_CURRENT
import android.app.PictureInPictureParams
import android.app.RemoteAction
import android.content.BroadcastReceiver
import android.content.Context
import android.content.ContextWrapper
import android.content.Intent
import android.content.IntentFilter
import android.graphics.drawable.Icon
import android.net.Uri
import android.os.Build
import android.os.Bundle
import android.os.Handler
import android.os.Looper
import android.util.Log
import android.view.View
import androidx.annotation.RequiresApi
import androidx.core.app.PictureInPictureModeChangedInfo
import androidx.core.view.isVisible
import androidx.lifecycle.Lifecycle
import androidx.lifecycle.LifecycleObserver
import androidx.lifecycle.OnLifecycleEvent
import expo.modules.core.interfaces.ReactActivityLifecycleListener
import expo.modules.core.logging.LogHandlers
import expo.modules.core.logging.Logger
import expo.modules.kotlin.AppContext
import expo.modules.kotlin.viewevent.EventDispatcher
import expo.modules.kotlin.views.ExpoView
import org.videolan.libvlc.LibVLC
import org.videolan.libvlc.Media
import org.videolan.libvlc.MediaPlayer
import org.videolan.libvlc.interfaces.IMedia
import org.videolan.libvlc.util.VLCVideoLayout
class VlcPlayerView(context: Context, appContext: AppContext) : ExpoView(context, appContext), LifecycleObserver, MediaPlayer.EventListener, ReactActivityLifecycleListener {
private val log = Logger(listOf(LogHandlers.createOSLogHandler(this::class.simpleName!!)))
private val PIP_PLAY_PAUSE_ACTION = "PIP_PLAY_PAUSE_ACTION"
private val PIP_REWIND_ACTION = "PIP_REWIND_ACTION"
private val PIP_FORWARD_ACTION = "PIP_FORWARD_ACTION"
private var libVLC: LibVLC? = null
private var mediaPlayer: MediaPlayer? = null
private lateinit var videoLayout: VLCVideoLayout
private var isPaused: Boolean = false
private var lastReportedState: Int? = null
private var lastReportedIsPlaying: Boolean? = null
private var media : Media? = null
private var timeLeft: Long? = null
private val onVideoProgress by EventDispatcher()
private val onVideoStateChange by EventDispatcher()
private val onVideoLoadEnd by EventDispatcher()
private val onPipStarted by EventDispatcher()
private var startPosition: Int? = 0
private var isMediaReady: Boolean = false
private var externalTrack: Map<String, String>? = null
private var externalSubtitles: List<Map<String, String>>? = null
var hasSource: Boolean = false
private val handler = Handler(Looper.getMainLooper())
private val updateInterval = 1000L // 1 second
private val updateProgressRunnable = object : Runnable {
override fun run() {
updateVideoProgress()
handler.postDelayed(this, updateInterval)
}
}
private val currentActivity get() = context.findActivity()
private val actions: MutableList<RemoteAction> = mutableListOf()
private val remoteActionFilter = IntentFilter()
private val playPauseIntent: Intent = Intent(PIP_PLAY_PAUSE_ACTION).setPackage(context.packageName)
private val forwardIntent: Intent = Intent(PIP_FORWARD_ACTION).setPackage(context.packageName)
private val rewindIntent: Intent = Intent(PIP_REWIND_ACTION).setPackage(context.packageName)
private var actionReceiver: BroadcastReceiver = object : BroadcastReceiver() {
override fun onReceive(context: Context?, intent: Intent?) {
when (intent?.action) {
PIP_PLAY_PAUSE_ACTION -> {
if (isPaused) play() else pause()
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
setupPipActions()
currentActivity.setPictureInPictureParams(getPipParams()!!)
}
}
PIP_FORWARD_ACTION -> seekTo((mediaPlayer?.time?.toInt() ?: 0) + 15_000)
PIP_REWIND_ACTION -> seekTo((mediaPlayer?.time?.toInt() ?: 0) - 15_000)
}
}
}
private var pipChangeListener: (PictureInPictureModeChangedInfo) -> Unit = { info ->
if (!info.isInPictureInPictureMode && mediaPlayer?.isPlaying == true) {
log.debug("Exiting PiP")
timeLeft = mediaPlayer?.time
pause()
// Setting the media after reattaching the view allows for a fast video view render
if (mediaPlayer?.vlcVout?.areViewsAttached() == false) {
mediaPlayer?.attachViews(videoLayout, null, false, false)
mediaPlayer?.media = media
mediaPlayer?.play()
timeLeft?.let { mediaPlayer?.time = it }
mediaPlayer?.pause()
}
}
onPipStarted(mapOf(
"pipStarted" to info.isInPictureInPictureMode
))
}
init {
VLCManager.listeners.add(this)
setupView()
setupPiP()
}
private fun setupView() {
log.debug("Setting up view")
setBackgroundColor(android.graphics.Color.WHITE)
videoLayout = VLCVideoLayout(context).apply {
layoutParams = LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT)
}
videoLayout.keepScreenOn = true
addView(videoLayout)
log.debug("View setup complete")
}
private fun setupPiP() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
remoteActionFilter.addAction(PIP_PLAY_PAUSE_ACTION)
remoteActionFilter.addAction(PIP_FORWARD_ACTION)
remoteActionFilter.addAction(PIP_REWIND_ACTION)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
currentActivity.registerReceiver(
actionReceiver,
remoteActionFilter,
Context.RECEIVER_NOT_EXPORTED
)
}
setupPipActions()
currentActivity.apply {
setPictureInPictureParams(getPipParams()!!)
addOnPictureInPictureModeChangedListener(pipChangeListener)
}
}
}
@RequiresApi(Build.VERSION_CODES.O)
private fun setupPipActions() {
actions.clear()
actions.addAll(
listOf(
RemoteAction(
Icon.createWithResource(context, R.drawable.ic_media_rew),
"Rewind",
"Rewind Video",
PendingIntent.getBroadcast(
context,
0,
rewindIntent,
FLAG_UPDATE_CURRENT or FLAG_IMMUTABLE
)
),
RemoteAction(
if (isPaused) Icon.createWithResource(context, R.drawable.ic_media_play)
else Icon.createWithResource(context, R.drawable.ic_media_pause),
"Play",
"Play Video",
PendingIntent.getBroadcast(
context,
if (isPaused) 0 else 1,
playPauseIntent,
FLAG_UPDATE_CURRENT or FLAG_IMMUTABLE
)
),
RemoteAction(
Icon.createWithResource(context, R.drawable.ic_media_ff),
"Skip",
"Skip Forward",
PendingIntent.getBroadcast(
context,
0,
forwardIntent,
FLAG_UPDATE_CURRENT or FLAG_IMMUTABLE
)
)
)
)
}
private fun getPipParams(): PictureInPictureParams? {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
var builder = PictureInPictureParams.Builder()
.setActions(actions)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
builder = builder.setAutoEnterEnabled(true)
}
return builder.build()
}
return null
}
fun setSource(source: Map<String, Any>) {
log.debug("setting source $source")
if (hasSource) {
log.debug("Source already set. Ignoring.")
return
}
val mediaOptions = source["mediaOptions"] as? Map<String, Any> ?: emptyMap()
val autoplay = source["autoplay"] as? Boolean ?: false
val isNetwork = source["isNetwork"] as? Boolean ?: false
externalTrack = source["externalTrack"] as? Map<String, String>
externalSubtitles = source["externalSubtitles"] as? List<Map<String, String>>
startPosition = (source["startPosition"] as? Double)?.toInt() ?: 0
val initOptions = source["initOptions"] as? MutableList<String> ?: mutableListOf()
initOptions.add("--start-time=$startPosition")
val uri = source["uri"] as? String
// Handle video load start event
// onVideoLoadStart?.invoke(mapOf("target" to reactTag ?: "null"))
libVLC = LibVLC(context, initOptions)
mediaPlayer = MediaPlayer(libVLC)
mediaPlayer?.attachViews(videoLayout, null, false, false)
mediaPlayer?.setEventListener(this)
log.debug("Loading network file: $uri")
media = Media(libVLC, Uri.parse(uri))
mediaPlayer?.media = media
log.debug("Debug: Media options: $mediaOptions")
// media.addOptions(mediaOptions)
// Set initial external subtitles immediately like iOS
setInitialExternalSubtitles()
hasSource = true
if (autoplay) {
log.debug("Playing...")
play()
}
}
fun startPictureInPicture() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
currentActivity.enterPictureInPictureMode(getPipParams()!!)
}
}
fun play() {
mediaPlayer?.play()
isPaused = false
handler.post(updateProgressRunnable) // Start updating progress
}
fun pause() {
mediaPlayer?.pause()
isPaused = true
handler.removeCallbacks(updateProgressRunnable) // Stop updating progress
}
fun stop() {
mediaPlayer?.stop()
handler.removeCallbacks(updateProgressRunnable) // Stop updating progress
}
fun seekTo(time: Int) {
mediaPlayer?.let { player ->
val wasPlaying = player.isPlaying
if (wasPlaying) {
player.pause()
}
val duration = player.length.toInt()
val seekTime = if (time > duration) duration - 1000 else time
player.time = seekTime.toLong()
if (wasPlaying) {
player.play()
}
}
}
fun setAudioTrack(trackIndex: Int) {
mediaPlayer?.setAudioTrack(trackIndex)
}
fun getAudioTracks(): List<Map<String, Any>>? {
log.debug("getAudioTracks ${mediaPlayer?.audioTracks}")
val trackDescriptions = mediaPlayer?.audioTracks ?: return null
return trackDescriptions.map { trackDescription ->
mapOf("name" to trackDescription.name, "index" to trackDescription.id)
}
}
fun setSubtitleTrack(trackIndex: Int) {
mediaPlayer?.setSpuTrack(trackIndex)
}
// fun getSubtitleTracks(): List<Map<String, Any>>? {
// return mediaPlayer?.getSpuTracks()?.map { trackDescription ->
// mapOf("name" to trackDescription.name, "index" to trackDescription.id)
// }
// }
fun getSubtitleTracks(): List<Map<String, Any>>? {
val subtitleTracks = mediaPlayer?.spuTracks?.map { trackDescription ->
mapOf("name" to trackDescription.name, "index" to trackDescription.id)
}
// Debug statement to print the result
log.debug("Subtitle Tracks: $subtitleTracks")
return subtitleTracks
}
fun setSubtitleURL(subtitleURL: String, name: String) {
log.debug("Setting subtitle URL: $subtitleURL, name: $name")
mediaPlayer?.addSlave(IMedia.Slave.Type.Subtitle, Uri.parse(subtitleURL), true)
}
fun setVideoAspectRatio(aspectRatio: String?) {
log.debug("Setting video aspect ratio: $aspectRatio")
mediaPlayer?.aspectRatio = aspectRatio
}
fun setVideoScaleFactor(scaleFactor: Float) {
log.debug("Setting video scale factor: $scaleFactor")
mediaPlayer?.scale = scaleFactor
}
fun setRate(rate: Float) {
log.debug("Setting playback rate: $rate")
mediaPlayer?.rate = rate
}
private fun setInitialExternalSubtitles() {
externalSubtitles?.let { subtitles ->
for (subtitle in subtitles) {
val subtitleName = subtitle["name"]
val subtitleURL = subtitle["DeliveryUrl"]
if (!subtitleName.isNullOrEmpty() && !subtitleURL.isNullOrEmpty()) {
log.debug("Setting external subtitle: $subtitleName $subtitleURL")
setSubtitleURL(subtitleURL, subtitleName)
}
}
}
}
override fun onDetachedFromWindow() {
log.debug("onDetachedFromWindow")
super.onDetachedFromWindow()
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
currentActivity.setPictureInPictureParams(
PictureInPictureParams.Builder()
.setAutoEnterEnabled(false)
.build()
)
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
currentActivity.unregisterReceiver(actionReceiver)
}
currentActivity.removeOnPictureInPictureModeChangedListener(pipChangeListener)
VLCManager.listeners.clear()
mediaPlayer?.stop()
handler.removeCallbacks(updateProgressRunnable) // Stop updating progress
media?.release()
mediaPlayer?.release()
libVLC?.release()
mediaPlayer = null
media = null
libVLC = null
}
override fun onEvent(event: MediaPlayer.Event) {
keepScreenOn = event.type == MediaPlayer.Event.Playing || event.type == MediaPlayer.Event.Buffering
when (event.type) {
MediaPlayer.Event.Playing,
MediaPlayer.Event.Paused,
MediaPlayer.Event.Stopped,
MediaPlayer.Event.Buffering,
MediaPlayer.Event.EndReached,
MediaPlayer.Event.EncounteredError -> updatePlayerState(event)
MediaPlayer.Event.TimeChanged -> {
// Do nothing here, as we are updating progress every 1 second
}
}
}
private fun updatePlayerState(event: MediaPlayer.Event) {
val player = mediaPlayer ?: return
val currentState = event.type
val stateInfo = mutableMapOf<String, Any>(
"target" to "null", // Replace with actual target if needed
"currentTime" to player.time.toInt(),
"duration" to (player.media?.duration?.toInt() ?: 0),
"error" to false,
"isPlaying" to (currentState == MediaPlayer.Event.Playing),
"isBuffering" to (!player.isPlaying && currentState == MediaPlayer.Event.Buffering)
)
// Todo: make enum - string to prevent this when statement from becoming exhaustive
when (currentState) {
MediaPlayer.Event.Playing ->
stateInfo["state"] = "Playing"
MediaPlayer.Event.Paused ->
stateInfo["state"] = "Paused"
MediaPlayer.Event.Buffering ->
stateInfo["state"] = "Buffering"
MediaPlayer.Event.EncounteredError -> {
stateInfo["state"] = "Error"
onVideoLoadEnd(stateInfo);
}
MediaPlayer.Event.Opening ->
stateInfo["state"] = "Opening"
}
if (lastReportedState != currentState || lastReportedIsPlaying != player.isPlaying) {
lastReportedState = currentState
lastReportedIsPlaying = player.isPlaying
onVideoStateChange(stateInfo)
}
}
private fun updateVideoProgress() {
val player = mediaPlayer ?: return
val currentTimeMs = player.time.toInt()
val durationMs = player.media?.duration?.toInt() ?: 0
if (currentTimeMs >= 0 && currentTimeMs < durationMs) {
// Set subtitle URL if available
if (player.isPlaying && !isMediaReady) {
isMediaReady = true
externalTrack?.let {
val name = it["name"]
val deliveryUrl = it["DeliveryUrl"] ?: ""
if (!name.isNullOrEmpty() && !deliveryUrl.isNullOrEmpty()) {
setSubtitleURL(deliveryUrl, name)
}
}
}
onVideoProgress(mapOf(
"currentTime" to currentTimeMs,
"duration" to durationMs
));
}
}
override fun onPause(activity: Activity?) {
log.debug("Pausing activity...")
}
override fun onResume(activity: Activity?) {
log.debug("Resuming activity...")
if (isPaused) play()
}
}
internal fun Context.findActivity(): androidx.activity.ComponentActivity {
var context = this
while (context is ContextWrapper) {
if (context is androidx.activity.ComponentActivity) return context
context = context.baseContext
}
throw IllegalStateException("Failed to find ComponentActivity")
}

View File

@@ -1,9 +0,0 @@
{
"platforms": ["ios", "tvos", "android", "web"],
"ios": {
"modules": ["VlcPlayerModule"]
},
"android": {
"modules": ["expo.modules.vlcplayer.VlcPlayerModule"]
}
}

View File

@@ -1,23 +0,0 @@
Pod::Spec.new do |s|
s.name = 'VlcPlayer'
s.version = '3.6.1b1'
s.summary = 'A sample project summary'
s.description = 'A sample project description'
s.author = ''
s.homepage = 'https://docs.expo.dev/modules/'
s.platforms = { :ios => '13.4', :tvos => '13.4' }
s.source = { git: '' }
s.static_framework = true
s.dependency 'ExpoModulesCore'
s.ios.dependency 'MobileVLCKit', s.version
s.tvos.dependency 'TVVLCKit', s.version
# Swift/Objective-C compatibility
s.pod_target_xcconfig = {
'DEFINES_MODULE' => 'YES',
'SWIFT_COMPILATION_MODE' => 'wholemodule'
}
s.source_files = "*.{h,m,mm,swift,hpp,cpp}"
end

View File

@@ -1,88 +0,0 @@
import ExpoModulesCore
public class VlcPlayerModule: Module {
public func definition() -> ModuleDefinition {
Name("VlcPlayer")
View(VlcPlayerView.self) {
Prop("source") { (view: VlcPlayerView, source: [String: Any]) in
view.setSource(source)
}
Prop("paused") { (view: VlcPlayerView, paused: Bool) in
if paused {
view.pause()
} else {
view.play()
}
}
Prop("nowPlayingMetadata") { (view: VlcPlayerView, metadata: [String: String]?) in
if let metadata = metadata {
view.setNowPlayingMetadata(metadata)
}
}
Events(
"onPlaybackStateChanged",
"onVideoStateChange",
"onVideoLoadStart",
"onVideoLoadEnd",
"onVideoProgress",
"onVideoError",
"onPipStarted"
)
AsyncFunction("startPictureInPicture") { (view: VlcPlayerView) in
view.startPictureInPicture()
}
AsyncFunction("play") { (view: VlcPlayerView) in
view.play()
}
AsyncFunction("pause") { (view: VlcPlayerView) in
view.pause()
}
AsyncFunction("stop") { (view: VlcPlayerView) in
view.stop()
}
AsyncFunction("seekTo") { (view: VlcPlayerView, time: Int32) in
view.seekTo(time)
}
AsyncFunction("setAudioTrack") { (view: VlcPlayerView, trackIndex: Int) in
view.setAudioTrack(trackIndex)
}
AsyncFunction("getAudioTracks") { (view: VlcPlayerView) -> [[String: Any]]? in
return view.getAudioTracks()
}
AsyncFunction("setSubtitleURL") { (view: VlcPlayerView, url: String, name: String) in
view.setSubtitleURL(url, name: name)
}
AsyncFunction("setSubtitleTrack") { (view: VlcPlayerView, trackIndex: Int) in
view.setSubtitleTrack(trackIndex)
}
AsyncFunction("setVideoAspectRatio") { (view: VlcPlayerView, aspectRatio: String?) in
view.setVideoAspectRatio(aspectRatio)
}
AsyncFunction("setVideoScaleFactor") { (view: VlcPlayerView, scaleFactor: Float) in
view.setVideoScaleFactor(scaleFactor)
}
AsyncFunction("getSubtitleTracks") { (view: VlcPlayerView) -> [[String: Any]]? in
return view.getSubtitleTracks()
}
AsyncFunction("setRate") { (view: VlcPlayerView, rate: Float) in
view.setRate(rate)
}
}
}
}

View File

@@ -1,725 +0,0 @@
import ExpoModulesCore
import MediaPlayer
import AVFoundation
#if os(tvOS)
import TVVLCKit
#else
import MobileVLCKit
#endif
class VlcPlayerView: ExpoView {
private var mediaPlayer: VLCMediaPlayer?
private var videoView: UIView?
private var progressUpdateInterval: TimeInterval = 1.0 // Update interval set to 1 second
private var isPaused: Bool = false
private var currentGeometryCString: [CChar]?
private var lastReportedState: VLCMediaPlayerState?
private var lastReportedIsPlaying: Bool?
private var customSubtitles: [(internalName: String, originalName: String)] = []
private var startPosition: Int32 = 0
private var externalSubtitles: [[String: String]]?
private var externalTrack: [String: String]?
private var progressTimer: DispatchSourceTimer?
private var isStopping: Bool = false // Define isStopping here
private var lastProgressCall = Date().timeIntervalSince1970
var hasSource = false
var isTranscoding = false
private var initialSeekPerformed: Bool = false
private var nowPlayingMetadata: [String: String]?
private var artworkImage: UIImage?
private var artworkDownloadTask: URLSessionDataTask?
// MARK: - Initialization
required init(appContext: AppContext? = nil) {
super.init(appContext: appContext)
setupView()
setupNotifications()
setupRemoteCommandCenter()
setupAudioSession()
}
// MARK: - Setup
private func setupView() {
DispatchQueue.main.async {
self.backgroundColor = .black
self.videoView = UIView()
self.videoView?.translatesAutoresizingMaskIntoConstraints = false
if let videoView = self.videoView {
self.addSubview(videoView)
NSLayoutConstraint.activate([
videoView.leadingAnchor.constraint(equalTo: self.leadingAnchor),
videoView.trailingAnchor.constraint(equalTo: self.trailingAnchor),
videoView.topAnchor.constraint(equalTo: self.topAnchor),
videoView.bottomAnchor.constraint(equalTo: self.bottomAnchor),
])
}
}
}
private func setupNotifications() {
NotificationCenter.default.addObserver(
self, selector: #selector(applicationWillResignActive),
name: UIApplication.willResignActiveNotification, object: nil)
NotificationCenter.default.addObserver(
self, selector: #selector(applicationDidBecomeActive),
name: UIApplication.didBecomeActiveNotification, object: nil)
#if !os(tvOS)
// Handle audio session interruptions (e.g., incoming calls, other apps playing audio)
NotificationCenter.default.addObserver(
self, selector: #selector(handleAudioSessionInterruption),
name: AVAudioSession.interruptionNotification, object: nil)
#endif
}
private func setupAudioSession() {
#if !os(tvOS)
do {
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(.playback, mode: .moviePlayback, options: [])
try audioSession.setActive(true)
print("Audio session configured for media controls")
} catch {
print("Failed to setup audio session: \(error)")
}
#endif
}
private func setupRemoteCommandCenter() {
#if !os(tvOS)
let commandCenter = MPRemoteCommandCenter.shared()
// Play command
commandCenter.playCommand.isEnabled = true
commandCenter.playCommand.addTarget { [weak self] _ in
self?.play()
return .success
}
// Pause command
commandCenter.pauseCommand.isEnabled = true
commandCenter.pauseCommand.addTarget { [weak self] _ in
self?.pause()
return .success
}
// Toggle play/pause command
commandCenter.togglePlayPauseCommand.isEnabled = true
commandCenter.togglePlayPauseCommand.addTarget { [weak self] _ in
guard let self = self, let player = self.mediaPlayer else {
return .commandFailed
}
if player.isPlaying {
self.pause()
} else {
self.play()
}
return .success
}
// Seek forward command
commandCenter.skipForwardCommand.isEnabled = true
commandCenter.skipForwardCommand.preferredIntervals = [15]
commandCenter.skipForwardCommand.addTarget { [weak self] event in
guard let self = self, let player = self.mediaPlayer else {
return .commandFailed
}
let skipInterval = (event as? MPSkipIntervalCommandEvent)?.interval ?? 15
let currentTime = player.time.intValue
self.seekTo(currentTime + Int32(skipInterval * 1000))
return .success
}
// Seek backward command
commandCenter.skipBackwardCommand.isEnabled = true
commandCenter.skipBackwardCommand.preferredIntervals = [15]
commandCenter.skipBackwardCommand.addTarget { [weak self] event in
guard let self = self, let player = self.mediaPlayer else {
return .commandFailed
}
let skipInterval = (event as? MPSkipIntervalCommandEvent)?.interval ?? 15
let currentTime = player.time.intValue
self.seekTo(max(0, currentTime - Int32(skipInterval * 1000)))
return .success
}
// Change playback position command (scrubbing)
commandCenter.changePlaybackPositionCommand.isEnabled = true
commandCenter.changePlaybackPositionCommand.addTarget { [weak self] event in
guard let self = self,
let event = event as? MPChangePlaybackPositionCommandEvent else {
return .commandFailed
}
let positionTime = event.positionTime
self.seekTo(Int32(positionTime * 1000))
return .success
}
print("Remote command center configured")
#endif
}
private func cleanupRemoteCommandCenter() {
#if !os(tvOS)
let commandCenter = MPRemoteCommandCenter.shared()
// Remove all command targets to prevent memory leaks
commandCenter.playCommand.removeTarget(nil)
commandCenter.pauseCommand.removeTarget(nil)
commandCenter.togglePlayPauseCommand.removeTarget(nil)
commandCenter.skipForwardCommand.removeTarget(nil)
commandCenter.skipBackwardCommand.removeTarget(nil)
commandCenter.changePlaybackPositionCommand.removeTarget(nil)
// Disable commands
commandCenter.playCommand.isEnabled = false
commandCenter.pauseCommand.isEnabled = false
commandCenter.togglePlayPauseCommand.isEnabled = false
commandCenter.skipForwardCommand.isEnabled = false
commandCenter.skipBackwardCommand.isEnabled = false
commandCenter.changePlaybackPositionCommand.isEnabled = false
print("Remote command center cleaned up")
#endif
}
// MARK: - Public Methods
func startPictureInPicture() {}
@objc func play() {
DispatchQueue.main.async {
self.mediaPlayer?.play()
self.isPaused = false
self.updateNowPlayingInfo()
print("Play")
}
}
@objc func pause() {
DispatchQueue.main.async {
self.mediaPlayer?.pause()
self.isPaused = true
self.updateNowPlayingInfo()
}
}
@objc func handleAudioSessionInterruption(_ notification: Notification) {
#if !os(tvOS)
guard let userInfo = notification.userInfo,
let typeValue = userInfo[AVAudioSessionInterruptionTypeKey] as? UInt,
let type = AVAudioSession.InterruptionType(rawValue: typeValue) else {
return
}
switch type {
case .began:
// Interruption began - pause the video
print("Audio session interrupted - pausing video")
self.pause()
case .ended:
// Interruption ended - check if we should resume
if let optionsValue = userInfo[AVAudioSessionInterruptionOptionKey] as? UInt {
let options = AVAudioSession.InterruptionOptions(rawValue: optionsValue)
if options.contains(.shouldResume) {
print("Audio session interruption ended - can resume")
// Don't auto-resume - let user manually resume playback
} else {
print("Audio session interruption ended - should not resume")
}
}
@unknown default:
break
}
#endif
}
@objc func seekTo(_ time: Int32) {
DispatchQueue.main.async {
guard let player = self.mediaPlayer else { return }
let wasPlaying = player.isPlaying
if wasPlaying {
player.pause()
}
if let duration = player.media?.length.intValue {
print("Seeking to time: \(time) Video Duration \(duration)")
// If the specified time is greater than the duration, seek to the end
let seekTime = time > duration ? duration - 1000 : time
player.time = VLCTime(int: seekTime)
if wasPlaying {
player.play()
}
self.updatePlayerState()
self.updateNowPlayingInfo()
} else {
print("Error: Unable to retrieve video duration")
}
}
}
@objc func setSource(_ source: [String: Any]) {
DispatchQueue.main.async { [weak self] in
guard let self = self else { return }
if self.hasSource {
return
}
let mediaOptions = source["mediaOptions"] as? [String: Any] ?? [:]
self.externalTrack = source["externalTrack"] as? [String: String]
var initOptions = source["initOptions"] as? [Any] ?? []
self.startPosition = source["startPosition"] as? Int32 ?? 0
self.externalSubtitles = source["externalSubtitles"] as? [[String: String]]
guard let uri = source["uri"] as? String, !uri.isEmpty else {
print("Error: Invalid or empty URI")
self.onVideoError?(["error": "Invalid or empty URI"])
return
}
self.isTranscoding = uri.contains("m3u8")
if !self.isTranscoding, self.startPosition > 0 {
initOptions.append("--start-time=\(self.startPosition)")
}
let autoplay = source["autoplay"] as? Bool ?? false
let isNetwork = source["isNetwork"] as? Bool ?? false
self.onVideoLoadStart?(["target": self.reactTag ?? NSNull()])
self.mediaPlayer = VLCMediaPlayer(options: initOptions)
self.mediaPlayer?.delegate = self
self.mediaPlayer?.drawable = self.videoView
self.mediaPlayer?.scaleFactor = 0
self.initialSeekPerformed = false
let media: VLCMedia
if isNetwork {
print("Loading network file: \(uri)")
media = VLCMedia(url: URL(string: uri)!)
} else {
print("Loading local file: \(uri)")
if uri.starts(with: "file://"), let url = URL(string: uri) {
media = VLCMedia(url: url)
} else {
media = VLCMedia(path: uri)
}
}
print("Debug: Media options: \(mediaOptions)")
media.addOptions(mediaOptions)
self.mediaPlayer?.media = media
self.setInitialExternalSubtitles()
self.hasSource = true
if autoplay {
print("Playing...")
self.play()
}
}
}
@objc func setAudioTrack(_ trackIndex: Int) {
self.mediaPlayer?.currentAudioTrackIndex = Int32(trackIndex)
}
@objc func getAudioTracks() -> [[String: Any]]? {
guard let trackNames = mediaPlayer?.audioTrackNames,
let trackIndexes = mediaPlayer?.audioTrackIndexes
else {
return nil
}
return zip(trackNames, trackIndexes).map { name, index in
return ["name": name, "index": index]
}
}
@objc func setSubtitleTrack(_ trackIndex: Int) {
print("Debug: Attempting to set subtitle track to index: \(trackIndex)")
self.mediaPlayer?.currentVideoSubTitleIndex = Int32(trackIndex)
print(
"Debug: Current subtitle track index after setting: \(self.mediaPlayer?.currentVideoSubTitleIndex ?? -1)"
)
}
@objc func setSubtitleURL(_ subtitleURL: String, name: String) {
guard let url = URL(string: subtitleURL) else {
print("Error: Invalid subtitle URL")
return
}
let result = self.mediaPlayer?.addPlaybackSlave(url, type: .subtitle, enforce: false)
if let result = result {
let internalName = "Track \(self.customSubtitles.count)"
print("Subtitle added with result: \(result) \(internalName)")
self.customSubtitles.append((internalName: internalName, originalName: name))
} else {
print("Failed to add subtitle")
}
}
private func setInitialExternalSubtitles() {
if let externalSubtitles = self.externalSubtitles {
for subtitle in externalSubtitles {
if let subtitleName = subtitle["name"],
let subtitleURL = subtitle["DeliveryUrl"]
{
print("Setting external subtitle: \(subtitleName) \(subtitleURL)")
self.setSubtitleURL(subtitleURL, name: subtitleName)
}
}
}
}
@objc func getSubtitleTracks() -> [[String: Any]]? {
guard let mediaPlayer = self.mediaPlayer else {
return nil
}
let count = mediaPlayer.numberOfSubtitlesTracks
print("Debug: Number of subtitle tracks: \(count)")
guard count > 0 else {
return nil
}
var tracks: [[String: Any]] = []
if let names = mediaPlayer.videoSubTitlesNames as? [String],
let indexes = mediaPlayer.videoSubTitlesIndexes as? [NSNumber]
{
for (index, name) in zip(indexes, names) {
if let customSubtitle = customSubtitles.first(where: { $0.internalName == name }) {
tracks.append(["name": customSubtitle.originalName, "index": index.intValue])
} else {
tracks.append(["name": name, "index": index.intValue])
}
}
}
print("Debug: Subtitle tracks: \(tracks)")
return tracks
}
@objc func setVideoAspectRatio(_ aspectRatio: String?) {
DispatchQueue.main.async {
if let aspectRatio = aspectRatio {
// Convert String to C string for VLC
let cString = strdup(aspectRatio)
self.mediaPlayer?.videoAspectRatio = cString
} else {
// Reset to default (let VLC determine aspect ratio)
self.mediaPlayer?.videoAspectRatio = nil
}
}
}
@objc func setVideoScaleFactor(_ scaleFactor: Float) {
DispatchQueue.main.async {
self.mediaPlayer?.scaleFactor = scaleFactor
print("Set video scale factor: \(scaleFactor)")
}
}
@objc func setRate(_ rate: Float) {
DispatchQueue.main.async {
self.mediaPlayer?.rate = rate
print("Set playback rate: \(rate)")
}
}
@objc func setNowPlayingMetadata(_ metadata: [String: String]) {
// Cancel any existing artwork download to prevent race conditions
artworkDownloadTask?.cancel()
artworkDownloadTask = nil
self.nowPlayingMetadata = metadata
print("[NowPlaying] Metadata received: \(metadata)")
// Load artwork asynchronously if provided
if let artworkUri = metadata["artworkUri"], let url = URL(string: artworkUri) {
print("[NowPlaying] Loading artwork from: \(artworkUri)")
artworkDownloadTask = URLSession.shared.dataTask(with: url) { [weak self] data, _, error in
guard let self = self else { return }
if let error = error as NSError?, error.code == NSURLErrorCancelled {
print("[NowPlaying] Artwork download cancelled")
return
}
if let error = error {
print("[NowPlaying] Artwork loading error: \(error)")
DispatchQueue.main.async {
self.updateNowPlayingInfo()
}
} else if let data = data, let image = UIImage(data: data) {
print("[NowPlaying] Artwork loaded successfully, size: \(image.size)")
self.artworkImage = image
DispatchQueue.main.async {
self.updateNowPlayingInfo()
}
} else {
print("[NowPlaying] Failed to create image from data")
// Update Now Playing info without artwork on failure
DispatchQueue.main.async {
self.updateNowPlayingInfo()
}
}
}
artworkDownloadTask?.resume()
} else {
// No artwork URI provided - update immediately
print("[NowPlaying] No artwork URI provided")
artworkImage = nil
DispatchQueue.main.async {
self.updateNowPlayingInfo()
}
}
}
@objc func stop(completion: (() -> Void)? = nil) {
guard !isStopping else {
completion?()
return
}
isStopping = true
// If we're not on the main thread, dispatch to main thread
if !Thread.isMainThread {
DispatchQueue.main.async { [weak self] in
self?.performStop(completion: completion)
}
} else {
performStop(completion: completion)
}
}
// MARK: - Private Methods
@objc private func applicationWillResignActive() {
}
@objc private func applicationDidBecomeActive() {
}
private func performStop(completion: (() -> Void)? = nil) {
// Stop the media player
mediaPlayer?.stop()
// Cancel any in-flight artwork downloads
artworkDownloadTask?.cancel()
artworkDownloadTask = nil
artworkImage = nil
// Cleanup remote command center targets
cleanupRemoteCommandCenter()
#if !os(tvOS)
// Deactivate audio session to allow other apps to use audio
do {
try AVAudioSession.sharedInstance().setActive(false, options: .notifyOthersOnDeactivation)
print("Audio session deactivated")
} catch {
print("Failed to deactivate audio session: \(error)")
}
// Clear Now Playing info
MPNowPlayingInfoCenter.default().nowPlayingInfo = nil
#endif
// Remove observer
NotificationCenter.default.removeObserver(self)
// Clear the video view
videoView?.removeFromSuperview()
videoView = nil
// Release the media player
mediaPlayer?.delegate = nil
mediaPlayer = nil
isStopping = false
completion?()
}
private func updateVideoProgress() {
guard let player = self.mediaPlayer else { return }
let currentTimeMs = player.time.intValue
let durationMs = player.media?.length.intValue ?? 0
print("Debug: Current time: \(currentTimeMs)")
if currentTimeMs >= 0 && currentTimeMs < durationMs {
if self.isTranscoding, !self.initialSeekPerformed, self.startPosition > 0 {
player.time = VLCTime(int: self.startPosition * 1000)
self.initialSeekPerformed = true
}
self.onVideoProgress?([
"currentTime": currentTimeMs,
"duration": durationMs,
])
}
// Update Now Playing info to sync elapsed playback time
// iOS needs periodic updates to keep progress indicator in sync
DispatchQueue.main.async {
self.updateNowPlayingInfo()
}
}
private func updateNowPlayingInfo() {
#if !os(tvOS)
guard let player = self.mediaPlayer else { return }
var nowPlayingInfo = [String: Any]()
// Playback rate (0.0 = paused, 1.0 = playing at normal speed)
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = player.isPlaying ? player.rate : 0.0
// Current playback time in seconds
let currentTimeSeconds = Double(player.time.intValue) / 1000.0
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = currentTimeSeconds
// Total duration in seconds
if let duration = player.media?.length.intValue {
let durationSeconds = Double(duration) / 1000.0
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = durationSeconds
}
// Add metadata if available
if let metadata = self.nowPlayingMetadata {
if let title = metadata["title"] {
nowPlayingInfo[MPMediaItemPropertyTitle] = title
print("[NowPlaying] Setting title: \(title)")
}
if let artist = metadata["artist"] {
nowPlayingInfo[MPMediaItemPropertyArtist] = artist
print("[NowPlaying] Setting artist: \(artist)")
}
if let albumTitle = metadata["albumTitle"] {
nowPlayingInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
print("[NowPlaying] Setting album: \(albumTitle)")
}
}
// Add artwork if available
if let artwork = self.artworkImage {
print("[NowPlaying] Setting artwork with size: \(artwork.size)")
let artworkItem = MPMediaItemArtwork(boundsSize: artwork.size) { _ in
return artwork
}
nowPlayingInfo[MPMediaItemPropertyArtwork] = artworkItem
}
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
#endif
}
// MARK: - Expo Events
@objc var onPlaybackStateChanged: RCTDirectEventBlock?
@objc var onVideoLoadStart: RCTDirectEventBlock?
@objc var onVideoStateChange: RCTDirectEventBlock?
@objc var onVideoProgress: RCTDirectEventBlock?
@objc var onVideoLoadEnd: RCTDirectEventBlock?
@objc var onVideoError: RCTDirectEventBlock?
@objc var onPipStarted: RCTDirectEventBlock?
// MARK: - Deinitialization
deinit {
performStop()
}
}
extension VlcPlayerView: VLCMediaPlayerDelegate {
func mediaPlayerTimeChanged(_ aNotification: Notification) {
// self?.updateVideoProgress()
let timeNow = Date().timeIntervalSince1970
if timeNow - lastProgressCall >= 1 {
lastProgressCall = timeNow
updateVideoProgress()
}
}
func mediaPlayerStateChanged(_ aNotification: Notification) {
self.updatePlayerState()
}
private func updatePlayerState() {
guard let player = self.mediaPlayer else { return }
let currentState = player.state
var stateInfo: [String: Any] = [
"target": self.reactTag ?? NSNull(),
"currentTime": player.time.intValue,
"duration": player.media?.length.intValue ?? 0,
"error": false,
]
if player.isPlaying {
stateInfo["isPlaying"] = true
stateInfo["isBuffering"] = false
stateInfo["state"] = "Playing"
} else {
stateInfo["isPlaying"] = false
stateInfo["state"] = "Paused"
}
if player.state == VLCMediaPlayerState.buffering {
stateInfo["isBuffering"] = true
stateInfo["state"] = "Buffering"
} else if player.state == VLCMediaPlayerState.error {
print("player.state ~ error")
stateInfo["state"] = "Error"
self.onVideoLoadEnd?(stateInfo)
} else if player.state == VLCMediaPlayerState.opening {
print("player.state ~ opening")
stateInfo["state"] = "Opening"
}
if self.lastReportedState != currentState
|| self.lastReportedIsPlaying != player.isPlaying
{
self.lastReportedState = currentState
self.lastReportedIsPlaying = player.isPlaying
self.onVideoStateChange?(stateInfo)
}
}
}
extension VlcPlayerView: VLCMediaDelegate {
// Implement VLCMediaDelegate methods if needed
}
extension VLCMediaPlayerState {
var description: String {
switch self {
case .opening: return "Opening"
case .buffering: return "Buffering"
case .playing: return "Playing"
case .paused: return "Paused"
case .stopped: return "Stopped"
case .ended: return "Ended"
case .error: return "Error"
case .esAdded: return "ESAdded"
@unknown default: return "Unknown"
}
}
}

View File

@@ -1,5 +0,0 @@
import { requireNativeModule } from "expo-modules-core";
// It loads the native module object from the JSI or falls back to
// the bridge module (from NativeModulesProxy) if the remote debugger is on.
export default requireNativeModule("VlcPlayer");

View File

@@ -61,7 +61,6 @@
"expo-router": "~6.0.21", "expo-router": "~6.0.21",
"expo-screen-orientation": "~9.0.8", "expo-screen-orientation": "~9.0.8",
"expo-secure-store": "^15.0.8", "expo-secure-store": "^15.0.8",
"expo-sensors": "~15.0.8",
"expo-sharing": "~14.0.8", "expo-sharing": "~14.0.8",
"expo-splash-screen": "~31.0.13", "expo-splash-screen": "~31.0.13",
"expo-status-bar": "~3.0.9", "expo-status-bar": "~3.0.9",

24
plugins/withGitPod.js Normal file
View File

@@ -0,0 +1,24 @@
const { withPodfile } = require("@expo/config-plugins");
const withGitPod = (config, { podName, podspecUrl }) => {
return withPodfile(config, (config) => {
const podfile = config.modResults.contents;
const podLine = ` pod '${podName}', :podspec => '${podspecUrl}'`;
// Check if already added
if (podfile.includes(podLine)) {
return config;
}
// Insert after "use_expo_modules!"
config.modResults.contents = podfile.replace(
"use_expo_modules!",
`use_expo_modules!\n${podLine}`,
);
return config;
});
};
module.exports = withGitPod;

View File

@@ -1,38 +0,0 @@
const { withDangerousMod } = require("@expo/config-plugins");
const fs = require("node:fs");
const path = require("node:path");
const withKSPlayer = (config) => {
return withDangerousMod(config, [
"ios",
async (config) => {
const podfilePath = path.join(
config.modRequest.platformProjectRoot,
"Podfile",
);
let podfileContent = fs.readFileSync(podfilePath, "utf8");
// KSPlayer and its dependencies
const ksPlayerPods = `
# KSPlayer dependencies (GPU acceleration + native PiP)
pod 'KSPlayer', :git => 'https://github.com/kingslay/KSPlayer.git', :tag => '2.3.4', :modular_headers => true
pod 'DisplayCriteria', :git => 'https://github.com/kingslay/KSPlayer.git', :tag => '2.3.4', :modular_headers => true
pod 'FFmpegKit', :git => 'https://github.com/kingslay/FFmpegKit.git', :tag => '6.1.3', :modular_headers => true
pod 'Libass', :git => 'https://github.com/kingslay/FFmpegKit.git', :tag => '6.1.3', :modular_headers => true
`;
// Only add if not already present
if (!podfileContent.includes("pod 'KSPlayer'")) {
podfileContent = podfileContent.replace(
/use_expo_modules!/,
`use_expo_modules!\n${ksPlayerPods}`,
);
fs.writeFileSync(podfilePath, podfileContent);
}
return config;
},
]);
};
module.exports = withKSPlayer;

View File

@@ -7,7 +7,7 @@ import type React from "react";
import { createContext, useCallback, useContext, useState } from "react"; import { createContext, useCallback, useContext, useState } from "react";
import { Platform } from "react-native"; import { Platform } from "react-native";
import type { Bitrate } from "@/components/BitrateSelector"; import type { Bitrate } from "@/components/BitrateSelector";
import { settingsAtom, VideoPlayerIOS } from "@/utils/atoms/settings"; import { settingsAtom } from "@/utils/atoms/settings";
import { getStreamUrl } from "@/utils/jellyfin/media/getStreamUrl"; import { getStreamUrl } from "@/utils/jellyfin/media/getStreamUrl";
import { generateDeviceProfile } from "@/utils/profiles/native"; import { generateDeviceProfile } from "@/utils/profiles/native";
import { apiAtom, userAtom } from "./JellyfinProvider"; import { apiAtom, userAtom } from "./JellyfinProvider";
@@ -78,17 +78,10 @@ export const PlaySettingsProvider: React.FC<{ children: React.ReactNode }> = ({
} }
try { try {
// Determine which player is being used: // Generate device profile for MPV player
// - Android always uses VLC
// - iOS uses user setting (VLC is default)
const useVlcPlayer =
Platform.OS === "android" ||
(Platform.OS === "ios" &&
settings.videoPlayerIOS === VideoPlayerIOS.VLC);
const native = generateDeviceProfile({ const native = generateDeviceProfile({
platform: Platform.OS as "ios" | "android", platform: Platform.OS as "ios" | "android",
player: useVlcPlayer ? "vlc" : "ksplayer", player: "mpv",
audioMode: settings.audioTranscodeMode, audioMode: settings.audioTranscodeMode,
}); });
const data = await getStreamUrl({ const data = await getStreamUrl({

View File

@@ -134,18 +134,16 @@ export enum VideoPlayer {
MPV = 0, MPV = 0,
} }
// iOS video player selection
export enum VideoPlayerIOS {
KSPlayer = "ksplayer",
VLC = "vlc",
}
// Audio transcoding mode - controls how surround audio is handled // Audio transcoding mode - controls how surround audio is handled
// This controls server-side transcoding behavior for audio streams.
// MPV decodes via FFmpeg and supports most formats, but mobile devices
// can't passthrough to external receivers, so this primarily affects
// bandwidth usage and server load.
export enum AudioTranscodeMode { export enum AudioTranscodeMode {
Auto = "auto", // Platform/player defaults (recommended) Auto = "auto", // Platform defaults (recommended)
ForceStereo = "stereo", // Always transcode to stereo ForceStereo = "stereo", // Always transcode to stereo
Allow51 = "5.1", // Allow up to 5.1, transcode 7.1+ Allow51 = "5.1", // Allow up to 5.1, transcode 7.1+
AllowAll = "passthrough", // Direct play all (for external DAC users) AllowAll = "passthrough", // Direct play all audio formats
} }
export type Settings = { export type Settings = {
@@ -192,20 +190,6 @@ export type Settings = {
mpvSubtitleAlignX?: "left" | "center" | "right"; mpvSubtitleAlignX?: "left" | "center" | "right";
mpvSubtitleAlignY?: "top" | "center" | "bottom"; mpvSubtitleAlignY?: "top" | "center" | "bottom";
mpvSubtitleFontSize?: number; mpvSubtitleFontSize?: number;
// KSPlayer settings
ksHardwareDecode: boolean;
ksSubtitleColor: string;
ksSubtitleBackgroundColor: string;
ksSubtitleFontName: string;
// VLC subtitle settings
vlcTextColor?: string;
vlcBackgroundColor?: string;
vlcBackgroundOpacity?: number;
vlcOutlineColor?: string;
vlcOutlineOpacity?: number;
vlcOutlineThickness?: "None" | "Thin" | "Normal" | "Thick";
vlcIsBold?: boolean;
vlcSubtitleMargin?: number;
// Gesture controls // Gesture controls
enableHorizontalSwipeSkip: boolean; enableHorizontalSwipeSkip: boolean;
enableLeftSideBrightnessSwipe: boolean; enableLeftSideBrightnessSwipe: boolean;
@@ -215,8 +199,6 @@ export type Settings = {
usePopularPlugin: boolean; usePopularPlugin: boolean;
showLargeHomeCarousel: boolean; showLargeHomeCarousel: boolean;
mergeNextUpAndContinueWatching: boolean; mergeNextUpAndContinueWatching: boolean;
// iOS video player selection
videoPlayerIOS: VideoPlayerIOS;
// Appearance // Appearance
hideRemoteSessionButton: boolean; hideRemoteSessionButton: boolean;
hideWatchlistsTab: boolean; hideWatchlistsTab: boolean;
@@ -292,20 +274,6 @@ export const defaultValues: Settings = {
mpvSubtitleAlignX: undefined, mpvSubtitleAlignX: undefined,
mpvSubtitleAlignY: undefined, mpvSubtitleAlignY: undefined,
mpvSubtitleFontSize: undefined, mpvSubtitleFontSize: undefined,
// KSPlayer defaults
ksHardwareDecode: true,
ksSubtitleColor: "#FFFFFF",
ksSubtitleBackgroundColor: "#00000080",
ksSubtitleFontName: "System",
// VLC subtitle defaults
vlcTextColor: "White",
vlcBackgroundColor: "Black",
vlcBackgroundOpacity: 128,
vlcOutlineColor: "Black",
vlcOutlineOpacity: 255,
vlcOutlineThickness: "Normal",
vlcIsBold: false,
vlcSubtitleMargin: 40,
// Gesture controls // Gesture controls
enableHorizontalSwipeSkip: true, enableHorizontalSwipeSkip: true,
enableLeftSideBrightnessSwipe: true, enableLeftSideBrightnessSwipe: true,
@@ -315,8 +283,6 @@ export const defaultValues: Settings = {
usePopularPlugin: true, usePopularPlugin: true,
showLargeHomeCarousel: false, showLargeHomeCarousel: false,
mergeNextUpAndContinueWatching: false, mergeNextUpAndContinueWatching: false,
// iOS video player selection - default to VLC
videoPlayerIOS: VideoPlayerIOS.VLC,
// Appearance // Appearance
hideRemoteSessionButton: false, hideRemoteSessionButton: false,
hideWatchlistsTab: false, hideWatchlistsTab: false,

View File

@@ -1,7 +1,7 @@
import type { Api } from "@jellyfin/sdk"; import type { Api } from "@jellyfin/sdk";
import type { MediaSourceInfo } from "@jellyfin/sdk/lib/generated-client/models"; import type { MediaSourceInfo } from "@jellyfin/sdk/lib/generated-client/models";
import { getMediaInfoApi } from "@jellyfin/sdk/lib/utils/api"; import { getMediaInfoApi } from "@jellyfin/sdk/lib/utils/api";
import native from "@/utils/profiles/native"; import trackPlayerProfile from "@/utils/profiles/trackplayer";
export interface AudioStreamResult { export interface AudioStreamResult {
url: string; url: string;
@@ -26,7 +26,7 @@ export const getAudioStreamUrl = async (
method: "POST", method: "POST",
data: { data: {
userId, userId,
deviceProfile: native, deviceProfile: trackPlayerProfile,
startTimeTicks: 0, startTimeTicks: 0,
isPlayback: true, isPlayback: true,
autoOpenLiveStream: true, autoOpenLiveStream: true,

View File

@@ -4,7 +4,10 @@ import type {
MediaSourceInfo, MediaSourceInfo,
} from "@jellyfin/sdk/lib/generated-client/models"; } from "@jellyfin/sdk/lib/generated-client/models";
import { Bitrate } from "@/components/BitrateSelector"; import { Bitrate } from "@/components/BitrateSelector";
import { generateDeviceProfile } from "@/utils/profiles/native"; import {
type AudioTranscodeModeType,
generateDeviceProfile,
} from "@/utils/profiles/native";
import { getDownloadStreamUrl, getStreamUrl } from "./getStreamUrl"; import { getDownloadStreamUrl, getStreamUrl } from "./getStreamUrl";
export const getDownloadUrl = async ({ export const getDownloadUrl = async ({
@@ -16,6 +19,7 @@ export const getDownloadUrl = async ({
audioStreamIndex, audioStreamIndex,
subtitleStreamIndex, subtitleStreamIndex,
deviceId, deviceId,
audioMode = "auto",
}: { }: {
api: Api; api: Api;
item: BaseItemDto; item: BaseItemDto;
@@ -25,6 +29,7 @@ export const getDownloadUrl = async ({
audioStreamIndex: number; audioStreamIndex: number;
subtitleStreamIndex: number; subtitleStreamIndex: number;
deviceId: string; deviceId: string;
audioMode?: AudioTranscodeModeType;
}): Promise<{ }): Promise<{
url: string | null; url: string | null;
mediaSource: MediaSourceInfo | null; mediaSource: MediaSourceInfo | null;
@@ -39,7 +44,7 @@ export const getDownloadUrl = async ({
audioStreamIndex, audioStreamIndex,
subtitleStreamIndex, subtitleStreamIndex,
deviceId, deviceId,
deviceProfile: generateDeviceProfile(), deviceProfile: generateDeviceProfile({ audioMode }),
}); });
if (maxBitrate.key === "Max" && !streamDetails?.mediaSource?.TranscodingUrl) { if (maxBitrate.key === "Max" && !streamDetails?.mediaSource?.TranscodingUrl) {
@@ -59,6 +64,7 @@ export const getDownloadUrl = async ({
maxStreamingBitrate: maxBitrate.value, maxStreamingBitrate: maxBitrate.value,
audioStreamIndex, audioStreamIndex,
subtitleStreamIndex, subtitleStreamIndex,
audioMode,
}); });
return { return {

View File

@@ -5,7 +5,8 @@ import type {
} from "@jellyfin/sdk/lib/generated-client/models"; } from "@jellyfin/sdk/lib/generated-client/models";
import { BaseItemKind } from "@jellyfin/sdk/lib/generated-client/models/base-item-kind"; import { BaseItemKind } from "@jellyfin/sdk/lib/generated-client/models/base-item-kind";
import { getMediaInfoApi } from "@jellyfin/sdk/lib/utils/api"; import { getMediaInfoApi } from "@jellyfin/sdk/lib/utils/api";
import download from "@/utils/profiles/download"; import { generateDownloadProfile } from "@/utils/profiles/download";
import type { AudioTranscodeModeType } from "@/utils/profiles/native";
interface StreamResult { interface StreamResult {
url: string; url: string;
@@ -265,6 +266,7 @@ export const getDownloadStreamUrl = async ({
subtitleStreamIndex = undefined, subtitleStreamIndex = undefined,
mediaSourceId, mediaSourceId,
deviceId, deviceId,
audioMode = "auto",
}: { }: {
api: Api | null | undefined; api: Api | null | undefined;
item: BaseItemDto | null | undefined; item: BaseItemDto | null | undefined;
@@ -274,6 +276,7 @@ export const getDownloadStreamUrl = async ({
subtitleStreamIndex?: number; subtitleStreamIndex?: number;
mediaSourceId?: string | null; mediaSourceId?: string | null;
deviceId?: string | null; deviceId?: string | null;
audioMode?: AudioTranscodeModeType;
}): Promise<{ }): Promise<{
url: string | null; url: string | null;
sessionId: string | null; sessionId: string | null;
@@ -292,7 +295,7 @@ export const getDownloadStreamUrl = async ({
method: "POST", method: "POST",
data: { data: {
userId, userId,
deviceProfile: download, deviceProfile: generateDownloadProfile(audioMode),
subtitleStreamIndex, subtitleStreamIndex,
startTimeTicks: 0, startTimeTicks: 0,
isPlayback: true, isPlayback: true,

View File

@@ -3,111 +3,79 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this * License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/. * file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/ */
import MediaTypes from "../../constants/MediaTypes"; import { generateDeviceProfile } from "./native";
/** /**
* Device profile for Native video player * @typedef {"auto" | "stereo" | "5.1" | "passthrough"} AudioTranscodeModeType
*/ */
export default {
Name: "1. Vlc Player", /**
MaxStaticBitrate: 20_000_000, * Download-specific subtitle profiles.
MaxStreamingBitrate: 20_000_000, * These are more permissive than streaming profiles since we can embed subtitles.
CodecProfiles: [ */
{ const downloadSubtitleProfiles = [
Type: MediaTypes.Video, // Official formats
Codec: "h264,h265,hevc,mpeg4,divx,xvid,wmv,vc1,vp8,vp9,av1",
},
{
Type: MediaTypes.Audio,
Codec: "aac,ac3,eac3,mp3,flac,alac,opus,vorbis,pcm,wma",
},
],
DirectPlayProfiles: [
{
Type: MediaTypes.Video,
Container: "mp4,mkv,avi,mov,flv,ts,m2ts,webm,ogv,3gp,hls",
VideoCodec:
"h264,hevc,mpeg4,divx,xvid,wmv,vc1,vp8,vp9,av1,avi,mpeg,mpeg2video",
AudioCodec: "aac,ac3,eac3,mp3,flac,alac,opus,vorbis,wma",
},
{
Type: MediaTypes.Audio,
Container: "mp3,aac,flac,alac,wav,ogg,wma",
AudioCodec:
"mp3,aac,flac,alac,opus,vorbis,wma,pcm,mpa,wav,ogg,oga,webma,ape",
},
],
TranscodingProfiles: [
{
Type: MediaTypes.Video,
Context: "Streaming",
Protocol: "hls",
Container: "ts",
VideoCodec: "h264, hevc",
AudioCodec: "aac,mp3,ac3",
CopyTimestamps: false,
EnableSubtitlesInManifest: true,
},
{
Type: MediaTypes.Audio,
Context: "Streaming",
Protocol: "http",
Container: "mp3",
AudioCodec: "mp3",
MaxAudioChannels: "2",
},
],
SubtitleProfiles: [
// Official foramts
{ Format: "vtt", Method: "Encode" }, { Format: "vtt", Method: "Encode" },
{ Format: "webvtt", Method: "Encode" }, { Format: "webvtt", Method: "Encode" },
{ Format: "srt", Method: "Encode" }, { Format: "srt", Method: "Encode" },
{ Format: "subrip", Method: "Encode" }, { Format: "subrip", Method: "Encode" },
{ Format: "ttml", Method: "Encode" }, { Format: "ttml", Method: "Encode" },
{ Format: "dvdsub", Method: "Encode" }, { Format: "dvdsub", Method: "Encode" },
{ Format: "ass", Method: "Encode" }, { Format: "ass", Method: "Encode" },
{ Format: "idx", Method: "Encode" }, { Format: "idx", Method: "Encode" },
{ Format: "pgs", Method: "Encode" }, { Format: "pgs", Method: "Encode" },
{ Format: "pgssub", Method: "Encode" }, { Format: "pgssub", Method: "Encode" },
{ Format: "ssa", Method: "Encode" }, { Format: "ssa", Method: "Encode" },
// Other formats // Other formats
{ Format: "microdvd", Method: "Encode" }, { Format: "microdvd", Method: "Encode" },
{ Format: "mov_text", Method: "Encode" }, { Format: "mov_text", Method: "Encode" },
{ Format: "mpl2", Method: "Encode" }, { Format: "mpl2", Method: "Encode" },
{ Format: "pjs", Method: "Encode" }, { Format: "pjs", Method: "Encode" },
{ Format: "realtext", Method: "Encode" }, { Format: "realtext", Method: "Encode" },
{ Format: "scc", Method: "Encode" }, { Format: "scc", Method: "Encode" },
{ Format: "smi", Method: "Encode" }, { Format: "smi", Method: "Encode" },
{ Format: "stl", Method: "Encode" }, { Format: "stl", Method: "Encode" },
{ Format: "sub", Method: "Encode" }, { Format: "sub", Method: "Encode" },
{ Format: "subviewer", Method: "Encode" }, { Format: "subviewer", Method: "Encode" },
{ Format: "teletext", Method: "Encode" }, { Format: "teletext", Method: "Encode" },
{ Format: "text", Method: "Encode" }, { Format: "text", Method: "Encode" },
{ Format: "vplayer", Method: "Encode" }, { Format: "vplayer", Method: "Encode" },
{ Format: "xsub", Method: "Encode" }, { Format: "xsub", Method: "Encode" },
], ];
/**
* Generates a device profile optimized for downloads.
* Uses the same audio codec logic as streaming but with download-specific bitrate limits.
*
* @param {AudioTranscodeModeType} [audioMode="auto"] - Audio transcoding mode
* @returns {Object} Jellyfin device profile for downloads
*/
export const generateDownloadProfile = (audioMode = "auto") => {
// Get the base profile with proper audio codec configuration
const baseProfile = generateDeviceProfile({ audioMode });
// Override with download-specific settings
return {
...baseProfile,
Name: "1. MPV Download",
// Limit bitrate for downloads (20 Mbps)
MaxStaticBitrate: 20_000_000,
MaxStreamingBitrate: 20_000_000,
// Use download-specific subtitle profiles
SubtitleProfiles: downloadSubtitleProfiles,
// Update transcoding profiles with download-specific settings
TranscodingProfiles: baseProfile.TranscodingProfiles.map((profile) => {
if (profile.Type === "Video") {
return {
...profile,
CopyTimestamps: false,
EnableSubtitlesInManifest: true,
};
}
return profile;
}),
};
}; };
// Default export for backward compatibility
export default generateDownloadProfile();

View File

@@ -5,7 +5,7 @@
*/ */
export type PlatformType = "ios" | "android"; export type PlatformType = "ios" | "android";
export type PlayerType = "vlc" | "ksplayer"; export type PlayerType = "mpv";
export type AudioTranscodeModeType = "auto" | "stereo" | "5.1" | "passthrough"; export type AudioTranscodeModeType = "auto" | "stereo" | "5.1" | "passthrough";
export interface ProfileOptions { export interface ProfileOptions {
@@ -18,3 +18,6 @@ export interface ProfileOptions {
} }
export function generateDeviceProfile(options?: ProfileOptions): any; export function generateDeviceProfile(options?: ProfileOptions): any;
declare const _default: any;
export default _default;

View File

@@ -9,22 +9,22 @@ import { getSubtitleProfiles } from "./subtitles";
/** /**
* @typedef {"ios" | "android"} PlatformType * @typedef {"ios" | "android"} PlatformType
* @typedef {"vlc" | "ksplayer"} PlayerType * @typedef {"mpv"} PlayerType
* @typedef {"auto" | "stereo" | "5.1" | "passthrough"} AudioTranscodeModeType * @typedef {"auto" | "stereo" | "5.1" | "passthrough"} AudioTranscodeModeType
* *
* @typedef {Object} ProfileOptions * @typedef {Object} ProfileOptions
* @property {PlatformType} [platform] - Target platform * @property {PlatformType} [platform] - Target platform
* @property {PlayerType} [player] - Video player being used * @property {PlayerType} [player] - Video player being used (MPV only)
* @property {AudioTranscodeModeType} [audioMode] - Audio transcoding mode * @property {AudioTranscodeModeType} [audioMode] - Audio transcoding mode
*/ */
/** /**
* Audio profiles for react-native-track-player based on platform capabilities. * Audio direct play profiles for standalone audio items in MPV player.
* iOS uses AVPlayer, Android uses ExoPlayer - each has different codec support. * These define which audio file formats can be played directly without transcoding.
*/ */
const getAudioDirectPlayProfile = (platform) => { const getAudioDirectPlayProfile = (platform) => {
if (platform === "ios") { if (platform === "ios") {
// iOS AVPlayer supported formats // iOS audio formats supported by MPV
return { return {
Type: MediaTypes.Audio, Type: MediaTypes.Audio,
Container: "mp3,m4a,aac,flac,alac,wav,aiff,caf", Container: "mp3,m4a,aac,flac,alac,wav,aiff,caf",
@@ -32,7 +32,7 @@ const getAudioDirectPlayProfile = (platform) => {
}; };
} }
// Android ExoPlayer supported formats // Android audio formats supported by MPV
return { return {
Type: MediaTypes.Audio, Type: MediaTypes.Audio,
Container: "mp3,m4a,aac,ogg,flac,wav,webm,mka", Container: "mp3,m4a,aac,ogg,flac,wav,webm,mka",
@@ -40,16 +40,20 @@ const getAudioDirectPlayProfile = (platform) => {
}; };
}; };
/**
* Audio codec profiles for standalone audio items in MPV player.
* These define codec constraints for audio file playback.
*/
const getAudioCodecProfile = (platform) => { const getAudioCodecProfile = (platform) => {
if (platform === "ios") { if (platform === "ios") {
// iOS AVPlayer codec constraints // iOS audio codec constraints for MPV
return { return {
Type: MediaTypes.Audio, Type: MediaTypes.Audio,
Codec: "aac,ac3,eac3,mp3,flac,alac,opus,pcm", Codec: "aac,ac3,eac3,mp3,flac,alac,opus,pcm",
}; };
} }
// Android ExoPlayer codec constraints // Android audio codec constraints for MPV
return { return {
Type: MediaTypes.Audio, Type: MediaTypes.Audio,
Codec: "aac,ac3,eac3,mp3,flac,vorbis,opus,pcm", Codec: "aac,ac3,eac3,mp3,flac,vorbis,opus,pcm",
@@ -57,72 +61,61 @@ const getAudioCodecProfile = (platform) => {
}; };
/** /**
* Gets the video audio codec configuration based on platform, player, and audio mode. * Gets the video audio codec configuration based on platform and audio mode.
* *
* Key insight: VLC handles AC3/EAC3/DTS downmixing fine. * MPV (via FFmpeg) can decode all audio codecs including TrueHD and DTS-HD MA.
* Only TrueHD and DTS-HD MA (lossless 7.1) cause issues on mobile devices * The audioMode setting only controls the maximum channel count - MPV will
* because VLC's internal downmixing from 7.1 to stereo fails on some Android audio pipelines. * decode and downmix as needed.
* *
* @param {PlatformType} platform * @param {PlatformType} platform
* @param {PlayerType} player
* @param {AudioTranscodeModeType} audioMode * @param {AudioTranscodeModeType} audioMode
* @returns {{ directPlayCodec: string, maxAudioChannels: string }} * @returns {{ directPlayCodec: string, maxAudioChannels: string }}
*/ */
const getVideoAudioCodecs = (platform, player, audioMode) => { const getVideoAudioCodecs = (platform, audioMode) => {
// Base codecs that work everywhere // Base codecs
const baseCodecs = "aac,mp3,flac,opus,vorbis"; const baseCodecs = "aac,mp3,flac,opus,vorbis";
// Surround codecs that VLC handles well (downmixes properly) // Surround codecs
const surroundCodecs = "ac3,eac3,dts"; const surroundCodecs = "ac3,eac3,dts";
// Lossless HD codecs that cause issues with VLC's downmixing on mobile // Lossless HD codecs - MPV decodes these and downmixes as needed
const losslessHdCodecs = "truehd"; const losslessHdCodecs = "truehd";
// Platform-specific codecs // Platform-specific codecs
const platformCodecs = platform === "ios" ? "alac,wma" : "wma"; const platformCodecs = platform === "ios" ? "alac,wma" : "wma";
// Handle explicit user settings first // MPV can decode all codecs - only channel count varies by mode
const allCodecs = `${baseCodecs},${surroundCodecs},${losslessHdCodecs},${platformCodecs}`;
switch (audioMode) { switch (audioMode) {
case "stereo": case "stereo":
// Force stereo transcoding - only allow basic codecs // Limit to 2 channels - MPV will decode and downmix
return { return {
directPlayCodec: `${baseCodecs},${platformCodecs}`, directPlayCodec: allCodecs,
maxAudioChannels: "2", maxAudioChannels: "2",
}; };
case "5.1": case "5.1":
// Allow up to 5.1 - include surround codecs but not lossless HD // Limit to 6 channels
return { return {
directPlayCodec: `${baseCodecs},${surroundCodecs},${platformCodecs}`, directPlayCodec: allCodecs,
maxAudioChannels: "6", maxAudioChannels: "6",
}; };
case "passthrough": case "passthrough":
// Allow all codecs - for users with external DAC/receiver // Allow up to 8 channels - for external DAC/receiver setups
return { return {
directPlayCodec: `${baseCodecs},${surroundCodecs},${losslessHdCodecs},${platformCodecs}`, directPlayCodec: allCodecs,
maxAudioChannels: "8", maxAudioChannels: "8",
}; };
default: default:
// Auto mode: platform and player-specific defaults // Auto mode: default to 5.1 (6 channels)
break;
}
// Auto mode logic based on platform and player
if (player === "ksplayer" && platform === "ios") {
// KSPlayer on iOS handles all codecs well, including TrueHD
return { return {
directPlayCodec: `${baseCodecs},${surroundCodecs},${losslessHdCodecs},${platformCodecs}`, directPlayCodec: allCodecs,
maxAudioChannels: "8",
};
}
// VLC on Android or iOS - don't include TrueHD (causes 7.1 downmix issues)
// DTS core is fine, VLC handles it well. Only lossless 7.1 formats are problematic.
return {
directPlayCodec: `${baseCodecs},${surroundCodecs},${platformCodecs}`,
maxAudioChannels: "6", maxAudioChannels: "6",
}; };
}
}; };
/** /**
@@ -133,22 +126,18 @@ const getVideoAudioCodecs = (platform, player, audioMode) => {
*/ */
export const generateDeviceProfile = (options = {}) => { export const generateDeviceProfile = (options = {}) => {
const platform = options.platform || Platform.OS; const platform = options.platform || Platform.OS;
const player = options.player || "vlc";
const audioMode = options.audioMode || "auto"; const audioMode = options.audioMode || "auto";
const { directPlayCodec, maxAudioChannels } = getVideoAudioCodecs( const { directPlayCodec, maxAudioChannels } = getVideoAudioCodecs(
platform, platform,
player,
audioMode, audioMode,
); );
const playerName = player === "ksplayer" ? "KSPlayer" : "VLC Player";
/** /**
* Device profile for Native video player * Device profile for MPV player
*/ */
const profile = { const profile = {
Name: `1. ${playerName}`, Name: "1. MPV",
MaxStaticBitrate: 999_999_999, MaxStaticBitrate: 999_999_999,
MaxStreamingBitrate: 999_999_999, MaxStreamingBitrate: 999_999_999,
CodecProfiles: [ CodecProfiles: [
@@ -210,3 +199,6 @@ export const generateDeviceProfile = (options = {}) => {
return profile; return profile;
}; };
// Default export for backward compatibility
export default generateDeviceProfile();

19
utils/profiles/trackplayer.d.ts vendored Normal file
View File

@@ -0,0 +1,19 @@
/**
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
export type PlatformType = "ios" | "android";
export interface TrackPlayerProfileOptions {
/** Target platform */
platform?: PlatformType;
}
export function generateTrackPlayerProfile(
options?: TrackPlayerProfileOptions,
): any;
declare const _default: any;
export default _default;

View File

@@ -0,0 +1,95 @@
/**
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import { Platform } from "react-native";
import MediaTypes from "../../constants/MediaTypes";
/**
* @typedef {"ios" | "android"} PlatformType
*
* @typedef {Object} TrackPlayerProfileOptions
* @property {PlatformType} [platform] - Target platform
*/
/**
* Audio direct play profiles for react-native-track-player.
* iOS uses AVPlayer, Android uses ExoPlayer - each has different codec support.
*
* @param {PlatformType} platform
*/
const getDirectPlayProfile = (platform) => {
if (platform === "ios") {
// iOS AVPlayer supported formats
return {
Type: MediaTypes.Audio,
Container: "mp3,m4a,aac,flac,alac,wav,aiff,caf",
AudioCodec: "mp3,aac,alac,flac,opus,pcm",
};
}
// Android ExoPlayer supported formats
return {
Type: MediaTypes.Audio,
Container: "mp3,m4a,aac,ogg,flac,wav,webm,mka",
AudioCodec: "mp3,aac,flac,vorbis,opus,pcm",
};
};
/**
* Audio codec profiles for react-native-track-player.
*
* @param {PlatformType} platform
*/
const getCodecProfile = (platform) => {
if (platform === "ios") {
// iOS AVPlayer codec constraints
return {
Type: MediaTypes.Audio,
Codec: "aac,ac3,eac3,mp3,flac,alac,opus,pcm",
};
}
// Android ExoPlayer codec constraints
return {
Type: MediaTypes.Audio,
Codec: "aac,ac3,eac3,mp3,flac,vorbis,opus,pcm",
};
};
/**
* Generates a device profile for music playback via react-native-track-player.
*
* This profile is specifically for standalone audio playback using:
* - AVPlayer on iOS
* - ExoPlayer on Android
*
* @param {TrackPlayerProfileOptions} [options] - Profile configuration options
* @returns {Object} Jellyfin device profile for track player
*/
export const generateTrackPlayerProfile = (options = {}) => {
const platform = options.platform || Platform.OS;
return {
Name: "Track Player",
MaxStaticBitrate: 320_000_000,
MaxStreamingBitrate: 320_000_000,
CodecProfiles: [getCodecProfile(platform)],
DirectPlayProfiles: [getDirectPlayProfile(platform)],
TranscodingProfiles: [
{
Type: MediaTypes.Audio,
Context: "Streaming",
Protocol: "http",
Container: "mp3",
AudioCodec: "mp3",
MaxAudioChannels: "2",
},
],
SubtitleProfiles: [],
};
};
// Default export for convenience
export default generateTrackPlayerProfile();