我正试图在Player类上提供我的audioHandler,但发生了一些奇怪的事情
当我进入屏幕时,StreamBuilder会很好地激活,但如果我再次弹出并导航到屏幕,流连接将永远处于"等待"状态,除非我播放音频。这会导致一些奇怪的行为。我做错了什么?
相关代码
玩家等级
final audioHandlerProvider = Provider<AudioHandler>((ref) {
AudioHandler _audioHandler = ref.read(audioHandlerServiceProvider);
return _audioHandler;
});
class _PlayerClicVozzState extends State<PlayerClicVozz> {
@override
Widget build(BuildContext context) {
return Scaffold(
extendBodyBehindAppBar: true,
backgroundColor: Color(0xff131313),
appBar: AppBar(
automaticallyImplyLeading: false,
actions: [
IconButton(
icon: Icon(Icons.clear, color: Colors.white),
onPressed: () => Navigator.of(context).pop(),
),
],
backgroundColor: Colors.transparent,
elevation: 0,
),
body: Center(
child: Consumer(builder: (context, watch, child) {
final res = watch(audioHandlerProvider);
return StreamBuilder<MediaState>(
stream: _mediaStateStream(res),
builder: (context, snapshot) {
final mediaState = snapshot.data;
return SeekBar(
duration: mediaState?.mediaItem?.duration ?? Duration.zero,
position: mediaState?.position ?? Duration.zero,
onChangeEnd: (newPosition) {
res.seek(newPosition);
},
);
},
);
...
音频服务初始化
late AudioHandler _audioHandler;
final audioHandlerServiceProvider = Provider<AudioHandler>((ref) {
return _audioHandler;
});
Future<void> main() async {
_audioHandler = await AudioService.init(
builder: () => AudioPlayerHandler(),
config: AudioServiceConfig(
androidNotificationChannelId: 'com.mycompany.myapp.channel.audio',
androidNotificationChannelName: 'Audio playback',
androidNotificationOngoing: true,
),
);
...
我的音频处理程序与插件示例完全相同
import 'package:audio_service/audio_service.dart';
import 'package:just_audio/just_audio.dart';
class AudioPlayerHandler extends BaseAudioHandler with SeekHandler {
static final _item = MediaItem(
id: 'https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3',
album: "Science Friday",
title: "A Salute To Head-Scratching Science",
artist: "Science Friday and WNYC Studios",
duration: const Duration(milliseconds: 5739820),
artUri: Uri.parse(
'https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg'),
);
final _player = AudioPlayer();
/// Initialise our audio handler.
AudioPlayerHandler() {
// So that our clients (the Flutter UI and the system notification) know
// what state to display, here we set up our audio handler to broadcast all
// playback state changes as they happen via playbackState...
_player.playbackEventStream.map(_transformEvent).pipe(playbackState);
// ... and also the current media item via mediaItem.
mediaItem.add(_item);
// Load the player.
_player.setAudioSource(AudioSource.uri(Uri.parse(_item.id)));
}
// In this simple example, we handle only 4 actions: play, pause, seek and
// stop. Any button press from the Flutter UI, notification, lock screen or
// headset will be routed through to these 4 methods so that you can handle
// your audio playback logic in one place.
@override
Future<void> play() => _player.play();
@override
Future<void> pause() => _player.pause();
@override
Future<void> seek(Duration position) => _player.seek(position);
@override
Future<void> stop() => _player.stop();
/// Transform a just_audio event into an audio_service state.
///
/// This method is used from the constructor. Every event received from the
/// just_audio player will be transformed into an audio_service state so that
/// it can be broadcast to audio_service clients.
PlaybackState _transformEvent(PlaybackEvent event) {
return PlaybackState(
controls: [
MediaControl.rewind,
if (_player.playing) MediaControl.pause else MediaControl.play,
MediaControl.stop,
MediaControl.fastForward,
],
systemActions: const {
MediaAction.seek,
MediaAction.seekForward,
MediaAction.seekBackward,
},
androidCompactActionIndices: const [0, 1, 3],
processingState: const {
ProcessingState.idle: AudioProcessingState.idle,
ProcessingState.loading: AudioProcessingState.loading,
ProcessingState.buffering: AudioProcessingState.buffering,
ProcessingState.ready: AudioProcessingState.ready,
ProcessingState.completed: AudioProcessingState.completed,
}[_player.processingState]!,
playing: _player.playing,
updatePosition: _player.position,
bufferedPosition: _player.bufferedPosition,
speed: _player.speed,
queueIndex: event.currentIndex,
);
}
}
MediaStateStream和QueueStateStream
Stream<MediaState> _mediaStateStream(AudioHandler audioHandler) {
return Rx.combineLatest2<MediaItem?, Duration, MediaState>(
audioHandler.mediaItem,
AudioService.position,
(mediaItem, position) => MediaState(mediaItem, position));
}
_queueStateStream(AudioHandler audioHandler) {
return Rx.combineLatest2<List<MediaItem>?, MediaItem?, QueueState>(
audioHandler.queue,
audioHandler.mediaItem,
(queue, mediaItem) => QueueState(queue, mediaItem));
}
订阅流时,您只会在订阅的那一刻开始接收之后发出的新事件,并且您可能有一段时间等待下一个事件。
在_mediaStateStream
的实现中,您使用的是AudioService.position
,它只在位置发生变化(即未暂停或停滞(时发出事件。因此,即使流在过去可能发出过位置事件,如果您在暂停或暂停时再次订阅该流,您将处于等待状态,直到下一个位置事件到来,也就是再次恢复播放之后。
我建议将流包装在rxdart的BeehaviorSubject
中,以便它保留最后一个事件的内存,并将最后一个活动重新发送给新的侦听器。此外,您可以使用第一个值为这个BehaviorSubject
种子,以确保即使对于第一个侦听器也没有等待期:
_mediaStateSubject = BehaviorSubject.seeded(MediaState(
handler.mediaItem.valueOrNull,
handler.playbackState.position))
..addStream(_mediaStateStream(handler));
然后您可以收听_mediaStateSubject
而不是_mediaStateStream
。