-
-
Notifications
You must be signed in to change notification settings - Fork 689
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide a simple Audio Visualizer #97
Comments
I also think this feature would be a good idea, although it is not necessarily the highest on my personal priority list. I will of course welcome pull requests. The underlying API on the Android side to implement this is: https://developer.android.com/reference/android/media/audiofx/Visualizer This API is relatively straightforward to use. However, on iOS, it does not appear to be as straightforward. |
Thanks for your answer! I look forward to and await this resource. |
|
@sachaarbonel thanks! |
Any chance you might add this? It's not present in any sound libraries for Flutter currently, and would be really nice to be able to build visualizers. Edit, looking into it a bit further, I think we just need this value on iOS, is that a big lift? |
That would allow for a rudimentary visualiser, although probably what we want is something equivalent to Android's API, so we'd want to do a FFT on the audio signal. After accidentally stumbling upon it, it seems there is a way to do this. First, we create an |
@ryanheise I think the best solution for audio visualization is providing a clean way to subscribe or retrieve the samples/buffers of audio data. Providing a way to pull in the raw PCM data would allow for more than just FFT analysis but would open the door to nearly any other type of audio analysis. The |
@pstromberg98 that's possibly a good idea. Looking at the Android Visualizer API, it actually provides both the FFT data and the waveform data, so we could do the same. Although you could argue we only need the latter, both Android and iOS provide us accelerated implementations of FFT which we should take advantage of. |
@ryanheise Totally! It would be super nice for the library to provide the fft and I don’t see any harm in providing that. I was mainly saying in the case of having either one or the other it would probably be better to provide the raw waveform just in case users wanted to do other analysis and transforms on the data. But having both would be slick! |
I've just implemented the Android side on the samplingRate = player.startVisualizer(
enableWaveform: true,
enableFft: true,
captureRate: 10000,
captureSize: 1024); Then listen to data on The waveform data is in 8 bit unsigned PCM (i.e. subtract 128 from each byte to get it zero-centred). The FFT is in the Android format, and I'm not sure yet whether the iOS native format will be different, so that particular part of the API may be subject to change. It may take a bit longer for me to get the iOS side working, but in the meantime would anyone consider contributing a pull request that adds a very minimalistic visualiser widget to the example, demonstrating how to make use of the data in |
I've changed the API slightly to include the sampling rate in the captured data, and included a very simple visualiser widget in the example. The iOS side will be more difficult and I have some higher priority things to switch to for the moment but help is welcome. In particular, if you would like to either give feedback on the API, contribute a better visualiser widget example or even help getting started on the iOS implementation using the documentation linked above. |
To those interested in this feature, would you like a separate method to start the request permission flow to record audio for the visualizer? Currently |
@ryanheise Personally I like when libraries provide more fine grain control but I can see the argument for both. Perhaps it would be best to have |
I second the comment from @pstromberg98 |
I've made an initial implementation of the visualizer for iOS. Note that this is definitely not production ready. Some problems to investigate:
Thanks, @pstromberg98 for the suggestion. I agree, and I'll try to implement that. As before, I unfortunately need to work on some other issues for a while, particularly null safety. But hopefully this commit provides a good starting foundation to build on. Contributions are also welcome, so here is the "help wanted":
|
Has anyone been able to give this a whirl yet? I think this would be a really useful feature to include in the next release, so I'd like to make it a priority, although for that to happen, it would definitely help to get some feedback on the iOS side in terms of memory efficiency and stability. I will of course eventually add the option to start the permission flow on Android on demand, but I think the iOS stability will be the most critical thing to be confident about before I include this in a release, along with the iOS FFT implementation. Of course, I could just document it as being experimental and unstable, and release it that way, which might actually not be a bad idea to get more eyes on it. |
@ryanheise If I can find time I will talk a look at the iOS side and give my thoughts and feedback. I appreciate your efforts on it so far and am eager to jump in and help when I find time 👍. I think marking the feature as experimental would make a lot of sense. |
I pulled it over and merged the changes to check out the android version, but cant seem to get it working. Mic permission is granted (although it crashes app after prompt acceptance), but it prevents my player from playing anything. I even tried wrapping it in a future to ensure the player has data before calling. Any ideas? Future.delayed(Duration(seconds: 2), () {
var samplingRate = activeState.player.startVisualizer(
enableWaveform: true,
enableFft: true,
captureRate: 48000,
captureSize: 1024);
activeState.player.visualizerWaveformStream.listen((event) {
print(event);
this.add(AudioManagerVizUpdateEvent(vizData: event.data));
});
}); |
Thanks for testing that. It turns out there is another change to the behaviour of ExoPlayer in that |
Perhaps. With the null safety release of Flutter soon to reach stable, I'm not sure if I'd like to do this before or after that. Currently I'm maintaining two branches which is a bit inconvenient to keep in sync. We'll see how things pan out but first I may need to focus on getting the null safety releases ready. |
Thanks for jumping to this. Things are working great so far, but the fft buffer visual is a bit different than I expected from using my custom visualizer on other fft sources. Will try to take a deeper look and report back the bug if I find it. |
Unfortunately for CI at the moment I think you'll have to fork the repo, make the necessary changes so that dependency overrides aren't necessary, and then use your fork as a dependency. Of course that's not ideal. This branch will still exist as its own branch for quite a while before being considered stable enough to merge into the master branch and publish on pub.dev, but perhaps I should do something similar to what I did with audio_service when working on the year long development of an experimental branch: basically, I'd make it so the pubspec.yaml files in git refer to just_audio_platform_interface via git references rather than relative paths within the repository, and have the alternative path-based dependencies still there and commented out (because the plugin developers still need to work based on those). Anyway, for now though I suggest the fork approach. |
I found this https://developer.apple.com/forums/thread/45966 However, I found some good news but I have not looked into it yet. |
That sounds familiar... I thought I was already doing something like that where |
Hi all, @Eittipat 's PR is now merged into the For anyone who was already using the FFT visualizer on Android, note that I also just changed the plugin to convert the platform data from This is still not ready to be merged into the public release. I think some improvements should be made on when the Android implementation prompts the user for permissions, and on the iOS side the TAP code should be reviewed and possibly refactored to allow for future uses of the TAP. |
Any Idea on when the visualizer will be done |
Did u try to use it? It works in a lot of situations |
@ryanheise I updated your library and change my code for Int8List, but it's still not working |
I will try to fix it in native code |
I just don't understand how to use it in IOS Native code( |
I've already looked at it. It doesn't work. I think you have to wait for the AVAudioEngine version (which is still in the early stage - #334) |
Curious what the current plans are for the visualizer branch. Is it still planned to be merged in or is it now waiting for #784 before further updates? |
Hi @spakanati No it is not waiting for #784 , probably going forward there will be both the current AVQueuePlayer-based and the AVAudioEngine-based implementions available since they may end up supporting different feature sets. What this branch is waiting on is a finalisation of the API (particularly for requesting permissions and also for starting/stopping the visualizer), and also a code review and perhaps code refactoring on the iOS side to handle the TAP code more cleanly. I would be a bit nervous about just merging this TAP code until it has been well tested, so I think this branch would remain here as the way for people to experiment with the visualizer until the final code has been tested and I am confident that it will not break anything. Of course to help speed this up, people are welcome to help on any of the above points, either through code or by contributing thoughts/ideas through discussion. |
Thanks for the clarification! I've been able to use the As far as the permissions, I agree that it might be common to want more control over the timing of the request, especially because a microphone record request is a little confusing/jarring for users. This was pretty easy for me to get around, though -- I just did my own permission request before ever calling |
First and foremost, I would like to express my sincere gratitude for the considerable effort and dedication you have devoted to this project alongside the rest of the contributors.
adding the following code to add a stop button for testing. Click stop while audio is playing, then click pause then click play, it will play without a sound.
changing to the following code It will crash the app when the song is playing and visualizer running and stop called. changing to the following code It will crash the app as the stopVisualizer is not finished before stop called changing to the following code it will work as the stopVisualizer had the time to finish before stop called. crashes log: ensureTap tracks:13 |
Thanks @karrarkazuya , this is exactly the sort of feedback I was hoping for, since this branch is quite experimental and can't be merged until it is sufficiently tested and becomes stable. Since you didn't mention which platform you were testing on, could you confirm which one that is? I would guess iOS or macOS since the Tap is an Apple concept. |
The test was actually made on iOS simulator as showing in the terminal log, however since you mentioned this now I have also tested on iOS simulator, iOS device (iPhone XR), and real Android Device (SD 8Gen 1) The results were as following On iOS Device:
On iOS simulator: |
This comment doesn't seem to work for me. I'm still getting an error on
This is in my pubspec.yml:
Can somebody help me setup this branch? Thanks! |
I think this is because on the It should probably be using a dependency_override instead. I am not sure what the best way to proceed until this gets addressed in some way is, either clone this repo localy instead of using a git url, or fork this repo and fix the pubspec files I think. |
That's correct, there is a chicken and egg problem with developing plugins within the federated plugin architecture that is quite inconvenient to deal with. As long as this branch is in development and hasn't been published, it will continue to be inconvenient. Running a local dependency definitely works, that's obviously what I do, as a plugin developer. I should probably bump up the priority of this branch so that it gets released. In order to do that, I need to look at two things:
|
I am just getting started in Dart, but couldn't you specify the local path inside a |
You could try it and if you find something that would be lower maintenance, you would be welcome to make a pull request. |
Makes sense! Btw one thing I came across, I'm not sure how relevant it is or if it should be mentioned in the docs anywhere perhaps: |
That is true, the example shows this, but I haven't written the documentation yet until I finalise how the permission API will actually work. I think rather than it being initiated by |
In the latest commit, permission handling is separated from the plugin, so your app can start the permission request flow at a suitable time before starting the visualizer. I've updated the example and the docs. The remaining issue before merging is to review the TAP code mentioned earlier. Extra eyes on it are welcome. E.g.
|
Unfortunately I have no idea about the TAP processor, but we just ran into an issue when trying to use this branch with background audio on Android. We're getting an error:
I don't quite understand where those are even supposed to be implemented, so any tips on how to approach this would be welcome. I will also try simply not subscribing to those events in |
I tried to implement the missing methods. This is as far as I got: I'm not sure if this is correct or if there's anything missing. I haven't actually tested this properly, as we are actually moving away from using the visualizer and doing offline pre-processing to generate spectral analysis of our audio files instead. |
Are you using just_audio_background? If so, you're getting the error because just_audio_background hasn't implemented that part of the platform interface. If you look inside that plugin's code, you'll see it already implements two of the other event streams, so the implementation of this new event stream would be like that: class _JustAudioPlayer extends AudioPlayerPlatform {
final eventController = StreamController<PlaybackEventMessage>.broadcast();
final playerDataController = StreamController<PlayerDataMessage>.broadcast();
...
@override
Stream<PlaybackEventMessage> get playbackEventMessageStream =>
eventController.stream;
@override
Stream<PlayerDataMessage> get playerDataMessageStream =>
playerDataController.stream;
...
} The implementation should provide all the visualizer event to the main plugin via this 3rd stream that should be overridden. |
The permission handling change has been working well for me. Is there a recommended way to use this branch in a project that also targets web (or a path for web in general if the branch is hopefully close to merging)? I understand the visualizer isn't implemented yet for web, but all playback breaks on web because of calls to unimplemented visualizerWaveformStream even if startVisualizer is never used. |
When will just_audio_background support the visualizer branch? |
Provide a simple Audio Visualizer (showing low, medium and high frequencies) that offers the possibility of increasing or decreasing a number of bars. This is useful for the user to make sure that the audio is being played or recorded in the application. Sometimes the volume or mic can be at a minimum.
Example: https://dev.to/armen101/audiorecordview-3jn5
The text was updated successfully, but these errors were encountered: