audiostreaming.md 4.8 KB

Audio streaming

This package provides the AudioStreamTrack class for streaming audio, allowing the sending and receiving of AudioSource. Also, by using the SetData method, raw audio data other than AudioSource can be sent.

[!NOTE] The package samples contains the Audio scene which demonstrates audio features of the package.

Sending audio

In order to stream audio, first you need to get the AudioStreamTrack instance.

// Create `AudioStreamTrack` instance with `AudioSource`.
var inputAudioSource = GetComponent<AudioSource>();
var track = new AudioStreamTrack(inputAudioSource);

// Add a track to the `RTCPeerConnection` instance.
var sendStream = new MediaStream();
var sender = peerConnection.AddTrack(track, sendStream);

The RTCRtpSender instance can use with the RemoveTrack method to discard the track.

// Remove a track from the `RTCPeerConnection` instance.
peerConnection.RemoveTrack(sender);

There are two types of AudioStreamTrack constructors: one with an AudioSource argument, and one with no argument. Use the one with no argument if you want to use AudioListener.

The SetData method is used to send audio data; SetData is automatically called internally when AudioSource is passed to the constructor, but when the constructor has no arguments, SetData must be called. Note that the SetData method is supposed to be called on the audio thread, not the main thread. See OnAudioFilterRead for details.

[RequireComponent(typeof(AudioListener))]
class AudioSender : MonoBehaviour
{
    AudioStreamTrack track;
    const int sampleRate = 48000;

    // The initialization process have been omitted for brevity.

    // This method is called on the audio thread.
    private void OnAudioFilterRead(float[] data, int channels)
    {
        track.SetData(data, channels, sampleRate);
    }
}

[!NOTE] As with the AudioListener component, when using the OnAudioFilterRead method, it must be associated with a GameObject.

Receiving audio

You can use AudioStreamTrack to receive the audio. The class for receiving audio is got on OnTrack event of the RTCPeerConnection instance. If the type of MediaStreamTrack argument of the event is TrackKind.Audio, The Track instance is able to be casted to the AudioStreamTrack class.

var receivedAudioSource = GetComponent<AudioSource>();
var receiveStream = new MediaStream();
receiveStream.OnAddTrack = e => {
    if(e.Track is AudioStreamTrack track)
    {
        // `AudioSource.SetTrack` is a extension method which is available 
        // when using `Unity.WebRTC` namespace.
        receivedAudioSource.SetTrack(track);

        // Please do not forget to turn on the `loop` flag.
        receivedAudioSource.loop = true;
        receivedAudioSource.Play();        
    }
    else if (e.Track is VideoStreamTrack track)
    {
        // This track is for video.
    }
}

var peerConnection = new RTCPeerConnection();
peerConnection.OnTrack = (RTCTrackEvent e) => {
    if (e.Track.Kind == TrackKind.Audio)
    {
        // Add track to MediaStream for receiver.
        // This process triggers `OnAddTrack` event of `MediaStream`.
        receiveStream.AddTrack(e.Track);
    }
};