Local Media
To stream media in a video conference, you need to produce audio and video. We call the audio and video produced and sent by the current user as local media. The audio and video sent by other participants is called remote media. This tutorial teaches you how to handle local media. We'll teach you how to handle remote media in the next tutorial.
You'll learn the following:
- Create local media to capture a local user's audio and video.
- Encode and decode Opus for audio transmission.
- Encode and decode VP8, VP9, and H.264 for video transmission.
- Create and enable Acoustic Echo Cancellation (AEC) to remove echo from a user's audio.
- Start and stop local media.
Important
For web, we don't implement local media because LiveSwitch provides one that works with the major browsers. However, we still need to implement the start and stop local media logic for TypeScript.
Prerequisites
This tutorial requires the Unregister and Reregister app you have created earlier.
Create Acoustic Echo Cancellation
If you ever hear yourself speaking a few hundred milliseconds after you have spoken, most likely it's because the remote peer doesn't have Acoustic Echo Cancellation (AEC). The remote peer is playing your audio stream through the speakers, then picking it up with the microphone and streaming it back to you.
Before creating the local media, we need to implement AEC and enable it in the local media so that the local user's echo is removed for the remote users.
Note
You only need to implement AEC for .NET and Android. iOS provides built-in echo cancellation. Browsers handle AEC automatically.
We implement AEC with the following methods:
- The
CreateOutputMixerSink
method that creates a sink to play the mixed audio output. This corresponds to your system's speakers. - The
CreateProcessor
method that creates and returns an instance ofFM.LiveSwitch.AudioProcessing.AecProcessor
. We specify the audio settings and the tail length to use for echo cancellation.
Paste the following code into the HelloWorld
namespace in the AecContext.cs
file.
// Simple implementation of AecContext
public class AecContext : FM.LiveSwitch.AecContext
{
protected override AudioSink CreateOutputMixerSink(AudioConfig config)
{
return new FM.LiveSwitch.NAudio.Sink(config);
}
protected override AecPipe CreateProcessor()
{
return new FM.LiveSwitch.AudioProcessing.AecProcessor(new AudioConfig(16000, 1));
}
}
Create Local Media
To create local media, do the following:
Define Local Media: The first step in capturing local media is to define how this capture is performed by extending the
RtcLocalMedia<T>
class. In this context, the generic typeT
represents the type of object used for displaying the video preview. Each local media implementation is usually associated with a specific set of inputs, such as a camera and a microphone or a user's screen. We capture the user's inputs using an implementation ofRtcLocalMedia
which we namedLocalMedia
.Enable AEC: We've already created the
AecContext
class. We now need to enable it in our local media. To enable AEC, simply take in anAecContext
object in the constructor and pass theAecContext
object to its base class's constructor.Capture Local Audio: To enable audio, we implement the following:
- The
CreateAudioRecorder
method that records audio. - The
CreateAudioSource
method that returns an audio source used for recording audio. - The
CreateOpusEncoder
method that enables Opus encoder. It captures the user's audio and sends it to other participants. This method is optional. However, without it, your app falls back to the lower quality PCMA/PCMU audio codecs.
- The
Capture Local Video: To enable video, we implement the following:
- The
CreateVideoRecorder
method that records video. - Specify which video codecs to use. We implement the
CreateVp8Encoder
,CreateVp9Encoder
, andCreateH264Encoder
methods that encode VP8, VP9, and H.264. - The
CreateImageConverter
method that provides a minor image formatting utility. This creates a tool that converts between various color spaces, which are different ways of representing colors. This is needed because webcams don't capture data in the i420 color space, which is required by the LiveSwitch video encoders. The method must return an instance ofFM.LiveSwitch.Yuv.ImageConverter
.
- The
Paste the following code to the LocalMedia.cs
file.
public abstract class LocalMedia : RtcLocalMedia<System.Windows.Controls.Image>
{
// Enable AEC
public LocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext)
: base(disableAudio, disableVideo, aecContext)
{
}
// Local Audio
protected override AudioSink CreateAudioRecorder(AudioFormat inputFormat)
{
return new FM.LiveSwitch.Matroska.AudioSink(Id + "-local-audio-" + inputFormat.Name.ToLower() + ".mkv");
}
protected override AudioSource CreateAudioSource(AudioConfig config)
{
if (FM.LiveSwitch.Dmo.VoiceCaptureSource.IsSupported())
{
return new FM.LiveSwitch.Dmo.VoiceCaptureSource(!AecDisabled);
}
else
{
return new FM.LiveSwitch.NAudio.Source(config);
}
}
protected override AudioEncoder CreateOpusEncoder(AudioConfig config)
{
return new FM.LiveSwitch.Opus.Encoder(config);
}
// Local Video
protected override VideoSink CreateVideoRecorder(VideoFormat inputFormat)
{
return new FM.LiveSwitch.Matroska.VideoSink(Id + "-local-video-" + inputFormat.Name.ToLower() + ".mkv");
}
protected override VideoEncoder CreateVp8Encoder()
{
return new FM.LiveSwitch.Vp8.Encoder();
}
protected override VideoEncoder CreateVp9Encoder()
{
return new FM.LiveSwitch.Vp9.Encoder();
}
protected override VideoEncoder CreateH264Encoder()
{
return null;
}
protected override VideoPipe CreateImageConverter(VideoFormat outputFormat)
{
return new FM.LiveSwitch.Yuv.ImageConverter(outputFormat);
}
}
Create Camera Local Media
We've created a general type of local media to handle a local user's inputs. Next, we'll create a CameraLocalMedia
class derived from the LocalMedia
class to capture the user's camera. We'll create it in a separate file with the following:
- The video configuration with the width, height, and frame rate.
- The
CreateVideoSource
method that creates the video source and captures video data from the user's camera. - The
CreateViewSink
method that shows a preview of the user's camera. This method must return a view of the same type as the type parameter of yourLocalMedia
class.
Paste the following code to the CameraLocalMedia.cs
file.
// An implementation of Local Media with Camera
public class CameraLocalMedia : LocalMedia
{
private VideoConfig _CameraConfig = new VideoConfig(640, 480, 30);
public CameraLocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext) : base(disableAudio, disableVideo, aecContext)
{
Initialize();
}
protected override VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.AForge.CameraSource(_CameraConfig);
}
protected override ViewSink<System.Windows.Controls.Image> CreateViewSink()
{
return new FM.LiveSwitch.Wpf.ImageSink()
{
ViewScale = LayoutScale.Contain,
ViewMirror = true
};
}
}
Start and Stop Local Media
Next, we implement the logic to start and stop the local user's microphone and camera. To do so, we implement the following:
- To start capturing media, invoke the
Start
method of theLocalMedia
implementation. TheStart
method returns a promise, which will resolve when the instance begins to capture data from the user's camera and microphone. If media can't be captured, then the promise will be rejected. We specify a reject action to notify errors. - To preview the local media:
- Create an instance of
FM.LiveSwitch.LayoutManager
, which is a class to manage the local and remote video feeds for a video conference. - Pass the layout manager instance to a container.
- Retrieve the local view by accessing the
View
property of theLocalMedia
instance. - Assign the local view to the layout manager by invoking the
SetLocalView
method.
- Create an instance of
- To stop capturing camera and microphone data, invoke the
Stop
method of theLocalMedia
class. We also need to remove the local preview from the layout manager by invoking theUnsetLocalView
method of the layout manager.
Paste the following code into the HelloWorldLogic
class:
public LocalMedia LocalMedia { get; private set; }
private AecContext _AecContext = new AecContext();
private Dispatcher _Dispatcher = null;
private FM.LiveSwitch.Wpf.LayoutManager _LayoutManager = null;
// Starting Local Media by passing in MainWindow
public async Task StartLocalMedia(MainWindow mainWindow)
{
_Dispatcher = mainWindow.Dispatcher;
_LayoutManager = new FM.LiveSwitch.Wpf.LayoutManager(mainWindow.videoContainer);
LocalMedia = new CameraLocalMedia(false, false, _AecContext);
await LocalMedia.Start();
_Dispatcher.Invoke(() =>
{
var localView = LocalMedia.View;
if (localView != null)
{
localView.Name = "LocalView";
_LayoutManager.SetLocalView(LocalMedia.View);
}
});
Log.Info("Successfully started Local Media.");
}
// Stopping Local Media
public async Task StopLocalMedia()
{
await LocalMedia.Stop();
var layoutManager = _LayoutManager;
if (layoutManager != null)
{
layoutManager.RemoveRemoteViews();
layoutManager.UnsetLocalView();
_LayoutManager = null;
}
if (LocalMedia != null)
{
LocalMedia.Destroy();
LocalMedia = null;
}
Log.Info("Successfully stopped local media.");
}
Uncomment UI Components
Now, go to the files for the UI components and uncomment the code for starting and stopping local media.
In the MainWindow.xaml.cs
file, uncomment all the code between the <LocalMedia>
and </LocalMedia>
tags.
Note
There are multiple instances of these tags. Uncomment all the code between those instances.
Run Your App
Note
LiveSwitch recommends that you use a phone and not an emulator to run the mobile apps. It's possible to run the mobile apps on an emulator, but the camera doesn't work.
Run your app in your project IDE and click Join. You should see yourself on the app window. Congratulations, you've built a LiveSwitch app to handle local media!
Note
For TypeScript, it's recommended that you clear your browser cache before clicking Join.