Search Results for

    Show / Hide Table of Contents

    Local Media

    To stream media in a video conference, you need to produce audio and video. We call the audio and video produced and sent by the current user as local media. The audio and video sent by other participants is called remote media. This tutorial teaches you how to handle local media. We'll teach you how to handle remote media in the next tutorial.

    You'll learn the following:

    • Create local media to capture a local user's audio and video.
    • Encode and decode Opus for audio transmission.
    • Encode and decode VP8, VP9, and H.264 for video transmission.
    • Create and enable Acoustic Echo Cancellation (AEC) to remove echo from a user's audio.
    • Start and stop local media.
    Important

    For web, we don't implement local media because LiveSwitch provides one that works with the major browsers. However, we still need to implement the start and stop local media logic for TypeScript.

    Prerequisites

    This tutorial requires the Unregister and Reregister app you have created earlier.

    Create Acoustic Echo Cancellation

    If you ever hear yourself speaking a few hundred milliseconds after you have spoken, most likely it's because the remote peer doesn't have Acoustic Echo Cancellation (AEC). The remote peer is playing your audio stream through the speakers, then picking it up with the microphone and streaming it back to you.

    Before creating the local media, we need to implement AEC and enable it in the local media so that the local user's echo is removed for the remote users.

    Note

    You only need to implement AEC for .NET and Android. iOS provides built-in echo cancellation. Browsers handle AEC automatically.

    We implement AEC with the following methods:

    • The CreateOutputMixerSink method that creates a sink to play the mixed audio output. This corresponds to your system's speakers.
    • The CreateProcessor method that creates and returns an instance of FM.LiveSwitch.AudioProcessing.AecProcessor. We specify the audio settings and the tail length to use for echo cancellation.
    • CSharp
    • Android
    • iOS
    • TypeScript

    Paste the following code into the HelloWorld namespace in the AecContext.cs file.

    // Simple implementation of AecContext
    public class AecContext : FM.LiveSwitch.AecContext
    {
        protected override AudioSink CreateOutputMixerSink(AudioConfig config)
        {
            return new FM.LiveSwitch.NAudio.Sink(config);
        }
    
        protected override AecPipe CreateProcessor()
        {
            return new FM.LiveSwitch.AudioProcessing.AecProcessor(new AudioConfig(16000, 1));
        }
    }
    

    Paste the following code to the AecContext.java file.

    public class AecContext extends fm.liveswitch.AecContext {
    
        @Override
        protected AudioSink createOutputMixerSink(AudioConfig audioConfig) {
            return new AudioTrackSink(audioConfig);
        }
    
        @Override
        protected AecPipe createProcessor() {
            AudioConfig config = new AudioConfig(48000, 2);
            return new AecProcessor(config, AudioTrackSink.getBufferDelay(config) + AudioRecordSource.getBufferDelay(config));
        }
    }
    

    Not required.

    Not required.

    Create Local Media

    To create local media, do the following:

    • Define Local Media: The first step in capturing local media is to define how this capture is performed by extending the RtcLocalMedia<T> class. In this context, the generic type T represents the type of object used for displaying the video preview. Each local media implementation is usually associated with a specific set of inputs, such as a camera and a microphone or a user's screen. We capture the user's inputs using an implementation of RtcLocalMedia which we named LocalMedia.

    • Enable AEC: We've already created the AecContext class. We now need to enable it in our local media. To enable AEC, simply take in an AecContext object in the constructor and pass the AecContext object to its base class's constructor.

    • Capture Local Audio: To enable audio, we implement the following:

      • The CreateAudioRecorder method that records audio.
      • The CreateAudioSource method that returns an audio source used for recording audio.
      • The CreateOpusEncoder method that enables Opus encoder. It captures the user's audio and sends it to other participants. This method is optional. However, without it, your app falls back to the lower quality PCMA/PCMU audio codecs.
    • Capture Local Video: To enable video, we implement the following:

      • The CreateVideoRecorder method that records video.
      • Specify which video codecs to use. We implement the CreateVp8Encoder, CreateVp9Encoder, and CreateH264Encoder methods that encode VP8, VP9, and H.264.
      • The CreateImageConverter method that provides a minor image formatting utility. This creates a tool that converts between various color spaces, which are different ways of representing colors. This is needed because webcams don't capture data in the i420 color space, which is required by the LiveSwitch video encoders. The method must return an instance of FM.LiveSwitch.Yuv.ImageConverter.
    • CSharp
    • Android
    • iOS
    • TypeScript

    Paste the following code to the LocalMedia.cs file.

    public abstract class LocalMedia : RtcLocalMedia<System.Windows.Controls.Image>
    {
        // Enable AEC
        public LocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext)
        : base(disableAudio, disableVideo, aecContext)
        {           
        }
    
        // Local Audio
        protected override AudioSink CreateAudioRecorder(AudioFormat inputFormat)
        {
            return new FM.LiveSwitch.Matroska.AudioSink(Id + "-local-audio-" + inputFormat.Name.ToLower() + ".mkv");
        }
    
        protected override AudioSource CreateAudioSource(AudioConfig config)
        {
            if (FM.LiveSwitch.Dmo.VoiceCaptureSource.IsSupported())
            {
                return new FM.LiveSwitch.Dmo.VoiceCaptureSource(!AecDisabled);
            }
            else
            {
                return new FM.LiveSwitch.NAudio.Source(config);
            }
        }
    
        protected override AudioEncoder CreateOpusEncoder(AudioConfig config)
        {
            return new FM.LiveSwitch.Opus.Encoder(config);
        }
    
        // Local Video       
        protected override VideoSink CreateVideoRecorder(VideoFormat inputFormat)
        {
            return new FM.LiveSwitch.Matroska.VideoSink(Id + "-local-video-" + inputFormat.Name.ToLower() + ".mkv");
        }
    
        protected override VideoEncoder CreateVp8Encoder()
        {
            return new FM.LiveSwitch.Vp8.Encoder();
        }
    
        protected override VideoEncoder CreateVp9Encoder()
        {
            return new FM.LiveSwitch.Vp9.Encoder();
        }
    
        protected override VideoEncoder CreateH264Encoder()
        {
            return null;
        }
    
        protected override VideoPipe CreateImageConverter(VideoFormat outputFormat)
        {
            return new FM.LiveSwitch.Yuv.ImageConverter(outputFormat);
        }
    }
    

    Paste the following code to the LocalMedia.java file.

    public abstract class LocalMedia<TView> extends RtcLocalMedia<TView> {
    
        protected Context context;
    
        // Enable AEC
        public LocalMedia(Context context, boolean disableAudio, boolean disableVideo, AecContext aecContext) {
            super(disableAudio, disableVideo, aecContext);
            this.context = context;
        }
    
        // Local Audio
        @Override
        protected AudioSink createAudioRecorder(AudioFormat audioFormat) {
            return new fm.liveswitch.matroska.AudioSink(getId() + "-local-audio-" + audioFormat.getName().toLowerCase() + ".mkv");
        }
    
        @Override
        protected AudioSource createAudioSource(AudioConfig audioConfig) {
            return new AudioRecordSource(context, audioConfig);
        }
    
        @Override
        protected AudioEncoder createOpusEncoder(AudioConfig audioConfig) {
            return new fm.liveswitch.opus.Encoder(audioConfig);
        }
    
        // Local Video
        @Override
        protected VideoSink createVideoRecorder(VideoFormat videoFormat) {
            return new fm.liveswitch.matroska.VideoSink(getId() + "-local-video-" + videoFormat.getName().toLowerCase() + ".mkv");
        }
    
        @Override
        protected VideoEncoder createVp8Encoder() {
            return new fm.liveswitch.vp8.Encoder();
        }
    
        @Override
        protected VideoEncoder createVp9Encoder() {
            return new fm.liveswitch.vp9.Encoder();
        }
    
        @Override
        protected VideoEncoder createH264Encoder() {
            return null;
        }
    
        @Override
        protected VideoPipe createImageConverter(VideoFormat videoFormat) {
            return new fm.liveswitch.yuv.ImageConverter(videoFormat);
        }
    }
    

    Paste the following code to the LocalMedia.swift file.

    class LocalMedia: FMLiveSwitchRtcLocalMedia {
        // Enable AEC
        override init!(disableAudio: Bool, disableVideo: Bool, aecContext: FMLiveSwitchAecContext!) {
            super.init(disableAudio: disableAudio, disableVideo: disableVideo, aecContext: aecContext)
        }
        
        override func doStart() -> FMLiveSwitchFuture!  {
            // Starting Audio Session
            if (!self.audioDisabled()) {
                do {
                    if #available(iOS 10.0, *) {
                        try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: [.allowBluetooth, (AVAudioSession.CategoryOptions.defaultToSpeaker)])
                    } else {
                        AVAudioSession.sharedInstance().perform(NSSelectorFromString("setCategory:withOptions:error"), with: AVAudioSession.Category.playAndRecord, with: [.allowBluetooth, AVAudioSession.CategoryOptions.defaultToSpeaker])
                    }
                } catch {
                    return FMLiveSwitchPromise.rejectNow(withEx: NSException.init(fmMessage: "Could not set audio session category for local media"))
                }
                
                do {
                    try AVAudioSession.sharedInstance().setActive(true)
                } catch {
                    return FMLiveSwitchPromise.rejectNow(withEx: NSException.init(fmMessage: "Could not activate audio session for local media."))
                }
            }
            return super.doStart()
        }
        
        // Local Audio
        override func createAudioRecorder(withInputFormat inputFormat: FMLiveSwitchAudioFormat!) -> FMLiveSwitchAudioSink!  {
            return FMLiveSwitchMatroskaAudioSink(path: "local-audio-\(String(describing: inputFormat.name())).mkv")
        }
        
        override func createAudioSource(with config: FMLiveSwitchAudioConfig!) -> FMLiveSwitchAudioSource {
            return FMLiveSwitchCocoaAudioUnitSource(config: config)
        }
        
        override func createOpusEncoder(with config: FMLiveSwitchAudioConfig!) -> FMLiveSwitchAudioEncoder!  {
            return FMLiveSwitchOpusEncoder(config:  config)
        }
        
        // Local Video
        override func createVideoRecorder(withInputFormat inputFormat: FMLiveSwitchVideoFormat!) -> FMLiveSwitchVideoSink!  {
            return FMLiveSwitchMatroskaVideoSink(path: "local-video-\(String(describing: inputFormat.name())).mkv")
        }
        
        override func createVp8Encoder() -> FMLiveSwitchVideoEncoder!  {
            return FMLiveSwitchVp8Encoder()
        }
        
        override func createVp9Encoder() -> FMLiveSwitchVideoEncoder!  {
            return FMLiveSwitchVp9Encoder()
        }
        
        override func createH264Encoder() -> FMLiveSwitchVideoEncoder!  {
            return nil
        }
        
        override func createImageConverter(withOutputFormat outputFormat: FMLiveSwitchVideoFormat!) -> FMLiveSwitchVideoPipe {
            return FMLiveSwitchYuvImageConverter(outputFormat: outputFormat)
        }
    }
    

    Not required.

    Create Camera Local Media

    We've created a general type of local media to handle a local user's inputs. Next, we'll create a CameraLocalMedia class derived from the LocalMedia class to capture the user's camera. We'll create it in a separate file with the following:

    1. The video configuration with the width, height, and frame rate.
    2. The CreateVideoSource method that creates the video source and captures video data from the user's camera.
    3. The CreateViewSink method that shows a preview of the user's camera. This method must return a view of the same type as the type parameter of your LocalMedia class.
    • CSharp
    • Android
    • iOS
    • TypeScript

    Paste the following code to the CameraLocalMedia.cs file.

    // An implementation of Local Media with Camera
    public class CameraLocalMedia : LocalMedia
    {
        private VideoConfig _CameraConfig = new VideoConfig(640, 480, 30);
    
        public CameraLocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext) : base(disableAudio, disableVideo, aecContext)
        {
            Initialize();
        }
        
        protected override VideoSource CreateVideoSource()
        {
            return new FM.LiveSwitch.AForge.CameraSource(_CameraConfig);
        }
    
        protected override ViewSink<System.Windows.Controls.Image> CreateViewSink()
        {
            return new FM.LiveSwitch.Wpf.ImageSink()
            {
                ViewScale = LayoutScale.Contain,
                ViewMirror = true
            };
        }
        
    }
    

    Paste the following code to the CameraLocalMedia.java file.

    public class CameraLocalMedia extends LocalMedia<View> {
    
        private final CameraPreview viewSink;
        private final VideoConfig videoConfig = new VideoConfig(640, 480, 30);
    
        public CameraLocalMedia(Context context, boolean disableAudio, boolean disableVideo, AecContext aecContext) {
            super(context, disableAudio, disableVideo, aecContext);
    
            this.context = context;
            viewSink = new CameraPreview(context, LayoutScale.Contain);
    
            super.initialize();
        }
    
        @Override
        protected VideoSource createVideoSource() {
            return new Camera2Source(viewSink, videoConfig);
        }
    
        @Override
        protected ViewSink<View> createViewSink() {
            return null;
        }
    
        // Return an Android View for local preview rather than using ViewSink.
        public View getView() {
            return viewSink.getView();
        }
    }
    

    Paste the following code to the CameraLocalMedia.swift file.

    class CameraLocalMedia : LocalMedia {
        
        var _videoConfig: FMLiveSwitchVideoConfig?
        var _preview: FMLiveSwitchCocoaAVCapturePreview?
        
        override init!(disableAudio: Bool, disableVideo: Bool, aecContext: FMLiveSwitchAecContext!) {
            super.init(disableAudio: disableAudio, disableVideo: disableVideo, aecContext: aecContext)
            
            self._videoConfig = FMLiveSwitchVideoConfig(width: 640, height: 480, frameRate: 30)
            self._preview = FMLiveSwitchCocoaAVCapturePreview()
            
            self.initialize()
        }
        
        override func createVideoSource() -> FMLiveSwitchVideoSource!  {
            return FMLiveSwitchCocoaAVCaptureSource(preview: _preview, config: _videoConfig)
        }
        
        override func createViewSink() -> FMLiveSwitchViewSink!  {
            return nil
        }
        
        override func view() -> Any! {
            return self._preview
        }
    }
    

    Not required.

    Start and Stop Local Media

    Next, we implement the logic to start and stop the local user's microphone and camera. To do so, we implement the following:

    1. To start capturing media, invoke the Start method of the LocalMedia implementation. The Start method returns a promise, which will resolve when the instance begins to capture data from the user's camera and microphone. If media can't be captured, then the promise will be rejected. We specify a reject action to notify errors.
    2. To preview the local media:
      1. Create an instance of FM.LiveSwitch.LayoutManager, which is a class to manage the local and remote video feeds for a video conference.
      2. Pass the layout manager instance to a container.
      3. Retrieve the local view by accessing the View property of the LocalMedia instance.
      4. Assign the local view to the layout manager by invoking the SetLocalView method.
    3. To stop capturing camera and microphone data, invoke the Stop method of the LocalMedia class. We also need to remove the local preview from the layout manager by invoking the UnsetLocalView method of the layout manager.

    Paste the following code into the HelloWorldLogic class:

    • CSharp
    • Android
    • iOS
    • TypeScript
    public LocalMedia LocalMedia { get; private set; }
    private AecContext _AecContext = new AecContext();
    private Dispatcher _Dispatcher = null;
    private FM.LiveSwitch.Wpf.LayoutManager _LayoutManager = null;
    
    // Starting Local Media by passing in MainWindow
    public async Task StartLocalMedia(MainWindow mainWindow)
    {
        _Dispatcher = mainWindow.Dispatcher;
        _LayoutManager = new FM.LiveSwitch.Wpf.LayoutManager(mainWindow.videoContainer);
    
        LocalMedia = new CameraLocalMedia(false, false, _AecContext);
    
        await LocalMedia.Start();
        _Dispatcher.Invoke(() =>
         {
             var localView = LocalMedia.View;
             if (localView != null)
             {
                 localView.Name = "LocalView";
                 _LayoutManager.SetLocalView(LocalMedia.View);
             }
         });
        Log.Info("Successfully started Local Media.");
    }
    
    // Stopping Local Media
    public async Task StopLocalMedia()
    {
        await LocalMedia.Stop();
        var layoutManager = _LayoutManager;
        if (layoutManager != null)
        {
            layoutManager.RemoveRemoteViews();
            layoutManager.UnsetLocalView();
            _LayoutManager = null;
        }
    
        if (LocalMedia != null)
        {
            LocalMedia.Destroy();
            LocalMedia = null;
        }
        Log.Info("Successfully stopped local media.");
    }
    
    private LocalMedia<View> localMedia;
    private LayoutManager layoutManager;
    private final AecContext aecContext = new AecContext();
    
    public Future<Object> startLocalMedia(final Activity activity, final RelativeLayout container) {
        final Promise<Object> promise = new Promise<>();
    
        activity.runOnUiThread(() -> {
            // Create a new local media with audio and video enabled.
            localMedia = new CameraLocalMedia(context, false, false, aecContext);
    
            // Set local media in the layout.
            layoutManager = new LayoutManager(container);
            layoutManager.setLocalView(localMedia.getView());
    
            // Start capturing local media.
            localMedia.start().then(localMedia -> {
                promise.resolve(null);
    
            }, promise::reject);
        });
    
        return promise;
    }
    
    public Future<Object> stopLocalMedia() {
        final Promise<Object> promise = new Promise<>();
    
        if (localMedia == null) {
            promise.resolve(null);
        } else {
            // Stop capturing local media.
            localMedia.stop().then(result -> {
                if (layoutManager != null) {
                    // Remove views from the layout.
                    layoutManager.removeRemoteViews();
                    layoutManager.unsetLocalView();
                    layoutManager = null;
                }
    
                if (localMedia != null) {
                    localMedia.destroy();
                    localMedia = null;
                }
    
                promise.resolve(null);
    
            }, promise::reject);
        }
    
        return promise;
    }
    
     var _localMedia: LocalMedia?
     var _layoutManager: FMLiveSwitchCocoaLayoutManager?
         
     func startLocalMedia(container: UIView) -> FMLiveSwitchFuture {
         let promise = FMLiveSwitchPromise()
         
         self._layoutManager = FMLiveSwitchCocoaLayoutManager(container: container)
         self._localMedia = CameraLocalMedia(disableAudio: false, disableVideo: false, aecContext: nil)
         _layoutManager!.setLocalView(self._localMedia!.view())
         
         self._localMedia?.start()?.then(resolveActionBlock: { (obj: Any?) in
             promise!.resolve(withResult: obj as! NSObject)
         }, rejectActionBlock: { (e: NSException?) in
             promise!.reject(with: e)
         })
         return promise!
     }
    
     func stopLocalMedia() -> FMLiveSwitchFuture {
         let promise = FMLiveSwitchPromise()
         
         // Cleaning up Layout Manager and Local Media
         if (_localMedia != nil) {
             self._localMedia?.stop()?.then(resolveActionBlock: { [weak self](obj: Any?) in
                
                 if (self?._layoutManager != nil) {
                     DispatchQueue.main.async {
                         self?._layoutManager?.removeRemoteViews()
                         self?._layoutManager?.unsetLocalView()
                         self?._layoutManager = nil
                         if (self?._localMedia != nil) {
                             self?._localMedia?.destroy()
                             self?._localMedia = nil
                         }
                     }
                 }
                 promise?.resolve(withResult: nil)
             }, rejectActionBlock: { (e: NSException?) in
                 promise!.reject(with: e)
             })
         } else {
             if (self._layoutManager != nil) {
                 DispatchQueue.main.async {
                     self._layoutManager?.removeRemoteViews()
                     self._layoutManager = nil
                 }
             }
             promise?.resolve(withResult: nil)
         }
         return promise!
     }
    

    We didn't implement local media for the web. To use local media, we create an instance of fm.liveswitch.LocalMedia.

    public localMedia: fm.liveswitch.LocalMedia;
    private layoutManager = new fm.liveswitch.DomLayoutManager(document.getElementById("my-container"));
    
    public startLocalMedia(): fm.liveswitch.Future<Object> {
        const promise = new fm.liveswitch.Promise<Object>();
    
        if (this.localMedia == null) {
            // Create local media with audio and video enabled.
            const audioEnabled = true;
            const videoEnabled = true;
            this.localMedia = new fm.liveswitch.LocalMedia(audioEnabled, videoEnabled);
    
            // Set local media in the layout.
            this.layoutManager.setLocalMedia(this.localMedia);
        }
    
        // Start capturing local media.
        this.localMedia.start()
            .then(() => {
                fm.liveswitch.Log.debug("Media capture started.");
                promise.resolve(null);
            })
            .fail(ex => {
                fm.liveswitch.Log.error(ex.message);
                promise.reject(ex);
            });
    
        return promise;
    }
    
    public stopLocalMedia(): fm.liveswitch.Future<Object> {
        const promise = new fm.liveswitch.Promise<Object>();
    
        // Stop capturing local media.
        this.localMedia.stop()
            .then(() => {
                fm.liveswitch.Log.debug("Media capture stopped.");
                promise.resolve(null);
            })
            .fail(ex => {
                fm.liveswitch.Log.error(ex.message);
                promise.reject(ex);
            });
    
        return promise;
    }
    

    Uncomment UI Components

    Now, go to the files for the UI components and uncomment the code for starting and stopping local media.

    • CSharp
    • Android
    • iOS
    • TypeScript

    In the MainWindow.xaml.cs file, uncomment all the code between the <LocalMedia> and </LocalMedia> tags.

    Note

    There are multiple instances of these tags. Uncomment all the code between those instances.

    In the StartingFragment.java file, uncomment all the code between the <LocalMedia> and </LocalMedia> tags.

    Note

    There are multiple instances of these tags. Uncomment all the code between those instances.

    In the ViewModel.swift file, uncomment all the code between the <LocalMedia> and </LocalMedia> tags.

    Note

    There are multiple instances of these tags. Uncomment all the code between those instances.

    In the index.ts file, uncomment all the code between the <LocalMedia> and </LocalMedia> tags.

    Note

    There are multiple instances of these tags. Uncomment all the code between those instances.

    Run Your App

    Note

    LiveSwitch recommends that you use a phone and not an emulator to run the mobile apps. It's possible to run the mobile apps on an emulator, but the camera doesn't work.

    Run your app in your project IDE and click Join. You should see yourself on the app window. Congratulations, you've built a LiveSwitch app to handle local media!

    Note

    For TypeScript, it's recommended that you clear your browser cache before clicking Join.

    In This Article
    Back to top Copyright © LiveSwitch Inc. All Rights Reserved.Documentation for LiveSwitch Version 1.24.0