Capture a User's Screen
In the section on Creating Streams and Connections, you learned how to capture input from a user's camera and microphone. You are not limited to a user's camera; you can use also use LiveSwitch to perform a screen share. This section demonstrates how you can allow your users to capture their screen data and share it with others in a session.
In other sections, you learned that to capture a user's microphone and camera, you must provide a class derived from the LiveSwitch LocalMedia
class. In this class you had to override the CreateVideoSource
function and with it create and return a CameraSource
. If you want to support screen capture, then you need to do something similar, but instead of returning a CameraSource
video source you return a ScreenSource
like in the code below:
public class LocalScreenMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
public override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.WinForms.ScreenSource(3);
}
public override FM.LiveSwitch.ViewSink<...> CreateViewSink
{
#if WINFORMS
return new FM.LiveSwitch.WinForms.PictureBoxSink
#else // WPF
return new FM.LiveSwitch.Wpf.ImageSink
#endif
{
ViewScale = LayoutScale.Contain
};
}
}
To capture a user's screen instead of their camera, simply use this new LocalScreenMedia
class instead of the LocalCameraMedia
class. It's a simple as that.
In other sections, you learned that to capture a user's microphone and camera, you must provide a class derived from the LiveSwitch LocalMedia
class. In this class you had to override the CreateVideoSource
function and with it create and return a CameraSource
. If you want to support screen capture, then you need to do something similar, but instead of returning a CameraSource
video source you return a ScreenSource
like in the code below:
- Android uses a media projection to support screen sharing. This is loosely demonstrated here for illustration purposes, but in production, you need to do something a bit different. You can always look at the Android example app that ships with the SDK for a full example.
public class LocalScreenMedia extends fm.liveswitch.RtcLocalMedia<...> {
public fm.liveswitch.VideoSource createVideoSource() {
return new MediaProjectionSource(mediaProjection, context, 1);
}
...
}
To capture a user's screen instead of their camera, simply use this new LocalScreenMedia
class instead of the LocalCameraMedia
class. It's a simple as that.
In other sections, you learned that to capture a user's microphone and camera, you must provide a class derived from the LiveSwitch LocalMedia
class. In this class you had to override the CreateVideoSource
function and with it create and return a CameraSource
. If you want to support screen capture, then you need to do something similar, but instead of returning a CameraSource
video source you return a ScreenSource
like in the code below:
- iOS uses the FMLiveSwitchCocoaScreenSource, which supports capturing the screen of your app only. This is due to restrictions that iOS places on the underlying API used by the source.
public class LocalScreenMedia : FMLiveSwitchRtcLocalMedia {
override func createVideoSource() -> FMLiveSwitchVideoSource {
return FMLiveSwitchCocoaScreenSource(frameRate: 3)
}
}
To capture a user's screen instead of their camera, simply use this new LocalScreenMedia
class instead of the LocalCameraMedia
class. It's a simple as that.
In other sections, you learned that to capture a user's microphone and camera, you must create an fm.liveswitch.LocalMedia
instance. The first two parameters that you pass to this are boolean values that indicate whether or not to capture audio and whether or not to capture video, like in the code below:
- Screen capture in Chrome versions prior to 72 require installation of a Chrome extension. This can be managed easily using LiveSwitch's Web Plugin class. For a discussion on how to use the Web Plugin class to manage screen capture extensions see the following section.
var localMedia = new fm.liveswitch.LocalMedia(true, true);
To capture a user's screen instead of their camera, simply provide one additional parameter. The LocalMedia constructor's optional third parameter is a boolean value that determines whether or not the user's screen is going to be captured. It defaults to false, which tells LiveSwitch to capture the user's camera. In this case, set it to true, and it is going to try capturing the user's screen:
var localMedia = new fm.liveswitch.LocalMedia(true, true, true);
In other sections, you learned that to capture a user's microphone and camera, you must provide a class derived from the LiveSwitch LocalMedia
class. In this class you had to override the CreateVideoSource
function and with it create and return a CameraSource
. If you want to support screen capture, then you need to do something similar, but instead of returning a CameraSource
video source you return a ScreenSource
like in the code below:
public class LocalScreenMedia : FMLiveSwitchRtcLocalMedia {
override func createVideoSource() -> FMLiveSwitchVideoSource {
return FMLiveSwitchCocoaScreenSource(frameRate: 3)
}
}
To capture a user's screen instead of their camera, simply use this new LocalScreenMedia
class instead of the LocalCameraMedia
class. It's a simple as that.
In other sections, you learned that to capture a user's microphone and camera, you must provide a class derived from the LiveSwitch LocalMedia
class. In this class you had to override the CreateVideoSource
function and with it create and return a CameraSource
. If you want to support screen capture, then you need to do something similar, but instead of returning a CameraSource
video source you return a ScreenSource
like in the code below:
- Android uses a media projection to support screen sharing. This is loosely demonstrated here for illustration purposes, but in production you need to do something a bit different. You can always look at the example app that ships with the SDK for a full example.
- iOS uses the FM.LiveSwitch.Cocoa.ScreenSource, which supports capturing the screen of your app only. This is due to restrictions that iOS places on the underlying API used by the source.
#if __ANDROID__
public class LocalScreenMedia : FM.LiveSwitch.RtcLocalMedia<View>
{
public override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new MediaProjectionSource(mediaProjection, context, 1);
}
}
#elif __IOS__
public class LocalScreenMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.UIImageView>
{
public override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.Cocoa.ScreenSource(3);
}
protected override ViewSink<UIImageView> CreateViewSink()
{
return new ImageViewSink();
}
}
#endif
To capture a user's screen instead of their camera, simply use this new LocalScreenMedia
class instead of the LocalCameraMedia
class. It's a simple as that.
In other sections, you learned that to capture a user's microphone and camera, you must provide a class derived from the LiveSwitch LocalMedia
class. In this class you had to override the CreateVideoSource
function and with it create and return a CameraSource
. If you want to support screen capture, then you need to do something similar, but instead of returning a CameraSource
video source you return a ScreenSource
like in the code below:
public abstract class LocalMedia : RtcLocalMedia<UnityEngine.RectTransform>
{
...
}
public class LocalScreenMedia : LocalMedia
{
public LocalScreenMedia(bool disableAudio, bool disableVideo, AecContext aecContext)
: base(disableAudio, disableVideo, aecContext)
{
Initialize();
}
protected override VideoSource CreateVideoSource()
{
return new LSUnity.ScreenSource();
}
protected override ViewSink<UnityEngine.RectTransform> CreateViewSink()
{
// Screen capture doesn't generally need a preview.
// If you want one, return a new RectTransformSink here.
return null;
}
}
To capture a user's screen instead of their camera, simply use this new LocalScreenMedia
class instead of the LocalCameraMedia
class. It's a simple as that.
Now you can establish a connection as you normally would, but instead of the camera, you are now sharing the user's screen. But what if you want to share both the user's camera and their screen? This is a common scenario. The best way to address this is to first connect with everyone as you normally would. Then, when a user wishes to share their screen, create a new connection specifically for the screen share, and disable audio, like so:
Note
About the code examples below:
- For .NET MAUI and Unity, use the C# code.
- For macOS, use the iOS code.
bool DisableAudio = true;
bool DisableVideo = false;
var LocalMedia = new LocalScreenMedia(DisableAudio, DisableVideo, ...);
MediaProjectionManager manager = (MediaProjectionManager) this.getSystemService(MEDIA_PROJECTION_SERVICE);
...
boolean disableAudio = true;
boolean disableVideo = false;
localMedia = new LocalScreenMedia(manager.getMediaProjection(...), context, disableAudio, disableVideo, ...);
self._localMedia = LocalScreenMedia(disableAudio: true, disableVideo: false, ...)
var localMedia = new fm.liveswitch.LocalMedia(false, true, true);
#if __ANDROID__
MediaProjectionManager manager = GetSystemService(MediaProjectionService).JavaCast<MediaProjectionManager>();
...
bool DisableAudio = true;
bool DisableVideo = false;
var localMedia = new LocalScreenMedia(manager.GetMediaProjection(...), Context, DisableAudio, DisableVideo, ...);
#endif
Once you have the screen media, establish a connection with each user in the session to share your screen.
Best Practice
If you wish to share video from the camera and video from screen capture at the same time then the best practice is to create two connections to handle this scenario. One connection manages video from the camera (and audio if needed), and the other manages video from the screen capture (disable audio in this connection). This makes it trivial to end the screen share at any point and also allows users to selectively mute the screen share if it is using up too much bandwidth. Also, the browsers do not allow more than one video stream per connection, so if you want to support this use case, and work with browser clients, then you must manage camera and screen concerns in separate connections.
You've now learned how to capture and share a user's screen. When sharing screens, remember that the bandwidth used to share a screen is several orders larger than the typical bandwidth used to share a camera. This is not usually a problem on modern networks, but make sure that you don't allow every user to share their screen at once to avoid reducing the video quality of the conference.