Learn how to use Wowza Streaming Engine™ media server software to ingest a non-WebRTC source stream and play it back with WebRTC or WebRTC plus other scalable HTTP-based streaming protocols like HLS.
About non-WebRTC to WebRTC workflows
Wowza Streaming Engine™ media server software version 4.7.7 and later supports WebRTC streaming, however, we recommend that you update to version 4.8.5 and later to capitalize on expanded functionality and enhancements to publisher reliability. Wowza Streaming Engine can ingest RTSP, SRT, and RTMP streams and output those streams as WebRTC content for playback on mobile and desktop browsers that support WebRTC APIs. Supported browsers include the latest versions of Chrome, Firefox, and Safari, as well as Edge version 79 and later. This workflow provides flexibility for publishing streams to Wowza Streaming Engine while maintaining the benefits of WebRTC for playback: low latency when compared with HTTP-based streaming; browser-based playback without plugins or protocol-specific, proprietary players; and the robust feature set of the open source WebRTC framework.
Wowza Streaming Engine also supports hybrid workflows: ingesting streams from RTSP, SRT, or RTMP to deliver the streams over WebRTC and HTTP-based protocols like HLS and MPEG-DASH or any playback protocol that Wowza Streaming Engine supports. This allows a scaled approach for delivery using a content delivery network (CDN) in addition to direct WebRTC browser connections.
When planning a non-WebRTC to WebRTC workflow, consider whether you’ll need to transcode the stream to achieve your goal of WebRTC playback. Transcoding is required when the ingest source stream has a different audio codec, video codec or video encoding profile from the WebRTC output. The workflows in this article provide a few possibilities for non-WebRTC ingest to WebRTC playback, but they aren’t an exhaustive list.
Set up Wowza Streaming Engine for WebRTC playback
Before setting up your IP camera or encoder, you'll need to configure Wowza Streaming Engine for WebRTC. You can configure WebRTC streaming in XML files at the VHost and application level. For instructions using Wowza Streaming Engine Manager to configure WebRTC instead, see Set up WebRTC streaming with Wowza Streaming Engine Manager. The following three sections are required for WebRTC setup.
Configure SSL/TLS for Wowza Streaming Engine
Encryption is required for all components of the WebRTC workflow. You must have a secure HTTP (HTTPS) connection to a web camera for WebRTC publishing and playback. As a result, due to cross-domain issues, you'll need to configure an SSL certificate to secure the connection between the browser and Wowza Streaming Engine for the SDP data exchange.
We recommend the free Wowza StreamLock certificate to secure this connection. The instructions in this article assume you've configured port 443 with an SSL certificate.
If you plan to use your own SSL certificate, the following resources may help you convert your SSL certificate to the Java KeyStore (JKS) format that's required by Wowza Streaming Engine. Although it's possible to use self-signed certificates with WebRTC, note that you must ensure the browser you're using to test accepts traffic encrypted with any self-signed certificates in use.
Configure the HTTP provider
You must configure the HTTPWebRTCExchangeSessionInfo HTTP provider to support an SDP exchange for the WebRTC session. This is equivalent to selecting Use WebRTC for host port 443 in Wowza Streaming Engine Manager. If you have already done this, continue to Configure a live application.
- Navigate to [install-dir]/conf/ and open VHost.xml in a text editor.
- Locate the <HostPort> container for port 443 with SSL and add the following XML snippet as the second-to-last entry in the container of <HTTPProviders>. The new HTTP provider must be the second-to-last provider in the section.
<HTTPProvider>
<BaseClass>com.wowza.wms.webrtc.http.HTTPWebRTCExchangeSessionInfo</BaseClass>
<RequestFilters>*webrtc-session.json</RequestFilters>
<AuthenticationMethod>none</AuthenticationMethod>
</HTTPProvider>
- Save your changes to VHost.xml.
Note: The 443 host port with SSL is commented out by default in VHost.xml. When you're ready to stream over WebRTC, comment it back in by removing the lines <!-- 443 with SSL -->, <!--, and --> from around the <HostPort> container.
Configure a live application
After the HTTP provider is added to your VHost.xml file, configure a live application for WebRTC streaming. This example uses the live application that's included in the default Wowza Streaming Engine installation.
Note: These instructions assume a new installation of Wowza Streaming Engine 4.7.7 or later. If you've updated from an earlier version, you must copy and paste the <WebRTC> container from the sample WebRTC Application.xml file into your [install-dir]/conf/[applicationName]/Application.xml file.
- Open [install-dir]/conf/[applicationName]/Application.xml in a text editor, and configure the following properties in the <WebRTC> container.
Name |
Type |
Description |
EnablePublish |
Boolean |
Set to true to enable WebRTC publishing to this application. |
EnablePlay |
Boolean |
Set to true to enable WebRTC playback from this application. |
EnableQuery |
Boolean |
Set to true to enable querying of published stream names for this application. |
IceCandidateIpAddresses |
String |
The IP address, transport, and port to use for WebRTC streaming.
For the user datagram protocol (UDP), the value should be [wowza-streaming-engine-external-ip-address],udp where [wowza-streaming-engine-external-ip-address] is the external IP address of the Wowza Streaming Engine instance. The port is dynamically assigned for UDP delivery.
For Transmission Control Protocol (TCP), the value should be set to [wowza-streaming-engine-external-ip-address],tcp,[port] where [wowza-streaming-engine-external-ip-address] is the external IP address of the Wowza Streaming Engine instance and [port] is one of the non-SSL-protected streaming HostPort entries defined in [install-dir]/conf/VHost.xml. For example, to stream over port 1935, the entry would be 66.175.168.127,tcp,1935.
For multiple IP addresses, use a pipe character to separate the lists.
Notes:
- Firefox requires UDP. TCP is not supported.
- Port 554 isn't supported for TCP ICE candidates.
- If you use UDP ICE candidates, enabling NACK messages is recommended to allow for retransmission of lost packets. See the optional RTP properties for information about how to enable NACK.
- At this time, full session traversal utilities for NAT (STUN) negotiation aren't supported. Currently, Wowza Streaming Engine only supports traversal of symmetric NATs. A single STUN transport configuration (TCP or UDP) must be supplied using IceCandidateIpAddresses. TURN servers are not supported at this time.
|
UDPBindAddress |
String |
The local IP address of the network card you want to use for WebRTC UDP traffic. (This value is not used if streaming WebRTC over TCP.) For UDP delivery in general, it's okay to leave this property blank. The property is only needed if the server has multiple network interfaces. For some network situations, like running on a cloud instance, a value of 0.0.0.0 would be best instead of the local IP address of the network card to prevent connection problems. |
PreferredCodecsAudio |
String |
A comma-separated list of audio codecs, in order of preference, for stream ingestion. The default is opus,pcmu,pcma. |
PreferredCodecsVideo |
String |
A comma-separated list of video codecs, in order of preference, for stream ingestion. The default is vp8,h264. Valid values are vp8, vp9, and h264. If you want to stream in Chrome using VP9, use vp9,vp8,h264. If no value is specified, the WebRTC stream will not be ingested. |
DebugLog |
Boolean |
Set to true to enable WebRTC debug logging. |
- If needed, in the [install-dir]/conf/[applicationName]/Application.xml file, add the following optional <Property> definitions, including the Name, Type, and Value, to the <WebRTC>/<Properties> container.
Name |
Type |
Description |
webrtcCodecUpdates |
Boolean |
Set to true to allow codec updates for WebRTC connections to pass through the system. The default value is true. Enabling this property can lead to frequent calls to the onCodecInfoVideo method of the IMediaStreamActionNotify3 interface, depending on the publishing client's source encoder. See Listen for stream events and codec information with the Wowza Streaming Engine Java API. |
webrtcFIRMessageInterval |
Integer |
The interval between Full Intra Request (FIR) messages. If webrtcFIRMessageScheme is set to time, the value is in milliseconds and the default is 1000 (1 second). If webrtcFIRMessageScheme is set to frame, the value is in frames and the default is 30. |
webrtcFIRMessageScheme |
String |
The scheme for the FIR message interval. Use time to send a FIR message after a specified period of time, or use frame to send a FIR message after a specified number of frames are received. The default is time. FIR messages, which are enabled by default, provide resiliency for packet loss while publishing WebRTC streams to Wowza Streaming Engine. |
webrtcIdleTimeout |
Integer |
The length of time, in milliseconds, after which a WebRTC session closes if there is no publish or playback activity. The default and recommended value is 10000 (10 seconds). |
Finally, configure the live application's optional RTP properties.
- To handle out of order and retransmitted packets for WebRTC publishing, add the following optional RTP <Property> definitions, including Name, Type (where applicable), and Value, to the <RTP>/<Properties> container in the [install-dir]/conf/[applicationName]/Application.xml file.
Name |
Type |
Description |
jitterBufferDebug |
Boolean |
Set to true to log information about out of order packets that are received and inserted in time. Logged information includes the length of time to recover a packet, which can help determine an appropriate value for jitterBufferDelay. The default is false. |
jitterBufferDelay |
Integer |
Configures the delay, in milliseconds, added by the jitter buffer. This delay allows for retransmitted packets and out of order packets to be played into the system in order, provided they are received in time. If useNack is true, the recommended jitter buffer delay is 2 times the round trip time (RTT) plus overhead for processing. If useNack is false, the recommended jitter buffer delay is 100. The default is 500. |
logPacketLoss |
Boolean |
Set to true to log when packets are lost and not recovered at the time when they should have been played into the system. The default is false. |
rtpDePacketizerWrapper |
— |
Set to com.wowza.wms.rtp.depacketizer.JitterBuffer to enable a jitter buffer for WebRTC that allows handling out of order and retransmitted packets. The jitter buffer is recommended for UDP-based WebRTC streams and adds a delay to the stream. The jitter buffer must be enabled for useNack to work when useNack is enabled. |
useNack |
Boolean |
Set to true to send Negative Acknowledgement (NACK) messages to publishing clients to allow for retransmission of lost packets. This property is for WebRTC streams only. The default is false. Using NACK messages is recommended for UDP-based WebRTC streams and adds a delay to the stream. The jitter buffer must also be enabled for this property to work when enabled. |
- If you have known issues with playback compatibility related to SDP information, add the following optional RTP <Property> definitions, including Name, Type, and Value, to the <RTP>/<Properties> container in the [install-dir]/conf/[applicationName]/Application.xml file. The properties affect WebRTC streams only.
Name |
Type |
Description |
rtpForceH264Constraint |
Boolean |
Set to true to allow the SDP information that is returned to contain different H.264 constraints than the stream contains. The default value is false. |
rtpForceH264ConstraintValue |
Integer |
Sets the H.264 constraint fields in the SDP information if rtpForceH264Constraint is true. The default value, 192, works in most circumstances. Other valid values are 128, 224, and 240. |
rtpUseLowestH264Constraint |
Boolean |
Set to true to compare the initial codec data and the first video unit to the profile data and use the lowest value to determine the H.264 video profile in the SDP information. Use this or rtpUseHighestH264Constraint to help play the stream when an encoder doesn't send consistent codec information for the stream. |
rtpUseHighestH264Constraint |
Boolean |
Set to true to compare the initial codec data and the first video unit to the profile data and use the highest value to determine the H.264 profile in the SDP information. Use this or rtpUseLowestH264Constraint to help play the stream when an encoder doesn't send consistent codec information for the stream. |
Note: The rtpUseLowestH264Constraint and rtpUseHighestH264Constraint properties can't be used simultaneously. If both are set to true, rtpUseHighestH264Constraint is used.
Connect a non-WebRTC source to Wowza Streaming Engine
Next, set up a source to publish the stream to Wowza Streaming Engine and configure a transcoder template, when necessary, to prepare for playback using WebRTC. Select one workflow from the following options.
Set up an RTSP source for a passthrough video-only stream
In this example workflow, you'll ingest a stream from an IP camera or RTSP encoder into Wowza Streaming Engine with an output of WebRTC video-only playback. The example RSTP source stream has H.264 video and AAC audio. If you require audio playback, you need to transcode the audio stream in Wowza Streaming Engine from the AAC audio codec to the Opus audio codec for WebRTC output. In this example scenario, Wowza Streaming Engine ingests an RTSP stream, passes through video and audio, and makes a video-only WebRTC stream available for playback. By default, Wowza Streaming Engine can package the stream into HLS, MPEG-DASH, and other protocols for playback at scale.
Configure the RTSP encoder or IP camera
Keep the following in mind when setting up an IP camera or RTSP encoder to prepare for playback using WebRTC:
- For the greatest interoperability with all potential playback clients, set the profile setting on your IP camera or encoder to Baseline. If you choose to set the profile to Main or High, ensure that the camera doesn’t encode with B-frames. If you’re unable to disable B-frames in the source stream, you’ll need to send the video source through the Wowza Streaming Engine transcoder to remove them.
- To maintain low latency, you may need to tune your RTSP source for low latency streaming. If your camera doesn't have B-frame specific tuning, low latency tuning may also result in a bitstream without B-frames. Refer to documentation for your IP camera or encoder on how to tune for low latency.
- For the greatest playback compatibility when encoding with H.264, the recommended resolution is 720p with a frame rate of 30 fps.
To send a stream from an RTSP source like an IP camera into Wowza Streaming Engine, use one of the following configurations.
If latency is a concern, see Tune for latency for additional options.
After you’ve configured your RTSP source, jump to Test playback for next steps.
Set up an SRT source for a transcoded audio, passthrough video stream
In this example workflow, you'll ingest an SRT stream from an encoder into Wowza Streaming Engine with an output of WebRTC video and audio playback. The example SRT stream has AAC audio and H.264 video. You’ll need to use Wowza Streaming Engine to transcode the AAC audio to Opus for WebRTC output. Wowza Streaming Engine ingests the SRT stream, transcodes the audio, passes through the video, and makes a WebRTC stream available for playback. By default, Wowza Streaming Engine can package the stream into HLS, MPEG-DASH, and other protocols for playback at scale. Transcoding introduces latency into the media delivery pipeline, so this transcoded workflow will have a longer startup time and higher latency than a passthrough video-only workflow.
Configure the SRT encoder
Keep the following in mind when setting up an SRT encoder to prepare for playback using WebRTC:
Start by configuring a stream file to re-stream a MPEG-TS-based SRT stream. SRT ingest is only available in Linux and Windows Wowza Streaming Engine installations.
To ingest an SRT stream, you must create a stream file that's configured with SRT-specific properties. When ingesting the stream, Wowza Streaming Engine functions in SRT listener mode when establishing a handshake with a peer.
- First, follow the instructions in Create and use .stream files in Wowza Streaming Engine to create the stream file. Note that it is important to use the IP address of the server hosting the SRT stream as the Stream URI.
- In Wowza Streaming Engine Manager, click the Application tab at the top of the page, and then click Stream Files in the left panel.
- Click the name of the stream file you want to configure in the Stream Files list, and then click the Properties tab.
Note: Access to the Properties tab is limited to administrators with advanced permissions. For more information about how to configure access, see Manage credentials.
- On the Properties tab, there are two sets of properties: Common properties, which can be configured for any stream type, and SRT-specific properties, which are unique for the source stream URI that's specified in the stream file. Depending on how you've configured your SRT stream, you may need to use both sets of properties. For more information, see Specify SRT stream (srt://) settings.
To enable a property, click Edit and then select the Enabled check box for the property. The property will be enabled with a default value that you can change in the Value box.
- Click Save.
Configure Wowza Streaming Engine for audio-only transcoding
- Using a text editor, open [install-dir]/conf/[application-name]/Application.xml and set the <Transcoder>/<LiveStreamTranscoder> element to transcoder:
<LiveStreamTranscoder>transcoder</LiveStreamTranscoder>
- Next, set the <Transcoder>/<Templates> element to a transcoder template file name that you'll create in the next step, for example, audioonly-webrtc.xml.
<Templates>audioonly-webrtc.xml</Templates>
- Navigate to [install-dir]/transcoder/templates, duplicate the audioonly.xml file, and rename it. For example: audioonly-webrtc.xml.
- Open the new transcoder template file in a text editor and make the following adjustments:
- In the <Encode>/<Audio> container, set <Codec> to Opus, and enter a valid value for <Bitrate>.
- If your source does not support an audio sample rate of 48000, set a <Resample> parameter in your <Audio> encode to this value.
<Audio>
<Codec>Opus</Codec>
<Bitrate>96000</Bitrate>
<Resample>
<Enable>true</Enable>
<SampleRate>48000</SampleRate>
<Channels>2</Channels>
</Resample>
<Parameters>
</Parameters>
</Audio>
- If your source isn't configured to send stereo audio, in the <Decode>/<Audio> container, set the default.scaleChannels parameter to 2 to force any mono audio frames to stereo by duplicating the mono stream for both the left and right channels.
<Decode>
<Audio>
<Parameters>
<Parameter>
<Name>default.scaleChannels</Name>
<Value>2</Value>
<Type>Integer</Type>
</Parameter>
</Parameters>
</Audio>
</Decode>
- Save the changes.
If latency is a concern, see Tune for latency for additional options.
After you’ve configured your SRT source and the Transcoder, jump to Test playback for next steps.
Set up an RTMP source for a transcoded video and audio stream
In this example workflow, you'll ingest an RTMP stream from a camera or encoder into Wowza Streaming Engine with an output of WebRTC video and audio playback. The example RTMP stream has AAC audio and H.264 video with a High profile. You’ll need to transcode the AAC audio to Opus, and, for this example, you'll also transcode the H.264 video to adjust the profile from High to Main. Wowza Streaming Engine ingests the RTMP stream, transcodes the video and audio, and makes a WebRTC stream available for playback. By default, Wowza Streaming Engine can package the stream into HLS, MPEG-DASH, and other protocols for playback at scale. Transcoding introduces latency into the media delivery pipeline, so this transcoded workflow will have a longer startup time and higher latency than a passthrough video-only workflow.
Configure the RTMP encoder
Keep the following in mind when setting up an RTMP encoder to prepare for playback using WebRTC:
In your RTMP encoder, enter the following application connection settings:
- Server URL – rtmp://[wowza-ip-address]/[application-name]
- Stream Name – myStream
- User – publisherName
- password – [password]
Configure Wowza Streaming Engine for video and audio transcoding
- Using a text editor, open [install-dir]/conf/[application-name]/Application.xml and set the <Transcoder>/<LiveStreamTranscoder> element to transcoder:
<LiveStreamTranscoder>transcoder</LiveStreamTranscoder>
- Next, set the <Transcoder>/<Templates> element to a transcoder template file name that you'll create in the next step, for example, transcode-webrtc.xml.
<Templates>transcode-webrtc.xml</Templates>
- Navigate to [install-dir]/transcoder/templates, duplicate the transcode.xml file, and rename it. For example: transcode-webrtc.xml.
- Open the new transcoder template file in a text editor and make the following adjustments:
- Depending on the quality of your incoming stream, enable the template's <Encode> block for 720p or 360p by setting <Enable> to true.
- In the <Encode>/<Video> container, ensure <Codec> is set to H.264, and set <Profile> to your desired setting. For this example use main.
- If you’re unable to turn off B-frames in your RTMP encoder, using the default MainConcept or NVIDIA NVENC encoding implementation will remove them. In the <Encode>/<Video> container, ensure that <Implementation> is set to default for the MainConcept software encoder or to NVENC for NVIDIA NVENC hardware-accelerated encoding. If you use QuickSync encoding, note that B-frames are only removed in the Baseline profile.
<Video>
<Codec>H.264</Codec>
<Implementation>default</Implementation>
<GPUID>-1</GPUID>
<FrameSize>
<FitMode>fit-height</FitMode>
<Width>1280</Width>
<Height>720</Height>
</FrameSize>
<Profile>main</Profile>
<Bitrate>1300000</Bitrate>
<KeyFrameInterval>
<FollowSource>false</FollowSource>
<Interval>60</Interval>
</KeyFrameInterval>
...
</Video>
- In the <Encode>/<Audio> container, set <Codec> to Opus, and enter a valid value for <Bitrate>.
- If your source does not support setting the audio sample rate to 48000, set a <Resample> parameter in your <Audio> encode to this value.
<Audio>
<Codec>Opus</Codec>
<Bitrate>96000</Bitrate>
<Resample>
<Enable>true</Enable>
<SampleRate>48000</SampleRate>
<Channels>2</Channels>
</Resample>
<Parameters>
</Parameters>
</Audio>
- If your source isn't configured to send stereo audio, in the <Decode>/<Audio> container, set the default.scaleChannels parameter to 2 to force any mono audio frames to stereo by duplicating the mono stream for both the left and right channels.
<Decode>
<Audio>
<Parameters>
<Parameter>
<Name>default.scaleChannels</Name>
<Value>2</Value>
<Type>integer</Type>
</Parameter>
</Parameters>
</Audio>
</Decode>
- Save the changes.
If latency is a concern, see Tune for latency for additional options.
After you’ve configured your RTMP source and the Transcoder, go to Test playback for next steps.
Tune for latency
The expected latency of passthrough streams that output WebRTC is one second or less. Transcoding, however, introduces latency into the media delivery pipeline. Streams that require audio or video transcoding have an expected latency of two seconds or less. While some amount of latency is expected due to frame buffering in your encoder, there are steps you can take to tune the WebRTC stream for low latency, depending on the source and whether it needs to be transcoded.
For an RTSP source stream:
- In [install-dir]/conf/[application-name]/Application.xml, in the <Streams> container, set <StreamType> to rtp-live-lowlatency.
<Streams>
<StreamType>rtp-live-lowlatency</StreamType>
...
</Streams>
For an RTMP source stream:
- In [install-dir]/conf/[application-name]/Application.xml, in the <Streams> container, set <StreamType> to live-lowlatency.
<Streams>
<StreamType>live-lowlatency</StreamType>
...
</Streams>
For any transcoded stream:
- In your transcoder template in [install-dir]/transcoder/templates, disable the <SortBuffer> in the <PostProcess> container element by setting <Enable> to false.
<PostProcess>
<SortBuffer>
<Enable>false</Enable>
...
</SortBuffer>
</PostProcess>
Test playback
Now that your custom live application and source stream ingest is set up, you can test out your workflow.
Test WebRTC playback
In production environments, a WebRTC playback page must be hosted on a web server utilizing SSL/TLS encryption. For testing and learning purposes, Wowza provides a hosted WebRTC playback test page so you can see WebRTC in action more quickly.
Note:
- Wowza Streaming Engine 4.8.5 or later is required for the hosted WebRTC playback test page.
- You can use the the Wowza hosted WebRTC test pages with the latest version of Chrome, Firefox, Safari, and Microsoft Edge version 79 and later.
- Ensure that your IP camera or encoder is sending a source stream to Wowza Streaming Engine.
- In a browser tab, go to the WebRTC play hosted test page.
- In the Signaling URL field, enter the secure WebSocket URL to connect to the Wowza Streaming Engine WebRTC sessions listener:
wss://[ssl-certificate-domain-name]/webrtc-session.json
where ssl-certificate-domain-name is the secure domain name for your Wowza Streaming Engine instance.
If using Wowza StreamLock, for example, the Signaling URL looks something like this:
wss://5ab4321c0d123.streamlock.net/webrtc-session.json
If you are connecting WebRTC sessions using a port other than the standard SSL/TLS port 443, you must include that non-standard port in the Signaling URL:
wss://5ab4321c0d123.streamlock.net:[SSL-port-number]/webrtc-session.json
- Enter an Application Name that matches the WebRTC live application you configured.
- For Stream Name, enter a name for the stream, such as myStream.
- To play the WebRTC stream from Wowza Streaming Engine, click Play.
- To test playback in a different browser or with a different device, click Copy config (
) to copy the configuration settings and share them.
For more advanced learning and testing, Wowza Media Systems provides WebRTC examples on GitHub that demonstrate how to play WebRTC streams with Wowza Streaming Engine. See Use WebRTC example pages with Wowza Streaming Engine.
Test playback for HLS or other protocols
By default, Wowza Streaming Engine packages incoming source streams into multiple playback protocols including HLS and MPEG-DASH. See the Wowza Video Test Players page to test playback using these protocols from Wowza Streaming Engine.
More resources