A measure of resolution for 16:9 video whose frame size is 3840x2160, or roughly 4000 pixels in width. 4K video is also sometimes called Ultra HD (UHD), or ultra high definition video; or 2160p video. 4K streaming provides a very sharp, clear picture but requires sufficient bandwidth in order to play smoothly.



Advanced Audio Coding (AAC) is an audio compression format used for in live and VOD streams. Commonly supported versions include AAC-Low Complexity (AAC-LC), High Efficiency AAC (also called HE-AAC, AAC+, or aacPlus), and High Efficiency AACv2 (also called enhanced AAC+ or aacPlus v2).


Dolby Digital 5.1 Surround Sound (AC-3) and Dolby Digital Plus (also called Enhanced AC-3 or E-AC-3) are audio compression technologies that support up to six channels of sound.

adaptive bitrate streaming

Adaptive bitrate (ABR) streaming lets players switch between bitrate renditions of a stream to accommodate changing network conditions, CPU constraints, and display capabilities. Players control switching based on client-side factors, and Wowza Streaming Engine or Wowza Streaming Cloud perform the server-side switching transparently to viewers.

ABR streaming requires multiple versions, or renditions, of a file or live stream, each encoded at a different bitrate. Keyframes must be aligned across renditions in order for switching to work.

aspect ratio

The width of the video frame divided by its height. The aspect ratio of a 640x480 frame is 1.33 or, when expressed as a proportion of width to height, 4:3 (the frame contains 4 vertical lines for every 3 horizontal lines).

Two aspect ratios are commonly used in video streaming: 4:3, which was the standard for traditional CRT television sets and is still used by DVDs; and 16:9, which is the newer, widescreen aspect ratio used with high-definition (HD) playback.



The measure of the throughput of streaming bits, typically measured in kilobits per second, or Kbps. Higher-quality streams (with better resolution and larger frame sizes) have a higher bitrate than lower-quality streams. Client bandwidth may limit the bitrate at which streams can play smoothly.



A content delivery network (CDN) is a distribution system that scales the delivery of content such as streaming media across the Internet. CDNs replicate content in caching servers located in geographically dispersed data centers (edge servers), and deliver content to clients from the nearest physical server.


Common Encryption (CENC) is an open standard for applying AES encryption and key mapping methods to live and VOD streaming media content and providing header information that allows DRM systems to decrypt the content.


Wowza Streaming Engine packetizers create chunks of indexable, keyframe-aligned content for HLS, HDS, and MPEG-DASH live streams. Those protocols use the term segment to describe the same unit of content.


The Wowza Streaming Engine implementation of an Apple HLS media playlist is called a chunklist. It's a subset of the chunks (or segments) in the master playlist.


Common Media Application Format (CMAF) is an ISO standard wrapper for encoded multimedia content that can be delivered using both MPEG-DASH and HLS. When combined with the chunked-transfer encoding data-transfer mechanism in the HTTP 1.1 protocol, CMAF can be used to achieve low latency streaming. Support for CMAF in Wowza Streaming Engine is in development.


A codec (encoder-decoder) is a hardware or software algorithm that compresses audio and video into a file format that can be efficiently transmitted, and then decodes the file so it can be played.

Lossy codecs are typically used in streaming because they produce smaller files that require less bandwidth. Lossy video codecs include H.263, H.264, and H.265. Lossy audio codecs include Dolby AC3, AAC, and MP3.

Lossless codecs, in contrast, are mostly used to store or archive media, because they retain all of the information in the original stream. Lossless video codecs include FFV1, Motion JPEG 2000, and H.264 Lossless. Lossless audio codecs include MPEG-4 ALS and FLAC.


A wrapper for the components of a stream: the encoded audio and video, the subtitles, the metadata, and so on. Container formats, sometimes called container file formats, support varying contents and codecs. Examples of container formats include MP4, MPEG-TS, Ogg, and QuickTime.



Digital rights management (DRM) refers to technology that protects copyrighted content from unauthorized distribution. Wowza Streaming Engine supports DRM key management systems from BuyDRM, EZDRM, and Verimatrix. Wowza Streaming Cloud doesn't support DRM.


Digital video recording (DVR) is the capability to pause, rewind, and resume playing live streaming content. The nDVR feature in Wowza Streaming Engine brings DVR capabilities to live streaming.


embedded player

An application that you embed into an HTML page to play streaming media.


The process of converting digital audio and video files into a different format. Encoding is performed by a codec.



A software framework that includes libraries and command-line tools for encoding, decoding, transcoding, muxing, and streaming video content. The FFmpeg container supports many standard video and audio formats, including various flavors of MPEG and MP3, and it creates HTTP-based output, among other formats.


Adobe Flash is a multimedia platform for producing and playing rich media content. Flash has been deprecated and will end-of-life by 2020.

Adobe Flash Player is a browser plug-in that can play live and VOD streams. Third-party players may also support Flash video playback.

frame rate

The number of frames in 1 second of video. The slower the frame rate, the lower the video quality. High-definition devices typically capture video at 30 or 60 frames per second (fps). The standard for video streaming is 29.97 fps in the United States and Japan, and 25 fps in the rest of the world.

frame size

The dimensions of a frame of video, measured in pixels. Width comes first, followed by height: The frame size 640x480 is 640 pixels wide by 480 pixels high.

Encodings and bitrate renditions are often named for the frame height, such as 1080p or 1080i, where p refers to progressive (noninterlaced) scanning and i refers to interlaced scanning. The p does not mean pixels.



The capability in Wowza Streaming Cloud to limit or control where a live stream can be viewed by geographical region.


Group of pictures (GOP) refers to the successive frames in an encoded stream from one keyframe interval to the next. For example, if the keyframe interval at frames 1 and 30, the GOP consists of frames 1 through 29.



A compression standard that was developed for low-bitrate videoconferencing but also supports Flash and RTSP streaming. It has been largely superceded by H.264.


A compression standard that provides higher-quality video with lower bitrates than earlier formats such as H.263. The terms H.264, AVC (advanced video encoding), and MPEG-4 Part 10 are equivalent and interchangeable. H.264 is the current standard for video streaming and is the most commonly used format for encoding live streams and multimedia files for on-demand streaming. It's the default format used by Transcoder in Wowza Streaming Engine and by Wowza Streaming Cloud.


H.265, also called High Efficiency Video Coding (HEVC), is a compression standard that delivers much higher-resolution video at the same bitrate as H.264. Its data compression ratio standard is double that of H.264. Support for H.265 is in preview release in Wowza Streaming Engine 4.1 and later.


High-definition (HD) video has a 16:9 aspect ratio and a frame size of 1280x720 pixels. It’s sometimes called 720p.


HTTP Dynamic Streaming (HDS) is an HTTP-based specification developed by Adobe Systems for delivering adaptive bitrate live and video-on-demand streams. HDS streams require a Flash-based player that's built using the Open Source Media Framework (OSMF).

HDS streams contain multimedia content called segments; Wowza Streaming Engine packetization code calls these segments chunks. HTTP clients use a manifest file to request the segments for playback. Wowza Streaming Engine breaks segments on keyframes, although not usually on every keyframe; however, each segment contains at least one keyframe.


HTTP Live Streaming (HLS) is an HTTP-based specification developed by Apple for delivering adaptive bitrate live and video-on-demand streams from servers to players. HLS streams can be played in the Safari browser; on Apple iOS devices; on a wide assortment of players, including Wowza Player; and on devices including set-top boxes and Android phones and tablets.

HLS streams contain keyframe-aligned groups of MPEG-TS or fragmented MP4 multimedia content called media segments, or as Wowza Streaming Engine packetization code calls them, chunks. HTTP clients call the server-generated manifest file, called a master playlist, to request the segments in order for playback.

Apple has announced an extension to HLS that enables low latency streaming through the use of partial media segments, playlist delta updates, blocking playlist reload, and rendition reports. Apple Low-Latency HLS also uses HTTP/2 for delivery. Support for Apple Low-Latency HLS in Wowza Streaming Engine is in development.

HTTP streaming

HTTP streaming refers collectively to the various specifications that enable the progressive download of streaming media over the HTTP protocol. Designed for adaptive bitrate streaming, HTTP streaming specifications include HDS, HLS, Microsoft Smooth Streaming, and MPEG-DASH.



Short for intraframe, an I-frame is a frame of video that is a complete image and can be decoded independently of other frames.


Internet Protocol Television (IPTV) is the delivery of on-demand television content over an IP network.



Keyframes mark the interval used to compress a group of video frames. Keyframe intervals, also called I-frame intervals, are also used as seek points in players. A seek point indicates where an adaptive bitrate rendition can be switched in a stream. Too many keyframes can result in poor (stuttered) playback.



The time it takes from the moment content is encoded by the source until it appears for a viewer in the player. Latency can range from tens of seconds to near real time; for live streaming, particularly applications such as gaming and auctions, which require interactivity, lower latency is better. Streaming protocols have inherently different amounts of latency, with some being much lower than others. However, some of the higher-latency protocols, such as Apple HLS, can be optimized to reduce latency. Ultra low latency refers to latency of under 3 seconds.

live streaming

The broadcasting and playing of video and/or audio content in real time over an Internet protocol. A live stream is always in progress and doesn't support pause, seek, or rewind functions. Live streams originate from an encoder or from a static file.

load balancing

The distribution of workloads across computers and networks. Load balancing is used to distribute playback client connections among edge servers to scale live and on-demand streaming.



Used in HTTP streaming, a manifest is a file that contains (indexes) the content of the stream and the information required to play it. When a player requests the stream, the server responds with the stream's manifest, which the HTTP client uses to order requests for the stream segments and to get metadata, bitrate, and codec information. MPEG-DASH manifest URLs end in .mpd, an acronym for media presentation description. In HLS, HDS, and SRT streaming, the manifest is called a playlist.


A re-streaming feature for Wowza Streaming Engine that allows the server to request a source stream from a video or audio source. The media server polls to see if a stream is available instead of the source initiating the connection with the server. MediaCaster enables re-streaming of IP camera streams (RTSP/RTP streams), SHOUTcast/Icecast streams, streaming output from native RTP or MPEG-TS encoders, Secure reliable transport (SRT) streams, and RTMP streams from another Wowza Streaming Engine server (live repeater streams). MediaCaster uses .stream files to configure a URI for the source stream and set properties that control the connection.


Information about a stream that accompanies the stream as its encoded and decoded. Examples of a stream's metadata are its codec and its resolution. Players use metadata to display the stream properly.

Timed metadata is metadata that's accompanied by a timecode. When the stream is encoded, the timed metadata is synchronized to audio and video keyframes. Timed metadata flows through a server to the client where, during playback, the timecode serves as a cue point to invoke an action on the data. Two common timed metadata formats used in streaming and supported by Wowza technologies are Action Message format (AMF), a binary format developed by Adobe Systems for exchanging messages between servers, and ID3, a tagging specification that carries timed metadata in Apple HLS streams.

Microsoft Smooth Streaming

An HTTP-based specification developed by Microsoft for delivering adaptive-bitrate live or video-on-demand streams from servers to clients for playback using Microsoft Silverlight technology.


Moving Picture Experts Group (MPEG) is the ISO working group that defines standards for the coded representation of digital audio, video, 3D graphics, and other data. Wowza Streaming Engine and other Wowza technologies support many MPEG standards, including H.264 (MPEG-4 Part 10), MPEG-DASH, and MPEG-TS.


Dynamic Adaptive Streaming over HTTP (DASH), or MPEG-DASH, is an ISO standard for HTTP-based adaptive bitrate streaming. Like the proprietary HTTP-based technologies Apple HLS, Adobe HDS, and Microsoft Smooth Streaming, MPEG-DASH delivers sequenced segments of audio and video content from a server to a player. In Wowza Streaming Engine packetization, MPEG-DASH segments are called chunks.

MPEG-DASH streams include a list of the available segment URLs in a manifest, which describes segment information and media characteristics. Clients request segments sequentially based on network conditions, device capabilities, and other factors to enable uninterrupted playback of the adaptive bitrate stream.


MPEG Transport Stream (MPEG-TS) is a container format used for delivering and storing streams. Defined in the MPEG-2 standard, MPEG-TS is designed for sending data across a network where there’s a potential for data loss.

Wowza Streaming Engine can ingest MPEG-TS live streams as well as distribute them using a stream target. Wowza Streaming Engine can also write .ts files to support playback of live Apple HLS streams and CDN caching infrastructures that support Wowza HTTP Origin features. These .ts files are kept in memory, not written to disk, so they're not easily co-opted for other purposes.


The simultaneous sending of one stream from a server to multiple destinations in a single transmission. With multicast streaming, a single IP router creates optimal one-to-many distribution paths to a multicast destination address. The multicast stream is published to a multicast address in a virtual LAN. A live encoder publishes one stream to the multicast address and playback clients stream from the multicast address, not from the source. The virtual LAN replicates multicast streams for each playback client. Multicast streaming isn't supported over the public Internet. Wowza Streaming Engine supports re-streaming from a multicast address, and its Stream Targets feature supports multicast publishing.



Over-the-top (OTT) refers to the delivery of video and multimedia content to viewers over the public Internet, bypassing the telecommunications and broadcast television platforms that traditionally controlled the distribution of such content.


passthrough streaming

The delivery of a single encoded stream from a server directly to a destination, such as a hosted webpage or a stream target. Passthrough streams are not transcoded but they may be transmuxed. Passthrough streaming is useful for such workflows as IP camera live feeds or routing streams to specific destinations.


Used by the HDS, HLS, and SRT streaming protocols, a playlist is an index of the segments of content in the stream.

HLS streaming uses two types of playlists: master and media.

The master playlist contains a list of the URLs to variant media playlists (one for each bitrate rendition) in the HLS stream. It also contains descriptive tags to control the playback behavior of the stream.

A media playlist is an individual playlist for each bitrate rendition, or variant, inside the HLS master playlist. It contains a list of URLs pointing to the bitrate rendition's media segments.

The Wowza Streaming Engine Apple HLS packetizer, cupertinostreaming, calls an HLS media playlist a chunklist.


A standardized method for delivering audio and video over a network or the internet. See HDS, HLS, MPEG-DASH, WebRTC, SRT, RTMP, RTSP/RTP, and WOWZ.

push publishing

The capability in Wowza Streaming Engine to send a live stream to an external destination, such as a CDN, for delivery to viewers. In Wowza Streaming Engine Manager, push publishing is exposed through the Stream Targets feature. Push publishing can alo be managed by adding destination profiles to the server's PushPublishing.xml map file.



The act of a server broadcasting an encoded video file as though it were a live stream, for example, a re-broadcast of a concert or other event. With the MediaCaster feature of Wowza Streaming Engine, re-streaming also allows the server to request a live source stream from a video or audio source. The media server polls to see if a stream is available instead of the source initiating the connection with the server. MediaCaster enables re-streaming of IP camera streams (RTSP/RTP streams), SHOUTcast/Icecast streams, streaming output from native RTP or MPEG-TS encoders, Secure reliable transport (SRT) streams, and RTMP streams from another Wowza Streaming Engine server (live repeater streams).


A version of an adaptive bitrate stream, including the stream's video and audio components, at a specific bitrate. Renditions are also sometimes called bitrate renditions, output renditions, or transcoded renditions.


The total number of pixels in a frame of video: the width times the height of (or number of lines in) the frame. The resolution of a 640x480 frame, for example, is 307,200 pixels. Higher resolutions mean sharper images.

4K (8.3 megapixels) refers to the resolution of 3840x2160 video. 4K is one of two ultra high resolutions used in consumer television. The other is 8K, which is 7680×4320 video, or 33.2 megapixels.


Real Time Messaging Protocol (RTMP) is a low latency, TCP-based specification from Adobe Systems for live and on-demand streaming to Adobe Flash platform technologies, including Adobe Flash Player. RTMP, which uses TCP port 1935, provides persistent connections between the streaming server and the player, which means audio, video, and data can move bidirectionally in RTMP connections.

RTMP comprises several subtypes: RTMPE, which uses a security mechanism for encryption; RTMPS, which runs over a secure TLS/SSL connection; and RTMPT, a tunneling variant that allows packets to traverse firewalls.


Real Time Streaming Protocol (RTSP) is a network protocol that controls streaming media servers. It establishes and maintains sessions between a source or player and the streaming server. RTSP uses RTP and RTCP for the delivery of the media.

Real-time Transport Protocol (RTP) is a standard packet format for delivering audio and video over IP networks.

RTCP is the data channel in an RTSP stream.

RTSP maintains session states by using control sequences, which are commands sent over RTCP with identifiers that track concurrent sessions. Most RTSP control sequences are sent by the client to the server, but some are initiated by the server and sent to the client. The default RTSP transport layer port number is 554.



In HTTP-based streaming, a segment is a small, keyframe-aligned and indexable bit of a stream, usually between 2 and 10 seconds long, that’s transferred from a server to a client. Stream segments are reconstructed sequentially on playback. In adaptive bitrate streaming, switching occurs at segment breaks.

HLS streaming uses the more specific term media segment. Wowza Streaming Engine HLS, HDS, and MPEG-DASH live stream packetizers call segments chunks.


The simultaneous broadcasting of one program across two channels, traditionally radio and television. Wowza ClearCaster supports the simulcasting of a live stream to Facebook Live and to a Wowza Streaming Cloud destination such as a hosted webpage, a custom target, or a Wowza CDN edge resource.


Synchronized Multimedia Integration Language (SMIL) files are XML files used by Wowza Streaming Engine for adaptive bitrate streaming with HDS, HLS, MPEG-DASH, or Smooth Streaming playback. SMIL files can use transcoded or transrated live streams created by Transcoder or encoded live or VOD streams from an external device as their source.


Secure Reliable Transport (SRT) is an open source video transport protocol developed by the SRT Alliance, which was founded by Haivision and Wowza Media Systems. SRT is designed to deliver secure, high-quality, low latency live streams over unpredictable network conditions.

Wowza Streaming Engine 4.7.3 and later supports SRT ingest and stream target delivery on Linux and Windows server installations. Wowza Streaming Cloud also supports SRT ingest.


The process of sending compressed audio and video files or live broadcasts across a network without downloading the content or saving it locally. Clients play the stream as it's received or from a client-side buffer.



The conversion of encoded audio or video from one codec to another. Transcoding allows content that was originally encoded in one format to be played by a client in another format. In addition, streams that are transcoded in Wowza Streaming Engine, Wowza Streaming Cloud, and Wowza ClearCaster all have aligned keyframes, which enables adaptive bitrate streaming.


Transcode-multiplexing (transmuxing) is the process of changing the format of an audio or video file while keeping some or all of the stream information from the original. In other words, transmuxing converts to a different container format without changing the file contents.

By default, Wowza Streaming Engine transmuxes live and VOD streams to support a variety of playback types; the Transcoder feature is required for streams to be transrated or transcoded.


The process of changing an encoded video or audio file from one bitrate to another without changing the file format. Wowza Streaming Engine and Wowza Streaming Cloud both perform transrating as well as transcoding as needed to create multiple stream renditions for adaptive bitrate streaming.

trick play

The capability to fast-forward and rewind on-demand streams. Wowza Streaming Engine supports trick play for Flash RTMP clients and for .flv files that contain Sorenson Spark or VP6 video. Wowza Streaming Engine doesn't support trick play for H.264 video or in other client types.



User Datagram Protocol (UDP) is a connectionless protocol that, unlike TCP, doesn't perform error checking. That makes UDP more efficient but also more prone to network issues, packet loss, and packets arriving out of order. Wowza Streaming Engine can publish and re-stream MPEG-TS and multicast streams over UDP.

unicast streaming

A type of streaming in which each playback client is served its own stream. With unicast streaming, server and network utilization increase linearly as playback load increases, so it's critical to be aware of the stream bitrate, the peak concurrent load, and the server's network capacity.

Wowza Streaming Engine supports unicast streaming for live streams using an origin/edge repeater. For VOD, Wowza Streaming Engine supports unicast delivery using Media Cache.


video-on-demand streaming

Video-on-demand (VOD) streaming uses a multimedia file as the source. It's always available, always plays from the beginning (or other specified start point), and has a fixed duration. By comparison, a live stream is always played in progress, is only available while it's live, and has an unknown duration. VOD streaming is synonymous with on-demand streaming.


A open and royalty-free block-based transform compression standard similar to H.264. It is used primarily with WebRTC.


A open and royalty-free compression standard customized for high-resolution video with lossless compression. Unlike HEVC, which offers similar compression benefits, VP9 is largely supported by modern web browsers.



Web Real-Time Communication (WebRTC) is an open framework that enables real-time communication of audio, video, and data in web browsers and apps. WebRTC is designed for peer-to-peer connections but includes fallbacks in case direct connections fail.


An IETF standard protocol that allows an interactive communication session between a browser and a web server over a single TCP connection. In streaming, WebSocket connections can be used for text, chat, out-of-band metadata, and control data. Wowza Streaming Engine supports WebSocket connections through an HTTP provider. Wowza Streaming Cloud combines WebSocket with WOWZ to achieve ultra low latency streaming.


A TCP-based messaging protocol developed by Wowza and built in to Wowza Streaming Engine for server-to-server communications. WOWZ is also supported by Wowza Streaming Cloud ultra low latency stream targets, the Wowza GoCoder SDK, the Wowza GoCoder app, and Wowza Player. WOWZ supports SSL, so a WOWZ URL may begin with wowz:// or wowzs://.