Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: Live Record 1000 simultaneous streams

  1. #1

    Question Live Record 1000 simultaneous streams

    We have a scenario in which we would have to record upto 1000 simutaneous streams with failover capabilities.
    Since there is no way any HD will suffice and we want to be able to scale horizontally, we are trying to come up with a viable high available solution.

    Here is the main concern:
    - record upto 1000 live streams simultaneously.
    - have hot spares available in case server crashes.
    - in case a server crashes while recording, another server should be able to continue the stream and append to the same file.
    - the streaming of these is not a concern as we will be leveraging a CDN (limelight / edgecast) for this.

    Here are the possible architectures we have come up with:

    Live Repeater Origin -> multiple Live Repeater Edges -> all edges write to a SAN / NAS
    - we would mount a folder from the NAS / SAN into the wowza content folder via sshfs and write directly to it.
    We would use startupstreams.xml on the edges to auto connect and record. But we need this completely automated.

    Hardware Load Balancer (ip bound at firewall) -> multiple Wowza Servers -> all write to a SAN / NAS.
    If one server dies, there are always other servers to easily fail over.

    Are there better ways to do this? What is the best way to automate this?
    User1 -> livestream stream1 -> need to record
    User2 -> livestream stream2 -> need to record
    User3 ->livestream stream3 -> need to record
    User 1000 -> livestream stream 1000 -> need to record

    Thank you

  2. #2
    Join Date
    Dec 2007



    The most live streams I have heard of on one server is about 100. RAM and CPU are the limiting factors, not bandwidth as with playback clients. We recommend a late model dual quad cpu, 8-16g ram, 64bit OS/Java, 1-4gbs nic and RAID 0 with as many disks as you can get (fast SSD drives are good)

    To record many streams at once you should have RAID 0 or 1+0. Writing directly to NAS/SANS might work, but they would have to be close, otherwise consider using this to move recordings when they are done:

    I'm not sure that the liverepeater system will be of much help. It is useful for scaling one or a few live streams to many clients. Will there be many clients? If the liverepeater is useful you should record on the origin. You can use StreamType "liverepeater-origin-record". If there is some good reason to record on a liverepeater edge, you can do it like this:

    If a server crashes, and you are recording to NAS/SANS, and file is not corrupt (which is very possible in this scenario), and you are able to start and other server and write to same NAS/SANS location and stream name, then it should append. But that will be difficult to accomplish, there are a lot of "ifs"


  3. #3


    Hi Richard,

    Yes we are planning on using dual quad cpu with 16G of ram as origin servers.
    The choice between writing locally to the filesystem on RAID 1+0 vs SAN is up in the air.
    I have tested the liverepeater and it works but the idea is to distribute the incoming streams for recording purposes. not the outgoing (since we can use a CDN to offset that). Yes there are about 100k users concurrently watching all these streams. But since we plan to use a CDN, not worried about that.

    I can record on the origin, but not on the edge, which is what Im trying to get working right now. The example works, but I want it automated to record the entire recording start to finish. StartupStreams.xml maybe would help here do you think ?
    Is there a stream type like liverepeater-origin-record for edge like liverepeater-edge-record? or can we create it? I guess the issue is here:

    Since cpu and ram and hd write speed are the limitations, we would need multiple origins right.
    What if we have a scenario like this:
    hardware load balancer -> 3 liverepeater origins -> 20 live repeater edges (this is where we will record)
    What if each stream is recorded twice on 2 different edges.

    If liverepeater is not that useful to distribute incoming streams for better recording as to not hit the cpu ram caps on each machine, what other modules would work. We can make changes to existing modules to do this.

    We have moving the file after recording working, but Im wondering if writing/appending to SAN is better? But if there are a lot of ifs like you said, then unless we record each incoming stream multiple times there is no failover type capability right ? Are there any other ways ?

    What kind of architecture would you recommend for this? We can use multiple late model cpu machines with 16g ram each
    Last edited by arpan_synapse; 07-18-2012 at 12:39 PM.

  4. #4
    Join Date
    Dec 2007


    You might need 10 or more origin servers with 100 incoming streams each. I suppose you could use liverepeater edge servers to re-stream and record.

    Obviously this is going to require a lot of load testing, and network engineering that is not my expertise


  5. #5


    Thanks Richard,

    Could you point me in the right direction of recording entire streams automatically on the liverepeater edge, the same way liverepeater-origin-record works?

  6. #6
    Join Date
    Dec 2007


    There is not built-in equivalent on the edge. You could use the LiveStreamRecord API to start recording from the IMediaStreamActionNotify3.onPublish, which I think runs when edge starts re-streaming from an origin.

    Take a look at the module example in the LiveStreamRecord package to get started with its API


  7. #7


    A reply from support:

    - To do the load balancing for incoming I would look at having a software load balancer built as part of Wowza. So the connection arrives into a fixed Wowza server, via custom code then redirects the incoming stream to least load server for incoming streams. You would need to check which ever encoder you are using supports RTMP redirects. This would provide you more control for load balancing incoming streams and could be based on incoming, rather than overall connections. If a streams gets disconnected and redirect works, then when connecting again it should re-direct to a working server and come back up. This would need some testing.
    - Having an approach above you would need to do some further custom work to push, or get the stream pulled, to your edge servers dynamically when the stream is published to a specific server. Publishing a stream on an edge server with liverepeater-edge is very similar to an encoder connecting, so again you will be limited to 100 streams incoming per edge server.
    - Unless the encoder is going to publish to multiple servers then you will always have a single point of failure, the encoder and the entry server, however this can be reduced a little.
    - Some possible architecture choices
    - Incoming Wowza server to receive the encoder only. This reduces risk for any streams not being available as the server only manages incoming streams and one connection from any edge, recording server.
    - Recording server - Pulls the stream from the encoder server. It has no other job but to record. You can build a sufficiently large server so it can record the number of streams required. You should be able to scale this relatively simply. You can use the Live Record module rather than the live-record stream type, tutorial can be found here
    - You have already described you will have an edge layer.
    - Monitoring of bandwidth, CPU and memory will be critical to make sure any expected capacity numbers, ie.100 streams per incoming server, are correct.
    - Across your edge servers you will need a load balancing solution so that the correct incoming streams can get mapped to the correct outgoing, push publishing stream into LimeLight.


  8. #8


    Writing a Load Balancer to hit an application server instead of wowza because the incoming stream is run through a .NET application. It would be easier for them to integrate an RTMP redirect via an API call. Here is the code we have so far, is this safe? I am worried about the Thread sleep. All this is doing is reporting current connections to a web server every 10 seconds.

    public class LoadMonitor extends ModuleBase implements IServerNotify2 {
    	private String LoadMonitorURL = "";
    	public void onServerInit(IServer server) {
    		WMSProperties props = server.getProperties();
    		this.LoadMonitorURL = props.getPropertyStr("LoadMonitorURL");
    		if(this.LoadMonitorURL != null){
    				// report load
    				try {
    					getLogger().info("Synapse -- Report Load to " + this.LoadMonitorURL);
    					long connections = server.getConnectionCounter().getTotal();
    					String data = "count=" + String.valueOf(connections);
    					URL url = new URL(this.LoadMonitorURL);
    				    URLConnection conn = url.openConnection();
    				    OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
    				} catch (Exception e) {
    				// sleep
    				try {
    				} catch (InterruptedException e) {
    	public void onServerShutdownComplete(IServer arg0) {
    		// TODO Auto-generated method stub
    	public void onServerShutdownStart(IServer arg0) {
    		// TODO Auto-generated method stub
    	public void onServerConfigLoaded(IServer arg0) {
    		// TODO Auto-generated method stub
    	public void onServerCreate(IServer arg0) {
    		// TODO Auto-generated method stub
    Last edited by arpan_synapse; 07-19-2012 at 09:40 AM.

  9. #9


    Good post
    Last edited by MARY2006; 12-03-2012 at 11:19 AM.

  10. #10


    Quote Originally Posted by rrlanham View Post
    The most live streams I have heard of on one server is about 100. RAM and CPU are the limiting factors, not bandwidth as with playback clients.

    I know this is an older topic, but is 100 incoming live streams still around the max you can get on one server? We are getting around that currently with our setups, but I wanted to check to see if anything has changed with newer Wowza and Java versions and new hardware options.


Page 1 of 2 12 LastLast

Similar Threads

  1. Maxing out a c3.8xlarge instance: 500, 1000 or 2000 simultaneous viewers?
    By niemion in forum Wowza Media Server 3 for Amazon EC2 Discussion
    Replies: 21
    Last Post: 03-21-2016, 06:15 AM
  2. Simultaneous record and video viewing
    By dmitry_mm in forum General Forum
    Replies: 6
    Last Post: 09-10-2013, 05:53 AM
  3. Replies: 1
    Last Post: 04-09-2013, 11:16 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts