I noticed one interesting thing this weekend - and this is that choosing Java is *very* important! Now, the only question left is which Java to choose? Sun, IBM, open...?
Here is what i found:
All 4 servers have:
- wowza3.1 on linux ubuntu, default settings for java
- 1 ORIGIN server and 3 FRONTEND servers
Livestreaming with 6000 concurrent clients spread around three EDGE servers.
The stream was working flawlessly on Edge1 server holding up 4000 users only by itself! Servers load and CPU usage were stable all the time of stream. The configuration of this server is:
x86_64, 16 cores of 2.93GHz and 12Gb of RAM, java version:
java version "1.6.0"
Java(TM) SE Runtime Environment (build pxa6460sr10fp1-20120321_01(SR10 FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux amd64-64 jvmxa6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT - r9_20111107_21307ifx1
GC - 20120202_AA)
JCL - 20120320_01
The other two EDGE2 and EDGE3 servers were lousy at performing (java started throwing out errors when edge reached 1000 concurrent clients and the stream died for a minute or so) but are much stronger and running different java:
x86_64, 24 cores of 2.00GHz and 16Gb of RAM, java version:
java version "1.6.0_31"
Java(TM) SE Runtime Environment (build 1.6.0_31-b04)
Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)
So from my example I learn that it is important what Java to choose, what else can it be? I changed Java on EDGE2 and EDGE3 to IBM Java that is running on EDGE1 and I really hope I solve the problem with this.
And my question is... How do You know what Java to choose? I assume that latest version is the best, but I didn't know there were such a differences between Sun, IBM and openJava... What is the best practice here? I want to squeeze the maximum out of those 3 EDGE servers because there are VERY powerful!