Forgive my ignorance as this is all pretty new to me. We currently use an analog camera, connected to a VBrick that sends an RTSP stream to a Darwin server that servers up a .mov stream that we embed in a web page. That .mov file point a player like QuickTime to view the RSTP stream.
We are refreshing the solution end to end. We're going digital camera to h.264 encoder to Wowza. That part all makes sense to me. The part that I'm confused about is "what then"? Do I need a "player" like JWPlayer to embed the video in a website? Does the website need to decide what format the device will support or does the player do that (like HLS to iOS and Flash to Windows)?
This part I know is probably better asked on a JWPlayer forum but is it licensed per server that's doing the embedding or per streaming (aka Wowza) server?
You are correct, you will need to implement a player technology, like JW Player, that embeds in your website. You can implement Java script that can sense the user agent and deliver the appropriate stream based on the client. This is helpful when having to decide when a client is using Flash in a web browser, an iPhone, or an Android device.