Add Live Video Streaming to a Mobile App (Webinar)

August 23, 2018 by

Today, most people have a camera in their pocket at all times, which is capable of both capturing and playing back high-quality live video: their smartphone. As a result, mobile apps with live video streaming are becoming more popular for a wide variety of use cases.

The Wowza GoCoder SDK can help you develop a mobile app that includes live video streaming quickly and easily. In this webinar recording, we’ll cover:

  • Developing your own app for live streaming at scale across the globe.
  • How to create a playback mobile app for viewing live broadcasts.
  • Considerations for two-way metadata communication within an app.
  • Tips and tricks from Wowza engineers on making the best app possible.

Watch the video below, and download the slides here.

Full Video Transcript:

Russ Fustino:

Okay, I’d like to welcome everybody to our Wowza GoCoder SDK session here, Add Live Video Streaming to a Mobile App. Go ahead, Benji, you can go forward one slide.

Everybody follow our Wowza Dev Twitter handle. My name is Russ Fustino, Developer Evangelist at Wowza. Today we have Benji Brown. Good morning Vietnam, Benji, how you doing?

Benji Brown:

Hey, buddy. I’m good. Thanks, everybody, for showing up for this, and hopefully we’ll be able to present some really great ways of implementing GoCoder into your apps.

Russ Fustino:

Cool, so Benji’s our Lead Mobile Engineer, and he is currently out in Vietnam on a temporary stay out there. Barry is up next, Barry Owen, VP of Engineering. Barry, go ahead and say hello.

Barry Owen:

Hey folks, thanks for joining us today. As Benji said, we hope we can give you a lot of information, so fasten your seat belts, there’s going to be a lot of information pretty quickly here. Please ask questions as they come up, and we will do our best to get them answered either during the webinar, or certainly follow up after if we can’t get to them all.

Russ Fustino:

Okay, we’re going to do a couple questions here, right before we get going, to kind of get an experience level set for everybody on board here, so we can tailor our presentation accordingly. I’ve got two sets of two part questions, so the first question here is, “Are you a software developer?” If you can please participate, and select one of the following. Yes I am a software developer, or no I am not, and we’ll give you a second or two to finish that up … And, we’ll share the results with.

Okay, looks like everybody’s voted. That’s great, thanks for your participation. Looks like we got about a 70, 30% split, Barry and Benji, so about 70% devs here, so that’s good. Let’s do another question here. Those that are devs, or the 70% that answered yes, can you please share what kind of development do you primarily? Do you do App Dev, which you’re making apps, and Android and Windows Store apps. Or, Web development, or API development, or Backend, or primarily something else? I know a lot of you overlap, so your primary type of development.

Oh, we got some good answers coming in here. This is good, you’re going to like this. 70% are doing App Dev, 7% Web, and 21% API, so you folks are in the right place, that’s for sure. We got a lot of good API information here, and SDK information for building apps today, so that’s perfect.

All right, let’s go ahead and go on to the next question. Are you currently streaming videos in your apps and web sites? Everybody please participate in this one. Yes I am, maybe in the future here, one to six months out, seven months to year, or a year or more out.

Getting pretty good. Oh, this is great, hundred percent voted, that’s great. Thank you very much. The numbers are in. 45% are currently using video streaming in the app, and then 25% within the next six months, 10% after that. Let’s go ahead and follow up this with one more question. Those that did answer you are doing it now, you are using video streaming now, are you currently using Wowza products to do that?

All right, so it looks like we got 70% yes and 30% will be new, so that’s our polling today, and Barry, I’m going to turn it over to you now to continue on with the deck. Go ahead, Benji, go forward one, and then we’ll go ahead and get going here. Thanks everybody.

Barry Owen:

Cool. As those of you building apps know, building apps is all about engagement, and how to increase the engagement with your applications, and drive your business. One of the ways to do that is adding relevant new features, such as live streaming. There are certainly obstacles to overcome in adding streaming to your app. One of the big goals of the SDK is, how can we help you to make that easier, and give you high quality experience with proven components, and not having you guys have to reinvent the wheel.

As you all know, being app developers, mobile devices are everywhere, and it’s a way to both create and consume content, anywhere, anytime. It’s becoming more and more prevalent for people to have mobile only workflows, and less and less tied to a desktop.

One of the things we also strive to do in the SDK is to provide the latest in technology. Very solid broadcast, endeavor to be as good as we can over varying network conditions. Then, on the Playback side, a high quality stream with minimal latency, as you might get from an HLS, or some other stream that you might traditionally use out into the native players you can get in IOS or Android.

Let’s go ahead and move one forward there, Benji. Wowza has developed an end to end ecosystem here. Whether it’s professionally created content, or user generated content you can do both, and you can integrate easily into your mobile experience. Traditionally, we’ve had the Broadcast out for a few years now. More recently, we’ve added the ability to do playback. Playback works specifically well with our low latency cloud service, and Broadcast works with any of our Engine or cloud products.
We’ve added things to help you make a richer experience with your application, in that not only can you do Broadcast and Playback end to end, you can leverage those, whether it’s creating content from a bunch of different people, and figuring out how to aggregate that out, or professionally generated content that are viewed by many, many people.

We also have things like bi-directional data event support, so you can have events coming in from the encoder, from the broadcast, and you can also display those events, or react to those events on both the server side, if you’re using Engine, but also on the device itself. We’ll pass those events through and allow you direct [inaudible 00:07:35]. We’ll talk a lot more about how that works as we move forward. That’s one of our newest releases, is the bi-directional event support, as I mentioned. That’s in the current release, which was released a couple months ago now. Then, integrated low latency playback, which we think is a really interesting service for many, many use cases, and it’s tuned to work in conjunction with our Wowza Cloud Ultra Low Latency service, which was also launched a couple months back.

I’m sure you guys have a bunch of varying use cases, and I’d love to hear more about them. Here are some of the ones we currently have in our portfolio of people who have been using the product. People are doing live crowd sourcing of new and weather. That may go into a broadcast facility, and they can then leverage that to actually go on the air with some of them if they choose. Lots of, what we’ll term, vanity broadcasting. We have folks like the Kardashians, and people like that using this as part of their app to broadcast into their fans, and do the snippets of their life, and things like that. We also have companies using this to do interview sessions with athletes, which is cool. The athletes are flying home from a game on a plane, and they can talk to the fans on the way home.

We do have support for external cameras, so there’s a bit of lifting required there. You can integrate things like drones, or higher quality cameras that your phone might have, and broadcast over the phone. People using it for sports streaming at both professional on down to high school, and people like that where you may aggregate a bunch of streams from a bunch of people watching a high school game, and be able to do clip extraction and cool stuff like that to produce highlight reels, and things like that.

Certainly, the low latency aspect plays there. We have a customer using this for auctions, where they have videos to auctions, and then broadcast that video out to help the auctions be more effective.
The last one that’s interesting, and this plays into some of the data event stuff as well, is the HQ Trivia like experience, where you’re having a video stream going out, and your synchronizing it with events, such as the questions popping up and things like that. Those are all things you can do with the combination of Broadcast, Playback, and data events.

How do you get the SDK, you might ask. Well, there’s a free trial available. You can go to the link shown on your screen, and you’ll get some instructions on how and where to download the SDK. That trial can easily be converted into a full fledged license when you’re ready.

Now I’m going to turn it over to Benji, and he’s going to walk you through a few things. Then, we’re going to show you some real live code about how you integrate the SDK into your existing project, and some of the things you have to do to the first steps too. We’re going to divide this up into three sections. We’re going to show you how to create the bit of code you need to do a broadcast. We’re going to show you a bit of code on what you need to create Playback. Then, we’re going to spend a little time at the end talking about the data event support. Benji’s going to walk you through some of that, and I’ll keep an eye on questions, and answer to a lot of them the best I can.

Russ Fustino:

Yeah, actually before Benji gets going, we do have one question here, Barry. We got an Android developer, and he wants to know if he’s able to send custom events metadata to the server from Android, and he needs to listen to those events by registering an event list. How can I listen to some other web players (ie: I’m required to listen an open C player.)

Barry Owen:

Let me answer that the best I can. We’ll definitely cover this more in the events section, and you’ll see some code to do this. Yes, for sure you can send custom events from the SDK to Streaming Engine. On the Streaming Engine side, you can either pass those events directly through, and they will persist in the Streaming Engine output. If you’re viewing these in, for example, an RTMP stream coming out the other side, you would pull those events out just like you would normally in any sort of a Flash player.

If you’re viewing these as an HLS stream or something like that, on the outbound side, you would need to have a module on the engine, which we actually have a stock module that you would modify very slightly within your event, to then embed that event in the HLS stream, for example. Then, you could process that event, and it’s fairly easy with HLS to pull those events out with either JavaScript or a player that has an event bottle in it. Hopefully that answers your question. We’ll go into more detail in that as we get down to the data events.

Russ Fustino:

Cool, thanks. Go ahead, Benji. We’ll answer a few more when you get done with your demo. Go ahead.

Benji Brown:

Cool. We’re going to demo through some slides about the iOS side, and the Android side. Then, I’ll show you some live coding, kind of, just me working in Xcode, kind of demoing both Broadcast and Playback, and finally a metadata app. The first thing that you do is, you need to add the framework. You pull down the trial framework from that link that we provided earlier. You open up a new project, and you drag and drop this SDK framework into your project. That allows you, then, to access all the files so that you can implement GoCoder within your app.

Russ Fustino:

Benji, we’re not seeing your screen here, so I don’t know if you got …

Benji Brown:

Oh no, it’s …

Barry Owen:

We’re good.

Russ Fustino:

You doing the PowerPoint yet? Okay, very good.

Benji Brown:

Okay. Yeah, kind of on the same side of thing. On the Android side, you just have to set up your dependencies, and create a build file. Also, link in the actual AAR files for the library. Kind of simple on that. Then, the next steps are really just the step by step to set up the environment within your application, which I can demo in the code. Pretty much, you attach a view to your camera preview. You then configure the GoCoder camera. You then specify on your GoCoder config file, you specify video and audio stream settings. Then, you set up what your callback methods, and who is going to be responding to your callback methods on various status notifications that happen from the GoCoder instance. Finally, you start live streaming your camera feed by tapping a button, and starting the actual WOWZ Broadcast flow. That’s how that works with Live Streaming. We’ll demo this with little snippets here.

iOS, you pretty much can set your root view. As the camera view, you could also nest a couple of these so that you can make your own custom layout using [auto 00:15:17] layout constraints. Whatever you want to do. Then, you start the preview, and this is a way you can actually start seeing the camera output from your iOS device. Similar kind of thing on the Android side of things. You set up your camera view associated with the resource. Then, you set the camera view to that resource, and then you start the camera preview.

We’ve kind of wrapped the native camera with some helper methods, so here are some examples of how you could set up continuous view autofocus if you wanted, on both platforms. Pretty much, you grab the WOWZ camera from the camera preview, then you toggle values such as port focus mode on these camera properties.

Next one we do is, you set up the config file. This kind of describes to GoCoder instance how it should be delivering video as a broadcaster. There’s a default mode if you just instantiate here in GoCoder. Then you can set different frame presets. You can do audio and video on and off, you can do a bunch of things. There’s a lot of stuff to dig into if you look at the config file on headers. Then, the next part that’s really required is, the host address and stream name, along with some other values, like port number. Sometimes if you have authentication set on your Wowza Streaming Engine, the endpoint that you’re going to, you might have to add in some username or password so that you can actually broadcast to them. You then assign this config file to the GoCoder instance. Do the same thing on Android. Pretty much same parody there.

The next thing to do is that if you register for the protocols to respond to status callbacks from the GoCoder instance … For example, something within GoCoder errors out, or you want to know when buffering is happening, or you want to know when it’s starting, or you want to know when the GoCoder instance is actually starting to broadcast. You can register for these different states, and then do something, like update the UI, turn on a spinner, turn off the spinner, enable a button, those kind of things. Or, respond to errors, and populate those up to your users so that they can then respond to it. You definitely want to register for those callbacks, and implement those methods that are part of the protocols.

The next thing is actually starting and stopping your GoCoder instance. After you’ve instantiated it, and you have it all [configged 00:18:03] up, you actually just send a message to it to start. You register self, or whatever class you want to be the one that responds to those status callbacks, and then you start streaming. You can also listen to, at this point, to different status events, and do something to your UI in that case.

I’m going to show you how that looks in Xcode. I had a little test broadcast application I opened up, so if you click on here … Hang on. There you go. If you scroll down here, I just drag and drop that framework from my finder into the embedded binaries area, and then it links everything up. Then, I have my view controller here. I imported Wowza GoCoder, I registered to be part of this WOWZStatusCallback protocol, which I then implemented those status methods further down. I normally set the log level on the Wowza GoCoder to default or verbose, so I can start seeing a bunch of stuff in the console logs so I can do debugging, and then just set it to default, or turn it completely off when I ship the app. Then, I instantiate a Wowza config file. I then registered license.

Every app that you build currently needs to have a license associated with it, even a trial license. Normally, if you think about what app name, bundle identifier you want to provide for your app, that’s what you would submit into our web system that provides you then a trial license, or if you purchase a production license. That’s what you would put in this area right here. I normally then listen for the error to make sure I didn’t type it in wrong, or something like that. Then, populate an error in case there was a licensing error on that.

Next thing is, I grab the share instance. I then check check the [inaudible 00:20:02], and then I do permissions checking against the GoCoder instance. I make sure I have camera permissions, I make sure I have microphone permissions. This is important, because you also need to make sure that you set up your info.Plist, at least on iOS to have privacy notifications, like NSMicrophoneUsage, or NSCameraUsageDescription. You need to have these, because otherwise your app won’t be passed through the app store, and you’ll see it also crashes. If you ever have any crashes, you’ll see in the console it says, “Oh, you didn’t register for NSCameraUsageDescription.” So, make sure you go back and do that. Once you have these permission, I then normally do video and audio enable, because I like to stream both audio and video.

Then, I set up the config, which, if you just look at this helper method that I wrote, it has my host address for my local host Wowza Streaming Engine that I have on my computer now. Then, port number, and then the application names, live, and for this demo it’s my stream. Now, I have this GoCoder config property, and I set that to the actual GoCoder instance. I then set my view to the camera view so I can start previewing what my camera feed is providing me. Then, I start the preview.

I then set up, on my storyboard over here, I set up a broadcast button that, if I tap it, you can see that I’ve hooked up to my [IB 00:21:26] action the didTapBroadcast. Inside my view controller, I have a method called didTapBroadcast. This lets me, first of all, validate my config, make sure it’s good. Then, second, I check to see if GoCoder’s status state is running. If it’s running then I use my button to stop the stream. Otherwise, I use the start stream. This is how I just start broadcasting. Once I’ve pressed that button, everything starts happening. One thing I kind of add in here often is this, setting the shared application idle timer to disabled so that the app doesn’t fall asleep behind broadcasting.

The next thing that I have is pretty rudimentary right here, but I have implemented the on WOWZ error methods, and on WOWZ status methods. This is where I would listen for what state my GoCoder is in. Maybe it’s starting, maybe it’s stopping, maybe it’s buffering, which we can see later when I show you the metadata example. That is kind of how Broadcast happens, and that’s the end of this demo. We’ll go back to the PowerPoint presentation. Okay.

Barry Owen:

Great, that was Broadcast in a nutshell. I know that was fast, but hopefully that gives you an idea of … It’s relatively straightforward to do this, and as experienced developers this is something. We do have Metasample code on GitHub, and sample applications that you can also use to follow all of this workflow. Pretty much everything Benji is showing here today, we have complete sample applications for it.

That’s Broadcast, and now we’re going to go on to Playback. I’ve seen a handful of Playback questions come up, and I’m not sure my answers are getting to you when I type them, so I’m just going to answer them here. Currently, Playback does not support adaptive bitrate. It’s a single bitrate playback, which is optimized for low latency. There is an HLS fallback feature that Benji will talk about. The HLS fallback does, in fact, support adaptive bitrate, so you have to choose either low latency with single bitrate, or if you’re looking for adaptive bitrate you can use the HLS fallback feature. We are looking at doing some things with adaptive bitrates, with the low latency playback. It’s a bit more involved when you’re trying to get down to one to two seconds of latency on the end. That’s something we’ll continue to evaluate for the future, but I do not have a timeline on when that might be available.
Go ahead, Benji.

Benji Brown:

Okay. I’m going to talk about Live Playback opportunities. You could be streaming from GoCoder as a publishing endpoint, or you could do a Wirecast, or OBS, or some encoder that’s producing frames, pushing them up to a Wowza Streaming Engine, or Wowza Cloud, or Wowza Ultra Low Latency service. Then, on the other side, you have implemented a playback solution using GoCoder so that you can play back those live stream videos. I’m going to be going through those steps. It’s very similar to the published side of things.

Pretty much, you add a Player view to the app, you then implement, once again, the WOWZStatusCallback methods. You configure the connection for your Player. Then you can configure a fallback HLS. Fallback HLS is really us wrapping the native HLS players that are provided with iOS or with Android. We just kind of set up them as a fallback, and we provide the ability for you to inject a HLS string or URL payload so that we can at least fall back to them in case your other normal stream is not working. We do have the capability for you to do HLS Playback. We are definitely recommending you use the GoCoder Playback capabilities first and foremost.

You can then set the play options on your player, which there are a few options. Like, you could change the fit fill style, the player gravity of the player. You could set buffering amounts, like, pre-roll buffer. For example, you’re streaming something, you want to make sure that you have the most uninterrupted playback, so you set the pre-roll buffer to three seconds. Instead of … Now, in this low latency situation, still have latency being from encoder to playback, being one to three seconds. You’d have that one to three seconds, plus your pre-roll buffer that you set. If you had a pre-roll buffer of three seconds, you’d have a potential total latency of six seconds. This is so that you have a lot of frames stored up on the player side so that in case there’s a drop in network connectivity you won’t have an interrupted playback experience.

That’s how those ones work. Now, we can go through each little snippet here. Pretty much, you instantiate a new WOWZ player, you assign it to a local property. You can set a pre-roll duration. You can also register for data syncs, which we’ll talk most about in the last example. Do the same thing on Android. Pretty much, you set up a view, then you set the resource to it. You then implement the status callback so you can know about the state of the player. Whether it’s starting, running, stopping, buffering. This is so that you can inform your UI, and then you want to turn on a spinner, or you want to have some cool animation when things start, you know, that kind of thing. Same thing on Android. Just implement the same callbacks.

Then, you want to do a setup for the config file. Pretty much the same information that you did on the publish side, where you instantiate a config file, you set the host address. For example, if you’re using the low latency service from Wowza, you would be inputting the edge information from the API callback that you get, not the origin, because the edge is the thing that you consume as a player. The origin is what you would be sending frames to if you were an encoder, a published from GoCoder. You set a stream name, you set an application name, and audio and video enabled, and then you assign to the GoCoder Player Instance.

This also how you can set up your HLS URLs so that you can have a fallback. I think our default right now is three failures. It then falls back to HLS. Anyways, this is the HLS backup URL that you can set. Then, you pretty much set up a button, and you press, and you have it start playing. You can check the state of the player, whether it’s playing or not, and do the opposite. Definitely want to set yourself as a callback so they can receive those WOWZ status callback notifications.

Now I’m going to go into the demo. It’s pretty simple. Once again, you start with your new app, you have your bundle identifier that you then input into the web tool. That provides you a trial license. You download the framework, you drag and drop it into your embedded binaries area, and then, in your view controller you import Wowza GoCoder SDK. You register for the protocol methods for status callback. You then create some local properties for your config file, and for your player. I then, within view did load, I set my log level so that I can get some information in the logs. Just in case I’m running into problems, I can get some information from Wowza about what those problems are.

I’ve instantiated and signed my config file to my local property. I set video and audio enabled on those, just because I want to hear and see. Potential that you could just do video, or you can do audio. That’s up to you. I then make sure the app doesn’t fall asleep by setting the idle timer to disable. Here’s some information about Wowza that I like to log for myself. The next thing I do is I make sure I check my license. I input my license into the GoCoder SDK, and check if I have an error. If I have an error, like, for example, I entered the wrong value, then I definitely do something about that error. Otherwise, I instantiate a new player, I register a data sync on that player, and then I set up the config, which includes the host address for the edge endpoint that I want to be playing back from. Port number 1935, application name, currently is live, and randomly it’s just a stream name called my stream, is what I’ve set.

I then, in my storyboard, I set up as my preview view. I just made a property that I linked against for an [IB 00:31:02] outlet. Then, I have a play button that does an [IB 00:31:06] action to our command to start playback. That’s all this does. I’ll show you in the status callback how we set the player view. I like to turn my player view on when I know that the player’s actually starting. We’ve gotten past the negotiations with the server endpoint, and now we’re about to get frames. That’s when I set my preview, which is my root view, in this case, to my player view so that I can start watching the frames that are coming through. That’s about it. We have our errors, and we have our status. That’s about all we do for this example, so I’ll go back to slides.

Barry Owen:

Hey Benji, can we pause for a couple questions?

Benji Brown:

Yes.

Barry Owen:

We have a couple of Playback questions I wanted to address, and one of them was about the HLS fallback, and can you set that up to essentially be automatic.

Benji Brown:

Currently, no. Right now it defaults to three times. If you are wanting to just-

Barry Owen:

No, I think that’s what he’s asking. If the connection fails or drops, will it switch?

Benji Brown:

Yes, it will automatically switch after three failures. We don’t currently have the property set that you can modulate that value. It could be a request that you could send in to support and see if they can do something about that. We don’t currently just let you choose to do HLS playback instead of GoCoder playback. Normally, if you wanted to do that you could just implement the native iOS or Android players to do that yourself.

Barry Owen:

Yeah, so it is fail over only.

Benji Brown:

Yeah. [crosstalk 00:33:02]

Barry Owen:

You’d set up your WOWZ Stream to be coming into the player with an HLS fallback, and you’d have HLS fallback URL. When the WOWZ stream became unstable or dropped you would switch over to HLS.

Benji Brown:

If you do subscribe to the Ultra Low Latency service that we have, when you do spawn the ULL endpoint using the API, you will see that it provides both an origin value URL, an edge URL, so for consumption by GoCoder, and finally, a fallback HLS URL. You’ll get those three URLs and you can put them into your app so that when you build an app on the ULL service you can have that fallback HLS URL. Any other questions?

Barry Owen:

Think we’re good on Player. From there on we’re going to start talking a little bit about the bi-directional event support, and there are a handful of questions there that we’ll try to answer at the end as well.

Benji Brown:

Okay, cool. With bi-directional data event support we have the ability to register a data sync on both iOS and Android, which pretty much you define a certain [AMF 00:34:22] message that you’re going to be listening for, and what you’re going to be sending. We have several methods that allow you to send from the Broadcaster. Really, sending data events only goes from the Broadcaster, so there’s never a data event being sent from the Playback to the Broadcaster. It’s a one way kind of flow of the data events. Broadcaster could trigger and make an event happen, and all the people playing it back would receive that data event in the stream.

The flow is, in Broadcast you create a data map, which can be a arbitrarily advanced data structure, and then you can send that data event as an action. Then, on the Playback side of things, people will be registering for that data sync and will receive that data event. Then, they can parse that data event and do something with it.

Barry Owen:

We’re not going to go into the details here, but those of you who are using Streaming Engine, there’s a ton of stuff you can do with modules on the Streaming Engine side to process or manipulate those events, or even inject new events for Playback only. There’s lots of resources on how to do that on our website, and our in our GitHub account. We’re intentionally staying a bit high level here, but just know that there’s a lot more capabilities you can do on the Streaming Engine side to interact with these things.

Benji Brown:

On the ULL side, it pretty much just passes the events through, but you can definitely do some combination of ULL and Streaming Engine to do some interesting stuff. Okay, I’ll move on. Pretty much this is a demo on how to create a data map. You create the data map object. You then set a string line in for key. You set an integer. It’s pretty much, it can be a nested dictionary of many things. It’s pretty much just like [Jython 00:36:26], but as an object. You can put a bunch of stuff inside this data map, on both Android and iOS. You then set up the module that you’re going to be signing it as, which in this case would be a stream module, or a data scope module. It really depends upon how you want to packetize it. Then, you frame it with the event name. A lot of times your consumers will only be listening for certain event names, so you want to make sure that your have those nicely defined for both your Broadcast side, and then your Playback or receiving side of the event.

You can register over data syncs, which means you register for certain data events, and then you’ll be listening for those. When they happen, then the callback will occur, and then this on data callback will get the data event, and will do something … You can do something with it. In this case, we’re just logging it to the console to prove that it occurred.

Barry Owen:

You can do so that the events are tied to their name, and you can have multiple events in any stream with different names, and handle them differently as you would like. It’s very flexible.

Benji Brown:

They’re going through that same socket that all the video and audio are going through, so they arrive.

Barry Owen:

Yeah, and they arrive in sync with the frame you sent it at, so they’re all time stamped so you can … The important part about this is, all these data events, as they go through, they are time stamped so you can synchronize them with particular frames in the video.

Benji Brown:

Which is valuable for some use cases, where you need to have a time and metadata experience where something occurs within the video, and you want then your user interface to populate from that event in a synchronized fashion. You don’t necessarily want your UI to pop something up before the video frames actually arrives, and then [inaudible 00:38:34] screen. In that case, I will just show it.

Okay, in this case, it’s a little more complicated. I have two view controllers here. I have a Broadcast one, and I have a Playback one. The same thing applies. You get your bundle identifier, you get your license from the web tool, you drag and drop the framework into your embedded binaries area. You then, in your view controllers, you import GoCoder as a library, and then you set your licenses in your view [inaudible 00:39:20]. I’ll just start from there.

Yeah, I do my logging level, I gain my config file. I then license my GoCoder instance, I respond to the error if it occurs. Otherwise, I get the GoCoder instance, I request camera and microphone permissions, making sure that my info key list is set up with the microphone and camera usages, so it doesn’t crash on me. Once that happens I then register for … In this case I’m kind of registering from other syncs that aren’t valuable for this use case, but I do register for this data sync here. On this time, I did on text data, which was the event name that I registered for. I then set up my config file, I set up my camera view for broadcast. Then I turn on that camera view and I start previewing it so that I can see the video frames before I actually start broadcasting.

I then, some other ancillary stuff that makes the app a little bit more usable. It’s UI hook-ins. Anyways, when I do tap the broadcast button from my storyboard that I’ve hooked together, I first validate that the config file is legit, it doesn’t have some erroneous values in there, which would then screw up Broadcast. Otherwise, I check to see if the state of the GoCoder Broadcast is running. If it is, then I’m stopping the stream, so I’m kind of the start/stop button, is what that does. Otherwise, I start streaming, which then, because I’ve registered for status events, I will see whether it’s idle, starting, running, stopping, and I can respond to those.

The thing that I did for this specific app is that, on the Broadcast side of things I added a little metadata button. What that metadata button does, is that, when I find it, right here, didTapMetaDataButton, I pretty much make sure that the GoCoder is running. If it is in the state of running, then I produce a data map, as I described in the previous slides. I put in a bunch of information here, just to kind of show everything. Then, on this one, I do some data scope stream with onFooBar. In this case, because our metadata’s kind of a protected space, I actually changed this around so it says onFooBar. I needed that named a little bit different. Then, I send that an event name, and that happens when I press the buttn. The event name then gets sent through the stream.

Then, on the Playback side that I’ve implemented, I implement the WOWZ data sync, and you can see that down here I am pretty much just logging the event when it comes through, when I’m playing, and I just log that data back here. It will write to my console here. On the Playback side of things, it’s the same flow that I showed you with all Playback. Pretty much you start playing, and then because I’ve registered for the data sync, the value comes through here. That’s how metadata works. At this moment I could respond, and I could update some UI, I could make a congratulations happen, or I could make an alert happen, or anything that needs to happen that’s tied to the data event when it comes through at the same frame that the video is showing. That’s how metadata works. Anybody have any questions?

Barry Owen:

Yeah, there are a few questions.

Benji Brown:

Okay.

Barry Owen:

One of the questions is, is bi-directional metadata like video chat? I think the right answer to that is, you could use the data events for sort of a data channel that’s both in and out. If you wanted to do a many in, many out, or many to many kind of video chat thing, you’re going to have to do some work on the server side to manage the sessions and things like that so that you know who gets to see what. You can certainly send events in from a single encoder, and broadcast those events out to many viewers. If you’re going to want those viewers to be able to respond back and see that on the encoder side, you in fact can do that, but it’s going to require a little bit of application architecture on the server side to manage the flow of those events back and forth.

Benji Brown:

Definitely. I would even recommend, at that point, maybe wanting to use a database of some type, or a …

Barry Owen:

Absolutely.

Benji Brown:

… A Redis kind of structure.

Barry Owen:

Yeah, a common use case would be, think of Trivia HQ, where you want to synchronize the questions showing up in the UI with the timing of the Broadcaster. When the Broadcaster says, “Now it’s time for question one,” you can send an event down to your clients so that they know when they see that event, they can display question one on the screen, and it will sync up with the video saying, “Here’s question one.”

Benji Brown:

The reason this is valuable is in case, say, for example, you have Trivia HQ, and you have thousands of people all working on your app, and your back end servers that are trying to handle all this information. Maybe there’s a bottleneck, so the host of the HQ show kind of has to wait a little bit, because they’re receiving all the answers from everybody and kind of wants to wait before he reveals the true answers and ends the round. That’s how you would be using the timed metadata, where you would be delivering the UI from the live stream at the point where the host feels comfortable to make that moment trigger.

Barry Owen:

Another question. Is there a way to record the chat stream, Facebook Live or Periscope, when recording a broadcast Wowza Streaming Cloud? I’m assuming you’re wanting to record the incoming events, or the bi-directional events while you’re recording the broadcast. In Wowza Streaming Cloud there currently is not a way to do that. The events, you would have to have a module to be able to do that. While you could do that with Streaming Engine, you cannot currently do that with Wowza Cloud.

Benji Brown:

Yeah, I would recommend … This is a potential that you could work, is that you could consume the stream from Engine.

Barry Owen:

That’s true, Benji, yeah, that makes sense.

Benji Brown:

Potentially use Streaming Engine as a … Creating VOD content, or offline content, or content that would be viewed post-stream.

Barry Owen:

Streaming Engine would essentially be another client viewing the stream, and processing and recording the data.

Benji Brown:

Exactly. You can also record on the GoCoder device. You can create MP4 files, and put it on the device, and then upload them. I understand that’s more of a-

Barry Owen:

That question did come up. Can I record locally, and/or play back local files? That question was asked.

Benji Brown:

Indeed, you can. You have to be careful about how you handle the audio session, because you’re kind of juggling some audio sessions there. It’s something definitely to be careful with how you do that. If you’re going to be trying to do things simultaneously, you have to do it correctly. You can definitely record local files, and then you could potentially push those files up to your own server on a different thread, or a different process.

I don’t know if specifically this question to ask was. Are the data events stored into a custom … When you do that MP4 writing locally, are they stored into the file? They are not. You would need to store those data events separately into some local data event file structure that you create, if you wanted to play those events later back in succession.

Barry Owen:

That is a reasonable backlog request, though, for the local files to include the data events. Streaming Engine well persists data events in an MP4 file if you choose that format.
Let’s see. Couple more questions here. Are the API available in PHP? No, there is currently no PHP wrapper for the API. It is available in Java, Swift and Objective-C. We are working on things with the Xamarin Wrapper, and potentially something with React Native as well.

Russ Fustino:

Also [Unity 00:49:01].

Barry Owen:

Thank you, Russ, I forgot about that one.

Benji Brown:

Are you talking about the API or the SDK?

Barry Owen:

I’m assuming they were talking about the SDK.

Russ Fustino:

Correct.

Barry Owen:

If that’s not true, please respond back and we’ll try to answer it again. Next question. Does the GoCoder make the app heavy? What’s the footprint of the SDK, Benji?

Benji Brown:

The footprint of the SDK, I think it’s 6 megabytes for Android. I think it’s 30 megabytes on iOS, but that’s because of Bitcode. You can strip out the frameworks, and it pretty much about 8 megabytes, I think, on iOS. That’s about the footprint of the SDK. As for usage, there is, on Broadcast side of things it can be high battery usage. Sometimes it can reach into very high battery usage, depending upon the frame size that you are broadcasting. If you’re broadcasting 4K video it’s going to leverage the GPU as much as it can to process those frames, so it will use a good amount of battery. Comparable to other broadcasting libraries for sure. There is a way for you to inject custom frames, aka not use the GoCoder camera object. You can just take your own custom frames and inject them in, and that is …

Barry Owen:

Those could be frames from an external camera, or they could be just frames that you’re creating and rendering within your app.

Benji Brown:

Correct.

Barry Owen:

You could also composite, if you want to do [Ed 00:50:53] graphics or titling you could do compositing as well, before you sent the frame off to the encoder.

Benji Brown:

Aka, we provide a source agnostic capability to inject data into our [AC64 00:51:12] encoder, and then Broadcast or flow.

Barry Owen:

Let’s see. We’ll take one or two more here. Benji, can we use the GoCoder SDK to develop a video chat application?

Benji Brown:

You could, but … You have to think about latency, and how that affects conversations. To have conversations, a two way conversation, a three way conversation, a eight way conversation, really the latency comes into factor there. As you add more people, the latency can become cumulative across the conversation. Not necessarily like if you have eight people and there’s a latency of a second each, you’d have a 10 second latency. It would be the fact that it would cumulatively, the interactive responses of everybody would feel latent.

GoCoder, in and of itself, just broadcasts frames up through a tuned WOWZ protocol that we have. Then you need to leverage your server, and cloud infrastructure to then have a low latency experience. That’s why we [inaudible 00:52:33] through the overload latency capabilities. It is not sub-500 millisecond realtime communication metric, it is one to three seconds. We’re way more suited for a one to very many kind of solution, where you are broadcasting, and there’s thousands of people watching. That would be the better solution, to have multiple GoCoders, with multiple broadcasts than a singular broadcast, and five playback instances instantiated. I don’t think you’ll have a very successful video chatting experience because of the latency of the network that you’d be dealing with.

Barry Owen:

Cool, thank you, Benji. This may be a question-

Benji Brown:

It’s definitely something that we’re really interested in. Let’s just say that.

Barry Owen:

We do have customers doing it. It’s typically with a limited number of participants. We just mentioned Xamarin support. Is that working now? I’m going to let Russ jump in and answer that one.

Russ Fustino:

Yeah, I’m actually hot on the metal on that one. I’m writing a wrapper as we speak, doing Objective-C binding for iOS first, and then I’ll do that for Android. Stay tuned for more information on that. It seems to be going pretty well, so hope they have that completed pretty soon.

Barry Owen:

Cool. Thanks, Russ. Let’s see. One question. Is there a way you could show us a complete end to end demo with server and GoCoder? Unfortunately, that’s out of the scope of this particular webinar, but we’ll certainly take that in. What I would encourage you to do is download a trial of the SDK and/or Streaming Engine, or Wowza Cloud, and it’s pretty simple to get it set up and see it all working from end to end.

Russ Fustino:

It looks like we may have technical difficulties there, guys, so we’re going to end the broadcast and wrap it up. Any parting comments from either of you? Benji? Barry?

Benji Brown:

Just want to say a thank you for everybody for showing up, and feel free to contact our support if you have any questions. Really thankful that you guys are looking at Wowza and GoCoder to be your streaming solution.

Barry Owen:

Couldn’t have said it better myself. Thanks everybody, appreciate your time.

Russ Fustino:

Thanks a lot, see you next time. Follow@wowzadev. Thank you.