Wowza Community

Air, IOS, and HLS

I’ve made an adaption of the videochat example… it publishes h264 video (no audio, at present) to rtmp://[myserver]:1935/myApp/myStream. When I export the project as a swf and view in the browser, I can successfully subscribe to the published stream (using a separate netstream instance), and display the 2 videos side-by-side (the local webcam that I’m publishing to the server, and streaming it back from the server)… as expected there is a slight delay between the two videos (maybe 1 second).

When I package this as an AIR for IOS app, and run it on my ipad2, it works in exactly the same way (surprisingly, as I have read that IOS is not capable of accepting RTMP streams, only HTTP, or HLS). When I publish a stream from my ipad, I am able to view it in my swf (in the browser). However, when I publish from the swf (in the browser), I am unable to view the stream in the ipad app (despite being able to publish from the ipad app, and view the stream in both the ipad app and swf app).

It is also worth mentioning that I’m able to view the stream (published from the swf) in the ipad app, if I use the AS3 StageWebView, and provide the URL: http://[myserver]:1935/myApp/myStream/playlist.m3u8 . While interesting, this isn’t really a solution, as the goal of my project is to create a cross-browser / ios / android compatible real-time audio/video chat application, and the absolute minimum latency time that I’m able to achieve with Apple’s HLS protocol is around 7 seconds (too much).

I’ve been scouring the internet for a week, but I haven’t yet been able to come up with any answers (most specifically, as to why the ipad app can publish/stream video from/to itself, and the swf can accept the stream from the ipad app, but the ipad app can’t read the stream from the swf). Could it be that the video published by the ipad app is somehow encoded differently to that of the swf? If so, is there any way analyze the two streams, and identify any difference?

I realize that AS3, Air for IOS, and the inner-workings of Apple devices are all rather specialist topics, but I didn’t really know where else to publish my question :slight_smile:

Any ideas? I’d love to be able to make this work.

Thanks.

To publish in Flash and playback in iOS with a video only stream, you can set the Flash application to use h.264 video instead of Spark or VP6. Take a look at how to setup Flash app for h.264 in this guide:

https://www.wowza.com/docs/how-to-convert-flash-player-11-output-from-h-264-speex-audio-to-h-264-aac-audio-using-wowza-transcoder

The above guide goes on to set the audio codec to Speex and use the Wowza Transcoder to transcode the stream to h.264/AAC for iOS and other clients. But if you are not using audio, you do not need to follow the audio steps or use the Transcoder.

Richard

Paul,

Sorry, did miss h.264 mention at the start.

Hm, first I’m not exactly sure what is happening with this Air export to iOS, but it is not a swf and it is not rtmp because they are not supported, and it almost certainly is cupertino because that is supported.

When you publish a stream to Wowza that has LiveStreamingPacketizers that includes “cupertinostreamingpacketizer”, Wowza logs some detailed info about the video codec and (tho not in this case) audio codec. Take a look at the access log at the codec info for cupertino packetizing. For example:

INFO server comment - LiveStreamPacketizerCupertino.handlePacket[live/definst/Stream1][avc1.66.30]: H.264 Video info: {H264CodecConfigInfo: codec:H264, profile:Baseline, level:3.0, frameSize:424x240, displaySize:424x240, frameRate:24.0, crop: l:0 r:4 t:0 b:0}

Richard

Hi Richard,

Thanks for your reply, but it didn’t really address my questions.

To publish in Flash and playback in iOS with a video only stream, you can set the Flash application to use h.264 video instead of Spark or VP6

Yes, as stated in my post, I’m already doing that.

Take a look at how to setup Flash app for h.264 in this guide:

http://www.wowza.com/forums/content…anscoder-AddOn

Yes, I’ve already got a functioning app, as stated in my post.

The above guide goes on to set the audio codec to Speex and use the Wowza Transcoder to transcode the stream to h.264/AAC for iOS and other clients. But if you are not using audio, you do not need to follow the audio steps or use the Transcoder.

That’s right, I’m not interested in transcoding audio.

My question is this (without repeating my initial post too much)… I’ve made a chat app (as per https://www.wowza.com/docs/how-to-convert-flash-player-11-output-from-h-264-speex-audio-to-h-264-aac-audio-using-wowza-transcoder). It’s working well, I can broadcast from the swf (in the browser), and stream the same live feed (webcam) back to the same swf, and display the two side-by-side, with a 1 second delay (or less). If I publish the app using Air for IOS, and run it on my ipad2, the behavior is identical (broadcasts h264 video from ipad, and streams back to ipad, with no problem in displaying the video feed).

When I broadcast from the ipad (air for ios app), I’m able to stream the feed to the swf (in a browser).

My issue, however, is that when I broadcast from the swf (in the browser), I can’t stream the feed to the ipad… the video simply doesn’t show up. I don’t understand why the ipad CAN receive the stream that is broadcasts from itself (h264 video, using RTMP, and a separate netstream/netconnection), but it can NOT receive a stream published from a swf (in a browser). It’s the same app, same code, one running in the browser, the other published as air for ios.

I should make it clear that I’m ONLY using RTMP - no HLS or HDS. Could it be that the ipad somehow encodes the video that it publishes to the server differently to how the swf does? If so, is there any tool for analyzing/debugging an incoming RTMP stream (to check for encoding differences)?

I appreciate that it’s a difficult question - any insight would be much appreciated.

Thanks,

Paul.