Now what I'm doing is, compress the video comes from my android phone's camera using ffmpeg compiled on the phone, and then stream it to wowza. When the video is compressed in Sorenson H.263, I get a H.263 video frame from ffmpeg and add a byte in front of it just as the flv format specifies(that byte is, the first byte in video data),
then use juv-rtmp-client(product from
smaxe.com) to send the frame to wowza. It indeed works when I use a flex program to play the live stream.
But now I need to change the codec to H.264, according to the flv format specifition, it seems need to add more bytes to the raw NALUs come fron ffmpeg(x264). What's more, it seems need to send a RTMP packet to tell the server about the sps, pps info before send the NALUs.
So the juv-rtmp-client seems unable to finish this task. Is anyone has done this before? Has any tools like juv-rtmp-client that can do this job? As I know, librtmp seems to be fine but I don't know whether it can be removed to android.
This is not really the forum to ask this question (try stack overflow), but from taking a look at jun-rtmp-client you should be able to do what you want. The SPS/PPS NALUs go in the header of the FLV file. If you are using ffmpeg/x264 this should be done for you when it emits the byte stream in a FLV container. The SPS/PPS NALUs will be in the extradata field of the AVCodecContext. Personally I would use x264 separately from FFmpeg, and then push the byte stream into FFmpeg for container generation. You should be able to build ffmpeg and link librtmp. This way you may be able to write your entire aspect in C (I am not an android dev).