Skip to content
Advertisement

How can I run command line FFMPEG and accept multiple pipes (video and audio) without blocking on the first input?

I’m trying to mux h264 and aac created with MediaCodec using FFMPEG, and also use FFMPEG’s RTMP support to send to youtube. I’ve created two pipes, and am writing from java (android) through WriteableByteChannels. I can send to one pipe just fine (accepting null audio) like this:

./ffmpeg -f lavfi -i aevalsrc=0 -i "files/camera-test.h264" -acodec aac -vcodec copy -bufsize 512k -f flv "rtmp://a.rtmp.youtube.com/live2/XXXX"

YouTube streaming works perfectly (but I have no audio). Using two pipes this is my command:

./ffmpeg 
-i "files/camera-test.h264" 
-i "files/audio-test.aac" 
-vcodec copy 
-acodec copy 
-map 0:v:0 -map 1:a:0 
-f flv "rtmp://a.rtmp.youtube.com/live2/XXXX""

The pipes are created with mkfifo , and opened from java like this:

pipeWriterVideo = Channels.newChannel(new FileOutputStream(outputFileVideo.toString()));

The order of execution (for now in my test phase) is creation of the files, starting ffmpeg (through adb shell) and then starting recording which opens the channels. ffmpeg will immediately open the h264 stream and then wait, since it is reading from the pipe the first channel open (for video) will successfully run. When it comes to trying to open the audio the same way, it fails because ffmpeg has not actually started reading from the pipe. I can open a second terminal window and cat the audio file and my app spits out what i hope is encoded aac, but ffmpeg fails, usually just sitting there waiting. Here is the verbose output:

ffmpeg version N-78385-g855d9d2 Copyright (c) 2000-2016 the FFmpeg
developers
  built with gcc 4.8 (GCC)
  configuration: --prefix=/home/dev/svn/android-ffmpeg-with-rtmp/src/ffmpeg/android/arm 
    --enable-shared --disable-static --disable-doc --disable-ffplay 
    --disable-ffprobe --disable-ffserver --disable-symver 
    --cross-prefix=/home/dev/dev/android-ndk-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64/bin/arm-linux-androideabi- 
    --target-os=linux --arch=arm --enable-cross-compile 
    --enable-librtmp --enable-pic --enable-decoder=h264 
    --sysroot=/home/dev/dev/android-ndk-r10e/platforms/android-19/arch-arm 
    --extra-cflags='-Os -fpic -marm' 
    --extra-ldflags='-L/home/dev/svn/android-ffmpeg-with-rtmp/src/openssl-android/libs/armeabi ' 
    --extra-ldexeflags=-pie --pkg-config=/usr/bin/pkg-config
  libavutil      55. 17.100 / 55. 17.100
  libavcodec     57. 24.102 / 57. 24.102
  libavformat    57. 25.100 / 57. 25.100
  libavdevice    57.  0.101 / 57.  0.101
  libavfilter     6. 31.100 /  6. 31.100
  libswscale      4.  0.100 /  4.  0.100
  libswresample   2.  0.101 /  2.  0.101
 matched as AVOption 'debug' with argument 'verbose'.
Trailing options were found on the commandline.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option async (audio sync method) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input file files/camera-test.h264.
Successfully parsed a group of options.
Opening an input file: files/camera-test.h264.
[file @ 0xb503b100] Setting default whitelist 'file'

I think if I could just get ffmpeg to start listening to both pipes, the rest would work out!

Thanks for your time.

EDIT: I’ve made progress by decoupling the audio pipe connection and encoding, but now as soon as the video stream has been passed it errors on audio. I started a separate thread to create the WriteableByteChannel for audio and it never gets passed the FileOutputStream creation.

matched as AVOption 'debug' with argument 'verbose'.
Trailing options were found on the commandline.
Finished splitting the commandline.
Parsing a group of options: global .
Successfully parsed a group of options.
Parsing a group of options: input file files/camera-test.h264.
Successfully parsed a group of options.
Opening an input file: files/camera-test.h264.
[file @ 0xb503b100] Setting default whitelist 'file'
[h264 @ 0xb503c400] Format h264 probed with size=2048 and score=51
[h264 @ 0xb503c400] Before avformat_find_stream_info() pos: 0 bytes read:15719 seeks:0
[h264 @ 0xb5027400] Current profile doesn't provide more RBSP data in PPS, skipping
[h264 @ 0xb503c400] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
[h264 @ 0xb503c400] After avformat_find_stream_info() pos: 545242 bytes read:546928 seeks:0 frames:127
Input #0, h264, from 'files/camera-test.h264':
  Duration: N/A, bitrate: N/A
    Stream #0:0, 127, 1/1200000: Video: h264 (Baseline), 1 reference frame, yuv420p(left), 854x480 (864x480), 1/50, 25 fps, 25 tbr, 1200k tbn, 50 tbc
Successfully opened the file.
Parsing a group of options: input file files/audio-test.aac.
Applying option vcodec (force video codec ('copy' to copy stream)) with argument copy.
Successfully parsed a group of options.
Opening an input file: files/audio-test.aac.
Unknown decoder 'copy'
[AVIOContext @ 0xb5054020] Statistics: 546928 bytes read, 0 seeks

Here is where I attempt to open the audio pipe.

new Thread(){
     public void run(){
          Log.d("Audio", "pre thread");
          FileOutputStream fs = null;
          try {
               fs = new FileOutputStream("/data/data/android.com.android.grafika/files/audio-test.aac");
          } catch (FileNotFoundException e) {
               e.printStackTrace();
          }
          Log.d("Audio", "made fileoutputstream");  //never hits here
          mVideoEncoder.pipeWriterAudio = Channels.newChannel(fs);
          Log.d("Audio", "made it past opening audio pipe");
     }
}.start();

Thanks.

Advertisement

Answer

Your explanation isn’t very clear, I can see you’re trying to explain exactly what you’re doing, but it’s not working.

First: can you describe the actual issue. I have to read until halfway your post into an 8-line paragraph and it looks like you’re suggesting ffmpeg is hanging. Is that the issue? You really want to be explicit about that.

Secondly: how do you pipe data into the FIFOs? This matters. Your post is entirely unclear, you seem to suggest ffmpeg reads an entire video file and then moves to the audio. Is that correct? Or are both streams fed to ffmpeg concurrently?

Lastly: if ffmpeg hangs, it’s likely because one of your input pipes is blocking (you’re pushing data to FIFO-1 and the buffer is full, but ffmpeg wants data from FIFO-2 and the buffer is empty). Both FIFOs need to always be independently filled with data.

Advertisement