I didn’t think this would be so difficult.
All I want to do is play live video on my netbook (from /dev/video1:input=1:norm=NTSC) and record it at the same time. Without introducing lag.
mplayer plays the video fine (no noticeable lag).
mencoder records it fine.
The mplayer FAQ says you can do it this way:
mencoder tv:// -tv driver=v4l:width=324:height=248:outfmt=rgb24:device=/dev/video0:adevice=hw.1,0 -oac mp3lame -lameopts cbr:br=128 -flip -ovc lavc -lavcopts threads=2 -o >( tee filename.avi | mplayer -)
But that doesn’t work.
You can’t record and play at the same time because there is only one /dev/video1 device, and once either mencoder or mplayer is using it, the device is “busy” to any other program that wants to read the video stream.
I spent lots of time with mplayer, mencoder, ffmpeg, avconv, and vlc; as far as I can tell none of them can do it, directly or indirectly. There are ways that work if you don’t mind 200 or 300 ms of extra latency over mplayer alone. But I’m doing a FPV teleoperation thing and that’s too much latency for remote control.
I found a way that sort of works. Here’s a bash script (works in Linux Mint 15, which is like Ubuntu):
#!/bin/bash
mplayer tv:// -tv device=/dev/video1:input=1:norm=NTSC -fs&
outfile=$(date +%Y-%m-%d-%H%M%S)$1.mp4
avconv -f x11grab -s 800×600 -i :0.0+112,0 -b 10M -vcodec mpeg4 $outfile
This works by running mplayer to send the live video to the screen (full screen), then running avconv at the same time to grab the video back off the display (-f x11grab) and encode it. It doesn’t add latency, but grabbing video off the display is slow – I end up with around 10 fps instead of 30.
There must be some straightforward way to “tee” /dev/video1 into two virtual devices, so both mplayer and mencoder can read them at the same time (without one of them complaining that the device is “busy”). But I haven’t found anybody who knows how. I even asked on Stack Overflow and have exactly zero responses after a day.
(If you know how, please post a comment!)
Addendum for Linux newbies (like me):
After you put the script in file “video.sh”, you have to:
chmod +x video.sh # to make it executable (just the first time), then
./video.sh # to run the script (each time you want to run it)
You’ll probably want to tweak the script, so you should know that I’m using a KWorld USB2800D USB video capture device, which puts the composite video on input=1 (the default input=0 is for S-Video) and requires you to do norm=NTSC or it’ll assume the source is PAL.
-fs makes mplayer show the video fullscreen. Since I’m doing this on my Samsung N130 netbook with a 1024×600 screen, the 4:3 video is the 800×600 pixels in the middle of the screen (starting at (1024-800)/2 = 112 pixels from the left).
Also, many thanks to Compn on the #mplayer IRC for trying really hard to help with this.
Update 2013-11-02:
I haven’t given up on this, so I’ll use this space to record progress (or non-progress).
I started a Stack Exchange thread on this.
On IRC I was told that VLC can do this. I got as far as getting it to display the video at 720×96 (yes ninety-six) resolution, with a lot of lag (the source is VGA, 640×480). Googling about it, it seems the resolution problem is probably fixable with VLC, but the lag isn’t. So I gave up on that.
The most promising approaches at the moment seem to be:
- This page about ffmpeg which gives ways to create multiple output from a single video input device – exactly what I need. But I haven’t found any way to get ffmpeg to read from input=1:norm=NTSC (as mplayer can).
- This thread on Stack Exchange seems to describe ways to “tee” the video from one device into 2 (or more) other devices. One way using V4L2VD, the other using v4l2loopback. I haven’t figured out how to get either working.
Update 2013-11-03:
Pygame has the ability to read and display video streams, but ‘nrp’ (one of the developers of pygame) told me on IRC that he never implemented reading from anything other than the default input 0 (zero). He suggested that the info needed to update the pygame code to do that is here, and the source code is here. I’m not really up for doing that myself, but maybe somebody else will (I posted this bug on it, per nrp’s suggestion).
Another idea I had was to just buy a different USB video capture device, that works with the default input 0 and norm. So far I haven’t found one that does that.
But I’ve got two new leads:
Update 2013-11-03 #2:
I think I made a sort of breakthrough.
v4l2-ctl can be used to control the video4linux2 driver after the app that reads the video stream has started. So even if the app mis-configures /dev/video1, once the app is running you can configure it properly.
The magic word for me is:
v4l2-ctl -d 1 -i 1 -s ntsc
That sets /dev/video1 (-d 1) to input 1 (-i 1) and NTSC (-s ntsc).
Not only that, but I (finally) found out how to get avconv to configure video4linux2 correctly (and maybe also for ffmpeg).
For avconv, “-channel n” sets the input channel, and “-standard NTSC” sets NTSC mode. I think the equivalents in ffmpeg are “-vc n” and “-tvstd ntsc” respectively, but I haven’t tried those yet.
But this works:
avplay -f video4linux2 -standard NTSC -channel 1 /dev/video1
Now I can try to ‘tee’ the output from /dev/video1….
Update 2014-06-01:
I gave up on this, but eventually got it working in Python with Windows (see this post); maybe that method will also work in Linux (I haven’t tried it).
For what it’s worth, this guy claims he has it working this way:
vlc -vvv v4l2:///dev/video1:input=1:norm=PAL-I:width=720:height=576 –input-slave=alsa://plughw:1,0 –v4l2-standard=PAL_I –sout ‘#duplicate{dst=display,dst=”transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}: std{access=file,mux=mp4,dst=test.mp4}}’
I’m doubtful (esp. re latency), but you could try it.