Categories
Linux Python Raspberry Pi

vlc command-line tips

Intro

I’m looking to test my old Raspberry Pi model 3 to see if it can play mp4 videos I recorded on my Samsung Galaxy A51 smartphone. I had assumed it would get overwhelmed and give up, but I haven’t tried in many years, so… The first couple videos did play, sort of. I was using vlc. Now if you’ve seen any of my posts you know I’ve written a zillion posts on running a dynamic slideshow based on RPi. Though the most important of these posts was written years ago, it honestly still runs and runs well to this day, amazingly enough. Usually technology changes or hardware breaks. But that didn’t happen. Every day I enjoy a brand new slideshow in my kitchen.

In most of my posts I use the old stalwart program fbi. In fact I don’t even have XWindows installed – it’s not a requirement if you know what you’re doing. But as far as I can see, good ‘ole fbi doesn’t do streaming media such as videos in mp4 format. As far as I know, vlc is more modern and most importantly, better supported. So after a FAIL trying with mplayer (still haven’t diagnose that one), I switched to trials with vlc.

I haven’t gotten very far, and that’s why I wanted to share my learnings. There’s just so much you can do with vlc, that even what you may think are the most common things anyone would want are very hard to find working examples for. So that’s where I plan to contribute to the community. As I figure out an ‘easy” thing, I will add it here. And if I’m the only one who ever refers to this post, so be it. I love my own My favorite python tips, post, for instance. it has everything I use on a regular basis. So I’m thinking this will be similar.

References and related

My RPi slideshow blog post

My favorite python tips – everything I need!

Categories
Network Technologies Raspberry Pi

Making the Raspberry Pi camera look like an Axis ethernet camera

Intro
I can’t add much to this excellent guide:

http://blog.miguelgrinberg.com/post/how-to-build-and-run-mjpg-streamer-on-the-raspberry-pi

except a few customizations and suggestions.

I think we will be able to manage to make the Raspberry Pi + its camera behave like an Axis ethernet camera. This can be useful for First Robotics. But I haven’t proven it out yet, I’m just anticipating it can be done.

I will also mention there is a better way to get real-time true motion video (see the references) and there are sure a lot of ways to not even come close – I know because I tried a bunch of dead-ends before I hit on a good way to do this! I will try to share some of my failures so others can avoid things like vlc, motion, raspi-still, etc.

The details
The Axis camera (I think its model 206) sends output via MJPG (motion JPEG). The Raspberry Pi camera can be made to do the same, with a little tweak here and there.

For instance the mjpg-streamer’s default port is 8080, but you can change it to 80, just like the Axis camera.

Instead of

$ LD_LIBRARY_PATH=/usr/local/lib mjpg_streamer -i “input_file.so -f /tmp/stream -n pic.jpg” -o “output_http.so -w /usr/local/www”

do this:

$ LD_LIBRARY_PATH=/usr/local/lib mjpg_streamer -i “input_file.so -f /run/shm -n pic.jpg” -o “output_http.so -p 80 -w /usr/local/www”

You’d better make sure you don’t have an apache server or something else listening on port 80, however.

Our enemy – lag
Although this command provides some helpful insights into the efficient running of raspi-still:

$ raspistill –nopreview -w 640 -h 480 -q 5 -o /run/shm/pic.jpg -tl 100 -t 9999999 -th 0:0:0 &

it is not sufficient by itself to eliminate all lag, unfortunately. I think the -q switch is a big help, however. In my testing lag seems to be under a second. So, ok, but nothing to write home about. But it’s easy to make it worse than that….

I settled on this testing methodology to get more precise results about lag and frames per second (fps). I held my smartphone with its stopwatch app running next to the computer screen, with the Pi camera close and pointed at the phone. So in my field of view could see the actual phone plus the phone image from the Pi on the laptop. This test was very helpful in illuminating what is going on in fact.

No matter how many fps I requested (e.g., 10 fps by setting -tl 100) the best I can do is a frame every 0.6 seconds (1.6 fps). Because of the stopwatch app I know this pretty precisely! The other interesting thing is that contrary to what i thought prior to doing this more quantitative test, the lag actually isn’t all that bad! It’s maybe 0.2 s. What made the lag seem larger is that you often get “unlucky” and your motion seems delayed because there are so few frames per second repainting the screen.

To be continued…

Other bad approaches
Suuposedly, i was assured, true motion video can be achieved following this recipe. It uses a package called motion. There is no use of either raspistill or raspivid, which is probably a good thing. I have to yet try this out – I just learned about it. Apparently it also solves the lag problem, at least at a low frame rate. This turned out to be one of those dead ends for me. Yes it more-or-less works, but not in real-time and not providing smooth motion.

References
I finally achieved true motion video and documented it in great detail in this post.