Intro
In my previous post I outlined how to get real-time video from your Raspberry Pi with its camera, and to make it somewhat robust. In the conclusion I mentioned that it would be nice to superimpose (overlay) a grid over that image, and speculated that openCV might be just the tool to do it. Here I demonstrate how I have done it, and what compromises I had to make along the way.
The details
Well, let’s talk about why I didn’t go the openCV route. I began to bring down the source code for raspivid and raspistill, as outlined in this series of blog posts. And I did get it to compile, but it’s a lot of packages to bring down, and then I still needed to add in the openCV stuff. He provided one example of a hacked source file, but for raspistill, and I needed raspivid which is slightly different. Then there was cmake to master – I have no idea never having used it before. And then I would have needed to figure out openCV, which in turn might require programming in C++, which I have only the most basic skills. And then after all that, my fear was that it would slow down the video to the point where we would lose the real-time aspect! So the barriers were many, the risk was great, the reward not that great.
Logic dictates there should be another way
I reasoned as follows. Windows display graphics. Something decides what pixels to display, and this is true for every window, including mplayer. So if you can get control of what decides how to draw pixels in Windows we can draw our grid on the client side in Windows rather than on the encoder side on the Pi. So I looked for a way to superimpose an image using mplayer. Though they don’t use that term, I soon was drawn to what sounded similar, a -vf (video filter) switch with post-processing capability. I don’t know how to bring up the mplayer documentation in Windows, but on the Pi it’s just
$ man mplayer
and you’ll get a whole long listing. Under -vf are different filters, non of which sounded very promising. then I came across geq (general equation). That sounded pretty good to me. I searched for examples on the web and came across this very helpful discussion of how to use it, with examples.
So, off I went. A lot of the stuff I tried initially didn’t work. Then, when it did work, it lost the real-time feature that’s so important to us. Or the convergence to real-time took too long. I finally settled on this string for my mplayer:
mplayer -vf geq=p(X\,Y)*(1-gt(mod(X/SW\,100)\,98))*(1-gt(mod(Y/SH\,100)\,98)) -ontop -fps 27 -vo gl -cache 1024 -geometry 600:50 -noborder -msglevel all=0 - |
in combination with these switches on raspivid:
raspivid -n -o - -t 9999999 -rot 180 -w 560 -h 420 -b 1000000 -fps 9 |
And, voila, my grid of black lines appears at 100 pixel intervals. Convergence is about 30 seconds and we have preserved the real-timeyness of the video!
But it as a series of compromises and tuning that got me there. For my desired 640 x 480 video I could get real-time video at about 7 fps (frame per second). I think my PC, a Dell Insipron 660, just can’t keep up at higher fps. Because when you think about it, it’s got to do calculations for each and every pixel, which must introduce quite some overhead. Perhaps things will go better on PCs that don’t need the -vo gl switch of mplayer which I have to use on my Dell display. So I kept the pixels per second constant and calculated what area I would have to shrink the picture to to increase the fps to a value that gave me sufficient real-timeyness. I decided there was a small but noticeable difference between 7 fps and 9 fps.
So
pixels/second = fps * Area,
so keeping that constant,
7 fps * A7 = 9 fps * A9
A = w*h = w*((3/4)*w)
So after some math you arrive at:
w9 = w7*sqrt(7/9) = 640 * 0.935 ~ 560 pixels
and h9 = w9*3/4 = 420 pixels
And that worked out! So a slightly smaller width gives us fewer pixels to have to calculate, and allows us to converge to real-time and have almost unnoticeable lag.
What we have now
One the Pi the /etc/init.d/raspi-vid now includes this key line:
raspivid -n -o - -t 9999999 -rot 180 -w 560 -h 420 -b 1000000 -fps 9|nc -l 443 |
and on my PC the key line in my .bat file now looks like this:
c:\apps\netcat\nc 192.168.0.90 443|c:\apps\smplayer\mplayer\mplayer -vf geq=p(X\,Y)*(1-gt(mod(X/SW\,100)\,98))*(1-gt(mod(Y/SH\,100)\,98)) -ontop -fps 27 -vo gl -cache 1024 -geometry 600:50 -noborder -msglevel all=0 - |
For the full versions of the files, and more discussion about the switches I chose, go back to my previous article about screaming streaming on the Pi, and just substitute in these lines in the obvious place. Adjust the IP to your Pi’s IP address.
No tick marks?
My original goal was to innlude tick marks, but I see given the per-pixel calculations required that that’s gonna be a lot more complicated and could only further slow us down (or force us to reduce the picture size further). So for now I think I’ll stop here.
A word on YUV coloring
I am much more comfortable with RGB, but that seems not to be used in Pi video stream. I guess raspivid encodes using YUV. I haven’t mastered the representation of YUV, but here’s a couple words on it anyways! I’m sure it’s related to YCbCr, which is described here. So because groups of pixels share a color, if you change the function above to mod(X/SW\,101),99), for instance, you get alternating green and black grid lines as you go from even to odd pixels. That is my vague understanding at this point. I learned just enough to get my black grid lines but no more…
Unsolved Mystery
Although the approach outlined above does generally work and can be real-time, I find that it also gets laggy when i leave the room and there is no motion. I’m not sure why. Then I introduce motion and it converges again to real-time. I don’t think this behaviour was so noticeable before I added the grid lines, but I need more tests.
Conclusion
We’ve shown how to overlay a black grid on the video output of our Raspberry Pi, while keeping the stream real-time with almost unnoticeable lag.
Conclusion
We have managed to overlay a black grid on our video using built-in functionality of mplayer. It appreciably slows things down. so your mileage may vary depending on your hardware.
References
The original Screaming Streaming on the Raspberry Pi article.