VDPAU Playback Issue (Problem Solved).

One of the facts which apply to Linux computing is, that NVIDIA created an API, which allows for certain video-streams to be played back in a way accelerated by the GPU, instead of all the video decoding taking place on the CPU. And, users don’t necessarily need to have an NVIDIA graphics card, in order for certain graphics drivers to offer this feature, which is called ‘VDPAU’, an acronym that stands for “Video Decode and Playback API for Unix”. Simultaneously, what some Linux users can do, is to experiment with shell-scripts that allow us to click on a specific Application Window, in order to perform screen-capture on that Window for a specified number of seconds, ad then to compress the resulting stream into MP4, AVI, or MPG -Files, once the screen-capture has finished. This latter piece of magic can be performed using elaborate ‘ffmpeg’ commands, which would need to be a part of the script in question. And in recent days, I’ve been tweaking such scripts.

But then an odd behaviour crept up. My NVIDIA graphics card supports the real-time playback of MPEG-1, MPEG-2, DIVX and H.264 -encoded streams, with GPU-acceleration. Yet, when I clicked on the resulting animations, depending on which player I chose to play those with, I’d either obtain the video stream, or I’d just obtain a grey rectangle, replacing the captured video stream. And what I do know, is that which of these results I obtain, depends on whether I’m playing back the video stream using a software decoder purely, or whether I’m choosing to have the stream played back with GPU-acceleration.

I’ve run in to the same problem before, but this time, the cause was elsewhere.

Basically, this result will often mean that the player application first asks the graphics card, whether the latter can decode the stream in question, and when the VDPAU API responds ‘Yes’, hands over the relevant processing to the GPU, but for some unknown reason, the GPU fails to decode the stream. This result can sometimes have a different meaning, but I knew I needed to focus my attention on this interpretation.

Linux users will often need to have some sort of file-format, in which they can store arbitrary video-clips, that do not need to conform to strict broadcasting and distribution standards, even when the goal is ‘just to monkey around with video clips’.

I finally found what the culprit was…

(Updated 8/15/2019, 22h15 … )

(As of 8/13/2019 : )

The Window which I was capturing, had a pixel-size of 941×934 pixels. My software decoders could handle that format, as long as for ‘Xvid’ and ‘H.264′ encoding, my shell-script first resized the capture-rectangle, to have multiples of 4 pixels in each direction. Hence, my software decoders could also handle 941×934 pixel rectangles, the size of which had not been rounded in any way, if the encoding had been in MPEG-2 format. But with VDPAU, the result is quite a different story.

Even when the size of the rectangle was rounded to a multiple of 4 when encoding, VDPAU refused to decode it, because its size did not conform to a standard screen-size used in commercial broadcasting and distribution. I had noticed this quite early in my trials, because with MPEG-2 streams, one of the ‘ffmpeg’ options available is, to target a DVD-format corresponding to the NTSC or to the PAL standard strictly, and when I put this option on the command-line, the captured screen-rectangle would be squished to the standard size, but would play back with VDPAU! I knew that there had to be something which the standard did, which VDPAU requires, but I could not be sure of what exactly this was.

At one point I even experimented with rounding the size of the encoded rectangle, for use with MPEG-2, to multiples of 8 pixels in each direction, but only obtained the same result. (:1)

Yet, with MP4-Files, as long as their size was rounded to a multiple of 4 pixels in each direction, A rectangle of arbitrary dimensions could be made to play back. And, while the ability of my GPU to decode ‘DIVX’ streams is indicated, streams that have been encoded using ‘Xvid’ are also chosen as equivalent candidates by the player application, sent to the GPU to decode, and will produce a grey rectangle, even though their width and height have been rounded to a multiple of 4.

I now know that this behaviour is entirely due to the pixel-size of the rectangle in question. The way I finally found this out was, to run my script, and, instead of clicking on a specific application-window to capture, clicking on the root window of the Desktop itself. This can be a valid thing to do, and, because the size of the monitor on the computer in question was 1920×1080 pixels, this corresponds to a broadcasting and/or distribution standard, and the MPG as well as the AVI -Files that my script generated, would play back just fine, using GPU-acceleration, via VDPAU. And this success took place even though I had not modified the scripts in any relevant way.


 

1:)

I was able to encode rectangle-sizes of 580×792 pixels to H.264 -encoded, MP4-Files, that play back fine with VDPAU. And their distinguishing characteristic would be, the fact that 580 isn’t divisible by 8. Therefore, divisibility of the rectangle-dimensions by 8 was never really an issue.

In this earlier posting, I had essentially defined what a ‘Macroblock’ is. That posting should not be read quite as narrowly as I wrote it. All it really implies is, that a macroblock needs to be twice the size of a ‘transform block’, in each direction. 4×4 pixel transform blocks are just as feasible, as 8×8 pixel transform blocks.

But, I could now take my coding exercises one step further, and test whether MPEG-2 streams are expected to have pixel-sizes that are multiples of 16 in each direction. Yet, because the short-term goal was, to click on an oblique shape on my desktop and capture it, if I used present techniques to distort its size to be multiples of 16 pixels, this would simply result in sizes that are too distorted, or at least, not guaranteed, not to be too distorted, and so my next step went as indicated below…

 


 

2:)

Even though it might just be nice to know, that MPEG-2 clips need to have standard rectangle-sizes, in order for VDPAU to play them, this does not solve the initial problem, that I wanted my shell-script to be able to capture an arbitrary application window, and then to encode it, so that the resulting MPG-File will always play. The way I was able to adapt my script to my newfound knowledge was, to add the following piece of code to the ‘ffmpeg’ command-line:

 


-vf "scale='min(1920,iw)':min'(1080,ih)':force_original_aspect_ratio=decrease,pad=1920:1080:(ow-iw)/2:(oh-ih)/2"

 

What that specification does, is to tell ‘ffmpeg’ that the resulting frame-size should be 1920×1080 – I should not be capturing anything larger than that anyway – But, either to letter-box or pillar-box the input stream, if it is larger, yet, to keep the image at its original sizes, if they are smaller. This latter detail may be useful because, to stretch the size of the details in the stream will also cause it to compress less efficiently.

When giving this command-line parameter, care should be taken not to invoke the same filter in the pipeline elsewhere in the same command, but using different parameters. Apparently, the filter being invoked above can only exist as a single instance, in one command-line. Hence, while it must have been nice also to give ‘-filter:v fps=fps=24‘, to obtain a frame-rate of 24FPS in the fanciest way possible, this command-line parameter cannot be given simultaneously with the parameter which has just become more important, so that the way to set the frame-rate is now ‘-r 24‘, after the input has been defined, and before the output is defined.


 

3:)

It was a custom which Linux computers had adopted in the past, to use the .MPG Filename Extension to denote MPEG-1 streams, and the .MPEG Filename Extension to denote MPEG-2 streams, because the older desktop managers were only able to distinguish the contents of a file, based on the filename extension. But by now, the actual application as well as the desktop manager can distinguish ‘what type of file it is’ most of the time, based on the first few hundred bytes of data stored in the file. Therefore today, the .MPG Filename Extension is being used for both MPEG-1 and MPEG-2 streams. Either way, the application was making this distinction, before handing either type of stream to the GPU to assist decoding.

In certain rare cases, an .M2V Filename Extension is used to tell certain programs that an MPEG-2 stream is required, and those are usually command-line programs, such as the ImageMagick command-line ‘convert‘. And in reality, the use of the .M2V -Extension confuses desktop managers more often than it helps anything.


 

(Update 8/15/2019, 22h15 : )

I have since followed up with a simple exercise, to encode an MPEG-1 File out of a video that with the dimensions of 1392×768 pixels, again using ‘ffmpeg’, and found that the generated video plays fine with VDPAU. This confirms, even though some software will play videos with other dimensions, MPEG-1/2 videos are supposed to be multiples of 16 pixels wide and high.

The script which I was originally undertaking to program, needed to be rewritten like so:

 



W=`convert "$KIMDIR"/capture001.xwd -format "%w" info:`
H=`convert "$KIMDIR"/capture001.xwd -format "%h" info:`

W1=$[${W}+15]
H1=$[${H}+15]

WO4=$[${W1}/16]
HO4=$[${H1}/16]

W=$[${WO4}*16]
H=$[${HO4}*16]

SC="pad=${W}:${H}:(ow-iw)/2:(oh-ih)/2"

ffmpeg -framerate 4 -i "$KIMDIR"/capture%3d.xwd -an \
    -vf "$SC" \
    -r 24 -qscale:v 2 -c:v mpeg2video -g 48 -maxrate 4000k \
    `kdialog --getsavefilename ~ *.mpg`


 

 

Dirk

 

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.