Sunday, August 03, 2008

How I hacked mplayer to work on Motorola ROKR - Part 2

It has been over a month since part 1, so let me start off quickly and finish this part. I will try to explain how I got the 19 bpp working in mplayer. It was very straight forward really. I had to edit the following files from the mplayer-cvs (you can download the code from http://code.google.com/p/j2me-ctunes/):
libmpcodecs/img_format.c
libmpcodecs/img_format.h
libmpcodecs/vf_scale.c
libvo/osd.c
libvo/osd.h
libvo/vo_fbdev.c
mplayer.c
postproc/yuv2rgb.c
postproc/swscale.c
In libmpcodecs/img_format.c, libmpcodecs/img_format.c and libmpcodecs/vf_scale.c I added a few lines to return something to say 19-bpp is supported in RGB and BGR formats.

libvo/osd.c is interesting. I added a method called vo_draw_alpha_rgb19() which draws the alpha for 19-bpp.

In libvo/vo_fbdev.c, I disabled some code that was causing problems in ROKR and disabled clipping because it was done elsewhere. In addtion added code to initialize variables for 19-bpp format (i.e. r, g, b offsets and lengths. Each color is 6 bits long with 1 bit for alpha) and drawing alpha (using the method previously added in osd.c)

I fixed a few things in mplayer because there was some weird state change after EOF. It did not work correctly on ROKR. I don't remember what was wrong, but for some reason I added a PAUSE state after a file has been played. I used the slave mode of mplayer (see man mplayer for more details about the slave mode) to play media files. So mplayer was always running and when it is time to play a new media file I just pass the new filename from java to the running process that controls the mplayer process. So when a file has completed playing the state of mplayer used to end in PLAYING state. I did not want it to be in that state for some reason so after a play ended I change the state to PAUSE.

postproc/yuv2rgb.c contains the most important hack of all. It is the one that converts YUV data to RGB 19bpp. It was the most fun of all to write the code. Take data point for each pixel in YUV format and convert it to RGB and then fit it in 19 bpp. A lookup table is created up front and the table is used to do the conversions quickly in realtime when the video is playing. I do know much of MMX or SSE extensions (and not sure if ARM supports them) so I had to do all the conversion in C, but it wasn't bad at all performance wise. I got 24 fps for most video formats on a 320 Mhz XScale processor.

The final change is in postproc/swscale.c which converts YUV to packed RGB format.

That is the end of part 2 :). Pretty simple isn't it. Next time I will explain some audio hacks.
Post a Comment