Media player
Media player
Ever since I first started trying homebrew on my PS2 a year ago, I've meant to work on a media player. For awhile, it looked like PS2Reality would obliviate the need for that, so I held off work on my own player. It's pretty clear by now that PS2Reality has gone as far as it's going to. It's also clear that the PS2Reality source won't be coming out anytime soon.
Because of this, I've decided to go ahead and start on a player. I'll be posting more on how I plan to set up the program later, but I'm starting by making something available that some people might find useful - The QCast versions of ffmpeg and mad. Previously, someone posted the patch files for ffmpeg and mad. QCast was required to make those available by the terms of the ffmpeg and mad licenses. These aren't very helpful to some folks, so I got the proper versions of ffmpeg and mad and applied the patchfiles. You can now get zips of these two libraries via these links:
http://www.geocities.com/jlfenton65/Pro ... broadq.zip
http://www.geocities.com/jlfenton65/Pro ... broadq.zip
Initially, files for this project will be on my geocities site. Once it reaches a runnable state, I'll see about getting it added to the PS2Dev cvs. Everything from the start to the end will be completely open. Any suggestions or help is appreciated.
Because of this, I've decided to go ahead and start on a player. I'll be posting more on how I plan to set up the program later, but I'm starting by making something available that some people might find useful - The QCast versions of ffmpeg and mad. Previously, someone posted the patch files for ffmpeg and mad. QCast was required to make those available by the terms of the ffmpeg and mad licenses. These aren't very helpful to some folks, so I got the proper versions of ffmpeg and mad and applied the patchfiles. You can now get zips of these two libraries via these links:
http://www.geocities.com/jlfenton65/Pro ... broadq.zip
http://www.geocities.com/jlfenton65/Pro ... broadq.zip
Initially, files for this project will be on my geocities site. Once it reaches a runnable state, I'll see about getting it added to the PS2Dev cvs. Everything from the start to the end will be completely open. Any suggestions or help is appreciated.
the broadq patches are already patched into 0.15.Once it reaches a runnable state, I'll see about getting it added to the PS2Dev cvs
but that's not enough to actually output audio on a ps2.
if you want a port of madplay that actually outputs audio, have a look at
version 0.15 (ps2dev cvs, ps2sdk-ports/)... specifically the audio_sjpcm.c
driver.
Last edited by rinco on Wed Mar 09, 2005 1:13 pm, edited 1 time in total.
-
- Posts: 3
- Joined: Fri Jan 07, 2005 5:47 am
Rather than just stick ffmpeg and mad together with a few ps2 libs, I'm trying to design a PS2 player from the ground up. ffmpeg and mad are designed for systems like the PC. A PS2 isn't a PC, and treating it like one hobbles a program. For some things, that's no big deal. The PS2 is pretty powerful (not by modern standards, but you know what I mean), so many things can just be converted directly from PC to PS2.
A player program is not one of those. It demands every resource the PS2 has if you wish to do better than low res/low bitrate video. PS2Reality and QCast both demonstrate that just converting ffmpeg and mad doesn't hack it. This doesn't mean I plan to write the decoders from scratch. Most of the code will come from existing player programs. I'm not up to writing my own mpeg4 decoder. However, the design of the player must be such as to utilize the PS2 more fully.
The EE should have nothing to do with inputting data at all. The IOP should read the data, split the data into elementary streams, then DMA the video data to the EE side for decoding. Simple audio formats should never go to the EE side at all (PCM, ADPCM...). More complex audio formats which need decoding by the EE should try to make use of a VU when possible. For example, mp1/mp2/mp3 audio should be able to use a vu for subband synthesis at the very least. The VU can also be used for resampling to 48KHz and filtering. Every little bit you don't have the EE doing is that much more the EE can devote to video.
A simple calculation shows that the VU is fast enough to do brightness, contrast, and sharpness calculations on the decoded video data. I'm currently checking if it's fast enough to do deblocking. Any postprocessing that can be off-loaded from the EE is that much more that can be used for video decoding.
A player program is not one of those. It demands every resource the PS2 has if you wish to do better than low res/low bitrate video. PS2Reality and QCast both demonstrate that just converting ffmpeg and mad doesn't hack it. This doesn't mean I plan to write the decoders from scratch. Most of the code will come from existing player programs. I'm not up to writing my own mpeg4 decoder. However, the design of the player must be such as to utilize the PS2 more fully.
The EE should have nothing to do with inputting data at all. The IOP should read the data, split the data into elementary streams, then DMA the video data to the EE side for decoding. Simple audio formats should never go to the EE side at all (PCM, ADPCM...). More complex audio formats which need decoding by the EE should try to make use of a VU when possible. For example, mp1/mp2/mp3 audio should be able to use a vu for subband synthesis at the very least. The VU can also be used for resampling to 48KHz and filtering. Every little bit you don't have the EE doing is that much more the EE can devote to video.
A simple calculation shows that the VU is fast enough to do brightness, contrast, and sharpness calculations on the decoded video data. I'm currently checking if it's fast enough to do deblocking. Any postprocessing that can be off-loaded from the EE is that much more that can be used for video decoding.
PS2Reality don't use mad sorry, only ffmpeg.J.F. wrote:
PS2Reality and QCast both demonstrate that just converting ffmpeg and mad doesn't hack it.
well you read always on iop side the data so nothing new, iop has limited resources and it makes too much things in projects like this don't forget that. Leave the vu make the work ok, but you need give it data to process and get result quickly or you will have a trouble(plus extra stuff to syncro ee/gs/vus) so you must to evaluate it.J.F. wrote: The EE should have nothing to do with inputting data at all. The IOP should read the data, split the data into elementary streams, then DMA the video data to the EE side for decoding. Simple audio formats should never go to the EE side at all (PCM, ADPCM...). More complex audio formats which need decoding by the EE should try to make use of a VU when possible. For example, mp1/mp2/mp3 audio should be able to use a vu for subband synthesis at the very least. The VU can also be used for resampling to 48KHz and filtering. Every little bit you don't have the EE doing is that much more the EE can devote to video.
J.F. wrote: A simple calculation shows that the VU is fast enough to do brightness, contrast, and sharpness calculations on the decoded video data. I'm currently checking if it's fast enough to do deblocking. Any postprocessing that can be off-loaded from the EE is that much more that can be used for video decoding.
too much vu man you must put all together first to see if vus can give you more perfomance or not, IPU and DMAC are your friends don't forget them are many more important for this kind of project so our advise is go to ee user's manuals from ps2linux kit and make you his friend also.
Again good luck i wait that this time we see your player at last
I know. I figured as much when both PS2Reality and QCast put back changes to ffmpeg, but only QCast did for mad. I could have been more specific about who used what, but I figured I didn't have to go into it that much. :)bigboss wrote: PS2Reality don't use mad sorry, only ffmpeg.
I plan to do some test proggies to evaluate how well some things work as I go along - like using the VU to help decode audio. Like you said, I need to be careful about how I handle passing off data to the different units so I don't waste more time than if I decoded it with the EE anyway. I'll probably have both and try it both ways to see which really works better.well you read always on iop side the data so nothing new, iop has limited resources and it makes too much things in projects like this don't forget that. Leave the vu make the work ok, but you need give it data to process and get result quickly or you will have a trouble(plus extra stuff to syncro ee/gs/vus) so you must to evaluate it.
Yep. DMA will be a big part of it. Like I said above, I do plan to check if it's really helping to use the VUs. Theoretically speaking, it should. Whether it does in reality is another thing altogether. :)too much vu man you must put all together first to see if vus can give you more perfomance or not, IPU and DMAC are your friends don't forget them are many more important for this kind of project so our advise is go to ee user's manuals from ps2linux kit and make you his friend also.
I recently purchased PS2 Linux (discs) so that I have the manuals. They're very helpful. I plan to make as much use as possible of the IPU. Colorspace conversion is probably one of the more time-consuming aspects of video, so any help there is a big help indeed. I'm also looking into whether the IDCT in the IPU could be made to work with other formats than MPEG. Supposedly, the PS2 camera uses it to accelerate decoding the JPEG frames, so I suspect you can use it for general 8x8 IDCT with a little work.
Thanks. I appreciate the advice. Especially as you are one of the few people to actually get a working player out.Again good luck i wait that this time we see your player at last
about eyetoy and IPU , yes eyetoy capture directly IPU frames so it's not needed extra libs to show them always it's better ps2 working for you instead of you working for your ps2, the driver only send ipu frames to ee using sifcmd stuff and then ee directly send to the IPU with its dma channel and receive them in rgba32 ready to show i suppose that we will release something soon about it while look your ps2linux's manual are the best source to understand it
That would certainly interest a lot of folks. There isn't much on the EyeToy yet, or the IPU for that matter. What's in the Linux manual is it as far as I know.bigboss wrote:about eyetoy and IPU , yes eyetoy capture directly IPU frames so it's not needed extra libs to show them always it's better ps2 working for you instead of you working for your ps2, the driver only send ipu frames to ee using sifcmd stuff and then ee directly send to the IPU with its dma channel and receive them in rgba32 ready to show i suppose that we will release something soon about it while look your ps2linux's manual are the best source to understand it
Folx, I split out the controversial postings into a separate thread so that the serious discussion in J.F.'s thread won't get hijacked by dev political soap operas. Please continue constructive dev-related discussion in this thread. Please send vitriol and dramatic scenes to the other.
You will find it in the "off topic" section.
http://forums.ps2dev.org/viewtopic.php?t=1159
You will find it in the "off topic" section.
http://forums.ps2dev.org/viewtopic.php?t=1159