PSP User agent and Overflow?
-
- Posts: 4
- Joined: Fri Mar 25, 2005 12:30 am
PSP User agent and Overflow?
Hello All,
I'm wondering if the following user agent string gives us a little information :- User-Agent: PSPUpdate-agent/1.0.0 libhttp/1.0.0
Is this a SCEI internally developed HTTP library or one that is nabbed from the web?
Hick.org has a library with a suspiciously similar name :-
http://www.hick.org/code/skape/libhttp/ ... 0.0.tar.gz
Of course this could just be a coincidence, but having worked with Sony on a web connected device a while back, I can tell you that shortcuts were made by the developers in Tokyo using opensource or public domain code to save time and quickly fix a requirement.
I'm going to try to do some comparisons tonight, if this is the same code, then maybe the hick code should be examined a bit more closely to see if there is anything that can be exploited, maybe this *could* be a potential way to overflow the device as the firmware update infrastucture can be easily spoofed in the home/lab.
Of course sony firmware would need to be decrypted and checksummed before execution, but I'm thinking that maybe we can overflow *before* that point using problems in the libhttp code.
Of course these are all big "ifs" and someone might have investigated this already.
Another tact I was thinking about that if they did use the hick libhttp code and it was covered by the GPL then asking them for a tarball of source modifications to any GPL'd code could be another angle, but unfortunatly libhttp doesnt seem to be covered by the GPL :(
Maybe this is old news and has been ruled out, if it is, tell me to shut up and I'll go back to lurk mode ;)
I'm wondering if the following user agent string gives us a little information :- User-Agent: PSPUpdate-agent/1.0.0 libhttp/1.0.0
Is this a SCEI internally developed HTTP library or one that is nabbed from the web?
Hick.org has a library with a suspiciously similar name :-
http://www.hick.org/code/skape/libhttp/ ... 0.0.tar.gz
Of course this could just be a coincidence, but having worked with Sony on a web connected device a while back, I can tell you that shortcuts were made by the developers in Tokyo using opensource or public domain code to save time and quickly fix a requirement.
I'm going to try to do some comparisons tonight, if this is the same code, then maybe the hick code should be examined a bit more closely to see if there is anything that can be exploited, maybe this *could* be a potential way to overflow the device as the firmware update infrastucture can be easily spoofed in the home/lab.
Of course sony firmware would need to be decrypted and checksummed before execution, but I'm thinking that maybe we can overflow *before* that point using problems in the libhttp code.
Of course these are all big "ifs" and someone might have investigated this already.
Another tact I was thinking about that if they did use the hick libhttp code and it was covered by the GPL then asking them for a tarball of source modifications to any GPL'd code could be another angle, but unfortunatly libhttp doesnt seem to be covered by the GPL :(
Maybe this is old news and has been ruled out, if it is, tell me to shut up and I'll go back to lurk mode ;)
Hick's libhttp is quite bullet proof... It does some intelligent reading by realloc()ing a buffer on demand.
pixel: A mischievous magical spirit associated with screen displays. The computer industry has frequently borrowed from mythology. Witness the sprites in computer graphics, the demons in artificial intelligence and the trolls in the marketing department.
No idea. You could still give a try with "let's flood the psp with fake HTTP answers", but, well, they would have taken great care not letting any overflow bug here, since it's quite a sensitive point.
pixel: A mischievous magical spirit associated with screen displays. The computer industry has frequently borrowed from mythology. Witness the sprites in computer graphics, the demons in artificial intelligence and the trolls in the marketing department.
About the open-source stuff -- apparently, Sony used parts of NetBSD for the PSP's "communications functions". A link to the NetBSD license is found in the user's manual (http://www.scei.co.jp/psp-license/pspnet.txt).
If at first you don't succeed, skydiving is not for you.
Hi people, I had a quick look at the libhttp code.
It is definitely not free of bugs (although, as pixel mentioned, they did pay a little attention to security issues, and excessively long URLs won't allow exploits). I have not finished reading the source code, but I have found two bugs for now :
I don't have a PSP, but it would be interesting to try and see what happens when you ask for large POST requests (plus, it might uncover more serious flaws in Sony's code), which might be exploitable -- contrarily to large GET requests (which are blocked by Sony's browser). If you would like to test it on your own PSP and want me to write a small HTML page for this purpose, just ask :)
It is definitely not free of bugs (although, as pixel mentioned, they did pay a little attention to security issues, and excessively long URLs won't allow exploits). I have not finished reading the source code, but I have found two bugs for now :
- one that should not be exploitable, it just makes libhttp not as fool-proof as Hick wanted, and it would require rather specific user code from Sony to be usable
- one that would at least allow us to retrieve uninitialized data from the RAM, depending on Sony's code : if the browser tries to send large POST forms (that could be made thanks to JavaScript, although I don't know if the browser has a built-in limit for these kinds of stuff), then there is the following bug :
- the browser has successfully allocated POST data, and is going to call libhttp to add the POST data to the HTTP request
- if there is not enough free RAM for libhttp to make an extra copy, then there is a bug in libhttp that will leave the POST content length intact (i.e. libhttp will still try to send the original content length, even though it doesn't remember the actual content)
- yet if at the time the browser wants to send the HTTP request there is now enough free RAM to make an extra copy, then libhttp will send the POST request with uninitialized data instead of the original POST content
But the problem is that in order for this to be exploitable, the browser code has to free some RAM between the time when it adds the POST content, and the time when it asks libhttp to make the request (e.g. freeing the buffer containing the original POST request). Thus I can't tell whether this will work or not, it really depends on Sony's code.
I don't have a PSP, but it would be interesting to try and see what happens when you ask for large POST requests (plus, it might uncover more serious flaws in Sony's code), which might be exploitable -- contrarily to large GET requests (which are blocked by Sony's browser). If you would like to test it on your own PSP and want me to write a small HTML page for this purpose, just ask :)
Honestly don't expect this exploit to work, there are large chances that it will fail ; I do think it would be worth trying though.
I don't know if the built-in browser supports getElementById() or CSS, or if it's gonna be sluggish for large input sizes, but have a look at this :
(save it to hugepost.html, or rename the action= field accordingly)
Basically, you should try it with different values for the POST size (try with small values at first) ; if it doesn't run out of memory, the page simply reloads, resetting the default size -- you may use dichotomy to pinpoint which size poses a problem. And it would be useful to have some kind of proxy that logs all the POST data sent to the server, just to check if it is indeed valid :)
I don't know if the built-in browser supports getElementById() or CSS, or if it's gonna be sluggish for large input sizes, but have a look at this :
Code: Select all
<html><body>
<script language="javascript">
function go() {
document.getElementById("preparing").style.display = "inline";
var nIter = document.getElementById("kilobytes").value*8;
s = ""; // don't use 'var', we don't want it to be freed by the JS engine after the function exits!
for (var i=0; i<nIter; i++)
s += "123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456\r\n";
document.getElementById("preparing").style.display = "none";
document.getElementById("sending").style.display = "inline";
document.getElementById("t3h_t3xt").value = s;
document.getElementById("t3h_f0rm").submit();
}
</script>
<form onsubmit="go()">
<span id="preparing" style="display:none">Preparing...<br /></span>
<span id="sending" style="display:none">Sending...<br /></span>
<b>POST</b> size (kb):<br /><textarea id="kilobytes">1024</textarea>
<input name="post" value="Try" type="submit" />
</form>
<form id="t3h_f0rm" action="hugepost.html" method="post" style="display:none">
<textarea name="txt" id="t3h_t3xt"></textarea>
<input name="post" value="Submit" type="submit" />
</form>
</body></html>
Basically, you should try it with different values for the POST size (try with small values at first) ; if it doesn't run out of memory, the page simply reloads, resetting the default size -- you may use dichotomy to pinpoint which size poses a problem. And it would be useful to have some kind of proxy that logs all the POST data sent to the server, just to check if it is indeed valid :)
I tried getting file-upload to work... It didn't work. I used W3C's Code Validator as a test upload site. Probably not the best thing to use as a test, but it's all I could get at a short notice.
http://validator.w3.org/
It displays the Browse button with some Japanese Characters. When I click on it, the browser seems to lock up.
http://validator.w3.org/
It displays the Browse button with some Japanese Characters. When I click on it, the browser seems to lock up.
Testing your html, here's what I found:
If I submit 1024KB quickly, it seems to submit fairly quickly.
After 54 times though, it slows way down. It takes several seconds for it to post. Pausing for several seconds, and then continuing seems to clear a little memory, as it will post quickly once, then go back to taking several seconds to post.
Waiting a minute or so allows me to post 4 quickly, then it goes back to being sluggish.
I changed the size to 11059.2 and did it (surprisingly) 14 times before it started getting sluggish.
I am not sure if this is truely the PSP or possibly my webserver doing some kind of limiting (Stock FC1 Apache), but it's definitely overloading something.
If I submit 1024KB quickly, it seems to submit fairly quickly.
After 54 times though, it slows way down. It takes several seconds for it to post. Pausing for several seconds, and then continuing seems to clear a little memory, as it will post quickly once, then go back to taking several seconds to post.
Waiting a minute or so allows me to post 4 quickly, then it goes back to being sluggish.
I changed the size to 11059.2 and did it (surprisingly) 14 times before it started getting sluggish.
I am not sure if this is truely the PSP or possibly my webserver doing some kind of limiting (Stock FC1 Apache), but it's definitely overloading something.