Cenzic 232 Patent
Paid Advertising
web application security lab

Reflected and Untraceable XSS Attacks

Kuza55 has a really interesting article on his blog around a conversation that Trev got going around how you can modify the domain and steal cookies if you can run an XSS on the site. However, Kuza55 shows that using an iframe that uses a meta refresh (to sanitize the referring URL) you can make the attack completely untraceable back to the origin server/logging server where the attack originated. Very cool.

Now the real question is this, why is there any circumstance that a page will not send a referring URL? Can someone explain how there is any advantage to browser security to disallow that? Sure there are places where you want to clear your referrer due to privacy issues, but from an attacker’s perspective this is one of the few ways to hide what you are doing to a victim. It doesn’t seem like the positive outweighs the negatives in terms of security to remove referring URLs. I’m definitely open to hearing people’s thoughts on this though, as I’m sure there are other reasons people can think up of why it still might have some use.

7 Responses to “Reflected and Untraceable XSS Attacks”

  1. Shawn Says:

    I would hazard a guess (without reviewing any specs or source code before I write this, mind you) that it stemmed originally from the code simply not performing that way. Unless they properly abstract URL changes in such a way that they all default to sending the Referer header, I’d bet someone simply didn’t implement it, and then developers got used to things “working” like that.

    In short, I’d guess laziness over active development.

  2. kuza55 Says:

    The main use of this that I’ve seen has been to create link anonymisers for forums, so that the forum URL is NOT sent to any external sites, while still allowing users to click links rather than have to copy/paste URLs into the browser.

    There’s a bug about this (for completely different reasons) for Firefox here: https://bugzilla.mozilla.org/show_bug.cgi?id=266554

    If you cannot be bothered reading the bug, then the argument against sending referers is as follows:

    We can’t break this behaviour because sites like Gmail (which have auth data in the URL) use it to stop the URL being sent in the referer. We also can’t break expected behaviour for sites which rely on this to help people’s privacy.

    The counter argument is that this form of security is stupid, and there shouldn’t be authentication data in the URL anyway. If people need to get rid of the referer, they can do it manually by copying and pasting the link or typing the URL. Furthermore if this method didn’t exist then hot linking could be stopped because a referer would always be sent.

    So, frankly I doubt its going to get fixed, at all, there are sites which rely on the behaviour, and since no-one who knows anything about browsers trusts the referer for security, there is no benefit.

    I’m not quite sure what side of the fence I would come down on here; I think everyone agrees it helps the bad guys a lot, but I do see the usefulness to sites which would rather stay off the map, but have users who aren’t very careful, and do not want to have to stop using links.

    Oh, and according to the Bug opera doesn’t clear referers, so this isn’t even universally supported.

  3. RSnake Says:

    I completely agree with both comments. However, honestly, I couldn’t care less about Gmail (I know it was an example but it’s a good one). Google cannot rely exclusively on what is realistically a browser quirk for the security of it’s users. That’s not good practice, and it’s potentially putting it’s users at risk (in the case of Opera as you mentioned).

    Anyway, putting my distaste for the advertising agency’s privacy issues on hold for a moment, I am actually on the fence on this one. There are some times when it becomes useful to be able to hide referring URLs, but this can also be done by sending users to a transitional page somewhere that “cleanses” the URL by bouncing them off that domain. Sure, it’s more work, and it would require changes on both IE and Firefox’s side, so that’s going to cause problems.

    As a side note, and I say this with the best possible intentions, the fact that Google has such vested interest in Firefox means it is very unlikely that Firefox will change first. However, as we can see in the case of Opera it’s not exactly causing the internet to break by sending the referrers.

    It still needs more thought, but I’m leaning towards throwing the referrers back into any query.

  4. Awesome AnDrEw Says:

    It took me a second glance, and a bowl of Chef Boyardee, to allow the data to process through my brain, and let me tell you that I feel this is an amazing concept. Basically this allows for a violation of the whole cross site restrictions as long as one of the URLs is vulnerable, correct? I thought document.domain was a read-only function. Well great job on this find, and if I was correct in my assumption on how this should work then a lot of vulnerable social networking sites could be impacted dramatically by third party sites providing questionable content (i.e. supposed MySpace trackers).

  5. kuza55 Says:

    First of all; I messed up, I forgot to test this in IE and Opera (beyond testing that you *can* set the document.domain property to com.), and have found that both IE and Opera treat target.com and target.com. as separate domains, and so store cookies, etc, separately as well. But luckily I have figured out a way to overcome this, which I’ve posted here: http://kuza55.blogspot.com/2007/03/non-persistent-untraceable-xss-attacks.html

    @RSnake
    I don’t think utilising browser quirks is a bad idea, if it helps, then why not? httpOnly is a browser quirk, yet we still encourage its use. But having auth data in a URL definitely is, so there’s no defending Gmail here. So while I don’t think the counter-measure they are attempting to use is bad one, i think they shouldn’t have to use any countermeasures because the system should be designed better.

    Oh, and what trev found was many times more interesting than being able to “modify the domain and steal cookies if you can run an XSS on the site”, he found that any sites which try to allow subdomains to communicate together by removing all but the last two levels of a domain by using the javascript split() function to extract the last two domains levels, could be communicated with without having an XSS hole in them, eg MySpace. Sadly this was only really effective on Firefox, but still much more valuable than what I posted.

    @Awesome AnDrEw
    document.domain isn’t read only; you can set it to any higher level domain, e.g. xxx.www.test.com could set its document.domain property to www.test.com or test.com; it cannot set it to just “com”, but xxx.www.test.com. can set it to “com.” as can www.evil.com. - but frankly I can’t see how people providing MySpace trackers would benifit from this.

  6. Wladimir Palant Says:

    Kuza55, really nice article. As to the Firefox bug - you made a very good point for fixing it. And I think the dangers of leaking credentials are exaggerated. So there should be a good chance for fixing it. And the document.domain issues should be fixed as well pretty soon.

  7. Wladimir Palant Says:

    To clarify my last point - I meant https://bugzilla.mozilla.org/show_bug.cgi?id=368700 and https://bugzilla.mozilla.org/show_bug.cgi?id=368702
    There is another issue blocking these patches at the moment but maybe we can proceed anyway, I check that.