Hat tip to WhiteAcid for the new naming convention for the post (it’s easier to name it in terms of powers, rather than to type it out by hand). But yes, yet again, MySpace is vulnerable, and yes, again, to the exact same thing as before. This cat and mouse game makes them look pretty foolish. Instead of just doing it right they are writing single blacklist/whitelist strings that are trivial to circumvent. digi7al64 found yet another way around the same XSS filters that are attempting to stop the non-alpha non-digit XSS vector that effects FireFox. Here is the string:
<body <script onload<script=alert('xss');> turns into <body .. onload..=alert('xss');> which works in Firefox.
What did MySpace do against the last filter? They simply stopped stripping in the single case where they found the offending string. No, they didn’t get rid of it, block it, enumerate through a while loop or anything else, they just did nothing, causing the string to fail (not a particularly great defense there). This is silly, and embarrassing to watch. I feel bad for them, I really do. The only thing I can think that would cause them to not write a while loop is over concerns of CPU, but adding more and more filters doesn’t help CPU either, and there are other ways to solve CPU bound search issues (I know because I’ve designed them before).
It’s kind of comical at this point - anyone want to take bets on how many more tries it will take for them to get it right? This is the trap you get into when you have to allow HTML for your business to survive.