For some reason, which I won’t explore or go into here too much, the original implementors of the XmlHttp object implemented a “same-origin” restriction. That means that when you make an HTTP request through the XmlHttp object, it can only go back to the same domain that served up the original HTML page.
So we are looking at the CORS spec that’s implemented in most current browsers. The CORS spec boils down to a header which allows the *server* to relax the single-origin policy. If the server sends and HTML page, and includes a CORS header that allows access to “http://woo.com” and “http://haha.com”, then the browser will allow requests to those domains in addition to the origin server.
I am digging into the spec with more detail later today, but here’s what I know so far.
The problem is, some implementations of CORS is spotty. Chrome and Safari are on board, and implement it as specified. Firefox is mostly there, but there are some bugs in handling non-success codes. IE has taken it’s own spin on things, and hasn’t implemented CORS in the XmlHttp object, but has implemented it in the (new) XmlDomainRequest object, which is a divergence from everyone else, as usual, and isn’t supported in some libraries, notably jQuery.
Aside from the ways you can bypass XmlHttp in the browser, the old-fashioned way to get around CORS is to add a pass-through proxy to the origin server. So rather than having your HTML page make requests to “http://original.com” and “http://woohaha.com”, you have it make *all* requests to the original, and let original relay the requests along.
So the calls would go to something like, “http://original.com” and “http://original.com/woohaha”. The site would act as a proxy, and send the requests on to the “http://woohaha” server for you.
The problem is, if you open up that proxy with no restrictions, then you’ve basically just opened up your entire internal network. There’s nothing to stop someone from making a request to “http://original.com/useraccounts” or “http://original.com/financials” and the proxy blindly passing the request along. Even worse, the proxy might allow forwarding requests to servers *outside* your network, in which case it becomes an anonimyzing server that scrubs the original source and makes requests on someone’s behalf. Hackers would have a heyday with that.
So a pass-through proxy deployed in parallel with your proxy is really nice, but it’s very important that the CORS configuration be lined up identically with the pass-though configuration. That is, if the CORS header allows the client to talk to “http://here.com” and “http://there.com”, then the pass-through proxy *must* honor the exact same restriction, and only forward requests to those servers.
So until CORS is fully adopted and correctly implemented in all the browsers, a pass-through proxy is a good way to make your clients work, without using browser tricks or hacks. Just make sure your CORS policy and pass-through proxy policy are identical, or else you’re opening up a more-or-less infinite security hole for the Internets to take advantage of.
A great article on the subject:
The CORS spec itself: