Cross-origin AJAX, CORS and Pass-through Proxies

We’re putting together more and more AJAX solutions here. That is, client applications that run in the browser using Javascript. After loading the initial HTML into the browser, these clients make HTTP requests back to the server, and operate like a desktop application might.

For some reason, which I won’t explore or go into here too much, the original implementors of the XmlHttp object implemented a “same-origin” restriction. That means that when you make an HTTP request through the XmlHttp object, it can only go back to the same domain that served up the original HTML page.

(Briefly, I think the idea was that they didn’t want someone dropping malicious javascript into an HTML page that sent a lot of local information to a bunch of untrusted servers. The problem is, only the XmlHttp object observes that restriction, and there are about a million ways to work around it. Anyway….)

So we are looking at the CORS spec that’s implemented in most current browsers. The CORS spec boils down to a header which allows the *server* to relax the single-origin policy. If the server sends and HTML page, and includes a CORS header that allows access to “” and “”, then the browser will allow requests to those domains in addition to the origin server.

I am digging into the spec with more detail later today, but here’s what I know so far.

The problem is, some implementations of CORS is spotty. Chrome and Safari are on board, and implement it as specified. Firefox is mostly there, but there are some bugs in handling non-success codes. IE has taken it’s own spin on things, and hasn’t implemented CORS in the XmlHttp object, but has implemented it in the (new) XmlDomainRequest object, which is a divergence from everyone else, as usual, and isn’t supported in some libraries, notably jQuery.

Aside from the ways you can bypass XmlHttp in the browser, the old-fashioned way to get around CORS is to add a pass-through proxy to the origin server. So rather than having your HTML page make requests to “” and “”, you have it make *all* requests to the original, and let original relay the requests along.

So the calls would go to something like, “” and “”. The site would act as a proxy, and send the requests on to the “http://woohaha” server for you.

The problem is, if you open up that proxy with no restrictions, then you’ve basically just opened up your entire internal network. There’s nothing to stop someone from making a request to “” or “” and the proxy blindly passing the request along. Even worse, the proxy might allow forwarding requests to servers *outside* your network, in which case it becomes an anonimyzing server that scrubs the original source and makes requests on someone’s behalf. Hackers would have a heyday with that.

So a pass-through proxy deployed in parallel with your proxy is really nice, but it’s very important that the CORS configuration be lined up identically with the pass-though configuration. That is, if the CORS header allows the client to talk to “” and “”, then the pass-through proxy *must* honor the exact same restriction, and only forward requests to those servers.

So until CORS is fully adopted and correctly implemented in all the browsers, a pass-through proxy is a good way to make your clients work, without using browser tricks or hacks. Just make sure your CORS policy and pass-through proxy policy are identical, or else you’re opening up a more-or-less infinite security hole for the Internets to take advantage of.

A great article on the subject:

The CORS spec itself:


Leave a comment

Filed under open standards, REST, servlets

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s