Ripley – Microsoft Research’s Security Solution For Web 2.0 Applications & Users
November 11, 2009One of the biggest pains in the neck in network security or any security is that a user in most cases won’t be proactive in ensuring his computer is secure. On most occasions the user isn’t even aware that his PC has been compromised. The browser is the user’s gateway so to speak to the Internet. Most recent technologies like JavaScripts and AJAX improve the user’s browsing experience but the computation is done at the user’s end which means that a compromised client PC can manipulate the data that is sent to the server, potentially dangerous when dealing with sensitive information and risky for the servers. So 3 guys one from IIT Delhi, Cornell university & Microsoft Research decided to propose a possible solution to this is called Ripley – Automatically Securing Distributed Web Applications Through Replicated Execution.
Ripley in simplest terms clones the client computer on the server, does the computation, verifies the output, gets the response and forwards it to the user. It is another layer added between the user & the server but part of the server-end. This layer is a clone of the client and another module which the research paper calls as Ripley Checker. So what used to be better performance of AJAX since it was done at the user’s end will be thrown back at the Server’s end yet maintaining the same levels of performance.
Ripley’s Architecture (from the research paper):
Legend: S = Server; C’ = Client & C = cloned client
-
Capture user events: RIPLEY augments the client to capture user events within the browser.
-
Transmit events to the server for replay: The client run-time is modi?ed to transmit user events to the client’s replica C for replay.
-
Compare server and client results: The server component S is augmented with a RIPLEY checker that compares arriving RPCs m’ and m received from the client C’ and server-based client replica C, respectively, looking for discrepancies.
Essentially the user’s actions & applications are replicated on C. The interaction between S & C is checked by Ripley Checker & the result is given back to the user. The user’s PC basically does nothing I guess. Something like taking the concept of Thin Clients to the Web, maybe?
All this happens supposedly without the user knowing. I know the first question that came to your mind (it did to me as well) This potentially increases the server load, replicating so many users, verifying and then sending the response. Well, the researchers say that what initially took a 50-60 MB footprint for each browser instance has been reduced drastically by having only a browser emulator that performs the tasks essential to Ripley. The researchers also state that the performance isn’t affected for the user and in some cases even improved since at the server end computation happen in .net which is a lot faster than JavaScripts running on the client’s PC. The research papers say that Ripley is built on Live Lab’s Volta. Now whether this will be browser specific or no, I don’t really know since I haven’t properly gone through all the research docs.
What happens at the server-end is basically verification by comparing the incoming data with ideal data generated at the receiver. A concept a lot of engineers are familiar with. Ripley’s (method of security might be the future) Believe It Or Not.
For the technically (in)sane: Technology Review & Microsoft Research