Standing on the Shoulders of Cloud Computers

Standing on the Shoulders of Cloud Computers

Standing on the Shoulders of Cloud Computers 150 150 Jason Krause

For science geeks, the real Fourth of July fireworks was the announcement that the Large Hadron Collider in Switzerland had found a particle likely to be the Higgs Boson, the last unobserved particle that completes the Standard Model in physics.
The discovery is a massive achievement that is the culmination of centuries of research, insight, sacrifice, and ingenuity. The Large Hadron Collider (LHC) near Geneva, Switzerland cost $10 billion to build and uses approximately 5,000 supermagnets to fire subatomic particles around a 17-mile, underground tunnel near the speed of light. The particles are smashed inside massively sensitive, 2,000 pound detectors designed to identify the atomic debris from these collisions.
Supercomputers for Supercolliders
The European Organization for Nuclear Research’s (CERN) supercollider is the one of the largest, most complex, and expensive science experiments in the history of the human race. But the real achievement is not simply smashing particles. We’ve been doing that for decades, with varying degrees of precision. This week’s announcement is in large part a triumph of computing power against massive amounts of data.

See the Higgs boson? Supercomputers say it’s there. Probably.

CERN is at the center of the LHC Worldwide Computing Grid, which connects scientific locations in 34 countries on a distributed computing and data storage infrastructure. To give a sense of the computing power needed, the Large Hadron Collider created 1 million atomic collisions per second for extended periods of time, and is trying to identify a particle that appears in perhaps 1 in a trillion collisions. At the end of 2010, the Grid harnessed roughly 200,000 computer processing cores to crunch 150 petabytes of collected data.

Physics for Lawyers
There are important lessons the rest of the world- including the legal community- can glean from this effort. The scientific world has long been a leader in pooling computing resources- it’s not an accident that the Internet as we know it was invented at CERN. Today, CERN is driving the development of next-generation cloud computing platforms with advanced research into grid computing.
The Large Hadron Collider was able to succeed by pooling the knowledge and resources of thousands of the best and most highly trained minds in the world, coupled with immense computing resources. Lawyers will likely never need to process the same volumes of data the LHC generates, but in many ways, their challenge is more complex. Lawyers are dealing with information generated by humans, which is arguably more unpredictable and subtle than subatomic collisions. (At least scientists know they’re looking for a particle with certain characteristics. No one knows what a smoking gun email looks like until they find it.)
When faced with a massive data processing problem, too many laws firm are still trying to buy the computers necessary to crunch the data themselves. That’s no longer the answer, not when there are commercial providers that offer computing resources that can be effectively deployed on demand. The scientific community understands the value of shared resources and cloud computing, and thanks to their efforts the technology is advancing at a rapid pace. The legal world must follow that example or lose the battle with big data.