Archive Graphic

Please Note: The content on this page is not maintained after the colloquium event is completed.  As such, some links may no longer be functional.

Download Adobe PDF Reader

Ian FosterIan Foster
Computation Institute, Argonne National Laboratory & University of Chicago
Big Process for Big Data

Wednesday, October 31, 2012
Building 3 Auditorium - 11:00 AM
(Coffee and cookies at 10:30 AM)

We have made much progress over the past decade toward effectively harnessing the collective power of IT resources distributed across the globe. In fields such as high-energy physics, astronomy, and climate, thousands benefit daily from tools that manage and analyze large quantities of data produced and consumed by large collaborative teams.

But we now face a far greater challenge: Exploding data volumes and powerful simulation tools mean that far more--ultimately most?--researchers will soon require capabilities not so different from those used by these big-science teams. How is the general population of researchers and institutions to meet these needs? Must every lab be filled with computers loaded with sophisticated software, and every researcher become an information technology (IT) specialist? Can we possibly afford to equip our labs in this way, and where would we find the experts to operate them?

Consumers and businesses face similar challenges, and industry has responded by moving IT out of homes and offices to so-called cloud providers (e.g., Google, Netflix, Amazon, Salesforce), slashing costs and complexity. I suggest that by similarly moving research IT out of the lab, we can realize comparable economies of scale and reductions in complexity. More importantly, we can free researchers from the burden of managing IT, giving them back their time to focus on research and empowering them to go beyond the scope of what was previously possible.

I describe work we are doing at the Computation Institute to realize this approach, focusing initially on research data lifecycle management. I present promising results obtained to date with the Globus Online system, and suggest a path towards large-scale delivery of these capabilities. For more information see: www.ci.uchicago.edu(link is external), www.globusonline.org(link is external)

Ian Foster is the Arthur Holly Compton Distinguished Service Professor of Computer Science at the University of Chicago and an Argonne Distinguished Fellow at Argonne National Laboratory. He is also the Director of the Computation Institute, a joint unit of Argonne and the University. His research is concerned with the acceleration of discovery in a networked world. Foster was a leader in the development and promulgation of concepts and methods that underpin grid computing. These methods allow computing to be delivered reliably and securely on demand, as a service, and permit the formation and operation of virtual organizations linking people and resources worldwide. These results, and the associated Globus open source software, have helped advance discovery in such areas as high energy physics, environmental science, and biomedicine. Grid computing methods have also proved influential outside the world of science, contributing to the emergence of cloud computing. His new Globus Online project seeks to outsource complex and time-consuming research management processes to software-as-a-service providers; the goal here is to make the discovery potential of massive data, exponentially faster computers, and deep interdisciplinary collaboration accessible to every researcher, not just a select few "big science" projects. Dr. Foster is a fellow of the American Association for the Advancement of Science, the Association for Computing Machinery, and the British Computer Society. His awards include the British Computer Society's Lovelace Medal, honorary doctorates from the University of Canterbury, New Zealand, and CINVESTAV, Mexico, and the IEEE Tsutomu Kanai award.

IS&T Colloquium Committee Host: Jim Fischer

Sign language interpreter upon request: 301-286-7040
Request future announcements