The Tradeoffs of Large-Scale Machine Learning
Wednesday, October 21, 2009
Building 3 Auditorium - 11:00 AM
(Coffee at 10:30 AM)
During the last decade, data sizes have outgrown processor speed. Computing time is then the bottleneck. The first part of the presentation theoretically uncovers qualitatively different tradeoffs for the case of small-scale and large-scale learning problems. The large-scale case involves the computational complexity of the underlying optimization algorithms in non-trivial ways. Unlikely optimization algorithm such as stochastic gradient descent show amazing performance for large-scale machine learning problems. The second part makes a detailed overview of stochastic gradient learning algorithms, with both simple and complex examples.
Léon Bottou received a Diplôme from Ecole Polytechnique, Paris in 1987, a Magistreé en Mathématiques Fondamentales et Appliquées et Informatiques from Ecole Normale Supérieure, Paris in 1988, and a PhD in Computer Science from Université de Paris-Sud in 1991. He joined AT&T Bell Labs from 1991 to 1992 and AT&T Labs from 1995 to 2002. Between 1992 and 1995 he was chairman of Neuristique in Paris, a small company pionneering machine learning for data mining applications. He has been with NEC Labs America in Princeton since 2002. Léon's primary research interest is machine learning. His contributions to this field address theory, algorithms and large scale applications.
Léon's secondary research interest is data compression and coding. His best known contribution in this field is the DjVu document compression technology (http://www.djvuzone.org). Léon is serving on the boards of the Journal of Machine Learning Research and of IEEE Trans. PAMI. He also serves on the scientific advisory board of Kxen Inc (http://www.kxen.com). He won the New York Academy of Sciences Blavatnik Award for Young Scientists in 2007.
IS&T Colloquium Committee Host: Tony Gualtieri