Timezone: »

Large Scale Matrix Analysis and Inference
Reza Zadeh · Gunnar Carlsson · Michael Mahoney · Manfred K. Warmuth · Wouter M Koolen · Nati Srebro · Satyen Kale · Malik Magdon-Ismail · Ashish Goel · Matei A Zaharia · David Woodruff · Ioannis Koutis · Benjamin Recht

Mon Dec 09 07:30 AM -- 06:30 PM (PST) @ Harvey's Tallac
Event URL: http://largematrix.org »

Much of Machine Learning is based on Linear Algebra.
Often, the prediction is a function of a dot product between
the parameter vector and the feature vector. This essentially
assumes some kind of independence between the features.
In contrast matrix parameters can be used to learn interrelations
between features: The (i,j)th entry of the parameter matrix
represents how feature i is related to feature j.

This richer modeling has become very popular. In some applications,
like PCA and collaborative filtering, the explicit goal is inference
of a matrix parameter. Yet in others, like direction learning and
topic modeling, the matrix parameter instead pops up in the algorithms
as the natural tool to represent uncertainty.

The emergence of large matrices in many applications has
brought with it a slew of new algorithms and tools.
Over the past few years, matrix analysis and numerical linear
algebra on large matrices has become a thriving field.
Also manipulating such large matrices makes it necessary to
to think about computer systems issues.

This workshop aims to bring closer researchers in large
scale machine learning and large scale numerical linear
algebra to foster cross-talk between the two fields. The
goal is to encourage machine learning researchers to work
on numerical linear algebra problems, to inform machine
learning researchers about new developments on large scale
matrix analysis, and to identify unique challenges and
opportunities. The workshop will conclude with a
session of contributed posters.


Author Information

Reza Zadeh (Matroid)

Reza Bosagh Zadeh is Founder CEO at Matroid and an Adjunct Professor at Stanford University. His work focuses on Machine Learning, Distributed Computing, and Discrete Applied Mathematics. Reza received his PhD in Computational Mathematics from Stanford under the supervision of Gunnar Carlsson. His awards include a KDD Best Paper Award and the Gene Golub Outstanding Thesis Award. He has served on the Technical Advisory Boards of Microsoft and Databricks. As part of his research, Reza built the Machine Learning Algorithms behind Twitter's who-to-follow system, the first product to use Machine Learning at Twitter. Reza is the initial creator of the Linear Algebra Package in Apache Spark. Through Apache Spark, Reza's work has been incorporated into industrial and academic cluster computing environments. In addition to research, Reza designed and teaches two PhD-level classes at Stanford: Distributed Algorithms and Optimization (CME 323), and Discrete Mathematics and Algorithms (CME 305).

Gunnar Carlsson (Stanford University)
Michael Mahoney (Stanford University)
Manfred K. Warmuth (Univ. of Calif. at Santa Cruz)
Wouter M Koolen (Centrum Wiskunde & Informatica)
Nati Srebro (TTI-Chicago)
Satyen Kale (Google)
Malik Magdon-Ismail (RPI)
Ashish Goel (Stanford University)
Matei A Zaharia (UC Berkeley)
David Woodruff (IBM Research)
Ioannis Koutis (University of Puerto Rico - Rio Piedras)
Benjamin Recht (UW-Madison)

More from the Same Authors