Position: PhD. Student

Current Institution: University of California, Berkeley

Abstract:
CoCoA: A Framework for Communication-Efficient Distributed Optimization

Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In light of this, we propose a communication-efficient framework, CoCoA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of algorithms, as well as experiments on real-world distributed datasets with implementations in Apache Spark. We demonstrate the flexibility and empirical performance of CoCoA as compared to state-of-the-art distributed methods, for common objectives such as SVMs, ridge regression, Lasso, sparse logistic regression, and elastic net-regularized objectives.

Bio:

Virginia Smith is a 5th-year Ph.D. student in the EECS Department at UC Berkeley, where she works jointly with Michael I. Jordan and David Culler as a member of the AMPLab. Her research interests are in large-scale machine learning and distributed optimization. She is actively working to increase the presence of women in computer science, most recently by co-founding the Women in Technology Leadership Round Table (WiT). Virginia has won several awards and fellowships while at Berkeley, including the NSF fellowship, Google Anita Borg Memorial Scholarship, NDSEG fellowship, and Tong Leong Lim Pre-Doctoral Prize.