דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

event speaker icon
דן גרבר (הרצאה סמינריונית למגיסטר)
event date icon
יום רביעי, 21.03.2012, 16:00
event location icon
Taub 701
event speaker icon
מנחה: Dr. E. Hazan
Semidefinite programming is a fundamental problem in convex programming with numerous applications. In the field of combinatorial optimization, many approximation algorithms that rely on sdp have been discovered in the past two decades starting with the work of Goemans and Williamson on MAX-CUT. In the field of machine learning solving sdps is at the heart of many learning tasks such as learning a distance metric and matrix completion. In ML in particular, the amounts of data nowadays are huge and it is often considered noisy. Thus fast approximation algorithms are preferable to exact generic solvers, such as interior-point methods, which have impractical running times and memory demands and are not scalable. In this work we present an approximation algorithm for semidefinite programming that produces low-rank solutions with running time that is sublinear in the size of the data. Our method builds on ideas that were developed in the theoretical computer science community for approximately solving Lagrange relaxations of linear programs. We use these ideas with recent results in online learning and random sampling to derive our sublinear algorithm. We also present lower bounds that show that the running time of our algorithm is close to optimal in some cases.