Samet Oymak
Samet Oymak
Verified email at ece.ucr.edu - Homepage
Title
Cited by
Cited by
Year
Simultaneously structured models with application to sparse and low-rank matrices
S Oymak, A Jalali, M Fazel, YC Eldar, B Hassibi
IEEE Transactions on Information Theory 61 (5), 2886-2908, 2015
2632015
Towards moderate overparameterization: global convergence guarantees for training shallow neural networks
S Oymak, M Soltanolkotabi
IEEE Journal on Selected Areas in Information Theory, 2020
1532020
Recovery of sparse 1-D signals from the magnitudes of their Fourier transform
K Jaganathan, S Oymak, B Hassibi
2012 IEEE International Symposium on Information Theory Proceedings, 1473-1477, 2012
1512012
Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks
M Li, M Soltanolkotabi, S Oymak
The 23rd International Conference on Artificial Intelligence and Statistics, 2020
1302020
The squared-error of generalized lasso: A precise analysis
S Oymak, C Thrampoulidis, B Hassibi
51st Annual Allerton Conference on Communication, Control, and Computing …, 2013
1152013
Regularized linear regression: A precise analysis of the estimation error
C Thrampoulidis, S Oymak, B Hassibi
Conference on Learning Theory, 1683-1709, 2015
1072015
Sparse phase retrieval: Convex algorithms and limitations
K Jaganathan, S Oymak, B Hassibi
IEEE International Symposium on Information Theory, 2013
1072013
Non-asymptotic identification of LTI systems from a single trajectory
S Oymak, N Ozay
The American Control Conference, 2019
1022019
A simplified approach to recovery conditions for low rank matrices
S Oymak, K Mohan, M Fazel, B Hassibi
2011 IEEE International Symposium on Information Theory Proceedings, 2318-2322, 2011
992011
Universality laws for randomized dimension reduction, with applications
S Oymak, JA Tropp
Information and Inference: A Journal of the IMA 7 (3), 337-446, 2018
952018
Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?
S Oymak, M Soltanolkotabi
36th International Conference on Machine Learning, 2019
942019
Sharp Time--Data Tradeoffs for Linear Inverse Problems
S Oymak, B Recht, M Soltanolkotabi
IEEE Transactions on Information Theory, 2018
922018
Sharp MSE bounds for proximal denoising
S Oymak, B Hassibi
Foundations of Computational Mathematics 16 (4), 965-1029, 2016
89*2016
Sparse phase retrieval: Uniqueness guarantees and recovery algorithms
K Jaganathan, S Oymak, B Hassibi
IEEE Transactions on Signal Processing 65 (9), 2402-2410, 2017
822017
New null space results and recovery thresholds for matrix rank minimization
S Oymak, B Hassibi
arXiv preprint arXiv:1011.6326, 2010
812010
Sharp MSE Bounds for Proximal Denoising
S Oymak, B Hassibi
Foundations of Computational Mathematics, 2013
802013
Parallel correlation clustering on big graphs
X Pan, D Papailiopoulos, S Oymak, B Recht, K Ramchandran, MI Jordan
Neural Information Processing Systems (NIPS), 2015
732015
Finding dense clusters via low rank + sparse decomposition
S Oymak, B Hassibi
arXiv preprint arXiv:1104.5186, 2011
662011
Isometric sketching of any set via the Restricted Isometry Property
S Oymak, B Recht, M Soltanolkotabi
Information & Inference, 2018
382018
Graph clustering with missing data: Convex algorithms and analysis
R Korlakai Vinayak, S Oymak, B Hassibi
Advances in Neural Information Processing Systems 27, 2996-3004, 2014
382014
The system can't perform the operation now. Try again later.
Articles 1–20