I am an Associate Professor at the Department of Automatic Control. My main research interests lie within optimization and its wide range of applications.
P. Giselsson, Nonlinear Forward-Backward Splitting with Projection Correction. Submitted. 2019.
M. Morin and P. Giselsson, SVAG: Unified Convergence Results for SAG-SAGA Interpolation with Stochastic Variance Adjusted Gradient Descent. Submitted. 2019.
E. Ryu, A. Taylor, C. Bergeling, P. Giselsson, Operator Splitting Performance Estimation: Tight contraction factors and optimal parameter selection. Submitted.
C. Grussler, P. Giselsson, Efficient Proximal Mapping Computation for Unitarily Invariant Low-Rank Inducing Norms. Submitted. 2018.
C. Grussler and P. Giselsson, Low-Rank Inducing Norms with Optimality Interpretations. SIAM Journal on Optimization, 28(4):3057 - 3078, 2018.
M. Fält and P. Giselsson, Optimal Convergence Rates for Generalized Alternating Projections. In Proceedings of the 56th Conference on Decision and Control, Melbourne, Australia, Dec 2017.
P. Giselsson, Tight Global Linear Convergence Rate Bounds for Douglas-Rachford Splitting. Journal of Fixed-Point Theory and Applications, 2017. doi:10.1007/s11784-017-0417-1.
P. Giselsson and M. Fält, Envelope Functions: Unifications and Further Properties. Journal of Optimization Theory and Applications, 178(3):673 - 698, 2018.
C. Grussler, A. Rantzer, and P. Giselsson, Low-Rank Optimization with Convex Constraints. IEEE Transactions on Automatic Control, 63(11):4000 - 4007, 2018.
P. Giselsson, and S. Boyd, Linear Convergence and Metric Selection in Douglas Rachford Splitting and ADMM. Transactions of Automatic Control. 62(2):532 - 544, February 2017.
P. Giselsson, M. Fält, and S. Boyd, Line Search for Averaged Operator Iteration. In Proceedings of the 55th Conference on Decision and Control, Las Vegas, USA, Dec 2016.
P. Giselsson, and S. Boyd, Metric Selection in Fast Dual Forward Backward Splitting. Automatica, 62:1-10, December 2015.
During the late fall of 2015, I taught a course on large-scale convex optimization.
In May 2019, I tought one of three modules in WASP course on Deep Learning and GANs called Training Algorithms.