Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Haim Kaplan, Yishay Mansour, Uri Stemmer, Eliad Tsfadia
We present a differentially private learner for halfspaces over a finite grid G in \Rd with sample complexity ≈d2.5⋅2log∗|G|, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a d2 factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of m linear constraints of the form Ax≥b, the task is to {\em privately} identify a solution x that satisfies {\em most} of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution x.