Skip to yearly menu bar Skip to main content


Poster

Replicable Learning of Large-Margin Halfspaces

Alkis Kalavasis · Amin Karbasi · Kasper Green Larsen · Grigoris Velegkas · Felix Zhou


Abstract: We provide an efficient replicable algorithm for the problem of learning large-margin halfspaces.Our results improve upon the algorithms provided by Impagliazzo, Lei, Pitassi, and Sorrell (STOC, 2022). Wedesign the first dimension-independent replicable algorithm forthis task which runs in polynomial time, is proper, and has strictly improved sample complexity compared to the oneachieved by Impagliazzo et al. (STOC, 2022)with respect to all the relevantparameters.Moreover,our algorithm has sample complexity that is optimal with respect to the accuracy parameter$\epsilon$.Departing from the requirement of polynomial time algorithms, using the DP-to-Replicability reduction of Bun et al. (STOC 2023),we show how to obtain a replicable algorithm for large-margin halfspaces with improved sample complexity with respectto the margin parameter $\tau$,but running time doubly exponential in $1/\tau^2$ and worse samplecomplexity dependence on $\epsilon$ than our previous algorithm.We then design an improved algorithm with better sample complexitythan both of our previous algorithmsand running time exponential in $1/\tau^{2}.$

Live content is unavailable. Log in and register to view live content