Webbläsaren som du använder stöds inte av denna webbplats. Alla versioner av Internet Explorer stöds inte längre, av oss eller Microsoft (läs mer här: * https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Var god och använd en modern webbläsare för att ta del av denna webbplats, som t.ex. nyaste versioner av Edge, Chrome, Firefox eller Safari osv.

Default user image.

Malgorzata Bogdan

Professor

Default user image.

The Strong Screening Rule For SLOPE

Författare

  • Johan Larsson
  • Malgorzata Bogdan
  • Jonas Wallin

Summary, in English

Extracting relevant features from data sets where the number of observations n is much smaller then the number of predictors p is a major challenge in modern statistics. Sorted L-One Penalized Estimation (SLOPE)—a generalization of the lasso---is a promising method within this setting. Current numerical procedures for SLOPE, however, lack the efficiency that respective tools for the lasso enjoy, particularly in the context of estimating a complete regularization path. A key component in the efficiency of the lasso is predictor screening rules: rules that allow predictors to be discarded before estimating the model. This is the first paper to establish such a rule for SLOPE. We develop a screening rule for SLOPE by examining its subdifferential and show that this rule is a generalization of the strong rule for the lasso. Our rule is heuristic, which means that it may discard predictors erroneously. In our paper, however, we show that such situations are rare and easily safeguarded against by a simple check of the optimality conditions. Our numerical experiments show that the rule performs well in practice, leading to improvements by orders of magnitude for data in the p >> n domain, as well as incurring no additional computational overhead when n > p.

Avdelning/ar

  • Statistiska institutionen

Publiceringsår

2020-12

Språk

Engelska

Sidor

1-12

Publikation/Tidskrift/Serie

Advances in Neural Information Processing Systems

Dokumenttyp

Artikel i tidskrift

Förlag

Morgan Kaufmann Publishers

Ämne

  • Probability Theory and Statistics
  • Computational Mathematics

Nyckelord

  • screening rules
  • lasso
  • regression
  • regularization

Conference name

Neural Information Processing Systems

Conference date

0001-01-02

Status

Published

Projekt

  • Optimization and Algorithms for Sparse Regression

ISBN/ISSN/Övrigt

  • ISSN: 1049-5258