Controlling the local false discovery rate in the adaptive Lasso. Academic Article uri icon

abstract

  • The Lasso shrinkage procedure achieved its popularity, in part, by its tendency to shrink estimated coefficients to zero, and its ability to serve as a variable selection procedure. Using data-adaptive weights, the adaptive Lasso modified the original procedure to increase the penalty terms for those variables estimated to be less important by ordinary least squares. Although this modified procedure attained the oracle properties, the resulting models tend to include a large number of "false positives" in practice. Here, we adapt the concept of local false discovery rates (lFDRs) so that it applies to the sequence, n, of smoothing parameters for the adaptive Lasso. We define the lFDR for a given n to be the probability that the variable added to the model by decreasing n to n- is not associated with the outcome, where is a small value. We derive the relationship between the lFDR and n, show lFDR =1 for traditional smoothing parameters, and show how to select n so as to achieve a desired lFDR. We compare the smoothing parameters chosen to achieve a specified lFDR and those chosen to achieve the oracle properties, as well as their resulting estimates for model coefficients, with both simulation and an example from a genetic study of prostate specific antigen.

published proceedings

  • Biostatistics

author list (cited authors)

  • Sampson, J. N., Chatterjee, N., Carroll, R. J., & Mller, S.

citation count

  • 9

complete list of authors

  • Sampson, Joshua N||Chatterjee, Nilanjan||Carroll, Raymond J||Müller, Samuel

publication date

  • September 2013