Group Variable Selection Methods with Quantile Regression: A Simulation Study.

Section: Research Paper
Published
May 1, 2025
Pages
114-126

Abstract

In many cases, covariates have a grouping structure that can be used in the analysis to identify important groups and the significant members of those groups. This paper reviews some group variable selection methods that utilize quantile regression. The study compares seven previously proposed group variable selection methods, namely the group Lasso estimate, the quantile group Lasso (median group Lasso) estimate, the quantile group adaptive Lasso estimate, the sparse group Lasso estimate, the group scad estimate, the group mcp estimate, and the group gel estimate through a simulation study. The simulation study helps determine which methods perform best in all linear regression scenarios.

References

  1. Arnold, T. B., and Tibshirani, R. J. (2014). Efficient implementations of the generalized Lasso dual-path algorithm. Journal of Computational and Graphical Statistics, 25(1):127, 2016.
  2. Bakin S (1999). Adaptive regression and model selection in data mining problems (Ph.D. thesis), The Australian National University.
  3. Breheny P and Huang J. (2015) Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors. Statistics and Computing, 25: 173-187.
  4. Breheny P. (2015). The group exponential lasso for bi-level variable selection. Biometrics, 71: 731-740.
  5. Cai, T. T., Zhang, A. R., and Zhou, Y. (2022). Sparse group lasso: Optimal sample complexity, convergence rate, and statistical inference. IEEE Transactions on Information Theory, 68, 59756002.
  6. Christidis, A.-A., Lakshmanan, L., Smucler, E., and Zamar, R. (2020). Split regularized regression. Technometrics 62.3, pp. 330338.
  7. Ciuperca, G. (2019). Adaptive group LASSO selection in quantile models, Statistical Papers, 60, 173197.
  8. Fan, J. and R. Li (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association 96(456), 13481360.
  9. Fujisawa, H. and Eguchi, S. (2008). Robust parameter estimation with a small bias against heavy contamination, Journal of Multivariate Analysis, 99(9), 2053-2081.
  10. Hashem H., Vinciotti V., Alhamzawi, R, and Yu, K. (2016). Quantile regression with group lasso for classification, Advances in Data Analysis and Classification, 10, 375390.
  11. Huang J, Breheny P, and Ma S (2012). A Selective Review of Group Selection in High-Dimensional Models, Statistical Science, 27, 481499.
  12. Huang, Y., Tibbe, T., Tang, A., & Montoya, A. (2023). Lasso and Group Lasso with Categorical Predictors: Impact of Coding Strategy on Variable Selection and Prediction.Journal of Behavioral Data Science,3(2), 15-42.
  13. Huling, J.D.; Chien, P. (2018). Fast Penalized Regression and Cross-Validation for Tall Data with the OEM Package. J. Stat. Softw.
  14. Kato, K. (2011). Group Lasso for high dimensional sparse quantile regression models, arXiv:1103.1458 v2 [stat.ME].
  15. Koenker, R. and G. W. Bassett (1978). Regression quantiles. Econometrica 46, 3350.
  16. Lambert-Lacroix, S. and Zwald, L. (2011). Robust regression through the Hubers criterion and adaptive lasso penalty. Electronic Journal of Statistics 5, 10151053.
  17. Li, J. , Liang, K. and Song,X. (2022).Logistic regression with adaptive sparse group Lasso penalty and its application in acute leukemia diagnosis.Computers in Biology and Medicine, Volume 141.
  18. Li, Q., R. Xi, and N. Lin (2010). Bayesian regularized quantile regression. Bayesian Analysis 5, 124
  19. McDonald, D. J. (2022). sparsegl: An R Package for Estimating Sparse Group Lasso. https://arxiv.org/pdf/2208.02942.pdf.
  20. Qin, Y. Li, S. and Yu, Y . (2017) Penalized Maximum Tangent Likelihood Estimation and Robust Variable Selection, https://arxiv.org/pdf/1708.05439.pdf.
  21. Rosset, S. and Zhu, J. ( 2007). Piecewise linear regularized solution paths. The Annals of Statistics 35 (3), 10121030.
  22. Sherwood, B. ,Li, S. and Maidman, A. (2016). rqPen: Penalized Quantile Regression. R package version 1.4.
  23. Taddy, M. (2017). One-step estimator paths for concave regularization, Journal of Computational and Graphical Statistics pp. 112.
  24. Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, Series B 58, 267288.
  25. Tibshirani, R. J., and Taylor, J. (2011). The solution path of the generalized Lasso. Ann.Stat., 39(3), 1335-1371.
  26. Wang, H., Li, , and Jiang, G. (2007). Robust regression shrinkage and consistent variable selection through the LAD-Lasso. Journal of Business & Economic Statistics 25, 347 - 355.
  27. Xiong S, Dai B, Huling J, Qian PZ. (2016). Orthogonalizing EM: A design-based least squares algorithm. Technometrics; 58(3): 285-93.
  28. Yi, C. Huang, J. (2016). Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss Regression and Quantile Regression. Journal of Computational and Graphical Statistics 3. 547
  29. Yu, K., C. Cathy, C. Reed, and D. Dunson (2013). Bayesian variable selection in quantile regression. Statistics and Its Interface 6, 261274
  30. Yuan M and Lin Y (2006). Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), B (Statistical Methodology), 68, 4967.
  31. Zhu, W., Levy-Leduc, C., and Tern`es, N. (2021). A variable selection approach for highly correlated predictors in high-dimensional genomic data. Bioinformatics, 37(16), 2238 2244.
  32. Zou, H. (2006). The adaptive Lasso and its oracle properties. Journal of the American Statistical Association 101, 14181429.
Download this PDF file

Statistics

How to Cite

A. Hashem, H. (2025). Group Variable Selection Methods with Quantile Regression: A Simulation Study. IRAQI JOURNAL OF STATISTICAL SCIENCES, 22(1), 114–126. https://doi.org/10.33899/iqjoss.2025.187759