Statistica Sinica 33 (2023), 1219-1232
Zhouping Li, Jinfeng Xu, Na Zhao and Wang Zhou
Abstract: The jackknife empirical likelihood (JEL) is an attractive approach for statistical inferences with nonlinear statistics, such as U-statistics. However, most contemporary problems involve high-dimensional model selection and, thus, the feasibility of this approach in theory and practice remains largely unexplored in situations in which the number of parameters diverges to infinity. In this paper, we propose a penalized JEL method that preserves the main advantages of the JEL and leads to reliable variable selection based on estimating equations with a U-statistic structure in high-dimensional settings. Under certain regularity conditions, we establish the asymptotic theory and oracle property for the JEL and its penalized version when the numbers of estimating equations and parameters increase with the sample size. Simulation studies and a real-data analysis are used to examine the performance of the proposed methods and illustrate its practical utility.
Keywords: Estimating equations, high-dimensional data analysis, jack-knife empirical likelihood, penalized likelihood, U-statistics, variable selection.