🤖 AI Summary
To address the low accuracy in feature selection and inefficiency in hyperparameter tuning within the Cox model, this paper proposes a direct optimization method for the regularization parameter based on the square-root Cox partial likelihood. We introduce the square-root partial likelihood into survival analysis for the first time, endowing the regularization parameter with asymptotic pivotality and revealing a phase-transition phenomenon—akin to compressed sensing—in correct variable recovery. The method integrates LASSO regularization, the linear Cox model, and neural network-based joint modeling, thereby circumventing high-variance tuning strategies such as cross-validation or information criteria. Empirical results demonstrate that our approach significantly outperforms CV-LASSO and BIC-based subset selection in both exact variable identification probability and model stability, while preserving statistical interpretability and delivering competitive predictive performance.
📝 Abstract
We revisit Cox's proportional hazard models and LASSO in the aim of improving feature selection in survival analysis. Unlike traditional methods relying on cross-validation or BIC, the penalty parameter $λ$ is directly tuned for feature selection and is asymptotically pivotal thanks to taking the square root of Cox's partial likelihood. Substantially improving over both cross-validation LASSO and BIC subset selection, our approach has a phase transition on the probability of retrieving all and only the good features, like in compressed sensing. The method can be employed by linear models but also by artificial neural networks.