Nonbifurcating Phylogenetic Tree Inference via the Adaptive LASSO
From MaRDI portal
Publication:4999164
DOI10.1080/01621459.2020.1778481zbMath1464.62480arXiv1805.11073OpenAlexW3035455049WikidataQ112651981 ScholiaQ112651981MaRDI QIDQ4999164
Vu Dinh, Cheng Zhang, Frederick A. IV Matsen
Publication date: 6 July 2021
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1805.11073
Ridge regression; shrinkage estimators (Lasso) (62J07) Applications of statistics to biology and medical sciences; meta analysis (62P10) Robustness and adaptive procedures (parametric inference) (62F35)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Support recovery without incoherence: a case for nonconvex regularization
- One-step sparse estimates in nonconcave penalized likelihood models
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Full reconstruction of Markov models on evolutionary trees: identifiability and consistency.
- Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression
- Least angle regression. (With discussion)
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- A molecular sequence metric and evolutionary trees
- Adaptive restart for accelerated gradient schemes
- Invariants of some probability models used in phylogenetic inference
- Simultaneous analysis of Lasso and Dantzig selector
- Consistency and convergence rate of phylogenetic inference via regularization
- Calibrating nonconvex penalized regression in ultra-high dimension
- Identifying evolutionary trees and substitution parameters for the general Markov model with invariable sites
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- Variable selection using MM algorithms
- SparseNet: Coordinate Descent With Nonconvex Penalties
- A Global Lojasiewicz Inequality for Algebraic Varieties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Identifiability of a Markovian model of molecular evolution with gamma-distributed rates
- Smoothly Clipped Absolute Deviation on High Dimensions
- Signal Recovery by Proximal Forward-Backward Splitting
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: Nonbifurcating Phylogenetic Tree Inference via the Adaptive LASSO