Laplacian-based semi-supervised learning in multilayer hypergraphs by coordinate descent
From MaRDI portal
Publication:6491346
DOI10.1016/J.EJCO.2023.100079MaRDI QIDQ6491346
Francesco Tudisco, Francesco Rinaldi, Andrea Cristofari, Sara Venturini
Publication date: 24 April 2024
Published in: EURO Journal on Computational Optimization (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- The 2-coordinate descent method for solving double-sided simplex constrained minimization problems
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- Networks beyond pairwise interactions: structure and dynamics
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- A nodal domain theorem and a higher-order Cheeger inequality for the graph \(p\)-Laplacian
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- A decomposition method for Lasso problems with zero-sum constraint
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- Analysis and algorithms for \(\ell_p\)-based semi-supervised learning on graphs
- Block coordinate descent for smooth nonconvex constrained minimization
- Randomness and permutations in coordinate descent methods
- Coordinate descent algorithms
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Random block coordinate descent methods for linearly constrained optimization over networks
- An almost cyclic 2-coordinate descent method for singly linearly constrained problems
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A convergent decomposition algorithm for support vector machines
- Nodal domain count for the generalized graph \(p\)-Laplacian
- A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Graph-Based Semi-Supervised Learning
- Globally convergent block-coordinate techniques for unconstrained optimization
- Community Detection in Networks via Nonlinear Modularity Eigenvectors
- Total Variation Based Community Detection Using a Nonlinear Optimization Approach
- Analysis of $p$-Laplacian Regularization in Semisupervised Learning
- On the convergence of sequential minimization algorithms
- Benchmarking optimization software with performance profiles.
This page was built for publication: Laplacian-based semi-supervised learning in multilayer hypergraphs by coordinate descent