Local entropy as a measure for sampling solutions in constraint satisfaction problems
From MaRDI portal
Publication:3302534
DOI10.1088/1742-5468/2016/02/023301zbMath1456.94029arXiv1511.05634OpenAlexW2262732936WikidataQ61444395 ScholiaQ61444395MaRDI QIDQ3302534
Alessandro Ingrosso, Riccardo Zecchina, Carlo Baldassi, Luca Saglietti, Carlo Lucibello
Publication date: 11 August 2020
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1511.05634
Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.) (68T20) Measures of information, entropy (94A17) Stochastic analysis in statistical mechanics (82M60)
Related Items
Deep relaxation: partial differential equations for optimizing deep neural networks, Entropy-SGD: biasing gradient descent into wide valleys, Clustering of solutions in the symmetric binary perceptron, Shaping the learning landscape in neural networks around wide flat minima, Biased landscapes for random constraint satisfaction problems, Biased measures for random constraint satisfaction problems: larger interaction range and asymptotic expansion, Wide flat minima and optimal generalization in classifying high-dimensional Gaussian mixtures, Optimization of the dynamic transition in the continuous coloring problem, Entropic gradient descent algorithms and wide flat minima*
Cites Work
- Generalization learning in a perceptron with binary synapses
- Entropy landscape of solutions in the binary perceptron problem
- A Max-Sum algorithm for training discrete neural networks
- Information, Physics, and Computation
- Survey propagation: An algorithm for satisfiability
- Determining computational complexity from characteristic ‘phase transitions’
- Gibbs states and the set of solutions of random constraint satisfaction problems
- Statistical mechanics methods and phase transitions in optimization problems