PAC-learning gains of Turing machines over circuits and neural networks
From MaRDI portal
Publication:2111729
DOI10.1016/j.physd.2022.133585OpenAlexW3137745286MaRDI QIDQ2111729
Raphaël M. Jungers, Brieuc Pinon, Jean-Charles Delvenne
Publication date: 17 January 2023
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.12686
computational complexityKolmogorov complexityminimum description lengthPAC-learningdeep learningprogram induction
Cites Work
- Unnamed Item
- Unnamed Item
- Boolean function complexity. Advances and frontiers.
- Reviewing bounds on the circuit size of the hardest functions
- Occam's razor
- The network complexity and the Turing machine complexity of finite functions
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- DECISION TREES DO NOT GENERALIZE TO NEW VARIATIONS
- Circuit-size lower bounds and non-reducibility to sparse sets
- Relations Among Complexity Measures
- Computational Complexity
- Some Results on Average-Case Hardness Within the Polynomial Hierarchy
- Understanding Machine Learning
- Two-Tape Simulation of Multitape Turing Machines
- A formal theory of inductive inference. Part I
- A formal theory of inductive inference. Part II
- An introduction to Kolmogorov complexity and its applications
This page was built for publication: PAC-learning gains of Turing machines over circuits and neural networks