Synthesizing context-free grammars from recurrent neural networks
From MaRDI portal
Publication:2044212
DOI10.1007/978-3-030-72016-2_19zbMath1467.68077arXiv2101.08200OpenAlexW3135979110MaRDI QIDQ2044212
Publication date: 4 August 2021
Full work available at URL: https://arxiv.org/abs/2101.08200
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05) Formal languages and automata (68Q45) Grammars and rewriting systems (68Q42)
Related Items (4)
Unsupervised and few-shot parsing from pretrained language models ⋮ Learning finite state models from recurrent neural networks ⋮ A survey of model learning techniques for recurrent neural networks ⋮ Synthesizing context-free grammars from recurrent neural networks
Cites Work
- Unnamed Item
- Learning regular sets from queries and counterexamples
- On the computational power of neural nets
- Synthesizing context-free grammars from recurrent neural networks
- A Polynomial Algorithm for the Inference of Context Free Languages
- Inductive inference of formal languages from positive data
- Rule Extraction from Recurrent Neural Networks: ATaxonomy and Review
- Language identification in the limit
This page was built for publication: Synthesizing context-free grammars from recurrent neural networks