Prudence and other conditions on formal language learning
From MaRDI portal
Publication:912648
DOI10.1016/0890-5401(90)90042-GzbMath0698.68072OpenAlexW2073020732MaRDI QIDQ912648
Publication date: 1990
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0890-5401(90)90042-g
Learning and adaptive systems in artificial intelligence (68T05) Formal languages and automata (68Q45)
Related Items (59)
Learning Families of Closed Sets in Matroids ⋮ Learnability: Admissible, co-finite, and hypersimple languages ⋮ Learning in the presence of inaccurate information ⋮ A map of update constraints in inductive inference ⋮ Parallel learning of automatic classes of languages ⋮ Learning languages in a union ⋮ Strongly non-U-shaped language learning results by general techniques ⋮ Characterization of language learning front informant under various monotonicity constraints ⋮ Learning all subfunctions of a function ⋮ Classes with easily learnable subclasses ⋮ Prudence in vacillatory language identification ⋮ Recursion theoretic models of learning: Some results and intuitions ⋮ Iterative learning from texts and counterexamples using additional information ⋮ Gold-Style Learning Theory ⋮ Prescribed Learning of R.E. Classes ⋮ Learning in Friedberg Numberings ⋮ Optimal language learning from positive data ⋮ Iterative Learning of Simple External Contextual Languages ⋮ Numberings Optimal for Learning ⋮ Learning how to separate. ⋮ When unlearning helps ⋮ Learning by switching type of information. ⋮ Non-U-shaped vacillatory and team learning ⋮ Learning in the presence of partial explanations ⋮ Learning in Friedberg numberings ⋮ Monotonic and dual monotonic language learning ⋮ Iterative learning from positive data and negative counterexamples ⋮ A general comparison of language learning from examples and from queries ⋮ Language learning without overgeneralization ⋮ Learning languages from positive data and a limited number of short counterexamples ⋮ Characterizing language identification in terms of computable numberings ⋮ Robust separations in inductive inference ⋮ Numberings optimal for learning ⋮ Learning languages from positive data and a finite number of queries ⋮ Hypothesis spaces for learning ⋮ On the learnability of recursively enumerable languages from good examples ⋮ Learning one-variable pattern languages very efficiently on average, in parallel, and by asking queries ⋮ Iterative learning of simple external contextual languages ⋮ Incremental learning with temporary memory ⋮ Variations on U-shaped learning ⋮ On some open problems in monotonic and conservative learning ⋮ Control structures in hypothesis spaces: The influence on learning ⋮ Input-dependence in function-learning ⋮ Hypothesis Spaces for Learning ⋮ U-shaped, iterative, and iterative-with-counter learning ⋮ Language learning without overgeneralization ⋮ Algorithms for learning regular expressions from positive data ⋮ Prescribed learning of r.e. classes ⋮ Set-driven and rearrangement-independent learning of recursive languages ⋮ Learning with refutation ⋮ Learning from Streams ⋮ Priced Learning ⋮ The synthesis of language learners. ⋮ Inductive inference of approximations for recursive concepts ⋮ Relations between Gold-style learning and query learning ⋮ Language learning from texts: Degrees of intrinsic complexity and their characterizations ⋮ On the role of update constraints and text-types in iterative learning ⋮ Learning languages with decidable hypotheses ⋮ Mapping monotonic restrictions in inductive inference
Cites Work
This page was built for publication: Prudence and other conditions on formal language learning