Intermediate value linearizability: a quantitative correctness criterion
From MaRDI portal
Publication:6534999
DOI10.4230/lipics.disc.2020.2zbMATH Open1543.68457MaRDI QIDQ6534999
Publication date: 2 November 2023
Data structures (68P05) Distributed algorithms (68W15) Computational aspects of data analysis and big data (68T09)
Cites Work
- The time complexity of updating snapshot memories
- Approximate counting: a detailed analysis
- On interprocess communication. II: Algorithms
- Synopses for Massive Data: Samples, Histograms, Wavelets, Sketches
- Quantitative relaxation of concurrent data structures
- Mergeable summaries
- Algorithms for distributed functional monitoring
- Counting large numbers of events in small registers
- Unifying Concurrent Objects and Distributed Tasks
- An improved data stream summary: the count-min sketch and its applications
- Strongly Linearizable Implementations of Snapshots and Other Types
- Database Theory - ICDT 2005
- The complexity of updating multi-writer snapshot objects
- Linearizable implementations do not suffice for randomized distributed computation
This page was built for publication: Intermediate value linearizability: a quantitative correctness criterion