Computer vision. Models, learning, and inference. Foreword by Andrew Fitzgibbon. (Q5891604)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Computer vision. Models, learning, and inference. Foreword by Andrew Fitzgibbon. |
scientific article; zbMATH DE number 6046673
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Computer vision. Models, learning, and inference. Foreword by Andrew Fitzgibbon. |
scientific article; zbMATH DE number 6046673 |
Statements
15 June 2012
0 references
computer vision
0 references
machine learning
0 references
probabilistic graphical models
0 references
Computer vision. Models, learning, and inference. Foreword by Andrew Fitzgibbon. (English)
0 references
The main goal of this book is to provide an overview of computer-vision algorithms that subdivides these algorithms into three main elements: a model, an inference procedure, and a learning procedure. I think this is a very commendable goal: the computer-vision literature is an enormous zoo of different algorithms, which makes it hard for students to see the relations between the different techniques. Because of the consistent subdivision of algorithms into three main categories, the book does an excellent job of pointing out the relations between the various computer-vision algorithms, as a result of which I think this book is a must-read for anyone in computer vision.NEWLINENEWLINEThe book consists of two main parts, each of which is again subdivided into three subparts. The first part presents probabilistic machine learning with applications to computer vision, whereas the second part covers more traditional computer-vision topics.NEWLINENEWLINEThe first part of the book covers probabilistic machine learning in a similar way as the book by Bishop from 2006. In particular, it comprises (1) a general introduction into probability theory and learning in probabilistic models, with a focus on topics such as Bayes' rule, conjugate priors, and learning via maximum a posteriori estimation; (2) an introduction to standard probabilistic machine-learning models such as (Bayesian) logistic regression, linear and Gaussian process regression, mixture-of-experts models and expectation-maximization, as well as into non-probabilistic models such as boosting, decision trees, and random forests; and (3) probabilistic graphical models with applications in computer vision such as conditional and Markov random fields, as well as popular inference techniques such as graph cuts and the alpha expansion algorithm. Each of the chapters in the first part of the book is concluded with a section that discusses applications of the presented techniques to computer vision.NEWLINENEWLINEThe second part of the book covers more traditional computer-vision topics that one would also find in, e.g., the book by Szeliski from 2010. Specifically, the second part contains (1) an introduction to image-processing techniques such as histogram equalization, linear filtering, interest point detection and description; (2) an introduction to geometric computer vision models, such as the camera pinhole model, image homographies, epipolar geometry, 3D reconstruction, and bundle adjustment; and (3) a collection of computer-vision models including deformable template models, articulated models, Gaussian process latent variable models, bilinear models, Kalman and particle filters, and visual-word models (with a strong focus on topic models).NEWLINENEWLINEBecause of the strong focus of the book on probabilistic graphical models, and because the description of more traditional computer vision topics is delayed to the second part of the book, the book is most suited for students and practitioners that already have had some prior exposure to image processing and computer vision. By contrast, undergraduate students may initially miss the relation between all the theory that is presented and the computer-vision problems they were expecting to learn about.NEWLINENEWLINEEach chapter contains a number of exercises as well as references to additional reading. The book contains helpful appendices on linear algebra and on optimization. All topics are covered comprehensively, and the text contains many very beautiful and helpful illustrations. This makes the book a pleasure to read.
0 references