Pattern Recognition, 2ed, An Indian Adaptation

Richard O. Duda, Peter E. Hart, David G. Stork

ISBN: 9789354244391

496 pages

INR 749

Description

Pattern Recognition is a classic reference in the field which has been an invaluable resource preferred by students, academics, researchers, and other interested readers for more than four decades. Starting with the introductory concepts of pattern classification, the book lays the theoretical foundations of Bayesian decision theory and then focuses on key topics such as parameter estimation, discriminant analysis, neural networks, and nonmetric methods. It finally covers machine learning, unsupervised learning, and different clustering techniques. The book incorporates a host of pedagogical features, including worked examples, extensive graphics, expanded exercises, and computer project topics.

 

    1. 1 | INTRODUCTION TO PATTERN RECOGNITION

1.1 Machine Perception

1.2 An Example

1.3 Approaches to Pattern Classification

1.4 Pattern Recognition Systems

1.5 The Design Cycle

1.6 Learning and Adaptation

1.7 Conclusion

    1.  
    2. 2 | BAYESIAN DECISION THEORY

2.1 Introduction

2.2 Bayesian Decision Theory—Continuous Features

2.3 Minimum-Error-Rate Classification

2.4 Classifiers, Discriminant Functions, and Decision Surfaces

2.5 The Normal Density

2.6 Discriminant Functions for the Normal Density

2.7 Error Probabilities and Integrals

2.8 Error Bounds for Normal Densities

2.9 Bayesian Decision Theory—Discrete Features

2.10 Missing and Noisy Features

2.11 Bayesian Belief Networks

2.12 Compound Bayesian Decision Theory and Context

 

3 | MAXIMUM-LIKELIHOOD AND BAYESIAN PARAMETER ESTIMATION

3.1 Introduction

3.2 Maximum-Likelihood Estimation

3.3 Bayesian Estimation

3.4 Bayesian Parameter Estimation: Gaussian Case

3.5 Bayesian Parameter Estimation: General Theory

3.6 Problems of Dimensionality

3.7 Component Analysis and Discriminants

3.8 Expectation-Maximization (EM)

3.9 Hidden Markov Models

 

4 | NONPARAMETRIC TECHNIQUES

4.1 Introduction

4.2 Density Estimation

4.3 Parzen Windows

4.4 kn-Nearest-Neighbor Estimation

4.5 The Nearest-Neighbor Rule

4.6 Metrics and Nearest-Neighbor Classification

4.7 Fuzzy Classification

4.8 Reduced Coulomb Energy Networks

4.9 Approximations by Series Expansions

 

5 | LINEAR DISCRIMINANT FUNCTIONS

5.1 Introduction

5.2 Linear Discriminant Functions and Decision Surfaces

5.3 Generalized Linear Discriminant Functions

5.4 The Two-Category Linearly Separable Case

5.5 Minimizing the Perceptron Criterion Function

5.6 Relaxation Procedures

5.7 Nonseparable Behavior

5.8 Minimum Squared-Error Procedures

5.9 The Ho-Kashyap Procedures

5.10 Support Vector Machines

 

6 | ARTIFICIAL NEURAL NETWORKS

6.1 Introduction

6.2 Feedforward Operation and Classification

6.3 Backpropagation Algorithm

6.4 Error Surfaces

6.5 Backpropagation as Feature Mapping

6.6 Backpropagation, Bayes Theory, and Probability

6.7 Practical Techniques for Improving Backpropagation

6.8 Additional Networks and Training Methods

6.9 Deep Neural Networks for Pattern Recognition

6.10 Regularization, Complexity Adjustment, and Pruning

 

7 | NONMETRIC METHODS

7.1 Introduction

7.2 Decision Trees

7.3 CART

7.4 Other Tree Methods

7.5 Recognition with Strings

7.6 Grammatical Methods

7.7 Grammatical Inference

7.8 Rule-Based Methods

 

8 | ALGORITHM-INDEPENDENT MACHINE LEARNING

8.1 Introduction

8.2 Lack of Inherent Superiority of any Classifier

8.3 Bias and Variance

8.4 Resampling for Estimating Statistics

8.5 Resampling for Classifier Design

8.6 Performance Metrics

8.7 Estimating and Comparing Classifiers

8.8 Combining Classifiers

 

9 | UNSUPERVISED LEARNING AND CLUSTERING

9.1 Introduction

9.2 Mixture Densities and Identifiability

9.3 Maximum-Likelihood Estimates

9.4 Application to Normal Mixtures

9.5 Unsupervised Bayesian Learning

9.6 Data Description and Clustering

9.7 Criterion Functions for Clustering

9.8 Hierarchical Clustering

9.9 On-Line Clustering

9.10 Graph-Theoretic Methods

9.11 Component Analysis

9.12 Low-Dimensional Representations and Multidimensional Scaling (MDS)

 

Summary

Bibliographical and Historical Remarks

Problems

Computer Exercises

Multiple Choice Questions

References

 

A | MATHEMATICAL FOUNDATIONS

A.1 Notation

A.2 Linear Algebra

A.3 Lagrange Optimization

A.4 Probability Theory

A.5 Gaussian Derivatives and Integrals

A.6 Hypothesis Testing

A.7 Information Theory

A.8 Computational Complexity

 

Bibliographical Remarks

References

INDEX

 

 

×
  • Name:
  • Designation:
  • Name of Institute:
  • Email:
  • * Request from personal id will not be entertained
  • Moblie:
  • ISBN / Title:
  • ISBN:    * Please specify ISBN / Title Name clearly