×

Close

- Machine Learning - ML
- Note
- 24 Topics
**8775 Views**- 237 Offline Downloads
- Uploaded 1 year ago

Touch here to read

Page-1

- Introduction - ( 7 - 7 )
- Core material and classification - ( 8 - 12 )
- Linear classifiers and preceptrons - ( 13 - 23 )
- Soft-margin support vector machines;features - ( 24 - 30 )
- Machine learning abstractions and numerical optimization - ( 31 - 36 )
- Decision theory; Generative and discriminative models - ( 37 - 42 )
- Gaussian discriminant analysis,including QDA and LDA - ( 43 - 46 )
- Eigenvectors and the anisotropic multivariate normal distribution - ( 47 - 51 )
- Anisotropic gaussians, maximum likelihood estimation,QDA and LDA - ( 52 - 60 )
- Regression , including least-squares linear and logistic regression - ( 61 - 65 )
- More regression;Newton's method; ROC curves - ( 66 - 71 )
- Statistical justifications; he bias-variance decomposition - ( 72 - 77 )
- Shrinkage: Ridge regression, subset selection,and lasso - ( 78 - 82 )
- The kernel trick - ( 83 - 87 )
- Decision trees - ( 88 - 100 )
- Neural networks - ( 101 - 108 )
- Neurons; variations on Neural networks - ( 109 - 115 )
- Better neural network training; convolution neural networks - ( 116 - 123 )
- Unsupervised learning and principal components analysis - ( 124 - 132 )
- The singular value decomposition; clustering - ( 133 - 140 )
- Spectral graph clustering - ( 141 - 148 )
- Learning theory - ( 149 - 154 )
- Multiple convectors; latent factor analysis;nearest neighbors - ( 155 - 162 )
- Faster nearest neighbors; voronoi diagrams and k-d trees - ( 163 - 164 )

Topic:

Concise Machine Learning Jonathan Richard Shewchuk May 4, 2017 Department of Electrical Engineering and Computer Sciences University of California at Berkeley Berkeley, California 94720 Abstract This report contains lecture notes for UC Berkeley’s introductory class on Machine Learning. It covers many methods for classification and regression, and several methods for clustering and dimensionality reduction. It is concise because not a word is included that cannot be written or spoken in a single semester’s lectures (with whiteboard lectures and almost no slides!) and because the choice of topics is limited to a small selection of particularly useful, popular algorithms. Supported in part by the National Science Foundation under Award CCF-1423560, in part by the University of California Lab Fees Research Program, and in part by an Alfred P. Sloan Research Fellowship. The claims in this document are those of the author. They are not endorsed by the sponsors or the U.S. Government.

Keywords: machine learning, classification, regression, density estimation, dimensionality reduction, clustering, perceptrons, support vector machines (SVMs), Gaussian discriminant analysis, linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), logistic regression, decision trees, neural networks, convolutional neural networks (CNNs, ConvNets), nearest neighbor search, least-squares linear regression, logistic regression, polynomial regression, ridge regression, Lasso, maximum likelihood estimation (MLE), principal components analysis (PCA), singular value decomposition (SVD), latent factor analysis, latent semantic indexing, k-means clustering, hierarchical clustering, spectral graph clustering

Contents 1 Introduction 1 2 Linear Classifiers and Perceptrons 7 3 Perceptron Learning; Maximum Margin Classifiers 13 4 Soft-Margin Support Vector Machines; Features 18 5 Machine Learning Abstractions and Numerical Optimization 25 6 Decision Theory; Generative and Discriminative Models 31 7 Gaussian Discriminant Analysis, including QDA and LDA 36 8 Eigenvectors and the Anisotropic Multivariate Normal Distribution 41 9 Anisotropic Gaussians, Maximum Likelihood Estimation, QDA, and LDA 46 10 Regression, including Least-Squares Linear and Logistic Regression 54 11 More Regression; Newton’s Method; ROC Curves 59 12 Statistical Justifications; the Bias-Variance Decomposition 65 13 Shrinkage: Ridge Regression, Subset Selection, and Lasso 71 14 The Kernel Trick 76 15 Decision Trees 81 16 More Decision Trees, Ensemble Learning, and Random Forests 86 17 Neural Networks 94 18 Neurons; Variations on Neural Networks 101 19 Better Neural Network Training; Convolutional Neural Networks 108 20 Unsupervised Learning and Principal Components Analysis 116 21 The Singular Value Decomposition; Clustering 125 i

22 Spectral Graph Clustering 133 23 Learning Theory 141 24 Multiple Eigenvectors; Latent Factor Analysis; Nearest Neighbors 147 25 Faster Nearest Neighbors: Voronoi Diagrams and k-d Trees 154 ii

## Leave your Comments