MACHINE LEARNING

300 210

Regulation 2023

978-81-19432-30-1

UNIT – I: INTRODUCTION TO MACHINE LEARNING 

Review of Linear Algebra for Machine Learning: Introduction and Motivation for
Machine Learning – Examples of Machine Learning Applications,
Vapnik-Chervonenkis (VC) dimension – Probably Approximately Correct (PAC)
Learning – Hypothesis Spaces, Inductive bias, Generalization, Bias variance trade-Off.

UNIT – II: SUPERVISED LEARNING 

Linear Regression Models: Least squares, single & multiple variables, Bayesian Linear
Regression, Gradient Descent, Linear Classification Models, Discriminant function –
Perceptron Algorithm – Probabilistic Discriminative Model – Logistic Regression –
Probabilistic Generative Model – Naïve Bayes – Maximum Margin Classifier – Support
Vector Machine – Decision Tree – Random Forests –

UNIT – III: ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 

Combining Multiple Learners: Model combination schemes, Voting, Ensemble
Learning – Bagging – Boosting – Unsupervised Learning: K-means, Instance Based
Learning, KNN, Gaussian mixture models and Expectation maximization.

UNIT – IV: NEURAL NETWORKS 

Multilayer Perceptron (MLP) – Activation Function – Network Training – Gradient Descent
Optimisation – Stochastic gradient descent, Error Backpropagation, from shallow Networks
to deep networks – Unit saturation) aka the vanishing gradient problem – ReLU –
Hyperparameters tuning, batch Normalization, Regularization, Dropout

UNIT – V: DESIGN AND ANALYSIS OF MACHINE LEARNING
EXPERIMENTS

Guidelines for Machine Learning Experiments – Cross Validation (CV) and
Resampling – K-fold CV, bootstrapping, Measuring Classifier Performance – Assessing
a Single Classification Algorithm and Comparing Two Classification Algorithms –t
test, McNemar’s test, K-fold CV paired t test.

Reviews

There are no reviews yet.

Be the first to review “MACHINE LEARNING”

Your email address will not be published. Required fields are marked *

300 210