Artificial Intelligence #3:kNN & Bayes Classification method
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 2 Hours | Lec: 20 | 200 MB
Genre: eLearning | Language: English
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 2 Hours | Lec: 20 | 200 MB
Genre: eLearning | Language: English
Classification methods for students and professionals. Learn k-Nearest Neighbors & Bayes Classification &code in python
In this Course you learn k-Nearest Neighbors & Naive Bayes Classification Methods.
In pattern recognition, the k-nearest neighbors algorithm (k-NN) is a non-parametric method used for classification and regression.
k-NN is a type of instance-based learning, or lazy learning, where the function is only approximated locally and all computation is deferred until classification. The k-NN algorithm is among the simplest of all machine learning algorithms.
For classification, a useful technique can be to assign weight to the contributions of the neighbors, so that the nearer neighbors contribute more to the average than the more distant ones.
The neighbors are taken from a set of objects for which the class (for k-NN classification). This can be thought of as the training set for the algorithm, though no explicit training step is required.
In machine learning, naive Bayes classifiers are a family of simple probabilistic classifiers based on applying Bayes' theorem with strong (naive) independence assumptions between the features.
Naive Bayes classifiers are highly scalable, requiring a number of parameters linear in the number of variables (features/predictors) in a learning problem. Maximum-likelihood training can be done by evaluating a closed-form expression, which takes linear time, rather than by expensive iterative approximation as used for many other types of classifiers.
In the statistics and computer science literature, Naive Bayes models are known under a variety of names, including simple Bayes and independence Bayes. All these names reference the use of Bayes' theorem in the classifier's decision rule, but naive Bayes is not (necessarily) a Bayesian method.
In this course you learn how to classify datasets by k-Nearest Neighbors Classification Method to find the correct class for data and reduce error. Then you go further You will learn how to classify output of model by using Naive Bayes Classification Method.
In the first section you learn how to use python to estimate output of your system. In this section you can classify:
Python Dataset
IRIS Flowers
Make your own k Nearest Neighbors Algorithm
In the Second section you learn how to use python to classify output of your system with nonlinear structure .In this section you can classify:
IRIS Flowers
Pima Indians Diabetes Database
Make your own Naive Bayes Algorithm