The problem of predicting a discrete class label output, as opposed to Regression which predicts a continuous quantity. ...
Confusion Matrix
An NxN table that summarizes how successful a classification model's predictions were; that is, the correlation between the label and the model's classification. One axis of a confusion matrix is the label that the model predicted, and the other axis is the actual label. N represents the numb ...
Class Imbalance
A problem which occurs in classification datasets where the labels appear in different frequencies. This can hurt or bias training.
    Area under the ROC Curve (AUC)
    An evaluation metric that considers all possible classification thresholds. The Area Under the ROC curve is the probability that a classifier will be more confident that a randomly chosen positive example is actually positive than that a randomly chosen negative example is positive.
    Neural Network
    A Neural Network is a collection of units or weights (the "neurons") which get adjusted during training. Its a type of Machine Learning algorithm used in many Stateoftheart algorithms. When the Network has many layers they are called Deep Neural Networks, trained with Deep Learning.
    Sentiment Analysis
    Using statistical or machine learning algorithms to determine the sentiment—positive or negative— of an image, text, or other data point. ...
    VC dimension
    In statistical learning theory, Vapnik-Chervonenkis (VC) dimension is a measure of the complexity of a continuous hypothesis class (such as linear classifiers). While it often corresponds to the number of parameters in a model, this isn't always the case. There is a general bound on the gener ...
    Transductive Learning
    A type of Semi Supervised Learning Paradigm. Transduction is reasoning from observed specific cases to specific test cases. Inductive reasoning in contrast learns general rules.Transductive learning aims to apply the trained classifier on the unlabeled instances ...
    Surrogate Loss Function
    A proxy loss function used to approximate the 0-1 loss to turn the problem convex. This can be the hinge loss a logistic loss, or others. Used for example in Support Vector Machines.
    The problem of predicting a continuous quantity output. (As opposed to Classification which predicts a discrete class label). ...
    Support Vector Machines (SVM)
    Supervised Learning models that perform classification by partitioning the input space based on input observations called "support vectors". It is a Kernel Method.