Categories

Popular Article

Supervised Learning: Classification and Regression Problems

In supervised learning, an information (called label) is attached to an object/observation (called Training data). The training data is consist of a set of training examples/labels.  Each example is a pair consisting of an input object and the desired...

Machine Learning

Continue to the category

Feature Selection with scikit-learn Library’s Inbuilt Methods

While working on any Machine Learning project, there are many phases...

Data Leakage: Do not apply Transformation before Splitting Training – Test set

This is for transformations like Normalization, Standardization, and TF-IDF etc. For example,...

L1 and L2 Regularization Guide: Lasso and Ridge Regression

This article is about Lasso Regression and Ridge Regression or other...

Programming and Coding

Continue to the category

Why Bubble Sort is Named So?

Bubble sort belongs to a family of sorting algorithms. It is...

Python Script for Facebook Auto Post Bot

How to make a Facebook bot? Now Facebook Bot is a very popular word,...

Latest articles

Feature Selection with scikit-learn Library’s Inbuilt Methods

While working on any Machine Learning project, there are many phases before you reach model training and one of the major phases of that is Feature Selection. Here, I want to share about a few common methods for feature selection with scikit-learn library. Recursive feature elimination: as we can understand by name, it runs the selected algorithms recursively until we reach our desired number of features. Core idea...

Data Leakage: Do not apply Transformation before Splitting Training – Test set

This is for transformations like Normalization, Standardization, and TF-IDF etc. For example, you have this numerical data, you apply MinMax normalization before splitting; now this normalization has seen all data, which means all the test data will range in 0-1, which in real cases might not be possible. Consider another situation where you have text data; you apply TF-IDF before splitting the data into train and test set....

L1 and L2 Regularization Guide: Lasso and Ridge Regression

This article is about Lasso Regression and Ridge Regression or other call it L1 and L2 regularization, here we will learn and discuss L1 vs L2 Regularization Guide: Lasso and Ridge Regression. The key difference between L1 and L2 regularization is the penalty term or how weights are used, L2 is the sum of the square of the weights, while L1 is just the absolute...

Important Python Libraries for Machine Learning and Data Science

We have mentioned few highly used and vital libraries that you will need while working in Machine learning, Artificial Intelligence and Data Science field with Python. Libraries include Numpy, Pandas, matplotlib, sklearn, TensorFlow and PyTorch.

Best 5 YouTube Channels for Machine Learning, Artificial Intelligence and Data Science

The Internet can work as The worlds biggest university for us only if we know how to use it efficiently. And, Synonymous to the Internet, YouTube is one of the best places to learn multiple things. In this article, we will be mentioning some of the best YouTube Channels for Machine Learning and Artificial Intelligence. (the list presented is in no particular order as everyone has...