Instance-Based Learning and Model Based Learning In Machine Learning Data Science

 Instance and Model Based Learning 


Instance-Based Learning 


Instance-based learning, also known as memory-based learning, is a technique where the model doesn’t generalize the training data into patterns but instead retains the actual data instances. When making a prediction, it directly compares new data points to specific examples stored from the training set. The algorithm identifies similar instances in the training data to make decisions.


For example, in the k-Nearest Neighbors (k-NN) algorithm, new data points (or people) are matched with previous training data (existing friends). It finds the closest points (closest friends) and uses them to predict what this new person might be like. For each new prediction, the old data is directly used.

Example:


Imagine you have a group of friends, and you want to choose new friends. You compare each new person with your existing friends to predict what kind of friend that person would be.



2. Model-Based Learning 


Model-based learning, in contrast, involves creating a generalized model that captures patterns and relationships within the data. Instead of storing each training instance, the algorithm trains a model that learns from the training data and abstracts these insights into a set of parameters or a mathematical function. This model is then used to make predictions on new data.


Now, imagine you want to understand a pattern, like the likelihood of rain. You look at weather conditions from past years and identify a pattern of when and how it rains.


Then, you create a model that has learned from all this data, so now you don’t need to use the old data every time. This model helps you predict future weather conditions. For instance, you can check temperature, humidity, and other weather indicators on an app to get model-based predictions about rain.



These approaches help improve current knowledge with new information and make predictions more effective in everyday situations.

Comments

Popular posts from this blog

Feature Engineering in Machine Learning: A Beginner's Guide Missing value imputation, handling categorical data ,outlier detection and feature scaling

Handling Missing Numerical Data with Simple Imputer

Feature Construction and Feature Splitting in Machine Learning data science