Linear Regression
Regression analysis is one of the most important fields in statistics and
machine learning. There are several regression methods available. Linear regression is one of them. Regression searches for relationships among variables. In statistical modeling and in Machine learning that relationship is used to forecast the result of further or future event.
Linear Regression
Linear regression is probably one of the most important and widely used regression techniques. It’s among the simplest regression methods. One of its main advantages is the ease of interpreting results.
Linear regression tries to form the relationship between two variables by making a linear equation to observed data. One variable is considered to be an descriptive variable, and the other is considered to be a dependent variable.
Simple Linear Regression: Simple linear regression is the simplest case of linear regression with a single independent variable, 𝐱 = 𝑥.
Multiple Linear Regression: Multiple linear regression is a case of linear regression with more than one independent variables.
Polynomial Regression: Polynomial regression is a generalized case of linear regression. One assumes the polynomial dependence between the output and inputs and, consequently, the polynomial estimated regression function.
Implementing Linear Regression in Python
Python Packages for Linear Regression:
The package NumPy is a fundamental Python scientific package that allows many high-performance operations on single- and multi-dimensional arrays. It also offers many mathematical routines. It is open source.
The package scikit-learn is a widely used Python library for machine learning, built on top of NumPy and some other packages. It provides the means for preprocessing data, reducing dimensionality, implementing regression, classification, clustering, and more. Like NumPy, scikit-learn is also open source.
Simple Linear Regression with scikit-learn : Let’s start with the simplest case, which is simple linear regression.
There are five basic steps when you’re implementing linear regression:
- Import the packages and classes that are needed.
- Provide the data to work with and then do appropriate changes.
- Create a regression model and fit it with existing data.
- Check the results of model fitting to know whether the model is satisfactory or not.
- Apply the model for predictions.
Lets see an example where we predict the speed of a 10 year old car.
- Import the modules needed.
- Create the arrays that represent the values of the x and y axis:
- Execute a method that returns some important key values of Linear Regression:
- Create a function that uses the slope and intercept values to return a new value. This new value represents where on the y-axis the corresponding x value will be placed:
- Run each value of the x array through the function. This will result in a new array with new values for the y-axis:
- Draw the original scatter plot:
- Draw the line of linear regression:
- Display the diagram: plt.show()
Conclusion:
Linear Regression is easy to implement and easier to interpret the output coefficients.When you are aware that the relationship between the independent and dependent variable have a linear relationship, this algorithm is the best to use due of it’s less complexity compared to other algorithms.Linear Regression is a great tool to analyze the relationships among the variables but it isn’t recommended for most practical applications because it over-simplifies real-world problems by assuming a linear relationship among the variables.