![]() ![]() Basically, auto-correlation occurs when there is dependency between residual errors. Basically, multi-collinearity occurs when the independent variables or features have dependency in them.Īuto-correlation − Another assumption Linear regression model assumes is that there is very little or no auto-correlation in the data. Multi-collinearity − Linear regression model assumes that there is very little or no multi-collinearity in the data. The following are some assumptions about dataset that is made by Linear Regression model − Plt.hlines(y = 0, xmin = 0, xmax = 50, linewidth = 2) Plt.scatter(reg.predict(X_test), reg.predict(X_test) - y_test,Ĭolor = "blue", s = 10, label = 'Test data') Plt.scatter(reg.predict(X_train), reg.predict(X_train) - y_train,Ĭolor = "green", s = 10, label = 'Train data') Print("Estimated coefficients:\nb_0 = '.format(reg.score(X_test, y_test))) The following script lines will plot the regression line and will put the labels on them −Īt last, we need to define main() function for providing dataset and calling the function we defined above − The following script line will predict response vector − Plt.scatter(x, y, color = "m", marker = "o", s = 30) The following script line will plot the actual points as scatter plot − Next, we need to define a function which will plot the regression line as well as will predict the response vector − We can find cross-deviation and deviation about x as follows − The mean of x and y vector can be calculated as follows − The following script line will give number of observations n − Next, define a function which will calculate the important values for SLR − We can implement SLR in Python in two ways, one is to provide your own dataset and other is to use dataset from scikit-learn python library.Įxample 1 − In the following Python implementation example, we are using our own dataset.įirst, we will start with importing necessary packages as follows − The assumption in SLR is that the two variables are linearly related. It is the most basic version of linear regression which predicts a response using a single feature. Linear regression is of the following two types − It can be understood with the help of following graph − Types of Linear Regression It can be understood with the help of following graph − Negative Linear relationshipĪ linear relationship will be called positive if independent increases and dependent variable decreases. ![]() If X = 0,Y would be equal to b.įurthermore, the linear relationship can be positive or negative in nature as explained below − Positive Linear RelationshipĪ linear relationship will be called positive if both independent and dependent variable increases. M is the slop of the regression line which represents the effect X has on Yī is a constant, known as the Y-intercept. X is the dependent variable we are using to make predictions. Here, Y is the dependent variable we are trying to predict Mathematically the relationship can be represented with the help of following equation − Linear relationship between variables means that when the value of one or more independent variables will change (increase or decrease), the value of dependent variable will also change accordingly (increase or decrease). Linear regression may be defined as the statistical model that analyzes the linear relationship between a dependent variable with given set of independent variables. Machine Learning With Python - Discussion.Machine Learning with Python - Resources.Machine Learning With Python - Quick Guide.Improving Performance of ML Model (Contd…). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |