# How to use the LINEST function

**What is the LINEST function?**

The LINEST function returns an array of values representing the parameters of a straight line based on the "least squares" method namely the SLOPE and Intercept values. The LINEST function also lets you combine other functions in order to calculate polynomial, logarithmic, exponential and power series. This function must be entered as an array formula.

The LINEST function may also calculate the following:

- Std error (slope)
- R
^{2} - F statistic
- RSS (Regression sum of squares)
- Std error for intercept
- Std error for the y value
- Degrees of freedom
- SSR (Residual sum of squares)

#### Table of Contents

## 1. Introduction

**What is the least square method?**

The least squares method is a standard approach to fitting regression lines and curves to data by minimizing the sum of the squares of the residuals. It finds the "best" line according to this squared error criterion.

**Related functions**

Function | Description |
---|---|

LINEST(known_y's, [known_x's], [const], [stats]) | Returns statistics for a linear trendline fit to the data |

SLOPE(known_y's, known_x's) | Returns slope of linear regression line |

INTERCEPT(known_y's, known_x's) | Returns y-intercept of linear regression line |

TREND(known_y's, [known_x's], [new_x's], [const]) | Returns predicted y values for a linear trend |

## 2. LINEST Function Syntax

LINEST(*known_y's, [known_x's], [const], [stats]*)

## 3. LINEST Function Arguments

y = mx + b is the function used to calculate the straight line that fits the data best using "least squares" method.

known_y's |
Required. Single column - Each row is a separate variable. Single row - Each column is a separate variable. |

[known_x's] |
Optional. Known x points, default values are 1, 2, 3, ... |

[const] |
Optional. A boolean value determining if constant b is equal to 0 (zero). TRUE - constant b is calculated. Default. FALSE - constant b is 0 (zero). |

[stats] |
Optional. A boolean value determining whether to calculate additional regression statistics. TRUE - Returns additional regression statistics. {mn, mn-1, ..., m1, b;sen, sen-1, ..., se1, seb;r2, sey;F, df;ssreg, ssresid} FALSE - returns only m and b. |

The following table shows what the LINEST function returns if [stats] argument is TRUE.

Statistic |
Description |

se1, se2, ..., sen |
The standard error values. |

seb |
The standard error value for the constant b. seb returns #N/A when const argument is FALSE. |

r^{2} |
The coefficient of determination. A perfect correlation is 1 and 0 (zero) means no correlation based on comparing the actual and the LINEST functions estimated y-values. |

sey |
The standard error for the estimated y-values. |

F |
The F statistic, or the F-observed value. Determines if the observed relationship between the dependent and independent variables occurs by chance. |

df |
The degrees of freedom assists you in finding F-critical values, then compare the values to the F statistic to get the confidence level for the model. |

ssreg |
The regression sum of squares. |

ssresid |
The residual sum of squares. |

*What is the R-square value?*

The R-square value indicates the goodness of fit, with a value of 1 indicating a perfect fit.

**What are the F-statistic and degrees of freedom?**

The F-statistic and degrees of freedom provided by the LINEST function are used to perform a hypothesis test to assess the overall significance of the regression model.

The F-statistic is a measure of how well the regression model fits the data compared to a model with no predictor variables. A larger F-statistic indicates that the regression model is a better fit than a model with no predictors.

The degrees of freedom represent the number of independent pieces of information used to calculate the statistic. In the case of the LINEST function, the degrees of freedom represent:

- Degrees of freedom for the regression: This is the number of predictor variables in the model.
- Degrees of freedom for the residuals: This is the number of data points minus the number of parameters estimated (including the intercept if included).

Together, the F-statistic and the degrees of freedom are used to calculate a p-value, which determines the statistical significance of the regression model.

A low p-value (typically less than 0.05) indicates that the regression model is statistically significant, meaning that at least one of the predictor variables has a significant relationship with the dependent variable.

On the other hand, a high p-value (greater than 0.05) suggests that the regression model is not statistically significant, and the predictor variables do not have a significant relationship with the dependent variable.

By evaluating the p-value based on the F-statistic and degrees of freedom, you can determine whether the regression model is reliable for making predictions or if the predictor variables are not useful for explaining the variation in the dependent variable.

The F-test value from the LINEST function differs from the FTEST function:

- F-test (LINEST function): F statistic
- FTEST function: probability

**What is the the regression sum of squares?**

The regression sum of squares (RSS) is a measure of the discrepancy between the data and the estimated regression line. It represents the sum of the squared differences between the observed values of the dependent variable and the values predicted by the regression line.

RSS = Σ(ŷ_{i} - y_{mean})^{2}

**What is the the regression sum of squares?**

The residual sum of squares (RSS or SSR) is a measure of the discrepancy between the observed values of the dependent variable and the values predicted by the regression model. It represents the sum of the squared residuals, where a residual is the difference between an observed value and its corresponding predicted value from the regression line.

SSR = Σ(y_{i} - ŷ_{i})^{2}

- y
_{i}is the observed value of the dependent variable for observation i - ŷ
_{i}is the predicted value of the dependent variable for observation i, using the estimated regression line

## 4. LINEST Function Example

Array formula in cell range C31:D31:

To enter an array formula, type the formula in a cell then press and hold CTRL + SHIFT simultaneously, now press Enter once. Release all keys.

The formula bar now shows the formula with a beginning and ending curly bracket telling you that you entered the formula successfully. Don't enter the curly brackets yourself.

Array formula in cell range E19:F23:

The image shows a straight line based on the m and b parameters that the LINEST function calculated in cell C31 and D31.

The blue dots are the known y's in column B in the top image.

The SLOPE and INTERCEPT function lets you also find the parameters m and b needed to describe the straight line function.

y = mx + b

The TREND function calculates the y and x-values needed to plot the straight line function and the growth function for exponential curves.

### Functions in 'Statistical' category

The LINEST function function is one of 73 functions in the 'Statistical' category.

## How to comment

How to add a formula to your comment<code>Insert your formula here.</code>

Convert less than and larger than signsUse html character entities instead of less than and larger than signs.

< becomes < and > becomes >

How to add VBA code to your comment[vb 1="vbnet" language=","]

Put your VBA code here.

[/vb]

How to add a picture to your comment:Upload picture to postimage.org or imgur

Paste image link to your comment.

Contact OscarYou can contact me through this contact form