• Skip to secondary menu
  • Skip to main content
  • Skip to primary sidebar
  • Home
  • About Me
  • Contact Me

Statistics By Jim

Making statistics intuitive

  • Basics
  • Hypothesis Testing
  • Regression
  • ANOVA
  • Probability
  • Time Series
  • Fun
  • Glossary
  • My Store

Regression

Understanding Historians’ Rankings of U.S. Presidents using Regression Models

By Jim Frost 6 Comments

Historians rank the U.S. Presidents from best to worse using all the historical knowledge at their disposal. Frequently, groups, such as C-Span, ask these historians to rank the Presidents and average the results together to help reduce bias. The idea is to produce a set of rankings that incorporates a broad range of historians, a vast array of information, and a historical perspective. These rankings include informed assessments of each President’s effectiveness, leadership, moral authority, administrative skills, economic management, vision, and so on. [Read more…] about Understanding Historians’ Rankings of U.S. Presidents using Regression Models

Filed Under: Regression Tagged With: analysis example, graphs, interpreting results

Proxy Variables: The Good Twin of Confounding Variables

By Jim Frost 6 Comments

Proxy variables are easily measurable variables that analysts include in a model in place of a variable that cannot be measured or is difficult to measure. Proxy variables can be something that is not of any great interest itself, but has a close correlation with the variable of interest. [Read more…] about Proxy Variables: The Good Twin of Confounding Variables

Filed Under: Regression Tagged With: conceptual

Variance Inflation Factors (VIFs)

By Jim Frost 15 Comments

Variance Inflation Factors (VIFs) measure the correlation among independent variables in least squares regression models. Statisticians refer to this type of correlation as multicollinearity. Excessive multicollinearity can cause problems for regression models.

In this post, I focus on VIFs and how they detect multicollinearity, why they’re better than pairwise correlations, how to calculate VIFs yourself, and interpreting VIFs. If you need a refresher about the types of problems that multicollinearity causes and how to fix them, read my post: Multicollinearity: Problems, Detection, and Solutions. [Read more…] about Variance Inflation Factors (VIFs)

Filed Under: Regression Tagged With: assumptions, conceptual, interpreting results

How to Perform Regression Analysis using Excel

By Jim Frost 14 Comments

Excel can perform various statistical analyses, including regression analysis. It is a great option because nearly everyone can access Excel. This post is an excellent introduction to performing and interpreting regression analysis, even if Excel isn’t your primary statistical software package.

[Read more…] about How to Perform Regression Analysis using Excel

Filed Under: Regression Tagged With: analysis example, Excel, interpreting results

New eBook Release! Regression Analysis: An Intuitive Guide

By Jim Frost 79 Comments

I’m thrilled to announce the release of my first ebook! Regression Analysis: An Intuitive Guide for Using and Interpreting Linear Models.

If you like the clear writing style I use on this website, you’ll love this book! The end of the post displays the entire table of contents! You can also download a Free Sample that includes the complete Table of Contents and the first two chapters. Go to My Store to download the ebook sample. [Read more…] about New eBook Release! Regression Analysis: An Intuitive Guide

Filed Under: Regression Tagged With: ebook

Confounding Variables Can Bias Your Results

By Jim Frost 61 Comments

Omitted variable bias occurs when a regression model leaves out relevant independent variables, which are known as confounding variables. This condition forces the model to attribute the effects of omitted variables to variables that are in the model, which biases the coefficient estimates. [Read more…] about Confounding Variables Can Bias Your Results

Filed Under: Regression Tagged With: assumptions, conceptual

The Gauss-Markov Theorem and BLUE OLS Coefficient Estimates

By Jim Frost 24 Comments

The Gauss-Markov theorem states that if your linear regression model satisfies the first six classical assumptions, then ordinary least squares (OLS) regression produces unbiased estimates that have the smallest variance of all possible linear estimators. [Read more…] about The Gauss-Markov Theorem and BLUE OLS Coefficient Estimates

Filed Under: Regression Tagged With: assumptions

7 Classical Assumptions of Ordinary Least Squares (OLS) Linear Regression

By Jim Frost 111 Comments


Ordinary Least Squares (OLS) is the most common estimation method for linear models—and that’s true for a good reason. As long as your model satisfies the OLS assumptions for linear regression, you can rest easy knowing that you’re getting the best possible estimates. [Read more…] about 7 Classical Assumptions of Ordinary Least Squares (OLS) Linear Regression

Filed Under: Regression Tagged With: assumptions

Regression Tutorial with Analysis Examples

By Jim Frost 78 Comments


Regression analysis mathematically describes the relationship between independent variables and the dependent variable. It also allows you to predict the mean value of the dependent variable when you specify values for the independent variables. In this regression tutorial, I gather together a wide range of posts that I’ve written about regression analysis. My tutorial helps you go through the regression content in a systematic and logical order. [Read more…] about Regression Tutorial with Analysis Examples

Filed Under: Regression Tagged With: guide

Choosing the Correct Type of Regression Analysis

By Jim Frost 449 Comments


Regression analysis mathematically describes the relationship between a set of independent variables and a dependent variable. There are numerous types of regression models that you can use. This choice often depends on the kind of data you have for the dependent variable and the type of model that provides the best fit. In this post, I cover the more common types of regression analyses and how to decide which one is right for your data. [Read more…] about Choosing the Correct Type of Regression Analysis

Filed Under: Regression Tagged With: choosing analysis, data types

Understanding Interaction Effects in Statistics

By Jim Frost 411 Comments


Interaction effects occur when the effect of one variable depends on the value of another variable. Interaction effects are common in regression analysis, ANOVA, and designed experiments. In this blog post, I explain interaction effects, how to interpret them in statistical designs, and the problems you will face if you don’t include them in your model. [Read more…] about Understanding Interaction Effects in Statistics

Filed Under: Regression Tagged With: analysis example, conceptual, graphs, interpreting results

When Should I Use Regression Analysis?

By Jim Frost 159 Comments

Use regression analysis to describe the relationships between a set of independent variables and the dependent variable. Regression analysis produces a regression equation where the coefficients represent the relationship between each independent variable and the dependent variable. You can also use the equation to make predictions.

As a statistician, I should probably tell you that I love all statistical analyses equally—like parents with their kids. But, shhh, I have secret! Regression analysis is my favorite because it provides tremendous flexibility, which makes it useful in so many different circumstances. In fact, I’ve described regression analysis as taking correlation to the next level!

In this blog post, I explain the capabilities of regression analysis, the types of relationships it can assess, how it controls the variables, and generally why I love it! You’ll learn when you should consider using regression analysis. [Read more…] about When Should I Use Regression Analysis?

Filed Under: Regression Tagged With: conceptual

Using Log-Log Plots to Determine Whether Size Matters

By Jim Frost 3 Comments

Log-log plots display data in two dimensions where both axes use logarithmic scales. When one variable changes as a constant power of another, a log-log graph shows the relationship as a straight line. In this post, I’ll show you why these graphs are valuable and how to interpret them. [Read more…] about Using Log-Log Plots to Determine Whether Size Matters

Filed Under: Regression Tagged With: analysis example, graphs, interpreting results

When Do You Need to Standardize the Variables in a Regression Model?

By Jim Frost 67 Comments

Standardization is the process of putting different variables on the same scale. In regression analysis, there are some scenarios where it is crucial to standardize your independent variables or risk obtaining misleading results.

In this blog post, I show when and why you need to standardize your variables in regression analysis. Don’t worry, this process is simple and helps ensure that you can trust your results. In fact, standardizing your variables can reveal essential findings that you would otherwise miss! [Read more…] about When Do You Need to Standardize the Variables in a Regression Model?

Filed Under: Regression Tagged With: analysis example, interpreting results

Why Are There No P Values in Nonlinear Regression?

By Jim Frost 27 Comments

Nonlinear regression analysis cannot calculate P values for the independent variables in your model. Why not? And, what do you use instead? Those are the topics of this blog post. [Read more…] about Why Are There No P Values in Nonlinear Regression?

Filed Under: Regression Tagged With: conceptual

Five Regression Analysis Tips to Avoid Common Problems

By Jim Frost 15 Comments

Image of lightbulb to represent the regression tips in this article.Regression is a very powerful statistical analysis. It allows you to isolate and understand the effects of individual variables, model curvature and interactions, and make predictions. Regression analysis offers high flexibility but presents a variety of potential pitfalls. Great power requires great responsibility!

In this post, I offer five tips that will not only help you avoid common problems but also make the modeling process easier. I’ll close by showing you the difference between the modeling process that a top analyst uses versus the procedure of a less rigorous analyst. [Read more…] about Five Regression Analysis Tips to Avoid Common Problems

Filed Under: Regression Tagged With: conceptual

Understand Precision in Predictive Analytics to Avoid Costly Mistakes

By Jim Frost 8 Comments

Precision in predictive analytics refers to how close the model’s predictions are to the observed values. The more precise the model, the closer the data points are to the predictions. When you have an imprecise model, the observations tend to be further away from the predictions, thereby reducing the usefulness of the predictions. If you have a model that is not sufficiently precise, you risk making costly mistakes! [Read more…] about Understand Precision in Predictive Analytics to Avoid Costly Mistakes

Filed Under: Regression Tagged With: analysis example, conceptual, graphs, interpreting results

Heteroscedasticity in Regression Analysis

By Jim Frost 57 Comments

Heteroscedasticity means unequal scatter. In regression analysis, we talk about heteroscedasticity in the context of the residuals or error term. Specifically, heteroscedasticity is a systematic change in the spread of the residuals over the range of measured values. Heteroscedasticity is a problem because ordinary least squares (OLS) regression assumes that all residuals are drawn from a population that has a constant variance (homoscedasticity).

To satisfy the regression assumptions and be able to trust the results, the residuals should have a constant variance. In this blog post, I show you how to identify heteroscedasticity, explain what produces it, the problems it causes, and work through an example to show you several solutions. [Read more…] about Heteroscedasticity in Regression Analysis

Filed Under: Regression Tagged With: assumptions, conceptual, graphs

How to Choose Between Linear and Nonlinear Regression

By Jim Frost 31 Comments

As you fit regression models, you might need to make a choice between linear and nonlinear regression models. The field of statistics can be weird. Despite their names, both forms of regression can fit curvature in your data. So, how do you choose? In this blog post, I show you how to choose between linear and nonlinear regression models. [Read more…] about How to Choose Between Linear and Nonlinear Regression

Filed Under: Regression Tagged With: analysis example, assumptions, choosing analysis, conceptual, interpreting results

Model Specification: Choosing the Correct Regression Model

By Jim Frost 50 Comments

Model specification is the process of determining which independent variables to include and exclude from a regression equation. How do you choose the best regression model? The world is complicated, and trying to explain it with a small sample doesn’t help. In this post, I’ll show you how to select the correct model. I’ll cover statistical methods, difficulties that can arise, and provide practical suggestions for selecting your model. Often, the variable selection process is a mixture of statistics, theory, and practical knowledge. [Read more…] about Model Specification: Choosing the Correct Regression Model

Filed Under: Regression Tagged With: conceptual

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

Primary Sidebar

Meet Jim

I’ll help you intuitively understand statistics by focusing on concepts and using plain English so you can concentrate on understanding your results.

Read More…

Buy My Introduction to Statistics eBook!

New! Buy My Hypothesis Testing eBook!

Buy My Regression eBook!

Subscribe by Email

Enter your email address to receive notifications of new posts by email.

    I won't send you spam. Unsubscribe at any time.

    Follow Me

    • FacebookFacebook
    • RSS FeedRSS Feed
    • TwitterTwitter
    • Popular
    • Latest
    Popular
    • How To Interpret R-squared in Regression Analysis
    • How to Interpret P-values and Coefficients in Regression Analysis
    • Measures of Central Tendency: Mean, Median, and Mode
    • Normal Distribution in Statistics
    • Multicollinearity in Regression Analysis: Problems, Detection, and Solutions
    • How to Interpret the F-test of Overall Significance in Regression Analysis
    • Understanding Interaction Effects in Statistics
    Latest
    • Chebyshev’s Theorem in Statistics
    • Using Permutations to Calculate Probabilities
    • Understanding Historians’ Rankings of U.S. Presidents using Regression Models
    • Spearman’s Correlation Explained
    • Effect Sizes in Statistics
    • Proxy Variables: The Good Twin of Confounding Variables
    • Multiplication Rule for Calculating Probabilities

    Recent Comments

    • Jim Frost on Understanding Historians’ Rankings of U.S. Presidents using Regression Models
    • Tony on Understanding Historians’ Rankings of U.S. Presidents using Regression Models
    • Jim Frost on Time Series Analysis Introduction
    • Jim Frost on Curve Fitting using Linear and Nonlinear Regression
    • Jim Frost on Using Post Hoc Tests with ANOVA

    Copyright © 2021 · Jim Frost · Privacy Policy