• Skip to secondary menu
  • Skip to main content
  • Skip to primary sidebar
  • My Store
  • Glossary
  • Home
  • About Me
  • Contact Me

Statistics By Jim

Making statistics intuitive

  • Graphs
  • Basics
  • Hypothesis Testing
  • Regression
  • ANOVA
  • Probability
  • Time Series
  • Fun

Fixed and Random factors

By Jim Frost

In ANOVA, factors are either fixed or random. In general, if the investigator controls the levels of a factor, the factor is fixed. The investigator gathers data for all factor levels she is interested in.

On the other hand, if the investigator randomly sampled the levels of a factor from a population, the factor is random. A random factor has many possible levels and the investigator is interested in all of them. However, she can only collect a random sample of some factor levels.

Suppose you have a factor called “operator,” and it has ten levels. If you intentionally select these ten operators and want your results to apply to just these operators, then the factor is fixed. However, if you randomly sample ten operators from a larger number of operators, and you want your results to apply to all operators, then the factor is random.

These two types of factors require different types of analyses. The conclusions that you draw from an analysis can be incorrect if you specify the type of factor incorrectly.

Related

Synonyms:
Random factors
Related Articles:
  • Repeated Measures Designs: Benefits and an ANOVA Example

Primary Sidebar

Meet Jim

I’ll help you intuitively understand statistics by focusing on concepts and using plain English so you can concentrate on understanding your results.

Read More...

Buy My Introduction to Statistics eBook!

New! Buy My Hypothesis Testing eBook!

Buy My Regression eBook!

Subscribe by Email

Enter your email address to receive notifications of new posts by email.

    I won't send you spam. Unsubscribe at any time.

    Follow Me

    • FacebookFacebook
    • RSS FeedRSS Feed
    • TwitterTwitter
    • Popular
    • Latest
    Popular
    • How To Interpret R-squared in Regression Analysis
    • How to Interpret P-values and Coefficients in Regression Analysis
    • Measures of Central Tendency: Mean, Median, and Mode
    • Normal Distribution in Statistics
    • Multicollinearity in Regression Analysis: Problems, Detection, and Solutions
    • How to Interpret the F-test of Overall Significance in Regression Analysis
    • Understanding Interaction Effects in Statistics
    Latest
    • How to Find the P value: Process and Calculations
    • Sampling Methods: Different Types in Research
    • Beta Distribution: Uses, Parameters & Examples
    • Geometric Distribution: Uses, Calculator & Formula
    • What is Power in Statistics?
    • Conditional Distribution: Definition & Finding
    • Marginal Distribution: Definition & Finding

    Recent Comments

    • Jim Frost on Introduction to Bootstrapping in Statistics with an Example
    • Jim Frost on How To Interpret R-squared in Regression Analysis
    • Jim Frost on Comparing Regression Lines with Hypothesis Tests
    • Jim Frost on Poisson Distribution: Definition & Uses
    • Jim Frost on Interquartile Range (IQR): How to Find and Use It

    Copyright © 2022 · Jim Frost · Privacy Policy