Intraclass Correlation Coefficient (ICC) Calculator
Calculate reliability from ANOVA mean squares. Select the ICC model, enter your values, and click Calculate ICC.
What Is an ICC Calculator?
An ICC calculator estimates the Intraclass Correlation Coefficient, a reliability index that shows how strongly measurements in the same group resemble each other. In plain English: if multiple raters score the same person (or if the same instrument measures repeatedly), ICC helps you quantify consistency and agreement.
ICC values typically range from negative values up to 1.0. Higher values indicate better reliability. Researchers in psychology, medicine, education, sports science, and machine learning commonly report ICC when evaluating observer agreement or test reliability.
Which ICC Model Should You Use?
ICC(1,1) and ICC(1,k): One-Way Random Effects
Use this when each target is rated by a random sample of raters and you only model target variability and residual variability. ICC(1,1) is for a single rating, while ICC(1,k) is for the average of k ratings.
ICC(2,1) and ICC(2,k): Two-Way Random Effects, Absolute Agreement
Use this when both targets and raters are random effects and you care about exact agreement. If one rater consistently scores 5 points higher than another, absolute agreement penalizes that difference.
ICC(3,1) and ICC(3,k): Two-Way Mixed Effects, Consistency
Use this when raters are fixed (specific raters in your study) and you care about rank-order consistency rather than exact matching. This is common when the same panel of experts always rates all subjects.
Inputs Required by This Tool
- n: number of targets/subjects (required for ICC(2,*) and ICC(3,*))
- k: number of raters or repeated measurements
- MSB / MSW: one-way ANOVA mean squares
- MSR / MSC / MSE: two-way ANOVA mean squares
If your stats software outputs ANOVA tables, you can copy these mean square values directly into this calculator.
Interpretation Guidelines
There is no universal rule, but a common guideline is:
- < 0.50: poor reliability
- 0.50 to 0.75: moderate reliability
- 0.75 to 0.90: good reliability
- > 0.90: excellent reliability
Always interpret ICC in context. A value considered acceptable in early-stage exploratory work may be too low for high-stakes clinical or engineering decisions.
Formulas Used
One-way models
- ICC(1,1) = (MSB − MSW) / (MSB + (k − 1)MSW)
- ICC(1,k) = (MSB − MSW) / MSB
Two-way models
- ICC(2,1) = (MSR − MSE) / (MSR + (k − 1)MSE + k(MSC − MSE)/n)
- ICC(2,k) = (MSR − MSE) / (MSR + (MSC − MSE)/n)
- ICC(3,1) = (MSR − MSE) / (MSR + (k − 1)MSE)
- ICC(3,k) = (MSR − MSE) / MSR
Common Mistakes to Avoid
- Choosing a model based on convention rather than study design.
- Confusing consistency (ICC3) with absolute agreement (ICC2).
- Reporting only point estimates without confidence intervals.
- Using too few targets; ICC can be unstable in small samples.
- Ignoring systematic rater bias when agreement is the real goal.
Practical Notes
This page is great for quick checks and learning. For publication-grade analysis, also compute confidence intervals and verify assumptions using statistical software such as R, SPSS, SAS, Python, or Stata.
Still, if you already have mean squares from an ANOVA table, this ICC calculator gives a fast and transparent reliability estimate in seconds.