IBM Support

How can ICC reliability be higher for absolute agreement than for consistency?

Question & Answer


Question

I used the Reliability procedure in SPSS (Analyze->Scale->Reliability Analysis) and requested intraclass correlations (ICCs) with a 2-way mixed model. For comparison purposes, I ran this model once with the absolute agreement definition and once with the consistency definition. I was surprised to see that the ICC was higher for the absolute agreement than for the consistency agreement. As I considered the absolute agreement definition to be the more stringent definition, this result seems counterintuitive. Please provide some definitions of these criteria that will explain this result.

Answer

For definitions of ICCs for consistency and agreement, see the following article, which is an excellent (and probably essential) resource for understanding the ICC output from SPSS.

McGraw, K.O., & Wong, S.P. (1996a). Forming inferences about some intraclass correlation coefficients. Psychological Methods, 1, 30-46.

McGraw, K.O., & Wong, S.P. (1996b). "Correction to McGraw and Wong (1996)". Psychological Methods 1: 390.


(Note that the corrections in 1996b do not involve the formula cited below.)

Here are a few quotes from this article to highlight the distinctions between consistency and agreement ICCs.

"Understanding the conceptual difference between them begins by noting their formal distinction, which is in the definition of the ICC denominator. For consistency measures, column variance is excluded from denominator variance, and for absolute agreement it is not. Column variance is excluded from the denominators of consistency measures because it is deemed to be an irrelevant source of variance." (p. 33)

"In this case [absolute agreement], when measurements disagree in absolute value, regardless of the reason, they are viewed as disagreements. Thus, paired scores (2,4), (4,6), and (6,8) are in perfect agreement using a consistency definition [ICC(C,1)=1.00] but not an absolute agreement definition [ICC(A,1)=.67]." (p. 34)

The RELIABILITY algorithms are available via Help>Algorithms in IBM SPSS Statistics.

See the definitions for the single-measure ICC for a 2-way model under the consistency definition (Type C) and the corresponding definition under absolute agreement (Type A). The formula for the absolute agreement ICC has an extra term in the denominator:

k(MSbm - MSres)/W

where k is the number of variables in the scale. W is the sum of case weights and equals N if no weight variable is assigned. Usually, the MSbm (MS between measures) is larger than the MSres (MS residual) so that the denominator would be enlarged by the addition of this term and the reliability would be decreased, leading to a smaller reliability under absolute agreement than under consistency. However, if MSres is larger than MSbm, then this term will be negative, leading to a larger reliability under absolute agreement than under consistency.

[{"Product":{"code":"SSLVMB","label":"IBM SPSS Statistics"},"Business Unit":{"code":"BU059","label":"IBM Software w\/o TPS"},"Component":"Not Applicable","Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"Not Applicable","Edition":"","Line of Business":{"code":"LOB10","label":"Data and AI"}}]

Historical Number

25326

Document Information

Modified date:
16 April 2020

UID

swg21477361