What is Hedges’ g and what is its definition and example?

Hedges’ g is a statistical measure that is used to determine the effect size of a treatment or intervention in a research study. It is a standardized mean difference that compares the average scores of two groups on a continuous variable. It takes into account the variability within the groups and provides a more accurate estimate of the true effect size.

Hedges’ g is calculated by dividing the difference between the two group means by the pooled standard deviation. This results in a decimal value that represents the magnitude of the difference between the two groups.

For example, in a study comparing the effectiveness of two different teaching methods on student test scores, Hedges’ g can be used to determine the magnitude of the difference between the average scores of the two groups. A larger g value indicates a stronger effect size, while a smaller g value indicates a weaker effect.

What is Hedges’ g? (Definition & Example)


In , we often use to determine if there is a statistically significant difference between two groups.

However, while a p-value can tell us whether or not there is a statistically significant difference between two groups, an can tell us the size of this difference.

One of the most common ways to measure effect size is to use Hedges’ g, which is calculated as follows:

g = (x1x2) / √((n1-1)*s12 + (n2-1)*s22) / (n1+n2-2)

where:

  • x1, x2: The sample 1 mean and sample 2 mean, respectively
  • n1, n2: The sample 1 size and sample 2 size, respectively
  • s12, s22: The sample 1 variance and sample 2 variance, respectively

The following example shows how to calculate Hedges’ g for two samples.

Example: Calculating Hedge’s g

Suppose we have the following two samples:

Sample 1:

  • x1: 15.2
  • s1: 4.4
  • n1: 39

Sample 2:

  • x2: 14
  • s2: 3.6
  • n2: 34

Here is how to calculate Hedges’ g for these two samples:

  • g = (x1x2) / √((n1-1)*s12 + (n2-1)*s22) / (n1+n2-2)
  • g = (15.2 – 14) / √((39-1)*4.42 + (34-1)*3.62) / (39+34-2)
  • g = 1.2 / 4.04788
  • g = 0.29851

Hedges’ g turns out to be 0.29851.

Bonus: Use this to automatically calculate Hedges’ g for any two samples.

How to Interpret Hedges’ g

As a rule of thumb, here is how to interpret Hedge’s g:

  • 0.2 = Small effect size
  • 0.5 = Medium effect size
  • 0.8 = Large effect size

In our example, an effect size of 0.29851 would likely be considered a small effect size. This means that even if the difference between the two group means is statistically significant, the actual difference between the group means is trivial.

Hedges’ g vs. Cohen’s d

Another common way to measure effect size is known as , which uses the following formula:

d = (x1x2) / √(s12 + s22) / 2

The only difference between Cohen’s d and Hedges’ g is that Hedges’ g takes each sample size into consideration when calculating the overall effect size.

Thus, it’s recommended to use Hedge’s g to calculate effect size when the two sample sizes are not equal.

If the two sample sizes are equal then Hedges’ g and Cohen’s d will be the exact same value.

x