What is considered a good AIC value?

AIC, or Akaike’s Information Criterion, is a measure of the relative quality of a statistical model. A good AIC value is one that is low, indicating that the model fits the data well while using as few parameters as possible. This is desirable because it means that the model is simple, and therefore more likely to make accurate predictions when applied to new data.


The Akaike information criterion (AIC) is a metric that is used to compare the fit of different regression models.

It is calculated as:

AIC = 2K – 2ln(L)

where:

  • K: The number of model parameters.
  • ln(L): The log-likelihood of the model. This tells us how likely the model is, given the data.

Once you’ve fit several regression models, you can compare the AIC value of each model. The model with the lowest AIC offers the best fit.

One question students often have about AIC is: What is considered a good AIC value?

The simple answer: There is no value for AIC that can be considered “good” or “bad” because we simply use AIC as a way to compare regression models. The model with the lowest AIC offers the best fit. The absolute value of the AIC value is not important.

For example, if Model 1 has an AIC value of 730.5 and Model 2 has an AIC value of 456.3, then Model 2 offers a better fit. The absolute values of the AIC are not important.

A useful reference on this topic comes from on page 402:

As with likelihood, the absolute value of AIC is largely meaningless (being determined by the arbitrary constant). As this constant depends on the data, AIC can be used to compare models fitted on identical samples.

 

The best model from the set of plausible models being considered is therefore the one with the smallest AIC value (the least information loss relative to the true model).

As noted in the textbook, the absolute value of the AIC is not important. We simply use AIC values to compare the fit of models and the model with the lowest AIC value is best.

How to Determine if a Model Fits a Dataset Well

The AIC value is a useful way to determine which regression model fits a dataset the best among a list of potential models, but it doesn’t actually quantify how well the model fits the data.

For example, a particular regression model might have the lowest AIC value among a list of potential models, but it may still be a poor fitting model.

To determine if a model fits a dataset well, we can use the following two metrics:

  • : A metric that quantifies the amount of bias in regression models.
  • : The proportion of the variance in the response variable that can be explained by the predictor variables in the model, adjusted for the number of predictor variables in the model.
  • First, identify the model with the lowest AIC value.
  • Then, fit this regression model to the data and calculate the Mallows’ Cp and adjusted R-squared of the model to quantify how well it actually fits the data.

This approach allows you to identify the best fitting model and quantify how well the model actually fits the data.

x