Akaike information criterion, or AIC, is a way to measure how well a model fits some specific data. It measures the quality of a model by looking at the differences in fit between different models. Akaike Information Criterion assigns a score to each model, and the model with the lowest score is the best one to use. It works by comparing a complex model with a simpler model, and the difference in score between these two models gives a measure of the complexity of the model. Akaike Information Criterion is often used in research to help make decisions about which model is the best for the data.