# Introduction In this lab, we will learn how to perform model selection with Gaussian Mixture Models (GMM) using information-theory criteria. Model selection concerns both the covariance type and the number of components in the model. We will use the Akaike Information Criterion (AIC) and the Bayes Information Criterion (BIC) to select the best model. We will generate two components by randomly sampling the standard normal distribution. One component is kept spherical yet shifted and re-scaled. The other one is deformed to have a more general covariance matrix. ## VM Tips After the VM startup is done, click the top left corner to switch to the **Notebook** tab to access Jupyter Notebook for practice. Sometimes, you may need to wait a few seconds for Jupyter Notebook to finish loading. The validation of operations cannot be automated because of limitations in Jupyter Notebook. If you face issues during learning, feel free to ask Labby. Provide feedback after the session, and we will promptly resolve the problem for you.
Click the virtual machine below to start practicing