The training error decreases as the model complexity increases.
Correct Answer: A
As the model complexity increases (for example, by adding more layers to a neural network or increasing the depth of a decision tree), the training error tends to decrease. This is because more complex models are able to fit the training data better, possibly even capturing noise. However, increasing complexity often leads to overfitting, where the model performs well on the training data but poorly on unseen test data.
The relationship between model complexity and performance is covered extensively in Huawei HCIA AI's discussion of overfitting and underfitting and how model generalization is affected by increasing model complexity.
HCIA AI
Reference:
Machine Learning Overview: Explains model complexity and its effect on training and testing error curves.
Deep Learning Overview: Discusses the balance between model capacity, overfitting, and underfitting in deep learning architectures.