![Checkout](https://naologiccom.imgix.net/website-update/general/checkout.png?auto=compress&w=64&fm=png)
Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial Question
K-Fold Cross-Validation - Why use k-fold cross-validation instead of leave one out?
Answer
With k equal to the data size (n), the leave-one-out approach is really just a special case of k-Fold Cross-Validation. One reason why k-Fold Cross-Validation is better than the leave-one-out technique is the bias-variance trade-off. This procedure reduces the variance often seen with the leave one out technique, but it introduces a little amount of bias by keeping a very large validation set.