7 important Cross-validation techniques: when to use them?

Toshiba Kamruzzaman
1 min readAug 18, 2020

--

Cross validation is a technique which is used to evaluate a machine learning model and estimate its performance on unseen data.

At first we divide our data into two parts,70% for training and 30% for testing. Then we further split randomly this training portion(the 70% portion) into two more segments-using a subset to learn or train the algorithm and remaining data for adjusting the hypermeter.

Advantage of using Cross-Validation

In Cross-Validation we use the validation set also for the training , therefore no data wastage happens.

Common cross-validation techniques include:-

  • k-fold cross-validation technique
  • Holdout cross-validation technique
  • Leave out cross-validation technique
  • Repeated random sub-sampling cross-validation technique
  • Stratify cross-validation technique
  • Resubstitution cross-validation technique

--

--