You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry to bother you!
I am wondering did you train a MLP for each fold?
If there were 10 MLP models for 10 folds, why the models could fit well on the entire dataset?
I am really confused about it. Looking forward to your reply, thank you!
The text was updated successfully, but these errors were encountered:
for each fold, we trained the MLP with [dataset - fold], and computed the accuracy for [fold].
We reported the average accuracy of the 10 folds. I hope it makes things clearer.
Thanks for your quick reply.
I'm a newbie to machine learning so maybe I am wrong.
I read the code and found you pre-trained two AEs in every individual folds, and fine-tuned the MLP for every individual folds. It seems like you trained 10 independent MLPs and computed the mean value of accuracy of those 10 mlps as average accuracy.
My question is how can I pre-train AEs and fine-tune the model using the cross-validation method. I mean the parameters of the model should be updated constantly with each fold.
Thanks again for your time.
Sorry to bother you!
I am wondering did you train a MLP for each fold?
If there were 10 MLP models for 10 folds, why the models could fit well on the entire dataset?
I am really confused about it. Looking forward to your reply, thank you!
The text was updated successfully, but these errors were encountered: