||Development and Optimization of Deep Belief Networks Applied for Academic Performance Prediction with Larger Datasets
||(Phauk Sokkhey) ; (Takeo Okazaki)
|| Student performance; Deep learning; Restricted Boltzmann machine; Deep belief network; Feature selection; Hyperparameters; Regularization
||Deep learning has recently attracted increasing interest for several applications, and great progress has been made. This paper introduces a novel application of a deep learning framework called deep belief networks (DBNs) that can be used to predict the predicting academic performance of larger datasets. First, unsupervised training is considered a so-called pre-training section. The stacked restricted Boltzmann machine (RBM) was trained to obtain the trained weights instead of random initial weights. Subsequently, supervised learning was adopted using back-propagation in the fine-tuning section to classify the student performance levels. An optimization approach for improving the classification performance of the proposed DBN was proposed. The optimization approach consisted of using a feature selection method to obtain the optimal feature subset, optimizing the DBN model with the optimal values of the hyperparameters, and applying the L2-regularization method for weight decay. The experiment was carried out with two phases. Phase1 was implemented with an actual dataset. Phase 2 was then implemented with four artificial datasets of increasing sizes. Many experiments were performed independently on each dataset. With a larger dataset, the improved DBN generated the highest accuracy and lowest root mean square error with more accuracy and effectiveness than the other proposed algorithms.