Mobile QR Code QR CODE

REFERENCES

1 
M. Majumder, “Artificial Neural Network,” pp. 49-54, 2015.DOI
2 
“Loss function - Wikipedia.” (accessed Mar. 29, 2023).URL
3 
“Gradient descent - Wikipedia.” (accessed Mar. 30, 2023).URL
4 
“Backpropagation - Wikipedia.” (accessed Mar. 30, 2023).URL
5 
S. A. Kalogirou, “Applications of artificial neural-networks for energy systems,” Appl. Energy, vol. 67, no. 1-2, pp. 17-35, 2000.DOI
6 
L. da Fontoura Costa and G. Travieso, “Fundamentals of neural networks,” Neurocomputing, vol. 10, no. 2, pp. 205-207, 1996.DOI
7 
G. Inghelbrecht, R. Pintelon, and K. Barbe, “Large-Scale Regression: A Partition Analysis of the Least Squares Multisplitting,” IEEE Trans. Instrum. Meas., vol. 69, no. 6, pp. 2635-2647, 2020.DOI
8 
L. Wang and C. Pan, “Groupwise Retargeted Least-Squares Regression,” IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 4, pp. 1352-1358, 2018.DOI
9 
N. Dengen, Haviluddin, L. Andriyani, M. Wati, E. Budiman, and F. Alameka, “Medicine Stock Forecasting Using Least Square Method,” Proc. - 2nd East Indones. Conf. Comput. Inf. Technol. Internet Things Ind. EIConCIT 2018, no. Ci, pp. 100-103, 2018.DOI
10 
S. Zhao, B. Zhang, and S. Li, “Discriminant and Sparsity Based Least Squares Regression with l1 Regularization for Feature Representation,” ICASSP, IEEE Int. Conf. Acoust. Speech Signal Process. - Proc., vol. 2020-May, pp. 1504-1508, 2020.DOI
11 
R. O. Ogundokun, A. F. Lukman, G. B. M. Kibria, J. B. Awotunde, and B. B. Aladeitan, “Predictive modelling of COVID-19 confirmed cases in Nigeria,” Infect. Dis. Model., vol. 5, pp. 543-548, 2020.DOI
12 
S. Rath, A. Tripathy, and A. Ranjan, “Since January 2020 Elsevier has created a COVID-19 resource centre with free information in English and Mandarin on the novel coronavirus COVID-19. The COVID-19 resource centre is hosted on Elsevier Connect, the company’s public news and information,” no. January, 2020.URL
13 
L. Qin et al., “Prediction of number of cases of 2019 novel coronavirus (COVID-19) using social media search index,” Int. J. Environ. Res. Public Health, vol. 17, no. 7, 2020.DOI
14 
R. Gan, J. Tan, L. Mo, Y. Li, and D. Huang, “Using Partial Least Squares Regression to Fit Small Data of H7N9 Incidence Based on the Baidu Index,” IEEE Access, vol. 8, pp. 60392-60400, 2020.DOI
15 
K. Jampachaisri, K. Tinochai, S. Sukparungsee, and Y. Areepong, “Empirical bayes based on squared error loss and precautionary loss functions in sequential sampling plan,” IEEE Access, vol. 8, pp. 51460-51469, 2020.DOI
16 
P. Anand and A. Bharti, “A combined reward-penalty loss function based extreme learning machine for binary classification,” 2019 2nd Int. Conf. Adv. Comput. Commun. Paradig. ICACCP 2019, 2019.DOI
17 
S. Ma, D. Li, T. Hu, Y. Xing, Z. Yang, and W. Nai, “Huber Loss Function Based on Variable Step Beetle Antennae Search Algorithm with Gaussian Direction,” Proc. - 2020 12th Int. Conf. Intell. Human-Machine Syst. Cybern. IHMSC 2020, vol. 1, pp. 248-251, 2020.DOI
18 
B. Sung Lee, R. Phattharaphon, S. Yean, J. Liu, and M. Shakya, “Euclidean Distance based Loss Function for Eye-Gaze Estimation,” 2020 IEEE Sensors Appl. Symp. SAS 2020 - Proc., 2020.DOI
19 
T. H. Phan and K. Yamamoto, “Resolving Class Imbalance in Object Detection with Weighted Cross Entropy Losses,” arXiv, 2020.DOI
20 
Di. Rengasamy, B. Rothwell, and G. P. Figueredo, “Asymmetric Loss Functions for Deep Learning Early Predictions of Remaining Useful Life in Aerospace Gas Turbine Engines,” Proc. Int. Jt. Conf. Neural Networks, 2020.DOI
21 
A. Gomez-Alanis, J. A. Gonzalez-Lopez, and A. M. Peinado, “A Kernel Density Estimation Based Loss Function and its Application to ASV-Spoofing Detection,” IEEE Access, vol. 8, no. i, pp. 108530-108543, 2020.DOI
22 
L. Xu, X. Zhou, X. Lin, Y. Ren, Y. Qin, and J. Liu, “A New Loss Function for Traffic Classification Task on Dramatic Imbalanced Datasets,” IEEE Int. Conf. Commun., vol. 2020-June, 2020.DOI
23 
H. Seo, M. Bassenne, and L. Xing, “Closing the Gap between Deep Neural Network Modeling and Biomedical Decision-Making Metrics in Segmentation via Adaptive Loss Functions,” IEEE Trans. Med. Imaging, vol. 40, no. 2, pp. 585-593, 2021.DOI
24 
N. Zhang et al., “Robust T-S Fuzzy Model Identification Approach Based on FCRM Algorithm and L1-Norm Loss Function,” IEEE Access, vol. 8, pp. 33792-33805, 2020.DOI
25 
Z. Li, J. F. Cai, and K. Wei, “Towards the optimal construction of a loss function without spurious local minima for solving quadratic equations,” arXiv, vol. 66, no. 5, pp. 3242-3260, 2018.DOI
26 
D. Zou, Y. Cao, D. Zhou, and Q. Gu, “Gradient descent optimizes over-parameterized deep ReLU networks,” Mach. Learn., vol. 109, no. 3, pp. 467-492, 2020.DOI
27 
J. Flynn et al., “Deepview: View synthesis with learned gradient descent,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2019-June, pp. 2362-2371, 2019.DOI
28 
J. Lee et al., “Wide neural networks of any depth evolve as linear models under gradient descent,” J. Stat. Mech. Theory Exp., vol. 2020, no. 12, 2020.DOI
29 
T. Lin, C. Jin, and M. I. Jordan, “On gradient descent ascent for nonconvex-concave minimax problems,” arXiv, 2019.URL
30 
M. M. Amiri and D. Gündüz, “Machine learning at the wireless edge: Distributed stochastic gradient descent over-the-air,” arXiv, vol. 68, pp. 2155-2169, 2019.DOI
31 
S. Goel, A. Gollakota, Z. Jin, S. Karmalkar, and A. Klivans, “Superpolynomial Lower Bounds for Learning One-Layer Neural Networks using Gradient Descent,” arXiv, 2020.URL
32 
E. Amid, M. K. Warmuth, J. Abernethy, and S. Agarwal, “Winnowing with Gradient Descent,” Proc. Mach. Learn. Res., vol. 125, pp. 1-20, 2020.URL
33 
M. Farajtabar, N. Azizan, A. Mott, and A. Li, “Orthogonal gradient descent for continual learning,” arXiv, vol. 108, 2019.URL
34 
M. Li, M. Soltanolkotabi, and S. Oymak, “Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks,” arXiv, 2019.URL
35 
C. Cheng, N. Emirov, and Q. Sun, “Preconditioned gradient descent algorithm for inverse filtering on spatially distributed networks,” arXiv, vol. 27, pp. 1834-1838, 2020..DOI
36 
W. Sun and C. Huang, “A carbon price prediction model based on secondary decomposition algorithm and optimized back propagation neural network,” J. Clean. Prod., vol. 243, p. 118671, 2020..DOI
37 
A. Mukherjee, D. K. Jain, P. Goswami, Q. Xin, L. Yang, and J. J. P. C. Rodrigues, “Back Propagation Neural Network Based Cluster Head Identification in MIMO Sensor Networks for Intelligent Transportation Systems,” IEEE Access, vol. 8, pp. 28524-28532, 2020..DOI
38 
L. Wang, P. Wang, S. Liang, Y. Zhu, J. Khan, and S. Fang, “Monitoring maize growth on the North China Plain using a hybrid genetic algorithm-based back-propagation neural network model,” Comput. Electron. Agric., vol. 170, no. 46, p. 105238, 2020.DOI
39 
Z. Jie and M. Qiurui, “Establishing a Genetic Algorithm-Back Propagation model to predict the pressure of girdles and to determine the model function,” Text. Res. J., vol. 90, no. 21-22, pp. 2564-2578, 2020.DOI
40 
W. Yang, X. Liu, K. Wang, J. Hu, G. Geng, and J. Feng, “Sex determination of three-dimensional skull based on improved backpropagation neural network,” Comput. Math. Methods Med., vol. 2019, 2019.DOI
41 
L. P. Huelsman, for Engineers, no. November. McGraw-Hill Science/Engineering/Math, 1990.URL
42 
“Analysis of the vulnerability estimation and neighbor value prediction in autonomous systems | Scientific Reports.” (accessed Mar. 30, 2023).URL
43 
J. Brownlee, “Loss and Loss Functions for Training Deep Learning Neural Networks,” Mach. Learn. Mastery, pp. 1-19, 2019,URL
44 
H. D. Learning et al., “Perceptron,” pp. 1-9, 2020,URL
45 
G. C. Mqef, “Mathematics for,” Quant. Lit. Why Numer. matters Sch., no. c, pp. 533-540, 2009,URL
46 
S. Shalev-Shwartz and S. Ben-David, Understanding machine learning: From theory to algorithms, vol. 9781107057. 2013.DOI
47 
G. Lawson, “Maxima and minima,” Edinburgh Math. Notes, vol. 32, pp. xxii-xxiii, 1940,DOI
48 
L. Multipliers, “Paul ’ s Online Notes Section 3-5: Lagrange Multipliers,” pp. 1-15, 2020.URL