0% found this document useful (0 votes)
61 views

List of Deep Learning

The document provides a list of the most popular deep learning models, including Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Denoising Autoencoders (DAE), Deep Belief Networks (DBNs), and Long Short-Term Memory (LSTM). It describes each model and lists some of their notable applications, such as image processing for CNNs, sequence and pattern recognition for RNNs, dimensionality reduction and feature learning for DAEs. The document aims to introduce these deep learning methods and their applications in a brief yet informative way.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views

List of Deep Learning

The document provides a list of the most popular deep learning models, including Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Denoising Autoencoders (DAE), Deep Belief Networks (DBNs), and Long Short-Term Memory (LSTM). It describes each model and lists some of their notable applications, such as image processing for CNNs, sequence and pattern recognition for RNNs, dimensionality reduction and feature learning for DAEs. The document aims to introduce these deep learning methods and their applications in a brief yet informative way.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

List of Deep Learning Models

Amir Mosavi1,2(&) , Sina Ardabili3 ,


and Annamária R. Várkonyi-Kóczy4
1
Institue of Automation, Kalman Kando Faculty of Electrical Engineering,
Obuda University, Budapest, Hungary
[email protected]
2
School of the Built Environment, Oxford Brookes University, Oxford OX3
0BP, UK
3
Institute of Advanced Studies Koszeg, Koszeg, Hungary
4
Department of Mathematics and Informatics, J. Selye University, Komarno,
Slovakia

Abstract. Deep learning (DL) algorithms have recently emerged from machine
learning and soft computing techniques. Since then, several deep learning
(DL) algorithms have been recently introduced to scientific communities and are
applied in various application domains. Today the usage of DL has become
essential due to their intelligence, efficient learning, accuracy and robustness in
model building. However, in the scientific literature, a comprehensive list of DL
algorithms has not been introduced yet. This paper provides a list of the most
popular DL algorithms, along with their applications domains.

Keywords: Deep learning  Machine learning 


Convolutional neural networks (CNN)  Recurrent neural networks (RNN) 
Denoising autoencoder (DAE)  Deep belief networks (DBNs) 
Long short-term memory (LSTM)

1 Introduction

There has been an enormous evolution in system modeling and intelligence after
introducing the early models for deep learning [1–8]. Deep learning methods very fast
emerged and expanded applications in various scientific and engineering domains.
Health informatics, energy, urban informatics, safety, security, hydrological systems
modeling, economic, bioinformatics, and computational mechanics have been among
the early application domains of deep learning. State of the art surveys on the data-
driven methods and machine learning algorithms, e.g., [9–26], indicates that deep
learning, along with the ensemble and hybrid machine learning methods are the future
of data science. Further comparative studies, e.g., [26–42], report that deep learning
models and hybrid machine learning models often outperform conventional machine
learning models. Figure 1 represents the rapid rise in the applications of various deep
learning methods during the past five years.
Deep learning methods are fast evolving for higher performance. Literature
includes adequate review papers on the progressing algorithms in particular application
domains, e.g., renewable energy forecasting, cardiovascular image analysis,
© Springer Nature Switzerland AG 2020
A. R. Várkonyi-Kóczy (Ed.): INTER-ACADEMIA 2019, LNNS 101, pp. 202–214, 2020.
https://doi.org/10.1007/978-3-030-36841-8_20
List of Deep Learning Models 203

super-resolution imaging, radiology, 3D sensed data classification, 3D sensed data


classification, multimedia analytics, sentiment classification, text detection, trans-
portation systems, activity recognition in radar, hyperspectral, medical ultrasound
analysis, image cytometry, and apache spark [43–59]. However, a simplified list of
deep learning methods has not been communicated so far. Thus, there is a gap in
research in introducing the deep learning methods and summarize the methods and
application in a brief, yet communicative paper. Consequently, this paper aims at
providing a comprehensive list of the most popular deep learning methods and their
notable applications. In every section, one deep learning method is introduced and the
notable applications related to that method are listed. The description of each deep
learning method and the function of each building block is explained.

Fig. 1. The rapid increase of using DL models in various application domains (source: web of
science)

2 Deep Learning Methods

Convolutional neural network (CNN) Recurrent neural network (RNN), Denoising


autoencoder (DAE), deep belief networks (DBNs), Long Short-Term Memory (LSTM)
are the most popular deep learning methods have been widely used. In this section, the
description of each method is described along with the notable applications.

2.1 Convolutional Neural Network (CNN)


CNN is one of the most known architectures of DL techniques. This technique is
generally employed for image processing applications. CNN contains three types of
layers with different convolutional, pooling, and fully connected layers (Fig. 1). In each
204 A. Mosavi et al.

CNN, there are two stages for the training process, the feed-forward stage, and the
back-propagation stage. The most common CNN architectures are ZFNet [60], Goo-
gLeNet [61], VGGNet [62], AlexNet [63], ResNet [64] (Table 1).
Although CNN is primarily known for image processing applications, the literature
includes other application domains, e.g., energy, computational mechanics, electronics
systems, remote sensing, etc.

Fig. 2. CNN architecture

Table 1. The CNN notable applications


References Application Journal
Kong et al. Condition monitoring of wind turbines Renewable Energy
[65]
Lossau et al. Motion estimation and correction of Computerized Medical Imaging
[66] medical imaging and Graphics
Bhatnagar Prediction of aerodynamic flow Computational Mechanics
et al. [67]
Nevavuori Crop yield prediction Computers and Electronics in
et al. [68] Agriculture
Ajami et al. Advanced image processing Remote Sensing
[69]

2.2 Recurrent Neural Networks (RNN)


RNN is designed to recognize sequences and patterns such as speech, handwriting,
text, and such applications. RNN benefits cyclic connections in the structure which
employ recurrent computations to sequentially process the input data [70]. RNN is
basically a standard neural network that has been extended across time by having edges
which feed into the next time step instead of into the next layer in the same time
step. Each of the previous inputs data are kept in a state vector in hidden units, and
these state vectors is utilized to compute the outputs. Figure 2 shows the architecture of
RNN (Table 2).
List of Deep Learning Models 205

Output layer
Hidden layer
Input layer
Fig. 3. RNN architecture

Table 2. Notable RNN applications


References Application Journal
Zhu et al. [71] Wind speed prediction Energy Conversion and Management
Pan et al. [72] Tropical cyclone intensity Electronics Letters
prediction
Bisharad et al. Music genre recognition Expert Systems
[73]
Zhong et al. Ship Trajectory Restoration Journal of Navigation
[74]
Jarrah et al. [75] Stock price trends predict Advanced Computer Science and
Applications

RNN is relatively newer deep learning method. This is why the application domains
are still young and plenty of rooms remains for research and exploration. The energy,
hydrological prediction, expert systems, navigation, and economics are the current
applications reported in the literature.

2.3 Denoising AutoEncoder (DAE)


DAE has been extended from AE as asymmetrical neural network for learning features
from noisy datasets. DAE consists of three main layers, including input, encoding, and
decoding layers [76]. DAE is able to be aggregated for taking high-level features.
Stacked Denoising AutoEncoder (SDAE), as an unsupervised algorithm, is generated
by the DEA method, which can be employed for nonlinear dimensionality reduction.
This method is a type of feed-forward neural network and employs a deep architecture
with multiple hidden layers and a pre-training strategy [77, 78]. Figure 3 presents the
architecture of DEA methodology (Table 3).
DEA is slowly starting to be known among researchers as an efficient DL algo-
rithm. DEA has already been used in various application domains with promising
206 A. Mosavi et al.

Code

Output layer

Input Output
Encoder Decoder
Fig. 4. DEA architecture

Table 3. The notable DEA applications


References Application Journal
Chen et al. Improving the cyber- Journal on Wireless Communications
[79] physical systems
Liu et al. [80] Electric load forecasting Energies
Nicolai et al. Laser-based scan registration IEEE Robotics and Automation
[81]
Yue, et al. Collaborative Filtering Computer Science and Technology
[82]
Roy et a. Noisy image classification Journal of Information and Communication
[83] Technology
Tan et al. Robust Speaker Verification IEEE Transactions on Audio Speech
[84]

results. The energy forecasting, cybersecurity, banking, fraud detection, image classi-
fication, and speaker verification are among the current popular applications of DEA.

2.4 The Deep Belief Networks (DBNs)


DBNs are employed for high dimensional manifolds learning of data. This method
contains multiple layers, including connections between the layers except for con-
nections between units within each layer. DBNs can be considered as a hybrid multi-
layered neural network, including directed and undirected connections. DBNs contains
restricted Boltzmann machines (RBMs) which are trained in a greedy manner.
Each RBM layer communicates with both the previous and subsequent layers [78, 85,
86]. This model is consists of a feed-forward network and several layers of restricted
List of Deep Learning Models 207

Boltzmann machines or RBM as feature extractors [87]. A hidden layer and visible
layer are only two layers of an RBM [88]. Figure 4 presents the architecture of the
DBN method (Fig. 5 and Table 4).

Hidden layer Hidden layerHidden layer

Visible Layer Output layer

Weights

RBM 1 RBM 2 RBM 3 RBM 4

Fig. 5. DBN architecture

Table 4. The notable DBN applications


References Application Journal
Hassan et al. Human emotion recognition Information Fusion
[89]
Cheng et al. [90] Time series prediction IEEE Internet of Things
Yu et al. [91] wind speed prediction IEEE Transactions on Electrical
Engineering
Zheng et al. [92] Exchange rate forecasting Neural Computing and Applications
Ahmad et al. Automatic Liver IEEE Access
[93] Segmentation
Ronoud et al. Breast cancer diagnosis Soft Computing
[94]

DBN is one of the most reliable deep learning methods with high accuracy and
computational efficiency. Thus, the application domains have been divers, including
exciting application in a wide range of engineering and scientific problems. Human
emotion detection, time series prediction, renewable energy prediction, economic
forecasting, and cancer diagnosis have been among the public application domains.
208 A. Mosavi et al.

2.5 Long Short-Term Memory (LSTM)


LSTM is an RNN method which benefits feedback connections to be used as a general-
purpose computer. This method can of for both sequences and patterns recognition and
image processing applications. In general, LSTM contains three central units, including
input, output, and forget gates. LSTM can control on deciding when to let the input
enter the neuron and to remember what was computed in the previous time step. One of
the main strength of the LSTM method is that it decides all these based on the current
input itself. Figure 6 presents the architecture of the LSTM method (Table 5).

Input Output
gate gate
Cell

Input modulation
gate

Forget gate

Fig. 6. LSTM architecture

Table 5. The notable applications of LSTM


References Application Journal
Ghimire et al. [95] Solar radiation forecasting Applied Energy
Liu [3] Volatility forecasting Expert Systems with Applications
Hong et al. [96] Fault prognosis of battery Applied Energy
systems
Krishan [97] Air quality prediction Air Quality and Atmosphere
Zhang et al. [98] Structural seismic prediction Computers and Structures
Hua et al. [99] Time Series Prediction IEEE Communications
Zhang et al. [100] Wind turbine power prediction Applied Energy
Vardaan et al. Earthquake trend prediction Electrical and Computer
[101] Engineering
List of Deep Learning Models 209

LSTM has shown great potential in environmental applications, e.g., geological


modeling, hydrological prediction, air quality, and hazard modeling. Due to the gen-
eralization ability of the LSTM architecture, it can be suitable for many application
domains. Energy demand and consumption, wind energy industry, and solar power
modeling are the other application domains of LSTM. Further investigation is essential
to explore the new deep learning methods and explore the application domains, as it
has been done for machine Learning methods [102–109].

3 Conclusions

Deep learning methods are fast-evolving. Some of them have advanced to be spe-
cialized in a particular application domain. However, there is a gap in research in
introducing the deep learning methods and summarize the methods and application in a
single paper. Consequently, this paper aims at providing a comprehensive list of the
most popular deep learning methods and provide notable applications. CNN, RNN,
DAE, DBNs, LSTM methods have been identified as the most popular deep learning
method. The description of each deep learning method and the function of each
building block of them is explained.

Acknowledgements. This publication has been supported by the Project: “Support of research
and development activities of the J. Selye University in the field of Digital Slovakia and creative
industry” of the Research & Innovation Operational Programme (ITMS code: NFP313010T504)
co-funded by the European Regional Development Fund.

References
1. Diamant, A., et al.: Deep learning in head & neck cancer outcome prediction. Scientific
Reports 9(1) (2019)
2. Dong, Y., et al.: Bandgap prediction by deep learning in configurationally hybridized
graphene and boron nitride. npj Comput. Mater. 5(1) (2019)
3. Liu, Y.: Novel volatility forecasting using deep learning–long short term memory recurrent
neural networks. Expert Syst. Appl. 132, 99–109 (2019)
4. Ludwiczak, J., et al.: PiPred – a deep-learning method for prediction of p-helices in protein
sequences. Scientific Reports 9(1) (2019)
5. Matin, R., Hansen, C., Mølgaard, P.: Predicting distresses using deep learning of text
segments in annual reports. Expert Syst. Appl. 132, 199–208 (2019)
6. Nguyen, D., et al.: A feasibility study for predicting optimal radiation therapy dose
distributions of prostate cancer patients from patient anatomy using deep learning.
Scientific Reports 9(1) (2019)
7. Shickel, B., et al.: DeepSOFA: a continuous acuity score for critically ill patients using
clinically interpretable deep learning. Scientific Reports 9(1) (2019)
8. Wang, K., Qi, X., Liu, H.: A comparison of day-ahead photovoltaic power forecasting
models based on deep learning neural network. Appl. Energy 251 (2019)
9. Aram, F., et al.: Design and validation of a computational program for analysing mental
maps: Aram mental map analyzer. Sustainability (Switzerland) 11(14) (2019)
210 A. Mosavi et al.

10. Asadi, E., et al.: Groundwater quality assessment for drinking and agricultural purposes in
Tabriz Aquifer, Iran (2019)
11. Asghar, M.Z., Subhan, F., Imran, M., Kundi, F.M., Shamshirband, S., Mosavi, A., Csiba,
P., Várkonyi-Kóczy, A.R.: Performance evaluation of supervised machine learning
techniques for efficient detection of emotions from online content. Preprints 2019,
2019080019 https://doi.org/10.20944/preprints201908.0019.v1
12. Bemani, A., Baghban, A., Shamshirband, S., Mosavi, A., Csiba, P., Várkonyi-Kóczy, A.R.:
Applying ANN, ANFIS, and LSSVM Models for Estimation of Acid Solvent Solubility in
Supercritical CO2. Preprints 2019, 2019060055 https://doi.org/10.20944/preprints201906.
0055.v2
13. Choubin, B., et al.: Snow avalanche hazard prediction using machine learning methods.
J. Hydrol. 577 (2019)
14. Choubin, B., et al.: An ensemble prediction of flood susceptibility using multivariate
discriminant analysis, classification and regression trees, and support vector machines. Sci.
Total Environ. 651, 2087–2096 (2019)
15. Dehghani, M., et al.: Prediction of hydropower generation using Grey wolf optimization
adaptive neuro-fuzzy inference system. Energies 12(2), 289 (2019)
16. Dineva, A., et al.: Review of soft computing models in design and control of rotating
electrical machines. Energies 12(6) (2019)
17. Dineva, A., et al.: Multi-label classification for fault diagnosis of rotating electrical
machines (2019)
18. Farzaneh-Gord, M., et al.: Numerical simulation of pressure pulsation effects of a snubber
in a CNG station for increasing measurement accuracy. Eng. Appl. Comput. Fluid Mech.
13(1), 642–663 (2019)
19. Ghalandari, M., et al.: Investigation of submerged structures’ flexibility on sloshing
frequency using a boundary element method and finite element analysis. Eng. Appl.
Comput.Fluid Mech. 13(1), 519–528 (2019)
20. Ghalandari, M., et al.: Flutter speed estimation using presented differential quadrature
method formulation. Eng. Appl. Comput. Fluid Mech. 13(1), 804–810 (2019)
21. Karballaeezadeh, N., et al.: Prediction of remaining service life of pavement using an
optimized support vector machine (case study of Semnan-Firuzkuh road). Eng. Appl.
Comput. Fluid Mech. 13(1), 188–198 (2019)
22. Menad, N.A., et al.: Modeling temperature dependency of oil—water relative permeability
in thermal enhanced oil recovery processes using group method of data handling and gene
expression programming. Eng. Appl. Comput. Fluid Mech. 13(1), 724–743 (2019)
23. Mohammadzadeh, S., et al.: Prediction of compression index of fine-grained soils using a
gene expression programming model. Infrastructures 4(2), 26 (2019)
24. Mosavi, A., Edalatifar, M.: A Hybrid Neuro-Fuzzy Algorithm for Prediction of Reference
Evapotranspiration, in Lecture Notes in Networks and Systems, pp. 235–243. Springer
(2019)
25. Mosavi, A., Lopez, A., Varkonyi-Koczy, A.R.: Industrial applications of big data: state of
the art survey, D. Luca, L. Sirghi, and C. Costin, Editors, pp. 225–232. Springer (2018)
26. Mosavi, A., Ozturk, P., Chau, K.W.: Flood prediction using machine learning models:
literature review. Water (Switzerland) 10(11) (2018)
27. Mosavi, A., Rabczuk, T.: Learning and intelligent optimization for material design
innovation, D.E. Kvasov, et al., Editors, pp. 358–363. Springer (2017)
28. Mosavi, A., Rabczuk, T., Varkonyi-Koczy, A.R.: Reviewing the novel machine learning
tools for materials design, D. Luca, L. Sirghi, and C. Costin, Editors, pp. 50–58. Springer
(2018)
List of Deep Learning Models 211

29. Mosavi, A., et al.: State of the art of machine learning models in energy systems, a
systematic review. Energies 12(7) (2019)
30. Mosavi, A., et al.: Prediction of multi-inputs bubble column reactor using a novel hybrid
model of computational fluid dynamics and machine learning. Eng. Appl. Comput. Fluid
Mech. 13(1), 482–492 (2019)
31. Mosavi, A., Varkonyi-Koczy, A.R.: Integration of machine learning and optimization for
robot learning, R. Jablonski and R. Szewczyk, Editors, pp. 349–355. Springer (2017)
32. Nosratabadi, S., et al.: Sustainable business models: a review. Sustainability (Switzerland)
11(6) (2019)
33. Qasem, S.N., et al.: Estimating daily dew point temperature using machine learning
algorithms. Water (Switzerland) 11(3) (2019)
34. Rezakazemi, M., Mosavi, A., Shirazian, S.: ANFIS pattern for molecular membranes
separation optimization. J. Mol. Liq. 274, 470–476 (2019)
35. Riahi-Madvar, H., et al.: Comp. Anal. Soft Comput. Techn. RBF, MLP, ANFIS with MLR
and MNLR Predict. Grade-control Scour Hole Geometry. Eng. Appl. Comput. Fluid Mech.
13(1), 529–550 (2019)
36. Shabani, S., Samadianfard, S., Taghi Sattari, M., Shamshirband, S., Mosavi, A., Kmet, T.,
Várkonyi-Kóczy, A.R.: Modeling daily pan evaporation in humid climates using gaussian
process regression. Preprints 2019, 2019070351 https://doi.org/10.20944/preprints201907.
0351.v1
37. Shamshirband, S., Hadipoor, M., Baghban, A., Mosavi, A., Bukor J., Varkonyi-Koczy, A.
R.: Developing an ANFIS-PSO model to predict mercury emissions in combustion flue
gases. Preprints 2019, 2019070165 https://doi.org/10.20944/preprints201907.0165.v1
38. Shamshirband, S., et al.: Ensemble models with uncertainty analysis for multi-day ahead
forecasting of chlorophyll a concentration in coastal waters. Eng. Appl. Comput. Fluid
Mech. 13(1), 91–101 (2019)
39. Shamshirband, S., Mosavi, A., Rabczuk, T.: Particle swarm optimization model to predict
scour depth around bridge pier. arXiv preprint arXiv:1906.08863 (2019)
40. Taherei Ghazvinei, P., et al.: Sugarcane growth prediction based on meteorological
parameters using extreme learning machine and artificial neural network. Eng. Appl.
Comput. Fluid Mech. 12(1), 738–749 (2018)
41. Torabi, M., et al.: A Hybrid clustering and classification technique for forecasting short-
term energy consumption. Environ. Progr. Sustain. Energy 38(1), 66–76 (2019)
42. Torabi, M., et al.: A Hybrid Machine Learning Approach for Daily Prediction of Solar
Radiation, in Lecture Notes in Networks and Systems, pp. 266–274. Springer (2019)
43. Biswas, M., et al.: State-of-the-art review on deep learning in medical imaging. Front.
Biosci. Landmark 24(3), 392–426 (2019)
44. Bote-Curiel, L., et al.: Deep learning and big data in healthcare: a double review for critical
beginners. Appl. Sci. (Switzerland) 9(11) (2019)
45. Feng, Y., Teh, H.S., Cai, Y.: Deep learning for chest radiology: a review. Curr. Radiol.
Reports 7(8) (2019)
46. Griffiths, D., Boehm, J.: A Review on deep learning techniques for 3D sensed data
classification. Remote Sens. 11(12) (2019)
47. Gupta, A., et al.: Deep learning in image cytometry: a review. Cytom. Part A 95(4), 366–
380 (2019)
48. Ha, V.K., et al.: Deep learning based single image super-resolution: a survey. Int. J. Autom.
Comput. 16(4), 413–426 (2019)
49. Jiang, W., Zhang, C.S., Yin, X.C.: Deep learning based scene text detection: a survey. Tien
Tzu Hsueh Pao/Acta Electronica Sinica 47(5), 1152–1161 (2019)
212 A. Mosavi et al.

50. Johnsirani Venkatesan, N., Nam, C., Shin, D.R.: Deep learning frameworks on apache
spark: a review. IETE Techn. Rev. (Inst. Electron. Telecommun. Eng., India) 36(2), 164–
177 (2019)
51. Li, X., He, Y., Jing, X.: A survey of deep learning-based human activity recognition in
radar. Remote Sens. 11(9) (2019)
52. Litjens, G., et al.: State-of-the-art deep learning in cardiovascular image analysis. JACC:
Cardiovasc. Imaging 12(8P1), 1549–1565 (2019)
53. Liu, S., et al.: Deep learning in medical ultrasound analysis: a review. Engineering 5(2),
261–275 (2019)
54. Mazurowski, M.A., et al.: Deep learning in radiology: an overview of the concepts and a
survey of the state of the art with focus on MRI. J. Magn. Reson. Imaging 49(4), 939–954
(2019)
55. Narendra, G., Sivakumar, D.: Deep learning based hyperspectral image analysis-a survey.
J. Comput. Theor. Nanosci. 16(4), 1528–1535 (2019)
56. Wang, H., et al.: A review of deep learning for renewable energy forecasting. Energy
Convers. Manag. 198 (2019)
57. Wang, Y., et al.: Enhancing transportation systems via deep learning: a survey.
Transp. Res. Part C: Emerg. Technol. 99, 144–163 (2019)
58. Zhang, W., et al.: Deep learning-based multimedia analytics: a review. ACM Trans.
Multimed. Comput. Commun. Appl. 15(1s) (2019)
59. Zhou, J., et al.: Deep learning for aspect-level sentiment classification: survey, vision, and
challenges. IEEE Access 7, 78454–78483 (2019)
60. Zeiler, M.D., Fergus, R.: Visualizing and understanding convolutional networks. In:
European Conference on Computer Vision. Springer (2014)
61. Szegedy, C., et al.: Going deeper with convolutions. In: Proceedings of the IEEE
Conference on Computer Vision and Pattern Recognition (2015)
62. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image
recognition. arXiv preprint arXiv:1409.1556 (2014)
63. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolu-
tional neural networks. In: Advances in Neural Information Processing Systems (2012)
64. He, K., et al.: Deep residual learning for image recognition. In Proceedings of the IEEE
Conference on Computer Vision and Pattern Recognition (2016)
65. Kong, Z., et al.: Condition monitoring of wind turbines based on spatio-temporal fusion of
SCADA data by convolutional neural networks and gated recurrent units. Renew. Energy
146, 760–768 (2020)
66. Lossau, T., et al.: Motion estimation and correction in cardiac CT angiography images
using convolutional neural networks. Computerized Med. Imaging Graphics 76 (2019)
67. Bhatnagar, S., et al.: Prediction of aerodynamic flow fields using convolutional neural
networks. Comput. Mech. 64(2), 525–545 (2019)
68. Nevavuori, P., Narra, N., Lipping, T.: Crop yield prediction with deep convolutional neural
networks. Comput. Electron. Agric. 163 (2019)
69. Ajami, A., et al.: Identifying a slums’ degree of deprivation from VHR images using
convolutional neural networks. Remote Sens. 11(11) (2019)
70. Min, S., Lee, B., Yoon, S.: Deep learning in bioinformatics. Brief. Bioinform. 18(5), 851–
869 (2017)
71. Zhu, S., et al.: Gaussian mixture model coupled recurrent neural networks for wind speed
interval forecast. Energy Convers. Manag. 198 (2019)
72. Pan, B., Xu, X., Shi, Z.: Tropical cyclone intensity prediction based on recurrent neural
networks. Electron. Lett. 55(7), 413–415 (2019)
List of Deep Learning Models 213

73. Bisharad, D., Laskar, R.H.: Music genre recognition using convolutional recurrent neural
network architecture. Expert Syst. (2019)
74. Zhong, C., et al.: Inland ship trajectory restoration by recurrent neural network. J. Navig.
(2019)
75. Jarrah, M., Salim, N.: A recurrent neural network and a discrete wavelet transform to
predict the Saudi stock price trends. Int. J. Adv. Comput. Sci. Appl. 10(4), 155–162 (2019)
76. Al Rahhal, M.M., et al.: Deep learning approach for active classification of electrocar-
diogram signals. Inf. Sci. 345, 340–354 (2016)
77. Yin, Z., Zhang, J.: Cross-session classification of mental workload levels using EEG and an
adaptive deep learning model. Biomed. Signal Process. Control 33, 30–47 (2017)
78. Sun, W., Zheng, B., Qian, W.: Automatic feature learning using multichannel ROI based on
deep structured algorithms for computerized lung cancer diagnosis. Comput. Biol. Med. 89,
530–539 (2017)
79. Chen, Y., et al.: Indoor location method of interference source based on deep learning of
spectrum fingerprint features in Smart Cyber-Physical systems. Eurasip J. Wirel. Commun.
Netw. 2019(1) (2019)
80. Liu, P., Zheng, P., Chen, Z.: Deep learning with stacked denoising auto-encoder for short-
term electric load forecasting. Energies 12(12) (2019)
81. Nicolai, A., Hollinger, G.A.: Denoising autoencoders for laser-based scan registration.
IEEE Robot. Autom. Lett. 3(4), 4391–4398 (2018)
82. Yue, L., et al.: Multiple auxiliary information based deep model for collaborative filtering.
J. Comput. Sci. Technol. 33(4), 668–681 (2018)
83. Roy, S.S., Ahmed, M., Akhand, M.A.H.: Noisy image classification using hybrid deep
learning methods. J. Inf.Commun. Technol. 17(2), 233–269 (2018)
84. Tan, Z., et al.: Denoised senone i-vectors for robust speaker verification. IEEE/ACM Trans.
Audio Speech Lang. Process. 26(4), 820–830 (2018)
85. Zhang, Q., et al.: Deep learning based classification of breast tumors with shear-wave
elastography. Ultrasonics 72, 150–157 (2016)
86. Wulsin, D., et al.: Modeling electroencephalography waveforms with semi-supervised deep
belief nets: fast classification and anomaly measurement. J. Neural Eng. 8(3), 036015
(2011)
87. Patterson, J., Gibson, A.: Deep Learning: A Practitioner’s Approach. “ O’Reilly Media,
Inc.” (2017)
88. Vieira, S., Pinaya, W.H., Mechelli, A.: Using deep learning to investigate the neuroimaging
correlates of psychiatric and neurological disorders: methods and applications. Neurosci.
Biobehav. Rev. 74, 58–75 (2017)
89. Hassan, M.M., et al.: Human emotion recognition using deep belief network architecture.
Inf. Fusion 51, 10–18 (2019)
90. Cheng, Y., et al.: Deep belief network for meteorological time series prediction in the
internet of things. IEEE Int. Things J. 6(3), 4369–4376 (2019)
91. Yu, Y., et al.: Forecasting a short-term wind speed using a deep belief network combined
with a local predictor. IEEJ Trans. Electr. Electron. Eng. 14(2), 238–244 (2019)
92. Zheng, J., Fu, X., Zhang, G.: Research on exchange rate forecasting based on deep belief
network. Neural Comput. Appl. 31, 573–582 (2019)
93. Ahmad, M., et al.: Deep belief network modeling for automatic liver segmentation. IEEE
Access 7, 20585–20595 (2019)
94. Ronoud, S., Asadi, S.: An evolutionary deep belief network extreme learning-based for
breast cancer diagnosis. Soft Comput. (2019)
95. Ghimire, S., et al.: Deep solar radiation forecasting with convolutional neural network and
long short-term memory network algorithms. Appl. Energy (2019)
214 A. Mosavi et al.

96. Hong, J., Wang, Z., Yao, Y.: Fault prognosis of battery system based on accurate voltage
abnormity prognosis using long short-term memory neural networks. Appl. Energy (2019)
97. Krishan, M., et al.: Air quality modelling using long short-term memory (LSTM) over
NCT-Delhi, India. Air Qual. Atmos. Health 12(8), 899–908 (2019)
98. Zhang, R., et al.: Deep long short-term memory networks for nonlinear structural seismic
response prediction. Comput. Struct. 220, 55–68 (2019)
99. Hua, Y., et al.: Deep learning with long short-term memory for time series prediction. IEEE
Commun. Mag. 57(6), 114–119 (2019)
100. Zhang, J., et al.: Short-term forecasting and uncertainty analysis of wind turbine power
based on long short-term memory network and Gaussian mixture model. Appl. Energy,
229–244 (2019)
101. Vardaan, K., et al.: Earthquake trend prediction using long short-term memory RNN. Int.
J. Electr. Comput. Eng. 9(2), 1304–1312 (2019)
102. Mesri Gundoshmian, T., Ardabili, S., Mosavi, A., Varkonyi-Koczy, A.: Prediction of
combine harvester performance using hybrid machine learning modeling and re-sponse
surface methodology, Preprints 2019
103. Ardabili, S., Mosavi, A., Varkonyi-Koczy, A.: Systematic review of deep learning and
machine learning models in biofuels research, Preprints 2019
104. Ardabili, S., Mosavi, A., Varkonyi-Koczy, A.: Advances in machine learning modeling
reviewing hybrid and ensemble methods, Preprints 2019
105. Ardabili, S., Mosavi, A., Varkonyi-Koczy, A.: Building Energy information: demand and
consumption prediction with Machine Learning models for sustainable and smart cities,
Preprints 2019
106. Ardabili, S., Mosavi, A., Dehghani, M., Varkonyi-Koczy, A., Deep learning and machine
learning in hydrological processes climate change and earth systems a systematic review,
Preprints 2019
107. Mohammadzadeh, D., Karballaeezadeh, N., Mohemmi, M., Mosavi, A., Varkonyi-Koczy
A.: Urban train soil-structure interaction modeling and analysis, Preprints 2019
108. Mosavi, A., Ardabili, S., Varkonyi-Koczy, A.: List of deep learning models, Preprints 2019
109. Nosratabadi, S., Mosavi, A., Keivani, R., Ardabili, S., Aram, F.: State of the art survey of
deep learning and machine learning models for smart cities and urban sustainability,
Preprints 2019

You might also like