Download full Applied Neural Networks with TensorFlow 2: API Oriented Deep Learning with Python Orhan Gazi Yalçın ebook all chapters
Download full Applied Neural Networks with TensorFlow 2: API Oriented Deep Learning with Python Orhan Gazi Yalçın ebook all chapters
com
https://textbookfull.com/product/applied-neural-networks-
with-tensorflow-2-api-oriented-deep-learning-with-python-
orhan-gazi-yalcin/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/applied-reinforcement-learning-with-
python-with-openai-gym-tensorflow-and-keras-beysolow-ii/
textboxfull.com
Deep Learning with JavaScript Neural networks in
TensorFlow js 1st Edition Shanqing Cai Stanley Bileschi
Eric D Nielsen
https://textbookfull.com/product/deep-learning-with-javascript-neural-
networks-in-tensorflow-js-1st-edition-shanqing-cai-stanley-bileschi-
eric-d-nielsen/
textboxfull.com
https://textbookfull.com/product/deep-learning-with-python-develop-
deep-learning-models-on-theano-and-tensorflow-using-keras-jason-
brownlee/
textboxfull.com
https://textbookfull.com/product/deep-learning-with-javascript-neural-
networks-in-tensorflow-js-1st-edition-shanqing-cai/
textboxfull.com
https://textbookfull.com/product/matlab-deep-learning-with-machine-
learning-neural-networks-and-artificial-intelligence-1st-edition-phil-
kim/
textboxfull.com
Applied Neural
Networks with
TensorFlow 2
API Oriented Deep Learning
with Python
—
Orhan Gazi Yalçın
Applied Neural
Networks with
TensorFlow 2
API Oriented Deep Learning
with Python
Chapter 1: Introduction������������������������������������������������������������������������1
Python as Programming Language�����������������������������������������������������������������������3
Timeline of Python�������������������������������������������������������������������������������������������3
Python 2 vs. Python 3��������������������������������������������������������������������������������������4
Why Python?����������������������������������������������������������������������������������������������������5
TensorFlow As Deep Learning Framework������������������������������������������������������������7
Timeline of TensorFlow������������������������������������������������������������������������������������8
Why TensorFlow?�������������������������������������������������������������������������������������������10
What’s New in TensorFlow 2.x�����������������������������������������������������������������������10
TensorFlow Competitors��������������������������������������������������������������������������������15
Installation and Environment Setup��������������������������������������������������������������������20
Interactive Programming Environments: IPython, Jupyter Notebook,
and Google Colab�������������������������������������������������������������������������������������������21
IPython�����������������������������������������������������������������������������������������������������������22
Jupyter Notebook������������������������������������������������������������������������������������������23
Google Colab��������������������������������������������������������������������������������������������������30
Hardware Options and Requirements�����������������������������������������������������������������32
v
Table of Contents
vi
Table of Contents
Perceptron�����������������������������������������������������������������������������������������������������64
A Modern Deep Neural Network��������������������������������������������������������������������65
Activation Functions��������������������������������������������������������������������������������������������66
Loss (Cost or Error) Functions�����������������������������������������������������������������������������69
Optimization in Deep Learning����������������������������������������������������������������������������70
Backpropagation�������������������������������������������������������������������������������������������71
Optimization Algorithms��������������������������������������������������������������������������������72
Optimization Challenges��������������������������������������������������������������������������������75
Overfitting and Regularization����������������������������������������������������������������������������77
Overfitting������������������������������������������������������������������������������������������������������77
Regularization�����������������������������������������������������������������������������������������������78
Feature Scaling���������������������������������������������������������������������������������������������������79
Final Evaluations�������������������������������������������������������������������������������������������������80
vii
Table of Contents
viii
Table of Contents
RNN Types���������������������������������������������������������������������������������������������������������167
Simple RNNs������������������������������������������������������������������������������������������������168
Long Short-Term Memory (LSTM)���������������������������������������������������������������169
Gated Recurrent Units (GRUs)����������������������������������������������������������������������170
Case Study | Sentiment Analysis with IMDB Reviews���������������������������������������171
Preparing Our Colab for GPU Accelerated Training��������������������������������������172
IMDB Reviews���������������������������������������������������������������������������������������������173
Preparing the Dataset����������������������������������������������������������������������������������175
Building the Recurrent Neural Network�������������������������������������������������������176
Compiling and Fitting the Model������������������������������������������������������������������178
Evaluating the Model�����������������������������������������������������������������������������������179
Making New Predictions������������������������������������������������������������������������������181
Saving and Loading the Model��������������������������������������������������������������������182
Conclusion��������������������������������������������������������������������������������������������������������185
x
Table of Contents
xi
Table of Contents
xii
Table of Contents
Video Games������������������������������������������������������������������������������������������������264
Malicious Applications and Deep Fake��������������������������������������������������������264
Miscellaneous Applications�������������������������������������������������������������������������264
Case Study | Digit Generation with MNIST��������������������������������������������������������265
Initial Imports����������������������������������������������������������������������������������������������265
Load and Process the MNIST Dataset���������������������������������������������������������266
Build the GAN Model������������������������������������������������������������������������������������267
Train the GAN Model������������������������������������������������������������������������������������275
Animate Generated Digits During the Training��������������������������������������������281
Conclusion��������������������������������������������������������������������������������������������������������284
Index�������������������������������������������������������������������������������������������������285
xiii
About the Author
Orhan Gazi Yalçın is a joint PhD candidate at
the University of Bologna and the Polytechnic
University of Madrid. After completing
his double major in business and law, he
began his career in Istanbul, working for a
city law firm, Allen & Overy, and a global
entrepreneurship network, Endeavor. During
his academic and professional career, he
taught himself programming and excelled
in machine learning. He currently conducts
research on hotly debated law and AI topics
such as explainable artificial intelligence
and the right to explanation by combining his technical and legal skills.
In his spare time, he enjoys free diving, swimming, exercising, as well as
discovering new countries, cultures, and cuisines.
www.orhangaziyalcin.com
www.linkedin.com/in/orhangaziyalcin
xv
About the Technical Reviewer
Vishwesh Ravi Shrimali graduated from BITS Pilani in 2018, where
he studied mechanical engineering. Since then, he has worked with
BigVision LLC on deep learning and computer vision and was involved in
creating official OpenCV AI courses. Currently, he is working at Mercedes
Benz Research and Development India Pvt. Ltd. He has a keen interest
in programming and AI and has applied that interest in mechanical
engineering projects. He has also written multiple blogs on OpenCV and
deep learning on LearnOpenCV, a leading blog on computer vision. He has
also coauthored Machine Learning for OpenCV4 (second edition) by Packt.
When he is not writing blogs or working on projects, he likes to go on long
walks or play his acoustic guitar.
xvii
Acknowledgments
This book was written during a global lockdown due to the Covid-19
pandemic, which created a new normal that I have never experienced
before. Writing a book in the middle of a global crisis was a very intense
experience, and I was uncertain about taking this responsibility for a
long time. Thanks to my family and friends, I was able to complete the
book even earlier than scheduled. Now I am glad that I accepted Aaron’s
invitation, who guided me throughout the whole process. Thank you very
much for reaching out to me in the first place and making it possible to
have this book written.
I would like to thank Jessica Vakili for coordinating the entire project
and for being there whenever I needed. I would also like to thank Vishwesh
Ravi Shrimali for reviewing every single line of the book and providing me
with all the valuable comments, which helped to improve the quality of the
book tremendously.
Being surrounded with people who all have a positive attitude made
this experience very fruitful, and I am looking forward to working with
them in the future. Thank you all very much!
xix
CHAPTER 1
Introduction
In this book, we dive into the realms of deep learning (DL) and cover
several deep learning concepts along with several case studies. These case
studies range from image recognition to recommender systems, from art
generation to object clustering. Deep learning is part of a broader family
of machine learning (ML) methods based on artificial neural networks
(ANNs) with representation learning. These neural networks mimic the
human brain cells, or neurons, for algorithmic learning, and their learning
speed is much faster than human learning speed. Several deep learning
methods offer solutions to different types of machine learning problems:
(i) supervised learning, (ii) unsupervised learning, (iii) semi-supervised
learning, and (iv) reinforcement learning.
This book is structured in a way to also include an introduction to
the discipline of machine learning so that the reader may be acquainted
with the general rules and concepts of machine learning. Then, a detailed
introduction to deep learning is provided to familiarize the reader with the
sub-discipline of deep learning.
After covering the fundamentals of deep learning, the book covers
different types of artificial neural networks with their potential real-life
applications (i.e., case studies). Therefore, at each chapter, this book (i)
introduces the concept of a particular neural network architecture with
details on its components and then (ii) provides a tutorial on how to
apply this network structure to solve a particular artificial intelligence
(AI) problem.
Since the goal of this book is to provide case studies for deep learning
applications, the competency in several technologies and libraries is
sought for a satisfactory learning experience.
Before diving into machine learning and deep learning, we start with
the introduction to the technologies used in this book. This introduction
includes the latest developments and the reasoning as to why these
technologies are selected. Finally, this chapter also covers how to install
these technologies and prepare your environment with a minimum
amount of hassle. The technologies that are in the center of this book are
as follows:
Please note that this book assumes that you use Google Colab, which
requires almost no environment setup. The chapter also includes a local
Jupyter Notebook installation guide if you prefer a local environment. You
may skip the Jupyter Notebook installation section if you decide to use
Google Colab.
2
Chapter 1 Introduction
Timeline of Python
Let’s take a look at the timeline of Python:
3
Chapter 1 Introduction
4
Chapter 1 Introduction
W
hy Python?
Compared to other programming languages, there are several reasons
for Python’s popularity among data scientists and machine learning
engineers. 2019 Kaggle Machine Learning and Data Science Survey
revealed that Python is by far the most popular programming language for
data science and machine learning; see Figure 1-1.
Figure 1-1. 2019 Kaggle Machine Learning and Data Science Survey
5
Chapter 1 Introduction
Ease of Learning
One of the main reasons for newcomers to choose Python as their primary
programming language is its ease of learning. When compared to other
programming languages, Python offers a shorter learning curve so that
programmers can achieve a good level of competency in a short amount
of time. Python’s syntax is easier to learn, and the code is more readable
compared to other popular programming languages. A common example
to show this is the amount of code required by different programming
languages to print out “Hello, World!”. For instance, to be able to print out
Hello, World! in Java, you need the following code:
The same result may be achieved with a single line of code in Python:
6
Chapter 1 Introduction
Community Support
The powerful community support is another advantage of Python over
other programming languages. More and more volunteers are releasing
Python libraries, and this practice made Python the language with modern
and powerful libraries. Besides, a high number of seasoned Python
programmers are always ready to help other programmers with their
problems on online community channels such as Stack Overflow.
Visualization Options
Data visualization is an important discipline to extract insights from raw
data, and Python offers several useful visualization options. The good old
Matplotlib is always there with the most customizable options. In addition,
Seaborn and Pandas Plot API are powerful libraries that streamline the
most common visualization tasks used by data scientists. Additionally,
libraries like Plotly and Dash allow users to create interactive plots and
sophisticated dashboards to be served on the Web. With these libraries,
data scientists may easily create charts, draw graphical plots, and facilitate
feature extraction.
Now that we covered why favorite language of data scientists is Python,
we can move on to why we use TensorFlow as our machine learning
framework.
7
Chapter 1 Introduction
Google released the library under the Apache License 2.0 in November
2015, which made it an open source library.1 Although the use cases of
TensorFlow are not limited to machine learning applications, machine
learning is the field where we see TensorFlow’s strength.
The two programming languages with stable and official
TensorFlow APIs are Python and C. Also, C++, Java, JavaScript, Go, and
Swift are other programming languages where developers may find
limited-to-extensive TensorFlow compatibility. Finally, there are third-
party TensorFlow APIs for C#, Haskell, Julia, MATLAB, R, Scala, Rust,
OCaml, and Crystal.
T imeline of TensorFlow
Although this book focuses on TensorFlow 2.x with Python API, there
are several complementary TensorFlow libraries released by Google.
Understanding the development of the TensorFlow platform is essential to
see the full picture. The timeline of the milestones achieved by Google as
part of the TensorFlow project may be summarized as follows:
1
Google Just Open Sourced TensorFlow, Its Artificial Intelligence Engine
| WIRED, www.wired.com/2015/11/google-open-sources-its-artificial-
intelligence-engine/ (last visited Jun 5, 2020)
8
Chapter 1 Introduction
9
Chapter 1 Introduction
Why TensorFlow?
There are more than two dozens of deep learning libraries developed by
tech giants, tech foundations, and academic institutions that are available
to the public. While each framework has its advantage in a particular sub-
discipline of deep learning, this book focuses on TensorFlow with Keras
API. The main reason for choosing TensorFlow over other deep learning
frameworks is its popularity. On the other hand, this statement does not
indicate that the other frameworks are better – yet, less popular – than
TensorFlow. Especially with the introduction of version 2.0, TensorFlow
strengthened its power by addressing the issues raised by the deep
learning community. Today, TensorFlow may be seen as the most popular
deep learning framework, which is very powerful and easy to use and has
excellent community support.
10
Chapter 1 Introduction
11
Chapter 1 Introduction
Run and Debug with Eager Execution, Then Use AutoGraph API
for the Benefits of Graphs
TensorFlow 1.x versions were prioritizing TensorFlow graphs, which is not
friendly to newcomers. Even though this complicated methodology was
kept in TensorFlow 2.0, eager execution – the contrast concept – was made
default. Google explained the initial reasoning for this change with the
following statement:
2
Google AI Blog: Eager Execution: An imperative, define-by-run interface
to TensorFlow, https://ai.googleblog.com/2017/10/eager-execution-
imperative-define-by.html (last visited Jun 8, 2020)
12
Chapter 1 Introduction
Export to SavedModel
After training a model, developers may export to SavedModel. tf.saved_
model API may be used to build a complete TensorFlow program with
weights and computations. This standardized SavedModel can be used
interchangeably across different TensorFlow deployment libraries such as
(i) TensorFlow Serving, (ii) TensorFlow Lite, (iii) TensorFlow.js, and (iv)
TensorFlow Hub.
TensorFlow Serving
TensorFlow Serving is a flexible and high-performance TensorFlow library
that allows models to be served over HTTP/REST or gRPC/Protocol
Buffers. This platform is platform and language-neutral as you may make
an HTTP call using any programming language.
TensorFlow Lite
TensorFlow Lite is a lightweight deep learning framework to deploy models
to mobile devices (iOS and Android) or embedded devices (Raspberry Pi
or Edge TPUs). Developers may pick a trained model, convert the model
into a compressed fat buffer, and deploy to a mobile or embedded device
with TensorFlow Lite.
13
Another Random Scribd Document
with Unrelated Content
far from intending to give even a sketch of all past geological
speculations, I must notice some of the forms such speculations
have at different times assumed.
98 V. 309, &c.
586 The main points really affecting the progress of sound theoretical
geology, will find a place in one of the two next Sections.
The opinion that the history of the earth had involved a serious of
catastrophes, confirmed by the two great classes of facts, the
symptoms of mechanical violence on a very large scale, and of
complete changes in the living things by which the earth had been
tenanted, took strong hold of the geologists of England, France, and
Germany. Hutton, though he denied that there was evidence of a
beginning of the present state of things, and referred many
processes in the formation of strata to existing causes, did not assert
that the elevatory forces which raise continents from the bottom of
the ocean, were of the same order, 589 as well as of the same kind,
with the volcanoes and earthquakes which now shake the surface.
His doctrine of uniformity was founded rather on the supposed
analogy of other lines of speculation, than on the examination of the
amount of changes now going on. “The Author of nature,” it was
said, “has not permitted in His works any symptom of infancy or of
old age, or any sign by which we may estimate either their future or
their past duration:” and the example of the planetary system was
referred to in illustration of this. 104 And a general persuasion that the
champions of this theory were not disposed to accept the usual
opinions on the subject of creation, was allowed, perhaps very
unjustly, to weigh strongly against them in the public opinion.
104 Lyell, i. 4, p. 94.
While the rest of Europe had a decided bias towards the doctrine
of geological catastrophes, the phenomena of Italy, which, as we
have seen, had already tended to soften the rigor of that doctrine, in
the progress of speculation from Steno to Generelli, were destined to
mitigate it still more, by converting to the belief of uniformity
transalpine geologists who had been bred up in the catastrophist
creed. This effect was, indeed, gradual. For a time the distinction of
the recent and the tertiary period was held to be marked and strong.
Brocchi asserted that a large portion of the Sub-Apennine fossil
shells belonged to a living species of the Mediterranean Sea: but the
geologists of the rest of Europe turned an incredulous ear to this
Italian tenet; and the persuasion of the distinction of the tertiary and
the recent period was deeply impressed on most geologists by the
memorable labors of Cuvier and Brongniart on the Paris basin. Still,
as other tertiary deposits were examined, it was found that they
could by no means be considered as contemporaneous, but that
they formed a chain of posts, advancing nearer and nearer to the
recent period. Above the strata of the basins of London and Paris, 105
lie the newer strata of Touraine, of Bourdeaux, of the valley of the
Bormida and the Superga near Turin, and of the basin of Vienna,
explored by M. Constant Prevost. Newer and higher still than these,
are found the Sub-Apennine formations of Northern Italy, and
probably of the same period, the English “crag” of Norfolk and
Suffolk. And most of these marine formations are associated with
volcanic products and fresh-water deposits, so as to imply
apparently a long train of alternations of corresponding processes. It
may easily be supposed that, when the subject had assumed this
form, the boundary of the present and past condition of the earth 590
was in some measure obscured. But it was not long before a very
able attempt was made to obliterate it altogether. In 1828, Mr. Lyell
set out on a geological tour through France and Italy. 106 He had
already conceived the idea of classing the tertiary groups by
reference to the number of recent species which were found in a
fossil state. But as he passed from the north to the south of Italy, he
found, by communication with the best fossil conchologists, Borelli at
Turin, Guidotti at Parma, Costa at Naples, that the number of extinct
species decreased; so that the last-mentioned naturalist, from an
examination of the fossil shells of Otranto and Calabria, and of the
neighboring seas, was of opinion that few of the tertiary shells were
of extinct species. To complete the series of proof, Mr. Lyell himself
explored the strata of Ischia, and found, 2000 feet above the level of
the sea, shells, which were all pronounced to be of species now
inhabiting the Mediterranean; and soon after, he made collections of
a similar description on the flanks of Etna, in the Val di Noto, and in
other places.
105 Lyell, 1st ed. vol. iii. p. 61.
Before Mr. Lyell entered upon his journey, he had put into the
hands of the printer the first volume of his “Principles of Geology,
being an attempt to explain the former Changes of the Earth’s
Surface by reference to the Causes now in Operation.” And after
viewing such phenomena as we have spoken of, he, no doubt,
judged that the doctrine of catastrophes of a kind entirely different
from the existing course of events, would never have been generally
received, if geologists had at first formed their opinions upon the
Sicilian strata. The boundary separating the present from the anterior
state of things crumbled away; the difference of fossil and recent
species had disappeared, and, at the same time, the changes of
position which marine strata had undergone, although not inferior to
those of earlier geological periods, might be ascribed, it was thought,
to the same kind of earthquakes as those which still agitate that
region. Both the supposed proofs of catastrophic transition, the
organical and the mechanical changes, failed at the same time; the
one by the removal of the fact, the other by the exhibition of the
cause. The powers of earthquakes, even such as they now exist,
were, it was supposed, if allowed to operate for an illimitable time,
adequate to produce all the mechanical effects which the strata of all
ages display. And it was declared that all evidence of a beginning of
the present state of the earth, or of any material alteration in the
energy of the forces by which it has been modified at various
epochs, was entirely wanting.
110 [2nd Ed.] [I have, in the text, quoted the fourth edition of Mr.
Lyell’s Principles, in which he recommends “an earnest and
patient endeavor to reconcile the former indications of change
with the evidence of gradual mutation now in progress.” In the
sixth edition, in that which is, I presume, the corresponding
passage, although it is transferred from the fourth to the first Book
(B. i. c. xiii. p. 325) he recommends, instead, “an earnest and
patient inquiry how far geological appearances are reconcileable
with the effect of changes now in progress.” But while Mr. Lyell
has thus softened the advocate’s character in his language in this
passage, the transposition which I have noticed appears to me to
have an opposite tendency. For in the former edition, the causes
now in action were first described in the second and third Books,
and the great problem of Geology, stated in the first Book, was
attempted to be solved in the fourth. But by incorporating this
fourth Book with the first, and thus prefixing to the study of
existing causes arguments against the belief of their geological
insufficiency, there is an appearance as if the author wished his
reader to be prepared by a previous pleading against the doctrine
of catastrophes, before he went to the study of existing causes.
The Doctrines of Catastrophes and of Uniformity, and the other
leading questions of the Palætiological Sciences, are further
discussed in the Philosophy of the Inductive Sciences, Book x.]
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com