0% found this document useful (0 votes)
21 views9 pages

Module-V CPU, TPU, GPU

Uploaded by

varunpuslekar31
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views9 pages

Module-V CPU, TPU, GPU

Uploaded by

varunpuslekar31
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 9

CPU,TPU,GPU

Dr.Bahubali Shiragapur
Introduction
• TPUs and GPUs are two of the most popular processors used in
machine learning applications, both offering unique benefits
depending on the individual requirements of a project.
• CPU’s (Central Processing Units) have been the traditional processor
choice for many years, but GPUs (Graphics Processing Units) and TPUs
(Tensor Processing Units) offer a range of specialized features
designed to maximize performance in certain types of calculations
GPU
• GPUs were originally designed for processing graphics but can also be
used for deep learning tasks. They are general-purpose processors
with thousands of cores optimized for handling vector and matrix
operations. This makes them versatile enough to handle tasks such as
graphics rendering, simulations, scientific computing and neural
network computations. They have an extensive ecosystem with
numerous software tools such as CUDA, cuDNN, TensorFlow and
PyTorch built around them. However, they consume more power than
TPUs and can be expensive especially for small businesses or
individual researchers who require high-performance GPUs.
TPUs

• TPUs on the other hand are purpose-built specifically for machine learning
applications using tensor operations as their core building blocks for neural
network computations. They offer superior performance compared to CPUs or
GPUs due to their streamlined architecture which is tailored to accelerate
tensor calculations quickly while consuming less energy than GPUs. The
tradeoff however is that they have fewer memory options available than GPUs
but higher memory bandwidth thus making them better suited when working
with large batches of data or large models where computational efficiency is
key. Furthermore, they come integrated with Google’s cloud platform so
projects that require scalability can take advantage of this feature without
having to worry about hardware availability issues due to limited supply chains
or cost restrictions associated with setting up one’s own infrastructure from
scratch.
How to decide
• When deciding between GPU vs TPU it ultimately comes down to the
specific requirements of a project – budget constraints, development
environment capabilities etc.. For instance if precision matters then a
GPU may be preferred since they offer greater flexibility in terms of
precision whereas if time taken from model conception through
deployment is important then a TPU might be favored given its ability
to provide faster inference times compared to its GPU counterpart
while providing enhanced energy efficiency benefits at scale over
longer periods of time.
CPU vs GPU vs TPU: Understanding
the basics
• CPUs are the most common processor used in modern computers, and they
are designed to handle general tasks such as data processing, calculations,
and input/output.
• GPUs, or graphics processing units, are specialized processors designed for
graphical rendering tasks such as gaming and video editing.
• TPUs, or tensor processing units, are a newer type of processor tailored
specifically for machine learning workloads.
• All three types of processors can be used to perform AI tasks; however, their
architectures differ significantly.
• CPUs have a general-purpose architecture while GPUs offer flexibility and
precision options and TPUs are optimized for tensor operations.
CPU & TPU
• https://cloud.google.com/tpu/docs/intro-to-tpu
TPU & CPU
• https://colab.research.google.com/drive/1L8hhkgZX2OYP_LOaMdtCju
fhCD-jWOGe?authuser=1#scrollTo=oiFxIrMqyYnW
TPU: In this example, we'll work through training a model to classify images of
flowers on Google's lightning-fast Cloud TPUs. Our model will take as input a photo of a
flower and return whether it is a daisy, dandelion, rose, sunflower, or tulip.

https://colab.research.google.com/notebooks/tpu.ipynb#scrollTo=ovFD
eMgtjqW4

You might also like