0% found this document useful (0 votes)
96 views

MNIST Database

The MNIST database is a large database of handwritten digits commonly used for image processing and machine learning research. It contains 60,000 training images and 10,000 testing images of digits written by high school students and US Census employees. Researchers have achieved near-human level performance on MNIST using neural networks. The best performing models have error rates below 0.25%

Uploaded by

mcdonald212
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views

MNIST Database

The MNIST database is a large database of handwritten digits commonly used for image processing and machine learning research. It contains 60,000 training images and 10,000 testing images of digits written by high school students and US Census employees. Researchers have achieved near-human level performance on MNIST using neural networks. The best performing models have error rates below 0.25%

Uploaded by

mcdonald212
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

MNIST database

The MNIST database (Modified National Institute of Standards and Technology database[1]) is a large
database of handwritten digits that is commonly used for training various image processing systems.[2][3]
The database is also widely used for training and testing in the field of machine learning.[4][5] It was
created by "re-mixing" the samples from NIST's original datasets.[6] The creators felt that since NIST's
training dataset was taken from American Census Bureau employees, while the testing dataset was taken
from American high school students, it was not well-suited for machine learning experiments.[7]
Furthermore, the black and white images from NIST were normalized to fit into a 28x28 pixel bounding
box and anti-aliased, which introduced grayscale levels.[7]

The MNIST database contains 60,000 training images and 10,000 testing images.[8] Half of the training set
and half of the test set were taken from NIST's training dataset, while the other half of the training set and Sample images from MNIST test dataset
the other half of the test set were taken from NIST's testing dataset.[9] The original creators of the database
keep a list of some of the methods tested on it.[7] In their original paper, they use a support-vector machine
to get an error rate of 0.8%.[10]

Extended MNIST (EMNIST) is a newer dataset developed and released by NIST to be the (final) successor to MNIST.[11][12] MNIST included images only of
handwritten digits. EMNIST includes all the images from NIST Special Database 19, which is a large database of handwritten uppercase and lower case letters as
well as digits.[13][14] The images in EMNIST were converted into the same 28x28 pixel format, by the same process, as were the MNIST images. Accordingly,
tools which work with the older, smaller, MNIST dataset will likely work unmodified with EMNIST.

History
The set of images in the MNIST database was created in 1994[15] as a combination of two of NIST's databases: Special Database 1 and Special Database 3.
Special Database 1 and Special Database 3 consist of digits written by high school students and employees of the United States Census Bureau, respectively.[7]

The original dataset was a set of 128x128 binary images, processed into 28x28 grayscale images. There were originally 60k samples in both the training set and
the testing set, but 50k of the testing set were discarded. Refer to [16] for a detailed history and a reconstruction of the discarded testing set.

Performance
Some researchers have achieved "near-human performance" on the MNIST database, using a committee of neural networks; in the same paper, the authors achieve
performance double that of humans on other recognition tasks.[17] The highest error rate listed[7] on the original website of the database is 12 percent, which is
achieved using a simple linear classifier with no preprocessing.[10]

In 2004, a best-case error rate of 0.42 percent was achieved on the database by researchers using a new classifier called the LIRA, which is a neural classifier with
three neuron layers based on Rosenblatt's perceptron principles.[18]

Some researchers have tested artificial intelligence systems using the database put under random distortions. The systems in these cases are usually neural networks
and the distortions used tend to be either affine distortions or elastic distortions.[7] Sometimes, these systems can be very successful; one such system achieved an
error rate on the database of 0.39 percent.[19]

In 2011, an error rate of 0.27 percent, improving on the previous best result, was reported by researchers using a similar system of neural networks.[20] In 2013, an
approach based on regularization of neural networks using DropConnect has been claimed to achieve a 0.21 percent error rate.[21] In 2016, the single
convolutional neural network best performance was 0.25 percent error rate.[22] As of August 2018, the best performance of a single convolutional neural network
trained on MNIST training data using no data augmentation is 0.25 percent error rate.[22][23] Also, the Parallel Computing Center (Khmelnytskyi, Ukraine)
obtained an ensemble of only 5 convolutional neural networks which performs on MNIST at 0.21 percent error rate.[24][25] Some images in the testing dataset are
barely readable and may prevent reaching test error rates of 0%.[26] In 2018, researchers from Department of System and Information Engineering, University of
Virginia announced 0.18% error with simultaneous stacked three kind of neural networks (fully connected, recurrent and convolution neural networks).[27]

Classifiers
This is a table of some of the machine learning methods used on the dataset and their error rates, by type of classifier:
Type Classifier Distortion Preprocessing

Linear classifier Pairwise linear classifier None Deskewing

K-Nearest Neighbors K-NN with rigid transformations None None

K-Nearest Neighbors K-NN with non-linear deformation (P2DHMDM) None Shiftable edges

Boosted Stumps Product of stumps on Haar features None Haar features

Non-linear classifier 40 PCA + quadratic classifier None None

Simple statistical pixel


Random Forest Fast Unified Random Forests for Survival, Regression, and Classification (RF-SRC)[31] None
importance

Support-vector machine (SVM) Virtual SVM, deg-9 poly, 2-pixel jittered None Deskewing

Deep neural network (DNN) 2-layer 784-800-10 None None

Deep neural network 2-layer 784-800-10 Elastic distortions None

Deep neural network 6-layer 784-2500-2000-1500-1000-500-10 Elastic distortions None

Convolutional neural network (CNN) 6-layer 784-40-80-500-1000-2000-10 None Expansion of the training dat

Convolutional neural network 6-layer 784-50-100-500-1000-10-10 None Expansion of the training dat

Convolutional neural network (CNN) 13-layer 64-128(5x)-256(3x)-512-2048-256-256-10 None None

Convolutional neural network Committee of 35 CNNs, 1-20-P-40-P-150-10 Elastic distortions Width normalizations

Convolutional neural network Committee of 5 CNNs, 6-layer 784-50-100-500-1000-10-10 None Expansion of the training dat

Random Multimodel Deep Learning


10 NN-10 RNN - 10 CNN None None
(RMDL)

Convolutional neural network Committee of 20 CNNS with Squeeze-and-Excitation Networks[38] None Data augmentation

Data augmentation consistin


Convolutional neural network Ensemble of 3 CNNs with varying kernel sizes None
of rotation and translation

See also
List of datasets for machine learning research
Caltech 101
LabelMe
OCR

References
1. "THE MNIST DATABASE of handwritten digits" (http://yann.lecun.c 9. Zhang, Bin; Srihari, Sargur N. (2004). "Fast k-Nearest Neighbor
om/exdb/mnist/). Yann LeCun, Courant Institute, NYU Corinna Classification Using Cluster-Based Trees" (http://mleg.cse.sc.edu/e
Cortes, Google Labs, New York Christopher J.C. Burges, Microsoft du/csce822/uploads/Main.ReadingList/KNN_fastbyClustering.pdf)
Research, Redmond. (PDF). IEEE Transactions on Pattern Analysis and Machine
2. "Support vector machines speed pattern recognition - Vision Intelligence. 26 (4): 525–528. doi:10.1109/TPAMI.2004.1265868 (ht
Systems Design" (http://www.vision-systems.com/articles/print/volu tps://doi.org/10.1109%2FTPAMI.2004.1265868). PMID 15382657
me-9/issue-9/technology-trends/software/support-vector-machines- (https://pubmed.ncbi.nlm.nih.gov/15382657). S2CID 6883417 (http
speed-pattern-recognition.html). Vision Systems Design. Retrieved s://api.semanticscholar.org/CorpusID:6883417). Retrieved 20 April
17 August 2013. 2020.
3. Gangaputra, Sachin. "Handwritten digit database" (http://cis.jhu.ed 10. LeCun, Yann; Léon Bottou; Yoshua Bengio; Patrick Haffner (1998).
u/~sachin/digit/digit.html). Retrieved 17 August 2013. "Gradient-Based Learning Applied to Document Recognition" (htt
4. Qiao, Yu (2007). "THE MNIST DATABASE of handwritten digits" (ht p://yann.lecun.com/exdb/publis/pdf/lecun-98.pdf) (PDF).
tp://www.gavo.t.u-tokyo.ac.jp/~qiao/database.html). Retrieved Proceedings of the IEEE. 86 (11): 2278–2324.
doi:10.1109/5.726791 (https://doi.org/10.1109%2F5.726791).
18 August 2013.
S2CID 14542261 (https://api.semanticscholar.org/CorpusID:145422
5. Platt, John C. (1999). "Using analytic QP and sparseness to speed 61). Retrieved 18 August 2013.
training of support vector machines" (https://web.archive.org/web/20
11. NIST (4 April 2017). "The EMNIST Dataset" (https://www.nist.gov/itl/
160304083810/http://ar.newsmth.net/att/148aa490aed5b5/smo-nip
products-and-services/emnist-dataset). NIST. Retrieved 11 April
s.pdf) (PDF). Advances in Neural Information Processing Systems:
557–563. Archived from the original (http://ar.newsmth.net/att/148aa 2022.
490aed5b5/smo-nips.pdf) (PDF) on 4 March 2016. Retrieved 12. NIST (27 August 2010). "NIST Special Database 19" (https://www.n
18 August 2013. ist.gov/srd/nist-special-database-19). NIST. Retrieved 11 April
6. Grother, Patrick J. "NIST Special Database 19 - Handprinted Forms 2022.
and Characters Database" (https://www.nist.gov/system/files/docum 13. Cohen, G.; Afshar, S.; Tapson, J.; van Schaik, A. (2017). "EMNIST:
ents/srd/nistsd19.pdf) (PDF). National Institute of Standards and an extension of MNIST to handwritten letters". arXiv:1702.05373 (ht
Technology. tps://arxiv.org/abs/1702.05373) [cs.CV (https://arxiv.org/archive/cs.C
7. LeCun, Yann; Cortez, Corinna; Burges, Christopher C.J. "The V)].
MNIST Handwritten Digit Database" (http://yann.lecun.com/exdb/m 14. Cohen, G.; Afshar, S.; Tapson, J.; van Schaik, A. (2017). "EMNIST:
nist/). Yann LeCun's Website yann.lecun.com. Retrieved 30 April an extension of MNIST to handwritten letters". arXiv:1702.05373v1
2020. (https://arxiv.org/abs/1702.05373v1) [cs.CV (https://arxiv.org/archive/
cs.CV)].
8. Kussul, Ernst; Baidyk, Tatiana (2004). "Improved method of
handwritten digit recognition tested on MNIST database". Image 15. http://yann.lecun.com/exdb/publis/pdf/bottou-94.pdf
and Vision Computing. 22 (12): 971–981.
doi:10.1016/j.imavis.2004.03.008 (https://doi.org/10.1016%2Fj.imav
is.2004.03.008).
16. Yadav, Chhavi; Bottou, Leon (2019). "Cold Case: The Lost MNIST 28. Lindblad, Joakim; Nataša Sladoje (January 2014). "Linear time
Digits" (https://proceedings.neurips.cc/paper/2019/hash/51c68dc08 distances between fuzzy sets with applications to pattern matching
4cb0b8467eafad1330bce66-Abstract.html). Advances in Neural and classification". IEEE Transactions on Image Processing. 23 (1):
Information Processing Systems. Curran Associates, Inc. 32. 126–136. Bibcode:2014ITIP...23..126L (https://ui.adsabs.harvard.ed
arXiv:1905.10498 (https://arxiv.org/abs/1905.10498). u/abs/2014ITIP...23..126L). doi:10.1109/TIP.2013.2286904 (https://d
17. Cires¸an, Dan; Ueli Meier; Jürgen Schmidhuber (2012). "Multi- oi.org/10.1109%2FTIP.2013.2286904). PMID 24158476 (https://pub
column deep neural networks for image classification" (http://reposit med.ncbi.nlm.nih.gov/24158476). S2CID 1908950 (https://api.sema
ory.supsi.ch/5145/1/IDSIA-04-12.pdf) (PDF). 2012 IEEE nticscholar.org/CorpusID:1908950).
Conference on Computer Vision and Pattern Recognition. 29. Keysers, Daniel; Thomas Deselaers; Christian Gollan; Hermann
pp. 3642–3649. arXiv:1202.2745 (https://arxiv.org/abs/1202.2745). Ney (August 2007). "Deformation models for image recognition".
CiteSeerX 10.1.1.300.3283 (https://citeseerx.ist.psu.edu/viewdoc/su IEEE Transactions on Pattern Analysis and Machine Intelligence.
mmary?doi=10.1.1.300.3283). doi:10.1109/CVPR.2012.6248110 (h 29 (8): 1422–1435. CiteSeerX 10.1.1.106.3963 (https://citeseerx.ist.
ttps://doi.org/10.1109%2FCVPR.2012.6248110). ISBN 978-1-4673- psu.edu/viewdoc/summary?doi=10.1.1.106.3963).
1228-8. S2CID 2161592 (https://api.semanticscholar.org/CorpusID: doi:10.1109/TPAMI.2007.1153 (https://doi.org/10.1109%2FTPAMI.2
2161592). 007.1153). PMID 17568145 (https://pubmed.ncbi.nlm.nih.gov/17568
18. Kussul, Ernst; Tatiana Baidyk (2004). "Improved method of 145). S2CID 2528485 (https://api.semanticscholar.org/CorpusID:25
handwritten digit recognition tested on MNIST database" (https://we 28485).
b.archive.org/web/20130921060416/https://vlabdownload.googleco 30. Kégl, Balázs; Róbert Busa-Fekete (2009). "Boosting products of
de.com/files/Image_VisionComputing.pdf) (PDF). Image and Vision base classifiers" (https://users.lal.in2p3.fr/kegl/research/PDFs/keglB
Computing. 22 (12): 971–981. doi:10.1016/j.imavis.2004.03.008 (htt usafeRekete09.pdf) (PDF). Proceedings of the 26th Annual
ps://doi.org/10.1016%2Fj.imavis.2004.03.008). Archived from the International Conference on Machine Learning: 497–504.
original (https://vlabdownload.googlecode.com/files/Image_VisionC doi:10.1145/1553374.1553439 (https://doi.org/10.1145%2F155337
omputing.pdf) (PDF) on 21 September 2013. Retrieved 4.1553439). ISBN 9781605585161. S2CID 8460779 (https://api.se
20 September 2013. manticscholar.org/CorpusID:8460779). Retrieved 27 August 2013.
19. Ranzato, Marc'Aurelio; Christopher Poultney; Sumit Chopra; Yann 31. "RandomForestSRC: Fast Unified Random Forests for Survival,
LeCun (2006). "Efficient Learning of Sparse Representations with Regression, and Classification (RF-SRC)" (https://cran.r-project.org/
an Energy-Based Model" (http://yann.lecun.com/exdb/publis/pdf/ran web/packages/randomForestSRC/). 21 January 2020.
zato-06.pdf) (PDF). Advances in Neural Information Processing 32. "Mehrad Mahmoudian / MNIST with RandomForest" (https://gitlab.c
Systems. 19: 1137–1144. Retrieved 20 September 2013. om/mehrad/mnist-with-randomforest).
20. Ciresan, Dan Claudiu; Ueli Meier; Luca Maria Gambardella; Jürgen 33. Decoste, Dennis; Schölkopf, Bernhard (2002). "Training Invariant
Schmidhuber (2011). "Convolutional neural network committees for Support Vector Machines" (https://doi.org/10.1023%2FA%3A10124
handwritten character classification" (https://web.archive.org/web/2 54411458). Machine Learning. 46 (1–3): 161–190.
0160222152015/http://www.icdar2011.org/fileup/PDF/4520b135.pd doi:10.1023/A:1012454411458 (https://doi.org/10.1023%2FA%3A1
f) (PDF). 2011 International Conference on Document Analysis and 012454411458). ISSN 0885-6125 (https://www.worldcat.org/issn/08
Recognition (ICDAR). pp. 1135–1139. CiteSeerX 10.1.1.465.2138 85-6125). OCLC 703649027 (https://www.worldcat.org/oclc/703649
(https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.465.213 027).
8). doi:10.1109/ICDAR.2011.229 (https://doi.org/10.1109%2FICDA
34. Patrice Y. Simard; Dave Steinkraus; John C. Platt (2003). "Best
R.2011.229). ISBN 978-1-4577-1350-7. S2CID 10122297 (https://a Practices for Convolutional Neural Networks Applied to Visual
pi.semanticscholar.org/CorpusID:10122297). Archived from the Document Analysis" (http://research.microsoft.com/apps/pubs/?id=6
original (http://www.icdar2011.org/fileup/PDF/4520b135.pdf) (PDF) 8920). Proceedings of the Seventh International Conference on
on 22 February 2016. Retrieved 20 September 2013.
Document Analysis and Recognition. Vol. 1. Institute of Electrical
21. Wan, Li; Matthew Zeiler; Sixin Zhang; Yann LeCun; Rob Fergus and Electronics Engineers. p. 958.
(2013). Regularization of Neural Network using DropConnect. doi:10.1109/ICDAR.2003.1227801 (https://doi.org/10.1109%2FICD
International Conference on Machine Learning(ICML). AR.2003.1227801). ISBN 978-0-7695-1960-9. S2CID 4659176 (htt
22. SimpleNet (2016). "Lets Keep it simple, Using simple architectures ps://api.semanticscholar.org/CorpusID:4659176).
to outperform deeper and more complex architectures" (https://githu 35. Ciresan, Claudiu Dan; Ueli Meier; Luca Maria Gambardella;
b.com/Coderx7/SimpleNet). arXiv:1608.06037 (https://arxiv.org/abs/ Juergen Schmidhuber (December 2010). "Deep Big Simple Neural
1608.06037). Retrieved 3 December 2020. Nets Excel on Handwritten Digit Recognition". Neural Computation.
23. SimpNet (2018). "Towards Principled Design of Deep 22 (12): 3207–20. arXiv:1003.0358 (https://arxiv.org/abs/1003.035
Convolutional Networks: Introducing SimpNet" (https://github.com/C 8). doi:10.1162/NECO_a_00052 (https://doi.org/10.1162%2FNECO
oderx7/SimpNet). Github. arXiv:1802.06205 (https://arxiv.org/abs/18 _a_00052). PMID 20858131 (https://pubmed.ncbi.nlm.nih.gov/2085
02.06205). Retrieved 3 December 2020. 8131). S2CID 1918673 (https://api.semanticscholar.org/CorpusID:1
24. Romanuke, Vadim. "Parallel Computing Center (Khmelnytskyi, 918673).
Ukraine) represents an ensemble of 5 convolutional neural 36. Romanuke, Vadim. "The single convolutional neural network best
networks which performs on MNIST at 0.21 percent error rate" (http performance in 18 epochs on the expanded training data at Parallel
s://drive.google.com/file/d/0B1WkCFOvGHDddElkdkl6bzRLRE0/vi Computing Center, Khmelnytskyi, Ukraine" (https://drive.google.co
ew?usp=sharing). Retrieved 24 November 2016. m/file/d/0B1WkCFOvGHDdWlZvWUlLd0V3ZFU/view?usp=sharin
25. Romanuke, Vadim (2016). "Training data expansion and boosting g). Retrieved 16 November 2016.
of convolutional neural networks for reducing the MNIST dataset 37. Romanuke, Vadim. "Parallel Computing Center (Khmelnytskyi,
error rate" (https://doi.org/10.20535%2F1810-0546.2016.6.84115). Ukraine) gives a single convolutional neural network performing on
Research Bulletin of NTUU "Kyiv Polytechnic Institute". 6 (6): 29– MNIST at 0.27 percent error rate" (https://drive.google.com/file/d/0B
34. doi:10.20535/1810-0546.2016.6.84115 (https://doi.org/10.2053 1WkCFOvGHDdOC0yR0tfbmpidjg/view?usp=sharing). Retrieved
5%2F1810-0546.2016.6.84115). 24 November 2016.
26. MNIST classifier, GitHub. "Classify MNIST digits using 38. Hu, Jie; Shen, Li; Albanie, Samuel; Sun, Gang; Wu, Enhua (2019).
Convolutional Neural Networks" (https://github.com/j05t/mnist). "Squeeze-and-Excitation Networks". IEEE Transactions on Pattern
GitHub. Retrieved 3 August 2018. Analysis and Machine Intelligence. 42 (8): 2011–2023.
27. Kowsari, Kamran; Heidarysafa, Mojtaba; Brown, Donald E.; arXiv:1709.01507 (https://arxiv.org/abs/1709.01507).
Meimandi, Kiana Jafari; Barnes, Laura E. (2018-05-03). "RMDL: doi:10.1109/TPAMI.2019.2913372 (https://doi.org/10.1109%2FTPA
Random Multimodel Deep Learning for Classification". MI.2019.2913372). PMID 31034408 (https://pubmed.ncbi.nlm.nih.go
Proceedings of the 2018 International Conference on Information v/31034408). S2CID 140309863 (https://api.semanticscholar.org/Co
System and Data Mining. arXiv:1805.01890 (https://arxiv.org/abs/18 rpusID:140309863).
05.01890). doi:10.1145/3206098.3206111 (https://doi.org/10.1145% 39. "GitHub - Matuzas77/MNIST-0.17: MNIST classifier with average
2F3206098.3206111). S2CID 19208611 (https://api.semanticschola 0.17% error" (https://github.com/Matuzas77/MNIST-0.17.git).
r.org/CorpusID:19208611). GitHub. 25 February 2020.
40. An, Sanghyeon; Lee, Minjun; Park, Sanglee; Yang, Heerin; So,
Jungmin (2020-10-04). "An Ensemble of Simple Convolutional
Neural Network Models for MNIST Digit Recognition".
arXiv:2008.10400 (https://arxiv.org/abs/2008.10400) [cs.CV (https://
arxiv.org/archive/cs.CV)].

Further reading
Ciresan, Dan; Meier, Ueli; Schmidhuber, Jürgen (June 2012). "Multi-column deep neural networks for image classification" (http://repository.su
psi.ch/5145/1/IDSIA-04-12.pdf) (PDF). 2012 IEEE Conference on Computer Vision and Pattern Recognition. New York, NY: Institute of
Electrical and Electronics Engineers. pp. 3642–3649. arXiv:1202.2745 (https://arxiv.org/abs/1202.2745). CiteSeerX 10.1.1.300.3283 (https://cit
eseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.300.3283). doi:10.1109/CVPR.2012.6248110 (https://doi.org/10.1109%2FCVPR.2012.6248
110). ISBN 9781467312264. OCLC 812295155 (https://www.worldcat.org/oclc/812295155). S2CID 2161592 (https://api.semanticscholar.org/
CorpusID:2161592). Retrieved 2013-12-09.

External links
Official website (http://yann.lecun.com/exdb/mnist/)
Visualization of the MNIST database (https://github.com/mbornet-hl/MNIST/tree/master/IMAGES/GROUPS) – groups of images of MNIST
handwritten digits on GitHub

Retrieved from "https://en.wikipedia.org/w/index.php?title=MNIST_database&oldid=1164972992"

You might also like