This document introduce the literature 'Deep Compression' written by S. Han, et al. You can easily understand that literature by reading this. Only Japanese.
Slides by Amaia Salvador at the UPC Computer Vision Reading Group.
Source document on GDocs with clickable links:
https://docs.google.com/presentation/d/1jDTyKTNfZBfMl8OHANZJaYxsXTqGCHMVeMeBe5o1EL0/edit?usp=sharing
Based on the original work:
Ren, Shaoqing, Kaiming He, Ross Girshick, and Jian Sun. "Faster R-CNN: Towards real-time object detection with region proposal networks." In Advances in Neural Information Processing Systems, pp. 91-99. 2015.
Slides by Amaia Salvador at the UPC Computer Vision Reading Group.
Source document on GDocs with clickable links:
https://docs.google.com/presentation/d/1jDTyKTNfZBfMl8OHANZJaYxsXTqGCHMVeMeBe5o1EL0/edit?usp=sharing
Based on the original work:
Ren, Shaoqing, Kaiming He, Ross Girshick, and Jian Sun. "Faster R-CNN: Towards real-time object detection with region proposal networks." In Advances in Neural Information Processing Systems, pp. 91-99. 2015.
The document discusses metric learning for clustering. It motivates metric learning by showing how must-link and cannot-link constraints can help clustering algorithms find better solutions. It explains that metric learning learns a distance metric that respects the pairwise constraints by assigning small distances to similar pairs and larger distances to dissimilar pairs. The document outlines an algorithm called MPCK-means that learns individual metrics for each cluster while allowing weights for different constraints.
This document summarizes a 2010 tutorial on metric learning given by Brian Kulis at the University of California, Berkeley. The tutorial introduces metric learning problems and algorithms. It discusses how metric learning can learn feature weights or linear/nonlinear transformations from data to improve distance metrics for tasks like clustering and classification. Key topics covered include Mahalanobis distance metrics, linear and nonlinear metric learning methods, and applications. The tutorial aims to explain both theoretical concepts and practical considerations for metric learning.
Distance metric learning is a technique to learn a distance metric from training data to improve the performance of algorithms like classification and clustering. Large Margin Nearest Neighbor (LMNN) is an approach that learns a Mahalanobis distance metric for k-nearest neighbor classification by formulating it as a semidefinite program to minimize a cost function. It aims to bring similar examples closer while pushing dissimilar examples farther apart with a margin of at least 1 unit. Large Margin Component Analysis (LMCA) extends LMNN to high dimensional data by directly optimizing the objective with respect to a non-square dimensionality reduction matrix rather than a square distance metric matrix.
Deep Auto-Encoder Neural Networks in Reiforcement Learnning (第 9 回 Deep Learn...Ohsawa Goodfellow
Deep Learning Japan @ 東大です
http://www.facebook.com/DeepLearning
https://sites.google.com/site/deeplearning2013/
論文紹介:Dueling network architectures for deep reinforcement learningKazuki Adachi
Wang, Ziyu, et al. "Dueling network architectures for deep reinforcement learning." Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1995-2003, 2016.
37. Refference
[1]Deep Learning:
A fast learning algorithm for deep belief nets, GE Hinton, S Osindero, YW Teh -
Neural computation, 2006.
[2]CNNs:
Face recognition: A convolutional neural-network approach, S Lawrence, CL
Giles,et al. Neural Networks, IEEE Transactions on 1997
参考http://ceromondo.blogspot.jp/2012/09/convolutional-neural-network.html
[3]Dropout:
Improving neural networks by preventing co-adaptation of feature detectors, GE
Hinton, N Srivastava, A Krizhevsky, et al. 2012