[转载]用数据说话 Pytorch详解NLLLoss和CrossEntropyLoss
https://www.cnblogs.com/jiading/p/11979391.html
NLL_Loss & CrossEntropyLoss(交叉熵)
https://blog.csdn.net/Tabbyddd/article/details/106101759
PyTorch学习笔记——softmax和log_softmax的区别、CrossEntropyLoss() 与 NLLLoss() 的区别、log似然代价函数
https://blog.csdn.net/hao5335156/article/details/80607732
视频讲解很不错的NLLLoss
为什么用log对数作为损失函数cost function呢
https://www.cnblogs.com/hum0ro/p/10243115.html