100% found this document useful (2 votes)
18 views

Tensors for Data Processing. Theory, Methods, and Applications Yipeng Liuinstant download

The document promotes the ebook 'Tensors for Data Processing: Theory, Methods, and Applications' edited by Yipeng Liu, available for download at ebookmass.com. It includes a variety of other recommended ebooks and textbooks related to data processing and analysis. The document also contains information about the publisher and copyright details.

Uploaded by

khakhmelloy
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (2 votes)
18 views

Tensors for Data Processing. Theory, Methods, and Applications Yipeng Liuinstant download

The document promotes the ebook 'Tensors for Data Processing: Theory, Methods, and Applications' edited by Yipeng Liu, available for download at ebookmass.com. It includes a variety of other recommended ebooks and textbooks related to data processing and analysis. The document also contains information about the publisher and copyright details.

Uploaded by

khakhmelloy
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 85

Download the full version and explore a variety of ebooks

or textbooks at https://ebookmass.com

Tensors for Data Processing. Theory, Methods, and


Applications Yipeng Liu

_____ Follow the link below to get your download now _____

https://ebookmass.com/product/tensors-for-data-processing-
theory-methods-and-applications-yipeng-liu/

Access ebookmass.com now to download high-quality


ebooks or textbooks
Here are some recommended products for you. Click the link to
download, or explore more at ebookmass.com

Tensors for Data Processing 1st Edition Yipeng Liu

https://ebookmass.com/product/tensors-for-data-processing-1st-edition-
yipeng-liu/

Spatial analysis using big data: methods and urban


applications Yamagata

https://ebookmass.com/product/spatial-analysis-using-big-data-methods-
and-urban-applications-yamagata/

Magnetic Communications: Theory and Techniques Liu

https://ebookmass.com/product/magnetic-communications-theory-and-
techniques-liu/

Computational Methods for Nonlinear Dynamical Systems:


Theory and Applications in Aerospace Engineering Xuechuan
Wang
https://ebookmass.com/product/computational-methods-for-nonlinear-
dynamical-systems-theory-and-applications-in-aerospace-engineering-
xuechuan-wang/
Processing for Android: Create Mobile, Sensor-aware, and
XR Applications Using Processing 2nd Edition Colubri

https://ebookmass.com/product/processing-for-android-create-mobile-
sensor-aware-and-xr-applications-using-processing-2nd-edition-colubri/

Big Data in Astronomy: Scientific Data Processing for


Advanced Radio Telescopes 1st Edition Linghe Kong (Editor)

https://ebookmass.com/product/big-data-in-astronomy-scientific-data-
processing-for-advanced-radio-telescopes-1st-edition-linghe-kong-
editor/

Oil Palm Biomass for Composite Panels: Fundamentals,


Processing, and Applications S.M. Sapuan

https://ebookmass.com/product/oil-palm-biomass-for-composite-panels-
fundamentals-processing-and-applications-s-m-sapuan/

Computational and Data-Driven Chemistry Using Artificial


Intelligence: Fundamentals, Methods and Applications
Takashiro Akitsu
https://ebookmass.com/product/computational-and-data-driven-chemistry-
using-artificial-intelligence-fundamentals-methods-and-applications-
takashiro-akitsu/

Processing for Android: Create Mobile, Sensor-aware, and


XR Applications Using Processing, 2nd Edition Andrés
Colubri
https://ebookmass.com/product/processing-for-android-create-mobile-
sensor-aware-and-xr-applications-using-processing-2nd-edition-andres-
colubri/
Tensors for
Data Processing
This page intentionally left blank
Tensors for
Data Processing
Theory, Methods, and Applications

Edited by
Yipeng Liu
School of Information and Communication Engineering
University of Electronic Science and Technology
of China (UESTC)
Chengdu, China
Academic Press is an imprint of Elsevier
125 London Wall, London EC2Y 5AS, United Kingdom
525 B Street, Suite 1650, San Diego, CA 92101, United States
50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States
The Boulevard, Langford Lane, Kidlington, Oxford OX5 1GB, United Kingdom
Copyright © 2022 Elsevier Inc. All rights reserved.

MATLAB® is a trademark of The MathWorks, Inc. and is used with permission.


The MathWorks does not warrant the accuracy of the text or exercises in this book.
This book’s use or discussion of MATLAB® software or related products does not constitute
endorsement or sponsorship by The MathWorks of a particular pedagogical approach or particular use
of the MATLAB® software.
No part of this publication may be reproduced or transmitted in any form or by any means, electronic
or mechanical, including photocopying, recording, or any information storage and retrieval system,
without permission in writing from the publisher. Details on how to seek permission, further
information about the Publisher’s permissions policies and our arrangements with organizations such
as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website:
www.elsevier.com/permissions.
This book and the individual contributions contained in it are protected under copyright by the
Publisher (other than as may be noted herein).
Notices
Knowledge and best practice in this field are constantly changing. As new research and experience
broaden our understanding, changes in research methods, professional practices, or medical treatment
may become necessary.
Practitioners and researchers must always rely on their own experience and knowledge in evaluating
and using any information, methods, compounds, or experiments described herein. In using such
information or methods they should be mindful of their own safety and the safety of others, including
parties for whom they have a professional responsibility.
To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume
any liability for any injury and/or damage to persons or property as a matter of products liability,
negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas
contained in the material herein.

Library of Congress Cataloging-in-Publication Data


A catalog record for this book is available from the Library of Congress

British Library Cataloguing-in-Publication Data


A catalogue record for this book is available from the British Library

ISBN: 978-0-12-824447-0

For information on all Academic Press publications


visit our website at https://www.elsevier.com/books-and-journals

Publisher: Mara Conner


Acquisitions Editor: Tim Pitts
Editorial Project Manager: Charlotte Rowley
Production Project Manager: Prem Kumar Kaliamoorthi
Designer: Miles Hitchen
Typeset by VTeX
Contents

List of contributors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii


Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xix
CHAPTER 1 Tensor decompositions: computations,
applications, and challenges . . . . . . . . . . . . . . . . . . . . 1
Yingyue Bi, Yingcong Lu, Zhen Long, Ce Zhu, and
Yipeng Liu
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.1 What is a tensor? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 Why do we need tensors? . . . . . . . . . . . . . . . . . . . . . 2
1.2 Tensor operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2.1 Tensor notations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2.2 Matrix operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.3 Tensor transformations . . . . . . . . . . . . . . . . . . . . . . . 6
1.2.4 Tensor products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.2.5 Structural tensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.2.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.3 Tensor decompositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.3.1 Tucker decomposition . . . . . . . . . . . . . . . . . . . . . . . . 13
1.3.2 Canonical polyadic decomposition . . . . . . . . . . . . . . . 14
1.3.3 Block term decomposition . . . . . . . . . . . . . . . . . . . . . 16
1.3.4 Tensor singular value decomposition . . . . . . . . . . . . . 18
1.3.5 Tensor network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
1.4 Tensor processing techniques . . . . . . . . . . . . . . . . . . . . . . . . . 24
1.5 Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
CHAPTER 2 Transform-based tensor singular value
decomposition in multidimensional image recovery 31
Tai-Xiang Jiang, Michael K. Ng, and Xi-Le Zhao
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.2 Recent advances of the tensor singular value decomposition . . 34
2.2.1 Preliminaries and basic tensor notations . . . . . . . . . . . 34
2.2.2 The t-SVD framework . . . . . . . . . . . . . . . . . . . . . . . . 35
2.2.3 Tensor nuclear norm and tensor recovery . . . . . . . . . . 38
2.2.4 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.3 Transform-based t-SVD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.3.1 Linear invertible transform-based t-SVD . . . . . . . . . . 45

v
vi Contents

2.3.2 Beyond invertibility and data adaptivity . . . . . . . . . . . 47


2.4 Numerical experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
2.4.1 Examples within the t-SVD framework . . . . . . . . . . . 49
2.4.2 Examples of the transform-based t-SVD . . . . . . . . . . 51
2.5 Conclusions and new guidelines . . . . . . . . . . . . . . . . . . . . . . . 53
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
CHAPTER 3 Partensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Paris A. Karakasis, Christos Kolomvakis, George Lourakis,
George Lykoudis, Ioannis Marios Papagiannakos,
Ioanna Siaminou, Christos Tsalidis, and
Athanasios P. Liavas
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.1.1 Related work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.1.2 Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.2 Tensor decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.2.1 Matrix least-squares problems . . . . . . . . . . . . . . . . . . 65
3.2.2 Alternating optimization for tensor decomposition . . . 69
3.3 Tensor decomposition with missing elements . . . . . . . . . . . . . 70
3.3.1 Matrix least-squares with missing elements . . . . . . . . 71
3.3.2 Tensor decomposition with missing elements: the
unconstrained case . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
3.3.3 Tensor decomposition with missing elements: the
nonnegative case . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
3.3.4 Alternating optimization for tensor decomposition
with missing elements . . . . . . . . . . . . . . . . . . . . . . . . 75
3.4 Distributed memory implementations . . . . . . . . . . . . . . . . . . . 75
3.4.1 Some MPI preliminaries . . . . . . . . . . . . . . . . . . . . . . 75
3.4.2 Variable partitioning and data allocation . . . . . . . . . . . 77
3.4.3 Tensor decomposition . . . . . . . . . . . . . . . . . . . . . . . . 79
3.4.4 Tensor decomposition with missing elements . . . . . . . 81
3.4.5 Some implementation details . . . . . . . . . . . . . . . . . . . 82
3.5 Numerical experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
3.5.1 Tensor decomposition . . . . . . . . . . . . . . . . . . . . . . . . 83
3.5.2 Tensor decomposition with missing elements . . . . . . . 84
3.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Acknowledgment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
CHAPTER 4 A Riemannian approach to low-rank tensor
learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Hiroyuki Kasai, Pratik Jawanpuria, and Bamdev Mishra
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
4.2 A brief introduction to Riemannian optimization . . . . . . . . . . 93
Contents vii

4.2.1 Riemannian manifolds . . . . . . . . . . . . . . . . . . . . . . . . 94


4.2.2 Riemannian quotient manifolds . . . . . . . . . . . . . . . . . 95
4.3 Riemannian Tucker manifold geometry . . . . . . . . . . . . . . . . . 97
4.3.1 Riemannian metric and quotient manifold structure . . 97
4.3.2 Characterization of the induced spaces . . . . . . . . . . . . 100
4.3.3 Linear projectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
4.3.4 Retraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
4.3.5 Vector transport . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
4.3.6 Computational cost . . . . . . . . . . . . . . . . . . . . . . . . . . 104
4.4 Algorithms for tensor learning problems . . . . . . . . . . . . . . . . 104
4.4.1 Tensor completion . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
4.4.2 General tensor learning . . . . . . . . . . . . . . . . . . . . . . . 106
4.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
4.5.1 Choice of metric . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
4.5.2 Low-rank tensor completion . . . . . . . . . . . . . . . . . . . 109
4.5.3 Low-rank tensor regression . . . . . . . . . . . . . . . . . . . . 113
4.5.4 Multilinear multitask learning . . . . . . . . . . . . . . . . . . 115
4.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
CHAPTER 5 Generalized thresholding for low-rank tensor
recovery: approaches based on model and
learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Fei Wen, Zhonghao Zhang, and Yipeng Liu
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
5.2 Tensor singular value thresholding . . . . . . . . . . . . . . . . . . . . . 123
5.2.1 Proximity operator and generalized thresholding . . . . 123
5.2.2 Tensor singular value decomposition . . . . . . . . . . . . . 126
5.2.3 Generalized matrix singular value thresholding . . . . . 128
5.2.4 Generalized tensor singular value thresholding . . . . . . 129
5.3 Thresholding based low-rank tensor recovery . . . . . . . . . . . . . 131
5.3.1 Thresholding algorithms for low-rank tensor recovery 132
5.3.2 Generalized thresholding algorithms for low-rank
tensor recovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
5.4 Generalized thresholding algorithms with learning . . . . . . . . . 136
5.4.1 Deep unrolling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
5.4.2 Deep plug-and-play . . . . . . . . . . . . . . . . . . . . . . . . . . 140
5.5 Numerical examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
5.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
CHAPTER 6 Tensor principal component analysis . . . . . . . . . . . . . 153
Pan Zhou, Canyi Lu, and Zhouchen Lin
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
viii Contents

6.2 Notations and preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . 155


6.2.1 Notations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
6.2.2 Discrete Fourier transform . . . . . . . . . . . . . . . . . . . . . 157
6.2.3 T-product . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
6.2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
6.3 Tensor PCA for Gaussian-noisy data . . . . . . . . . . . . . . . . . . . 161
6.3.1 Tensor rank and tensor nuclear norm . . . . . . . . . . . . . 161
6.3.2 Analysis of tensor PCA on Gaussian-noisy data . . . . . 165
6.3.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
6.4 Tensor PCA for sparsely corrupted data . . . . . . . . . . . . . . . . . 166
6.4.1 Robust tensor PCA . . . . . . . . . . . . . . . . . . . . . . . . . . 167
6.4.2 Tensor low-rank representation . . . . . . . . . . . . . . . . . 172
6.4.3 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
6.4.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
6.5 Tensor PCA for outlier-corrupted data . . . . . . . . . . . . . . . . . . 191
6.5.1 Outlier robust tensor PCA . . . . . . . . . . . . . . . . . . . . . 192
6.5.2 The fast OR-TPCA algorithm . . . . . . . . . . . . . . . . . . 196
6.5.3 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
6.5.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
6.6 Other tensor PCA methods . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
6.7 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
6.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
CHAPTER 7 Tensors for deep learning theory . . . . . . . . . . . . . . . . . 215
Yoav Levine, Noam Wies, Or Sharir, Nadav Cohen, and
Amnon Shashua
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
7.2 Bounding a function’s expressivity via tensorization . . . . . . . . 217
7.2.1 A measure of capacity for modeling input
dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
7.2.2 Bounding correlations with tensor matricization ranks 220
7.3 A case study: self-attention networks . . . . . . . . . . . . . . . . . . . 223
7.3.1 The self-attention mechanism . . . . . . . . . . . . . . . . . . 223
7.3.2 Self-attention architecture expressivity questions . . . . 227
7.3.3 Results on the operation of self-attention . . . . . . . . . . 230
7.3.4 Bounding the separation rank of self-attention . . . . . . 235
7.4 Convolutional and recurrent networks . . . . . . . . . . . . . . . . . . 242
7.4.1 The operation of convolutional and recurrent networks 243
7.4.2 Addressed architecture expressivity questions . . . . . . 243
7.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
CHAPTER 8 Tensor network algorithms for image classification 249
Cong Chen, Kim Batselier, and Ngai Wong
Contents ix

8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249


8.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
8.2.1 Tensor basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
8.2.2 Tensor decompositions . . . . . . . . . . . . . . . . . . . . . . . 253
8.2.3 Support vector machines . . . . . . . . . . . . . . . . . . . . . . 256
8.2.4 Logistic regression . . . . . . . . . . . . . . . . . . . . . . . . . . 257
8.3 Tensorial extensions of support vector machine . . . . . . . . . . . 258
8.3.1 Supervised tensor learning . . . . . . . . . . . . . . . . . . . . . 258
8.3.2 Support tensor machines . . . . . . . . . . . . . . . . . . . . . . 260
8.3.3 Higher-rank support tensor machines . . . . . . . . . . . . . 263
8.3.4 Support Tucker machines . . . . . . . . . . . . . . . . . . . . . . 265
8.3.5 Support tensor train machines . . . . . . . . . . . . . . . . . . 269
8.3.6 Kernelized support tensor train machines . . . . . . . . . . 275
8.4 Tensorial extension of logistic regression . . . . . . . . . . . . . . . . 284
8.4.1 Rank-1 logistic regression . . . . . . . . . . . . . . . . . . . . . 285
8.4.2 Logistic tensor regression . . . . . . . . . . . . . . . . . . . . . 286
8.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289
CHAPTER 9 High-performance tensor decompositions for
compressing and accelerating deep neural
networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293
Xiao-Yang Liu, Yiming Fang, Liuqing Yang, Zechu Li, and
Anwar Walid
9.1 Introduction and motivation . . . . . . . . . . . . . . . . . . . . . . . . . . 294
9.2 Deep neural networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
9.2.1 Notations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
9.2.2 Linear layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
9.2.3 Fully connected neural networks . . . . . . . . . . . . . . . . 298
9.2.4 Convolutional neural networks . . . . . . . . . . . . . . . . . . 300
9.2.5 Backpropagation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
9.3 Tensor networks and their decompositions . . . . . . . . . . . . . . . 305
9.3.1 Tensor networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
9.3.2 CP tensor decomposition . . . . . . . . . . . . . . . . . . . . . . 308
9.3.3 Tucker decomposition . . . . . . . . . . . . . . . . . . . . . . . . 310
9.3.4 Hierarchical Tucker decomposition . . . . . . . . . . . . . . 313
9.3.5 Tensor train and tensor ring decomposition . . . . . . . . 315
9.3.6 Transform-based tensor decomposition . . . . . . . . . . . 318
9.4 Compressing deep neural networks . . . . . . . . . . . . . . . . . . . . 321
9.4.1 Compressing fully connected layers . . . . . . . . . . . . . . 321
9.4.2 Compressing the convolutional layer via CP
decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
9.4.3 Compressing the convolutional layer via Tucker
decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
x Contents

9.4.4
Compressing the convolutional layer via TT/TR
decompositions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
9.4.5 Compressing neural networks via transform-based
decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330
9.5 Experiments and future directions . . . . . . . . . . . . . . . . . . . . . 333
9.5.1 Performance evaluations using the MNIST dataset . . . 333
9.5.2 Performance evaluations using the CIFAR10 dataset . 336
9.5.3 Future research directions . . . . . . . . . . . . . . . . . . . . . 337
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338
CHAPTER 10 Coupled tensor decompositions for data fusion . . . . 341
Christos Chatzichristos, Simon Van Eyndhoven,
Eleftherios Kofidis, and Sabine Van Huffel
10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
10.2 What is data fusion? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
10.2.1 Context and definition . . . . . . . . . . . . . . . . . . . . . . . . 342
10.2.2 Challenges of data fusion . . . . . . . . . . . . . . . . . . . . . . 343
10.2.3 Types of fusion and data fusion strategies . . . . . . . . . . 347
10.3 Decompositions in data fusion . . . . . . . . . . . . . . . . . . . . . . . . 348
10.3.1 Matrix decompositions and statistical models . . . . . . . 350
10.3.2 Tensor decompositions . . . . . . . . . . . . . . . . . . . . . . . 351
10.3.3 Coupled tensor decompositions . . . . . . . . . . . . . . . . . 352
10.4 Applications of tensor-based data fusion . . . . . . . . . . . . . . . . 355
10.4.1 Biomedical applications . . . . . . . . . . . . . . . . . . . . . . . 355
10.4.2 Image fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
10.5 Fusion of EEG and fMRI: a case study . . . . . . . . . . . . . . . . . . 358
10.6 Data fusion demos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361
10.6.1 SDF demo – approximate coupling . . . . . . . . . . . . . . 361
10.7 Conclusion and prospects . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
CHAPTER 11 Tensor methods for low-level vision . . . . . . . . . . . . . . 371
Tatsuya Yokota, Cesar F. Caiafa, and Qibin Zhao
11.1 Low-level vision and signal reconstruction . . . . . . . . . . . . . . . 371
11.1.1 Observation models . . . . . . . . . . . . . . . . . . . . . . . . . . 372
11.1.2 Inverse problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374
11.2 Methods using raw tensor structure . . . . . . . . . . . . . . . . . . . . 378
11.2.1 Penalty-based tensor reconstruction . . . . . . . . . . . . . . 379
11.2.2 Tensor decomposition and reconstruction . . . . . . . . . . 393
11.3 Methods using tensorization . . . . . . . . . . . . . . . . . . . . . . . . . . 409
11.3.1 Higher-order tensorization . . . . . . . . . . . . . . . . . . . . . 411
11.3.2 Delay embedding/Hankelization . . . . . . . . . . . . . . . . 413
11.4 Examples of low-level vision applications . . . . . . . . . . . . . . . 415
Contents xi

11.4.1 Image inpainting with raw tensor structure . . . . . . . . . 415


11.4.2 Image inpainting using tensorization . . . . . . . . . . . . . 416
11.4.3 Denoising, deblurring, and superresolution . . . . . . . . 417
11.5 Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 419
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 420
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 420
CHAPTER 12 Tensors for neuroimaging . . . . . . . . . . . . . . . . . . . . . . . 427
Aybüke Erol and Borbála Hunyadi
12.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427
12.2 Neuroimaging modalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429
12.3 Multidimensionality of the brain . . . . . . . . . . . . . . . . . . . . . . 431
12.4 Tensor decomposition structures . . . . . . . . . . . . . . . . . . . . . . 433
12.4.1 Product operations for tensors . . . . . . . . . . . . . . . . . . 434
12.4.2 Canonical polyadic decomposition . . . . . . . . . . . . . . . 435
12.4.3 Tucker decomposition . . . . . . . . . . . . . . . . . . . . . . . . 435
12.4.4 Block term decomposition . . . . . . . . . . . . . . . . . . . . . 437
12.5 Applications of tensors in neuroimaging . . . . . . . . . . . . . . . . 437
12.5.1 Filling in missing data . . . . . . . . . . . . . . . . . . . . . . . . 438
12.5.2 Denoising, artifact removal, and dimensionality
reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441
12.5.3 Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444
12.5.4 Registration and longitudinal analysis . . . . . . . . . . . . 445
12.5.5 Source separation . . . . . . . . . . . . . . . . . . . . . . . . . . . 447
12.5.6 Activity recognition and source localization . . . . . . . . 451
12.5.7 Connectivity analysis . . . . . . . . . . . . . . . . . . . . . . . . . 456
12.5.8 Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
12.5.9 Feature extraction and classification . . . . . . . . . . . . . . 463
12.5.10 Summary and practical considerations . . . . . . . . . . . . 468
12.6 Future challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471
12.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 472
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 473
CHAPTER 13 Tensor representation for remote sensing images . 483
Yang Xu, Fei Ye, Bo Ren, Liangfu Lu, Xudong Cui,
Jocelyn Chanussot, and Zebin Wu
13.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 483
13.2 Optical remote sensing: HSI and MSI fusion . . . . . . . . . . . . . 488
13.2.1 Tensor notations and preliminaries . . . . . . . . . . . . . . . 488
13.2.2 Nonlocal patch tensor sparse representation for
HSI-MSI fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 488
13.2.3 High-order coupled tensor ring representation for
HSI-MSI fusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 496
13.2.4 Joint tensor factorization for HSI-MSI fusion . . . . . . . 504
xii Contents

13.3 Polarimetric synthetic aperture radar: feature extraction . . . . . 517


13.3.1 Brief description of PolSAR data . . . . . . . . . . . . . . . . 518
13.3.2 The tensorial embedding framework . . . . . . . . . . . . . 519
13.3.3 Experiment and analysis . . . . . . . . . . . . . . . . . . . . . . 522
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 532
CHAPTER 14 Structured tensor train decomposition for speeding
up kernel-based learning . . . . . . . . . . . . . . . . . . . . . . . . 537
Yassine Zniyed, Ouafae Karmouda, Rémy Boyer,
Jérémie Boulanger, André L.F. de Almeida, and
Gérard Favier
14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 538
14.2 Notations and algebraic background . . . . . . . . . . . . . . . . . . . . 540
14.3 Standard tensor decompositions . . . . . . . . . . . . . . . . . . . . . . . 541
14.3.1 Tucker decomposition . . . . . . . . . . . . . . . . . . . . . . . . 542
14.3.2 HOSVD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 542
14.3.3 Tensor networks and TT decomposition . . . . . . . . . . . 543
14.4 Dimensionality reduction based on a train of low-order tensors 545
14.4.1 TD-train model: equivalence between a high-order TD
and a train of low-order TDs . . . . . . . . . . . . . . . . . . . 546
14.5 Tensor train algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 548
14.5.1 Description of the TT-HSVD algorithm . . . . . . . . . . . 548
14.5.2 Comparison of the sequential and the hierarchical
schemes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549
14.6 Kernel-based classification of high-order tensors . . . . . . . . . . 551
14.6.1 Formulation of SVMs . . . . . . . . . . . . . . . . . . . . . . . . 552
14.6.2 Polynomial and Euclidean tensor-based kernel . . . . . . 553
14.6.3 Kernel on a Grassmann manifold . . . . . . . . . . . . . . . . 553
14.6.4 The fast kernel subspace estimation based on tensor
train decomposition (FAKSETT) method . . . . . . . . . . 554
14.7 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555
14.7.1 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555
14.7.2 Classification performance . . . . . . . . . . . . . . . . . . . . . 557
14.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 558
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 560
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565
List of contributors

Kim Batselier
Delft Center for Systems and Control, Delft University of Technology, Delft,
The Netherlands
Yingyue Bi
School of Information and Communication Engineering, University of Electronic
Science and Technology of China (UESTC), Chengdu, China
Jérémie Boulanger
CRIStAL, Université de Lille, Villeneuve d’Ascq, France
Rémy Boyer
CRIStAL, Université de Lille, Villeneuve d’Ascq, France
Cesar F. Caiafa
Instituto Argentino de Radioastronomía – CCT La Plata, CONICET / CIC-PBA /
UNLP, Villa Elisa, Argentina
RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
Jocelyn Chanussot
LJK, CNRS, Grenoble INP, Inria, Université Grenoble, Alpes, Grenoble, France
Christos Chatzichristos
KU Leuven, Department of Electrical Engineering (ESAT), STADIUS Center for
Dynamical Systems, Signal Processing and Data Analytics, Leuven, Belgium
Cong Chen
Department of Electrical and Electronic Engineering, The University of Hong
Kong, Pokfulam Road, Hong Kong
Nadav Cohen
School of Computer Science, Hebrew University of Jerusalem, Jerusalem, Israel
Xudong Cui
School of Mathematics, Tianjin University, Tianjin, China
André L.F. de Almeida
Department of Teleinformatics Engineering, Federal University of Fortaleza,
Fortaleza, Brazil
Aybüke Erol
Circuits and Systems, Department of Microelectronics, Delft University of
Technology, Delft, The Netherlands

xiii
xiv List of contributors

Yiming Fang
Department of Computer Science, Columbia University, New York, NY,
United States

Gérard Favier
Laboratoire I3S, Université Côte d’Azur, CNRS, Sophia Antipolis, France

Borbála Hunyadi
Circuits and Systems, Department of Microelectronics, Delft University of
Technology, Delft, The Netherlands

Pratik Jawanpuria
Microsoft, Hyderabad, India

Tai-Xiang Jiang
School of Economic Information Engineering, Southwestern University of
Finance and Economics, Chengdu, Sichuan, China

Paris A. Karakasis
School of Electrical and Computer Engineering, Technical University of Crete,
Chania, Greece

Ouafae Karmouda
CRIStAL, Université de Lille, Villeneuve d’Ascq, France

Hiroyuki Kasai
Waseda University, Tokyo, Japan

Eleftherios Kofidis
Dept. of Statistics and Insurance Science, University of Piraeus, Piraeus, Greece

Christos Kolomvakis
School of Electrical and Computer Engineering, Technical University of Crete,
Chania, Greece

Yoav Levine
School of Computer Science, Hebrew University of Jerusalem, Jerusalem, Israel

Zechu Li
Department of Computer Science, Columbia University, New York, NY,
United States

Athanasios P. Liavas
School of Electrical and Computer Engineering, Technical University of Crete,
Chania, Greece
List of contributors xv

Zhouchen Lin
Key Lab. of Machine Perception, School of EECS, Peking University, Beijing,
China
Xiao-Yang Liu
Department of Computer Science and Engineering, Shanghai Jiao Tong
University, Shanghai, China
Department of Electrical Engineering, Columbia University, New York, NY,
United States
Yipeng Liu
School of Information and Communication Engineering, University of Electronic
Science and Technology of China (UESTC), Chengdu, China
Zhen Long
School of Information and Communication Engineering, University of Electronic
Science and Technology of China (UESTC), Chengdu, China
George Lourakis
Neurocom, S.A, Athens, Greece
Canyi Lu
Carnegie Mellon University, Pittsburgh, PA, United States
Liangfu Lu
School of Mathematics, Tianjin University, Tianjin, China
Yingcong Lu
School of Information and Communication Engineering, University of Electronic
Science and Technology of China (UESTC), Chengdu, China
George Lykoudis
Neurocom, S.A, Athens, Greece
Bamdev Mishra
Microsoft, Hyderabad, India
Michael K. Ng
Department of Mathematics, The University of Hong Kong, Pokfulam,
Hong Kong
Ioannis Marios Papagiannakos
School of Electrical and Computer Engineering, Technical University of Crete,
Chania, Greece
Bo Ren
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of
Education of China, Xidian University, Xi’an, China
xvi List of contributors

Or Sharir
School of Computer Science, Hebrew University of Jerusalem, Jerusalem, Israel
Amnon Shashua
School of Computer Science, Hebrew University of Jerusalem, Jerusalem, Israel
Ioanna Siaminou
School of Electrical and Computer Engineering, Technical University of Crete,
Chania, Greece
Christos Tsalidis
Neurocom, S.A, Athens, Greece
Simon Van Eyndhoven
KU Leuven, Department of Electrical Engineering (ESAT), STADIUS Center for
Dynamical Systems, Signal Processing and Data Analytics, Leuven, Belgium
icometrix, Leuven, Belgium
Sabine Van Huffel
KU Leuven, Department of Electrical Engineering (ESAT), STADIUS Center for
Dynamical Systems, Signal Processing and Data Analytics, Leuven, Belgium
Anwar Walid
Nokia Bell Labs, Murray Hill, NJ, United States
Fei Wen
Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai,
China
Noam Wies
School of Computer Science, Hebrew University of Jerusalem, Jerusalem, Israel
Ngai Wong
Department of Electrical and Electronic Engineering, The University of Hong
Kong, Pokfulam Road, Hong Kong
Zebin Wu
School of Computer Science and Engineering, Nanjing University of Science
and Technology, Nanjing, China
Yang Xu
School of Computer Science and Engineering, Nanjing University of Science
and Technology, Nanjing, China
Liuqing Yang
Department of Computer Science, Columbia University, New York, NY,
United States
List of contributors xvii

Fei Ye
School of Computer Science and Engineering, Nanjing University of Science
and Technology, Nanjing, China
Tatsuya Yokota
Nagoya Institute of Technology, Aichi, Japan
RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
Zhonghao Zhang
School of Information and Communication Engineering, University of Electronic
Science and Technology of China (UESTC), Chengdu, China
Qibin Zhao
RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
Guangdong University of Technology, Guangzhou, China
Xi-Le Zhao
School of Mathematical Sciences/Research Center for Image and Vision
Computing, University of Electronic Science and Technology of China, Chengdu,
Sichuan, China
Pan Zhou
SEA AI Lab, Singapore, Singapore
Ce Zhu
School of Information and Communication Engineering, University of Electronic
Science and Technology of China (UESTC), Chengdu, China
Yassine Zniyed
Université de Toulon, Aix-Marseille Université, CNRS, LIS, Toulon, France
This page intentionally left blank
Preface

This book provides an overview of tensors for data processing, covering computing
theories, processing methods, and engineering applications. The tensor extensions
of a series of classical multidimensional data processing techniques are discussed
in this book. Many thanks go to all the contributors. Students can read this book to
get an overall understanding, researchers can update their knowledge on the recent
research advances in the field, and engineers can refer to implementations on various
applications.
The first chapter is an introduction to tensor decomposition. In the following, the
book provides variants of tensor decompositions with their efficient and effective so-
lutions, including some parallel algorithms, Riemannian algorithms, and generalized
thresholding algorithms. Some tensor-based machine learning methods are summa-
rized in detail, including tensor completion, tensor principal component analysis,
support tensor machine, tensor-based kernel learning, tensor-based deep learning, etc.
To demonstrate that tensors can effectively and systematically enhance performance
in practical engineering problems, this book gives implemental details of many ap-
plications, such as signal recovery, recommender systems, climate forecasting, image
clustering, image classification, network compression, data fusion, image enhance-
ment, neuroimaging, and remote sensing.
I sincerely hope this book can serve to introduce tensors to more data scientists
and engineers. As a natural representation of multidimensional data, tensors can be
used to substantially avoid the information loss in matrix representations of multiway
data, and tensor operators can model more connections than their matrix counterparts.
The related advances in applied mathematics allow us to move from matrices to ten-
sors for data processing. This book is promising to motivate novel tensor theories and
new data processing methods, and to stimulate the development of a wide range of
practical applications.

Yipeng Liu
Chengdu, China
Aug. 10, 2021

xix
This page intentionally left blank
CHAPTER

Tensor decompositions:
computations,
applications, and
challenges
1
Yingyue Bi, Yingcong Lu, Zhen Long, Ce Zhu, and Yipeng Liu
School of Information and Communication Engineering, University of Electronic Science and
Technology of China (UESTC), Chengdu, China

CONTENTS
1.1 Introduction..................................................................................... 1
1.1.1 What is a tensor? ............................................................... 1
1.1.2 Why do we need tensors? ..................................................... 2
1.2 Tensor operations ............................................................................. 3
1.2.1 Tensor notations ................................................................ 3
1.2.2 Matrix operators ................................................................ 4
1.2.3 Tensor transformations ........................................................ 6
1.2.4 Tensor products................................................................. 7
1.2.5 Structural tensors .............................................................. 11
1.2.6 Summary ........................................................................ 13
1.3 Tensor decompositions ....................................................................... 13
1.3.1 Tucker decomposition ......................................................... 13
1.3.2 Canonical polyadic decomposition........................................... 14
1.3.3 Block term decomposition .................................................... 16
1.3.4 Tensor singular value decomposition ........................................ 18
1.3.5 Tensor network .................................................................. 19
1.4 Tensor processing techniques............................................................... 24
1.5 Challenges ...................................................................................... 25
References............................................................................................ 26

1.1 Introduction
1.1.1 What is a tensor?
The tensor can be seen as a higher-order generalization of vector and matrix, which
normally has three or more modes (ways) [1]. For example, a color image is a third-
order tensor. It has two spatial modes and one channel mode. Similarly, a color video
is a fourth-order tensor; its extra mode denotes time.
Tensors for Data Processing. https://doi.org/10.1016/B978-0-12-824447-0.00007-8
Copyright © 2022 Elsevier Inc. All rights reserved.
1
2 CHAPTER 1 TDs: computations, applications, and challenges

As special forms of tensors, vector a ∈ RI is a first-order tensor whose i-th


entry (scalar) is ai and matrix A ∈ RI ×J is a second-order tensor whose (i, j )-
th element is ai,j . A general N-th-order tensor can be mathematically denoted as
A ∈ RI1 ×I2 ×···×IN and its (i1 , i2 , · · · , iN )-th entry is ai1 ,i2 ,··· ,iN . For example, a third-
order tensor A ∈ RI1 ×I2 ×I3 is illustrated in Fig. 1.1.

FIGURE 1.1
A third-order tensor A ∈ RI1 ×I2 ×I3 .

1.1.2 Why do we need tensors?


Tensors play important roles in a number of applications, such as signal processing,
machine learning, biomedical engineering, neuroscience, computer vision, communi-
cation, psychometrics, and chemometrics. They can provide a concise mathematical
framework for formulating and solving problems in those fields.
Here are a few cases involving tensor frameworks:
• Many spatial-temporal signals in speech and image processing are multidimen-
sional. Tensor factorization-based techniques can effectively extract features for
enhancement, classification, regression, etc. For example, nonnegative canonical
polyadic (CP) decomposition can be used for speech signal separation where the
first two components of CP decomposition represent frequency and time structure
of the signal and the last component is the coefficient matrix [2].
• The fluorescence excitation–emission data, commonly used in chemistry, medicine,
and food science, has several chemical components with different concentrations.
It can be denoted as a third-order tensor; its three modes represent sample, exci-
tation, and emission. Taking advantage of CP decomposition, the tensor can be
factorized into three factor matrices: relative excitation spectral matrix, relative
emission spectral matrix, and relative concentration matrix. In this way, tensor
decomposition can be applied to analyze the components and corresponding con-
centrations in each sample [3].
1.2 Tensor operations 3

• Social data often have multidimensional structures, which can be exploited by


tensor-based techniques for data mining. For example, the three modes of chat
data are user, keyword, and time. Tensor analysis can reveal the communication
patterns and the hidden structures in social networks, and this can benefit tasks
like recommender systems [4].

1.2 Tensor operations


In this section, we first introduce tensor notations, i.e., fibers and slices, and then
demonstrate how to represent tensors in a graphical way. Before we discuss tensor
operations, several matrix operations are reviewed.

1.2.1 Tensor notations


Subtensors, such as fibers and slices, can be formed from the original tensor. A fiber
is defined by fixing all the indices but one and a slice is defined by fixing all but
two indices. For a third-order tensor AI1 ×I2 ×I3 , its mode-1, mode-2, and mode-3
fibers are denoted by A(:, i2 , i3 ), A(i1 , :, i3 ), and A(i1 , i2 , :), where i1 = 1, · · · , I1 ,
i2 = 1, · · · , I2 and i3 = 1, · · · , I3 , which are illustrated in Fig. 1.2. Its horizontal
slices A(i1 , :, :), i1 = 1, · · · , I1 , lateral slices A(:, i2 , :), i2 = 1, · · · , I2 , and frontal
slices A(:, :, i3 ), i3 = 1, · · · , I3 , are shown in Fig. 1.3. For ease of denotation, we
refer to the frontal slice of A as A(·) in some formulas.

FIGURE 1.2
The illustration of mode-1 fibers A(:, i2 , i3 ), mode-2 fibers A(i1 , :, i3 ), and mode-3 fibers
A(i1 , i2 , :) with i1 = 1, · · · , I1 , i2 = 1, · · · , I2 and i3 = 1, · · · , I3 .

Other than the aforementioned notations, there is another way to denote tensors
and their operations [5]. Taking advantage of graphical representations, tensors can
be denoted by nodes and edges in a straightforward way. Graphical representations
for scalars, vectors, matrices, and tensors are shown in Fig. 1.4. The number next to
the edge represents the indices of the corresponding mode.
4 CHAPTER 1 TDs: computations, applications, and challenges

FIGURE 1.3
The illustration of horizontal slices A(i1 , :, :) i1 = 1, · · · , I1 , lateral slices A(:, i2 , :)
i2 = 1, · · · , I2 , and frontal slices A(:, :, i3 ) i3 = 1, · · · , I3 .

FIGURE 1.4
Graphical representations of scalar, vector, matrix and tensor.

1.2.2 Matrix operators


Definition 1.2.1. (Matrix trace [6]) The trace of matrix A ∈ RI ×I is obtained by

summing all the diagonal entries of A, i.e., tr(A) = Ii=1 ai,i .

Definition 1.2.2. (p -norm [6]) For matrix A ∈ RI ×J , its p -norm is defined as

 1/p

I 
J
p
Ap = ai,j . (1.1)
i=1 j =1

Definition 1.2.3. (Matrix nuclear norm [7]) The nuclear norm of matrix A is denoted

as A∗ = i σi (A), where σi (A) is the i-th largest singular value of A.
1.2 Tensor operations 5

Definition 1.2.4. (Hadamard product [8]) The Hadamard product for matrices A ∈
RM×N and B ∈ RM×N is defined as A  B ∈ RM×N with
⎡ ⎤
a1,1 b1,1 a1,2 b1,2 ··· a1,N b1,N
⎢ a2,1 b2,1 a2,2 b2,2 ··· a2,N b2,N ⎥
⎢ ⎥
AB=⎢ .. .. .. .. ⎥. (1.2)
⎣ . . . . ⎦
aM,1 bM,1 aM,2 bM,2 ··· aM,N bM,N

Definition 1.2.5. (Kronecker product [9]) The Kronecker product of matrices A =


[a1 , a2 , · · · , aN ] ∈ RM×N and B = [b1 , b2 , · · · , bQ ] ∈ RP ×Q is defined as A ⊗ B ∈
RMP ×N Q , which can be written mathematically as
⎡ ⎤
a1,1 B a1,2 B ··· a1,N B
⎢ a2,1 B a2,2 B ··· a2,N B ⎥
⎢ ⎥
A⊗B=⎢ .. .. .. .. ⎥
⎣ . . . . ⎦
aM,1 B aM,2 B ··· aM,N B
= a1 ⊗ b1 a1 ⊗ b2 a1 ⊗ b3 ··· aN ⊗ bQ−1 aN ⊗ bQ .
(1.3)

Based on the Kronecker product, a lot of useful properties can be derived. Given
matrices A, B, C, D, we have

(A ⊗ B)(C ⊗ D) = AC ⊗ BD,
(A ⊗ B)† = A† ⊗ B† , (1.4)
(A ⊗ B) = A ⊗ B ,
T T T

where AT and A† represent the transpose and Moore–Penrose inverse of matrix A.

Definition 1.2.6. (Khatri–Rao product [10]) The Khatri–Rao product of matrices A ∈


RM×N and B ∈ RL×N is defined as

A  B = [a1 ⊗ b1 a2 ⊗ b2 · · · aN ⊗ bN ∈ RML×N . (1.5)

Similar to the Kronecker product, the Khatri–Rao product also has some conve-
nient properties, such as

(A  B)T = AT  BT ,
A  B  C = (A  B)  C = A  (B  C),
(1.6)
(A  B)T (A  B) = AT A  BT B,
   †
(A  B)† = AT A  BT B (A  B)T .
6 CHAPTER 1 TDs: computations, applications, and challenges

1.2.3 Tensor transformations


Definition 1.2.7. (Tensor transpose [11]) Given a tensor A ∈ RI1 ×I2 ×I3 , whose
frontal slices are A(:, :, i3 ) (i3 = 1, · · · , I3 ), its transpose AT is acquired by first
transposing each of the frontal slices and then placing them in the order of AT (:, :, 1),
AT (:, :, I3 ), AT (:, :, I3 − 1), · · · , AT (:, :, 2) along the third mode.

FIGURE 1.5
A graphical illustration of the tensor transpose on A ∈ RI1 ×I2 ×5 .

Fig. 1.5 demonstrates the tensor transpose of A ∈ RI1 ×I2 ×5 .


Definition 1.2.8. (Tensor mode-n matricization [1]) For tensor A ∈ RI1 ×···×IN , its
matricization along the n-th mode is denoted as A(n) ∈ RIn ×I1 I2 ···In−1 In+1 ···IN , as
shown in Fig. 1.6. It rearranges fibers on the n-th mode to form the columns of A(n) .
For instance, there exists a third-order tensor A ∈ R3×3×2 whose frontal slices are
⎡ ⎤ ⎡ ⎤
1 4 5 2 6 2
A(:, :, 1) = ⎣ 2 8 7 ⎦ , A(:, :, 2) = ⎣ 8 1 3 ⎦ . (1.7)
9 5 3 7 5 6

Thus, its mode-1, mode-2, and mode-3 matricizations can be written as


⎛ ⎞
1 4 5 2 6 2
A(1) = ⎝ 2 8 7 8 1 3 ⎠ , (1.8)
9 5 3 7 5 6

⎛ ⎞
1 2 9 2 8 7
A(2) = ⎝ 4 8 5 6 1 5 ⎠ , (1.9)
5 7 3 2 3 6
 
1 2 9 4 8 5 5 7 3
A(3) = . (1.10)
2 8 7 6 1 5 2 3 6
1.2 Tensor operations 7

FIGURE 1.6
A graphical illustration of tensor mode-n matricization for A ∈ RI1 ×···×IN .

Definition 1.2.9. (Tensor n-th canonical matricization [12]) For a fixed index n =
1, 2, · · · , N, the n-th canonical matricization of tensor A ∈ RI1 ×I2 ×···×IN can be de-
fined as
(A<n> )i1 i2 ···in , in+1 ···iN = ai1 ,i2 ,··· ,iN , (1.11)

where i1 i2 · · · in , in+1 · · · iN are multiindices and A<n> ∈ RI1 I2 ···In ×In+1 ···IN .
Take the multiindex i = i1 i2 · · · iN as an example, in = 1, 2, · · · , In , n = 1, · · · , N.
It can either be defined using the little-endian convention (reverse lexicographic or-
dering) [13]

i1 i2 · · · iN = i1 + (i2 − 1)I1 + (i3 − 1)I1 I2 + · · · + (iN − 1)I1 · · · IN −1 , (1.12)

or the big-endian convention (colexicographic ordering)

i1 i2 · · · iN = iN +(iN −1 −1)IN +(iN −2 −1)IN IN −1 +· · ·+(i1 −1)I2 · · · IN . (1.13)

1.2.4 Tensor products


Definition 1.2.10. (Tensor inner product [1]) The inner product of two tensors A ∈
RI1 ×I2 ×···×IN and B ∈ RI1 ×I2 ×···×IN , shown in Fig. 1.7, is expressed as


I1 
I2 
IN
A, B = ··· ai1 ,i2 ,··· ,iN bi1 ,i2 ,··· ,iN . (1.14)
i1 =1 i2 =1 iN =1

Definition 1.2.11. (Tensor norm [1]) The norm of a tensor A ∈ RI1 ×I2 ×···×IN is the
square root of the summation over the square of all its elements, which can be ex-
pressed as


 I1 I2 
IN

A =  ··· (ai1 ,i2 ,··· ,iN )2 . (1.15)
i1 =1 i2 =1 iN =1
8 CHAPTER 1 TDs: computations, applications, and challenges

FIGURE 1.7
A graphical illustration of the tensor inner product.

Definition 1.2.12. (Tensor mode-n product with a matrix [1]) The tensor mode-n
product of A ∈ RI1 ×I2 ×···×IN and matrix B ∈ RK×In is denoted as

X = A ×n B ∈ RI1 ×···×In−1 ×K×In+1 ×···×IN , (1.16)

or element-wisely,

IN
xi1 ,··· ,k,··· ,iN = ai1 ,··· ,in ,··· ,iN bk,in . (1.17)
in =1

A visual illustration is shown in Fig. 1.8.


Taking advantage of tensor matricization, Eq. (1.16) can also be expressed in an
unfolded form as
X(n) = BA(n) . (1.18)

FIGURE 1.8
A graphical illustration of the tensor mode-n product.
1.2 Tensor operations 9

 
1 2 3
For example, given tensor A (Eq. (1.7)) and matrix B = , the
4 5 6
mode-n product A ×1 B will yield a tensor X ∈ R2×3×2 , whose frontal slices are
   
32 35 28 39 23 26
X (:, :, 1) = , X (:, :, 2) = . (1.19)
68 86 73 90 59 59

Definition 1.2.13. (Tensor mode-n product with a vector [1]) The tensor mode-n
product of the tensor A ∈ RI1 ×I2 ×···×IN and vector b ∈ RIn is denoted as

X = A ×n b ∈ RI1 ×···×In−1 ×In+1 ×···×IN , (1.20)

with entries

IN
xi1 ,··· ,in−1 ,in+1 ,··· ,iN = ai1 ,··· ,in−1 ,in ,in+1 ,··· ,iN bin . (1.21)
in =1

T
For example, given tensor A in Eq. (1.7) and vector b = 1 2 3 , we have


24 20
A ×2 b = ⎣ 39 19 ⎦ . (1.22)
28 35

It can be clearly seen that the operation of multiplying a tensor by a matrix will
not change the number of ways of the tensor. However, if a tensor is multiplied by a
vector, the number of ways will decrease.

Definition 1.2.14. (t-product [11]) The t-product of A ∈ RI1 ×I2 ×I3 and C ∈ RI2 ×L×I3
is defined as
 
X = A ∗ C = fold circ(A)MatVec(C) , (1.23)
 T
where X ∈ RI1 ×L×I3 , MatVec(C) = C(1)T C(2)T · · · C(I3 )T ∈ RI2 I3 ×L
represents the block matrix [11] of C, and
⎡ ⎤
A(1) A(I3 ) ··· A(2)
⎢ A(2) A(1) ... A(3) ⎥
⎢ ⎥
circ(A) = ⎢ .. .. .. .. ⎥ ∈ RI1 I3 ×I2 I3
⎣ . . . . ⎦
A(I3 ) A(I3 − 1) · · · A(1)

is the block-circulant matrix [11] of A, where C(i3 ) and A(i3 ), i3 = 1, · · · , I3 , repre-


sent the i3 -th frontal slice of C and A, respectively.

Definition 1.2.15. (Tensor contraction [5]) Given two tensors A ∈ RI1 ×I2 ×···×IM
and B ∈ RJ1 ×J2 ×···×JN , suppose they have L equal indices {K1 , K2 , · · · , KL } in
10 CHAPTER 1 TDs: computations, applications, and challenges

{I1 , I2 , · · · , IM } and {J1 , J2 , · · · , JN }. The contraction of these two tensors yields


an (M + N − 2L)-th-order tensor X = A, BL , whose entries can be calculated by


K1 
KL
··· ai1 ,··· ,iM bj1 ,··· ,jN . (1.24)
k1 =1 kL =1

A graphical illustration of tensor contraction is shown in Fig. 1.9.

FIGURE 1.9
Graphical representation of contraction of two tensors, A ∈ RI1 ×···×IM and B ∈ RJ1 ×···×JN ,
where {K1 , K2 , · · · , KL } denotes the L equal indices in {I1 , I2 , · · · , IM } and
{J1 , J2 , · · · , JN }.

For example, given tensors A ∈ R3×4×2×6×7 and B ∈ R2×5×7×8×4 , based on the


aforementioned definition, we can conclude that L = 3, K1 = I2 = J5 = 4, K2 = I3 =
J1 = 2, and K3 = I5 = J3 = 7. As shown in Fig. 1.10, the result of tensor contraction
X = A, B3 is of the size of 3 × 6 × 5 × 8, and its entries are


4 
2 
7
xi1 ,i4 ,j2 ,j4 = ai1 ,k1 ,k2 ,i4 ,k3 bk2 ,j2 ,k3 ,j4 ,k1 . (1.25)
k1 =1 k2 =1 k3 =1

Consider a special case when L = 1 and K1 = Im = Jn , as demonstrated in Fig. 1.11.

FIGURE 1.10
The contraction of two tensors, A ∈ R3×4×2×6×7 and B ∈ R2×5×7×8×4 , where
K1 = I2 = J5 = 4, K2 = I3 = J1 = 2, K3 = I5 = J3 = 7, I1 = 3, I4 = 6, J2 = 5, and J4 = 8.
1.2 Tensor operations 11

FIGURE 1.11
A graphical representation of contraction over two tensors, A ∈ RI1 ×I2 ×···×IM and
B ∈ RJ1 ×J2 ×···×JN , where K1 = Im = Jn .

The contraction of tensors A ∈ RI1 ×I2 ×···×IM and B ∈ RJ1 ×J2 ×···×JN results in an
(M + N − 2)-th-order tensor X = A, B1 , whose entries can be calculated by

xi1 ,··· ,im−1 ,im+1 ,··· ,iM ,j1 ,··· ,jn−1 ,jn+1 ,··· ,jN

K1
= ai1 ,··· ,im−1 ,k1 ,im+1 ,··· ,iM bj1 ,··· ,jn−1 ,k1 ,jn+1 ,··· ,jN . (1.26)
k1 =1

1.2.5 Structural tensors


Definition 1.2.16. (Identity tensor [11]) An identity tensor I is a tensor whose first
frontal slice is an identity matrix and the rest are zero matrices.
Definition 1.2.17. (Orthogonal tensor [14]) Using the t-product, an orthogonal tensor
H is defined as
H ∗ HT = HT ∗ H = I. (1.27)
Definition 1.2.18. (Rank-1 tensor [1]) A rank-1 tensor A ∈ RI1 ×I2 ×I3 is formed by
the outer product of vectors, as shown in Fig. 1.12. Its mathematical formulation can
be written as
A = a(1) ◦ a(2) ◦ a(3) , (1.28)
where ◦ means the outer product. Therefore, the entries of A can be written as
ai1 ,i2 ,i3 = ai1 ai2 ai3 . Generalizing it to the N-th-order tensor A ∈ RI1 ×I2 ×···×IN ,
(1) (2) (3)

we have
A = a(1) ◦ a(2) ◦ · · · ◦ a(N ) . (1.29)

Definition 1.2.19. (Diagonal tensor [1]) Tensor A is a diagonal tensor if and


only if all its nonzero elements are on the superdiagonal line. Specifically, if A ∈
RI1 ×I2 ×···×IN is a diagonal tensor, then we have A(i1 , · · · , iN ) = 0 if and only if
i1 = i2 = · · · = iN . A graphical illustration of a third-order diagonal tensor is demon-
strated in Fig. 1.13.
12 CHAPTER 1 TDs: computations, applications, and challenges

FIGURE 1.12
A rank-1 tensor A = a(1) ◦ a(2) ◦ a(3) ∈ RI1 ×I2 ×I3 .

FIGURE 1.13
A third-order diagonal tensor A ∈ RI1 ×I2 ×I3 .

FIGURE 1.14
An f -diagonal tensor A ∈ RI1 ×I2 ×I3 .

Definition 1.2.20. (f -diagonal tensor [14]) An f -diagonal tensor A is a tensor with


diagonal frontal slices. A third-order f -diagonal tensor is visualized in Fig. 1.14.
1.3 Tensor decompositions 13

1.2.6 Summary
In this section, we first briefly described some notations of tensor representations.
Then by giving basic operations of matrices, we discussed several common tensor
operations, including tensor transformations and tensor products. Concepts of struc-
tural tensors such as orthogonal tensor, diagonal tensor, and f -diagonal tensor are
also given. It is worth noting that we only focus on the most commonly used defini-
tions; for more information, please refer to [1], [5], and [6].

1.3 Tensor decompositions


The idea of tensor decomposition was first put forward by Hitchcock in 1927 and
developed by a lot of scholars until these days. Traditionally, it was implemented in
psychometrics and stoichiometry. With the growing prosperity of tensor decomposi-
tion in [15–18], it began to draw attention in other fields, including signal processing
[19–21], numerical linear algebra [22,23], computer vision [24], numerical analysis
[25,26], and data mining [27–29]. Meanwhile, different decomposition approaches
were developed to meet various requirements.
In this section, we first discuss two cornerstones, Tucker decomposition and CP
decomposition, and go through some other methods like block term decomposition
(BTD), tensor singular value decomposition (t-SVD), and tensor networks (TNs).

1.3.1 Tucker decomposition


In 1963, Tucker decomposition was firstly proposed in [30] by Tucker and perfected
by Levin and Tucker later on. In 2000, the name of higher-order singular value de-
composition (HOSVD) was put forward by De Lathauwer [31]. Nowadays, the terms
Tucker decomposition and HOSVD are used alternatively to refer to Tucker decom-
position.
Taking advantage of the mode-n product, Tucker decomposition can be defined
as a multiplication of a core tensor and the matrix along each mode.
Definition 1.3.1. (Tucker decomposition) Given a tensor X ∈ RI1 ×···×IN , its Tucker
decomposition is
 
X = G ×1 U(1) ×2 U(2) · · · ×N U(N ) = G; U(1) , U(2) , · · · , U(N ) , (1.30)

where U(n) ∈ RIn ×Rn (n = 1, · · · , N ) are semi-orthogonal factor matrices that satisfy
U(n)T U(n) = IRn and G ∈ RR1 ×R2 ×···×RN is the core tensor. Even though the core
tensor is usually dense, it is generally much smaller than X , i.e., Rn  In .
We can also write Tucker decomposition in an element-wise style as


R1 
R2 
RN
(1) (2) (N )
xi1 ,i2 ,··· ,iN = ··· gr1 ,r2 ,··· ,rN ui1 ,r1 ui2 ,r2 · · · uiN ,rN , (1.31)
r1 r2 rN
14 CHAPTER 1 TDs: computations, applications, and challenges

where U(n) = [u(n) (n)


1 , · · · , uRn ].
Fig. 1.15 is an illustration of Tucker decomposition on a third-order tensor, i.e.,
T = G; A, B, C.

FIGURE 1.15
An illustration of Tucker decomposition on a third-order tensor T . The core tensor is
G ∈ RR1 ×R2 ×R3 and factor matrices are A, B, C ∈ RIn ×Rn , n = 1, 2, 3.

Based on Tucker decomposition, we can derive the definition of Tucker rank.

Definition 1.3.2. (Tucker rank) The Tucker rank of a given tensor X ∈ RI1 ×···×IN
is defined as an N-tuple (R1 , · · · , RN ) comprised of n-rank Rn . The n-rank Rn =
rank(X(n) ), a.k.a. the multilinear rank, is the dimension of the vector space spanned
by the mode-n fibers. In other words, the n-rank is the column rank of X(n) .

Note that Tucker decomposition is not unique; if we employ permutations R ∈


RR1 ×R1 , P ∈ RR2 ×R2 , and Q ∈ RR3 ×R3 on each mode of the core tensor G, we can
always find the corresponding factor matrices by applying the reverse operations on
them. Specifically, taking a third-order tensor T as an example, we have

T = G; A, B, C
 
= G ×1 R ×2 P ×3 Q; AR−1 , BP−1 , CQ−1 ,

where R−1 , P−1 , Q−1 represent the inverse matrices of R, P, Q.


This property of Tucker decomposition enables us to select the appropriate core
tensor according to the situations.

1.3.2 Canonical polyadic decomposition


CP decomposition was invented in 1927 as polyadic decomposition by Hitchcock
[32], whose idea is to express a tensor as the summation of rank-1 tensors. In 1970,
polyadic decomposition was renamed by Carroll and Chang [33] as canonical de-
composition (CANDECOMP), while in the meantime, Harshman [34] named it as
parallel factors analysis (PARAFAC). For a long time, different articles used differ-
ent names to refer to CP decomposition and it was quite confusing. In 2000, Kiers
1.3 Tensor decompositions 15

[35] suggested to call it CP decomposition uniformly. Today, even though we refer to


it as CP decomposition, it can be seen as both CANDECOMP/PARAFAC decompo-
sition and CP decomposition.
Definition 1.3.3. (CP decomposition) CP decomposition of an N-th-order tensor
X ∈ RI1 ×···×IN is represented as the summation of rank-1 tensors


R
 
X= r ◦ ur ◦ · · · ◦ ur
u(1) (2) (N )
= U(1) , U(2) , · · · , U(N ) , (1.32)
r=1

(n) (n) (n)


where U(n) = [u1 , u2 , · · · uR ], n = 1, · · · , N, are factor matrices and R denotes
the number of rank-1 components. The entries in X can be computed individually as


R
xi1 ,i2 ,··· ,iN = u(1) (2) (N )
i1 ,r ui2 ,r · · · uiN ,r . (1.33)
r=1

Similar to Tucker decomposition, there is a tensor rank in terms of CP decompo-


sition.
Definition 1.3.4. (Tensor rank) Tensor rank is defined as the minimal number of
rank-1 components that ensure Eq. (1.32) holds.
Taking a third-order tensor T ∈ RI1 ×I2 ×I3 as an example, it can be decomposed
as

R
T = ar ◦ br ◦ cr = A, B, C, (1.34)
r=1
where each component ar ◦br ◦cr is a third-order tensor of rank-1. If R is the smallest
number among all the possible values that fit in Eq. (1.34), then we say R is the tensor
rank of T . Fig. 1.16 illustrates the CP decomposition for T intuitively.

FIGURE 1.16
A demonstration of CP decomposition of a third-order tensor. Each dotted rectangle
represents a rank-1 tensor resulting from ar ◦ br ◦ cr , r = 1, · · · , R.

CP decomposition and Tucker decomposition are not independent. According


to Eq. (1.30), when the core tensor G is an identity tensor I ∈ RR×R×···×R , then
16 CHAPTER 1 TDs: computations, applications, and challenges

Eq. (1.30) is indeed a CP decomposition. Specifically,


 
X = I; U(1) , U(2) , · · · , U(N )
 
= U(1) , U(2) , · · · , U(N ) .

In this sense, CP decomposition can be regarded as a special case of Tucker de-


composition. However, different from Tucker decomposition, CP decomposition is
unique under permutation indeterminacy and scaling indeterminacy. The permuta-
tion indeterminacy means we can arbitrarily change the order of rank-1 tensors in
Eq. (1.32), and the decomposition still holds. Taking a third-order tensor as an exam-
ple, the mathematical form of permutation is as follows:

T = A, B, C = AP, BP, CP, (1.35)

where P ∈ RR×R is a permutation matrix. The scaling indeterminacy means that we


can scale the column vectors by different parameters as long as the product of those
parameters equal one. That is to say,


R 
R
T = ar ◦ br ◦ cr = (αr ar ) ◦ (βr br ) ◦ (γr cr ), (1.36)
r=1 r=1

where factors satisfy αr βr γr = 1 for r = 1, · · · , R.


The uniqueness of CP decomposition means that, for a given tensor X , there is
only one set of rank-1 tensors that satisfies Eq. (1.32).

1.3.3 Block term decomposition


As introduced above, CP decomposition and Tucker decomposition have different
ranks. By unifying the ideas of these two ranks, BTD was proposed [36,37].

Definition 1.3.5. (Block term decomposition) For a tensor T ∈ RI1 ×I2 ×···×IN , its
BTD is denoted as


R
T = Sr ×1 W(1)
r ×2 Wr · · · ×N Wr ,
(2) (N )
(1.37)
r=1

(n)
where Wr ∈ RIn ×Mr represents the n-th factor in the r-th term and Sr ∈
(n)
(1) (2) (N)
RMr ×Mr ×···×Mr represents the corresponding core tensor.

Considering BTD on third-order tensors, we can give the definition of (L, M, N)-
decomposition.

Definition 1.3.6. (Rank-(L, M, N)) A third-order tensor is of rank-(L, M, N) if its


mode-1 rank, mode-2 rank, and mode-3 rank equal L, M, and N , respectively.
1.3 Tensor decompositions 17

Definition 1.3.7. ((L, M, N)-decomposition) Given a third-order tensor T ∈ RI ×J ×K ,


the (L, M, N)-decomposition is the summation of rank-(L, M, N) terms as follows:


R
T = Sr ×1 Ar ×2 Br ×3 Cr , (1.38)
r=1

where Sr ∈ RL×M×N is of full rank-(L, M, N) and Ar ∈ RI ×L , Br ∈ RJ ×M , and


Cr ∈ RK×N are of full column rank (I  L, J  M, K  N, 1  r  R).

A graphical illustration of (L, M, N)-decomposition is shown in Fig. 1.17.

FIGURE 1.17
The illustration of block term decomposition for a third-order tensor.

Following this idea, we give three special cases of (L, M, N)-decomposition:


(L, L, 1)-decomposition, (Lr , Lr , 1)-decomposition, and (L, M, ·)-decomposition.

Definition 1.3.8. ((L, L, 1)-decomposition) For a tensor T ∈ RI ×J ×K , its (L, L, 1)-


decomposition is the summation of rank-(L, L, 1) terms as follows:


R
 
T = Ar BTr ◦ cr , (1.39)
r=1

where Ar ∈ RI ×L and Br ∈ RJ ×L (r = 1, · · · , R) are rank-L matrices.

However, Ar and Br may not have the same rank on some occasions. Therefore,
(Lr , Lr , 1)-decomposition was proposed, as shown in Fig. 1.18.

Definition 1.3.9. ((Lr , Lr , 1)-decomposition) For a tensor T ∈ RI ×J ×K , its


(Lr , Lr , 1)-decomposition is the summation of rank-(Lr , Lr , 1) terms, which can
be defined as

R
T = Ar BTr ◦ cr , (1.40)
r=1

where Ar ∈ RI ×Lr and Br ∈ RJ ×Lr (r = 1, · · · , R) are rank-Lr matrices.


18 CHAPTER 1 TDs: computations, applications, and challenges

FIGURE 1.18
The illustration of (Lr , Lr , 1)-decomposition for a third-order tensor.

Definition 1.3.10. ((L, M, ·)-decomposition) For a tensor T ∈ RI ×J ×K , its (L, M, ·)-


decomposition can be denoted as


R
T = Sr ×1 Ar ×2 Br , (1.41)
r=1

where Sr ∈ RL×M×K is the core tensor and Ar ∈ RI ×L , Br ∈ RJ ×M are of full col-


umn rank (I  L, J  M, 1  r  R).

FIGURE 1.19
The illustration of (L, M, ·)-decomposition for a third-order tensor.

A visual representation of (L, M, ·)-decomposition is shown in Fig. 1.19.

1.3.4 Tensor singular value decomposition


In addition to some of the decomposition methods mentioned above, Kilmer et
al. [14] proposed a new type of decomposition for tensors, named t-SVD. It is the
generalization of matrix singular value decomposition (SVD) and can be applied to
multiple applications such as data compression.

Definition 1.3.11. (t-SVD) For tensor T ∈ RI1 ×I2 ×I3 , the t-SVD is defined as

T = U ∗ S ∗ V T, (1.42)

where U ∈ RI1 ×I1 ×I3 , V ∈ RI2 ×I2 ×I3 are orthogonal tensors and S ∈ RI1 ×I2 ×I3 is an
f -diagonal tensor.

An illustration of t-SVD for tensor T is shown in Fig. 1.20.


1.3 Tensor decompositions 19

FIGURE 1.20
The illustration of t-SVD for a third-order tensor.

Definition 1.3.12. (Tensor tubal rank [38]) The tubal rank of a third-order tensor T
is defined as the number of nonzero tubes in S.
It is well known that block-circulant matrices can be block-diagonalized by the
Fourier transform [39]. Mathematically, for T ∈ RI1 ×I2 ×I3 , we have
⎡ ⎤
T(1)
  ⎢ ⎥
(F ⊗ I)circ(T ) F∗ ⊗ I = bdiag(T ) = ⎢⎣
..
.
⎥,
⎦ (1.43)
T(I3 )

where circ(T ) is the block-circulant matrix (see Eq. (1.23)) of T , F ∈ RI3 ×I3 is a
normalized discrete Fourier transform (DFT) matrix, T is the fast Fourier transform
(FFT) of T along the third mode, and T(i3 ) ∈ RI1 ×I2 (i3 = 1, · · · , I3 ) represents its
i3 -th frontal slice.
In this way, instead of directly calculating circ(·), we are able to employ the FFT
to solve Eq. (1.42). Specifically, we apply SVD on every frontal slice of T to acquire
U(i3 ), S(i3 ), and V(i3 ), i3 = 1, · · · , I3 . Then by simply performing the inverse FFT
along the third mode on U, S, and V, the t-SVD of T is obtained. From this point
of view, t-SVD can be regarded as performing matrix SVD on each frontal slice of a
tensor in the frequency domain.

1.3.5 Tensor network


In recent years, data collected from various fields often have high dimensions. This
gives rise to a tricky problem named the curse of dimensionality.
In Tucker decomposition, if we assume the dimension of a core tensor is R1 ×
· · · × RN , R1 = · · · = RN = R, then G has R N entries. The number of entries scales
exponentially with the number of tensor ways, which results in a tremendous compu-
tational burden that we may not be able to handle by standard numerical algorithms.
To ease this issue, we can resort to TNs, which are able to decompose a high-order
tensor into a set of sparsely interconnected lower-order core tensors and matrices.
The core tensors are connected through tensor contractions. Apart from the afore-
mentioned decompositions, there are also many other kinds of decompositions under
the TN framework. Here we mainly introduce hierarchical Tucker (HT) decomposi-
tion and its special case, tensor train (TT) decomposition.
20 CHAPTER 1 TDs: computations, applications, and challenges

FIGURE 1.21
A possible binary tree T of a fourth-order tensor. Each node corresponds to a mode set of
the tensor A ∈ RI1 ×I2 ×I3 ×I4 .

1.3.5.1 Hierarchical Tucker decomposition


HT decomposition is also known as the tensor tree network (TTN) with rank-3 tensors
in quantum physics. It was originally proposed in [40,41].
The main idea of HT is to decompose a tensor in a hierarchical way according to a
binary tree T (dimension tree) whose nodes indicate subsets of modes in the original
tensor and the root node contains all the modes. Fig. 1.21 demonstrates a possible
dimension tree for a fourth-order tensor A ∈ RI1 ×I2 ×I3 ×I4 , where C1 , C2 , C3 denote
the interior nodes and C4 , C5 , C6 , C7 denote the leaf nodes. Grelier et al. discussed
how to select the optimal structure for a given tensor [42].
Definition 1.3.13. (Matricization for dimension tree) Given a tensor X ∈ RI1 ×···×IN ,
dimension indices Cq ⊂ C1 , and its complement C̄q := C1 \Cq in the corresponding
binary tree (q represents the ordinal label assigned to the node), the matricization is
ICq ×IC̄q
X[q] ∈ R ,

where
 
ICq := Ic , IC̄q := Ic̄ .
c∈Cq c̄∈C̄q

For example, given tensor X and dimension indices Cq = C2 = {1, 2} in Fig. 1.21,
the matricization X[q] = X[2] is of the size of I1 I2 × I3 I4 .
Definition 1.3.14. (Hierarchical rank) The hierarchical rank of tensor X ∈ RI1 ×···×IN
is defined as
 
K = KCq |KCq = rank(X[q] ), ∀ Cq ⊂ T , (1.44)
where Cq refers to the dimension index in the related binary tree and X[q] is the
matricization according to Cq .
Let AK denote a set of tensors whose hierarchical rank is no more than K, i.e.,
AK = {A ∈ RI1 ×···×IN |rank(A[q] )  KCq , ∀ Cq ⊂ T}. Using the nestedness property,
we can define HT decomposition properly.
1.3 Tensor decompositions 21

Lemma 1.3.1. (Nestedness property) Suppose A ∈ AK . Then we can have a corre-


sponding subspace Uq for every Cq and its complement C̄q
 
Uq := span x ∈ RICq |x is the left singular vector of A[q] ,
I ×I
where A[q] ∈ R Cq C̄q . For each interior node with two successors, their dimension
indices are Cq , Cq1 , Cq2 , respectively, and the space of UCq naturally decouples into

UCq = UCq1 ⊗ UCq2 ,

where ⊗ denotes the Kronecker product.


Definition 1.3.15. (HT decomposition) Suppose A ∈ AK . We can represent A[q] as

A[q] = UCq VTCq ,

×K
where UCq ∈ RICq ×KCq , VCq ∈ R C̄q Cq , KCq is the hierarchical rank of A[q] , and
I

Cq ⊂ T. Therefore, for Cq = {Cq1 , Cq2 }, the column vectors UCq (:, l) of UCq satisfy
the nestedness property when 1  l  KCq , that is,

KCq KCq
1 2
UCq (:, l) = GCq (l, l1 , l2 )UCq1 (:, l1 ) ⊗ UCq2 (:, l2 ), (1.45)
l1 =1 l2 =1

where GCq (l, l1 , l2 ) represents the entry in GCq . It is the coefficient in the linear com-
bination of vectors. Here UCq1 (:, l1 ) and UCq2 (:, l2 ) are the column vectors of UCq1
and UCq2 . Therefore, GCq and UCq are the factors of HT decomposition of A. Ac-
cording to the dimension tree in Fig. 1.21, the HT decomposition for a fourth-order
tensor is illustrated in Fig. 1.22.

1.3.5.2 Tensor train decomposition


TT decomposition is proposed in [43] and is also known as matrix product state
(MPS) in the area of quantum physics. Since it can avoid the recursive computation
of binary trees and is mathematically easy to solve due to its compact form, it has
attracted a lot of attention in recent years.
The main idea of TT decomposition is to factorize a tensor X ∈ RI1 ×···×IN into N
factors, where head and tail factors are matrices and the rest are smaller third-order
tensors.
Definition 1.3.16. (TT decomposition) Given X ∈ RI1 ×···×IN , the TT decomposition
is of the form


R1 RN+1

X= ··· G (1) (r1 , :, r2 ) ◦ G (2) (r2 , :, r3 ) ◦ · · · ◦ G (N ) (rN , :, rN +1 ), (1.46)
r1 =1 rN+1 =1
22 CHAPTER 1 TDs: computations, applications, and challenges

FIGURE 1.22
Hierarchical Tucker decomposition of a fourth-order tensor A ∈ RI1 ×I2 ×I3 ×I4 , where
G{1,2,3,4} ∈ RK{1,2} ×K{3,4} ×1 , G{1,2} ∈ RK{1,2} ×K{1} ×K{2} , G{3,4} ∈ RK{3,4} ×K{3} ×K{4} ,
U{1} ∈ RK{1} ×I1 , U{2} ∈ RK{2} ×I2 , U{3} ∈ RK{3} ×I3 , and U{4} ∈ RK{4} ×I4 are the factors of
hierarchical Tucker decomposition.

FIGURE 1.23
Top: Tensor train decomposition on an Nth-order tensor X . The yellow dot (light gray in
print version) G (n) ∈ RRn ×In ×Rn+1 is a core tensor, leaf components (not drawn) are
identities in this case and thus no need to be stored. Bottom: Corresponding core tensors.

or in an element-wise style,

xi1 ,i2 ,··· ,iN = G (1) (:, i1 , :)G (2) (:, i2 , :) · · · G (N ) (:, iN , :), (1.47)

where G (n) (n = 1, · · · , N) are the Rn × In × Rn+1 core factors. Note that we set
R1 = RN +1 = 1.

A graphical illustration of TT decomposition on an N-th-order tensor is shown in


Fig. 1.23.
Similar to other decompositions, TT decomposition also has its rank.
1.3 Tensor decompositions 23

Definition 1.3.17. (Tensor train rank) The TT rank of X ∈ RI1 ×···×IN is an (N + 1)-
tuple

rankTT (X ) = (R1 , · · · , RN +1 ), (1.48)

where Rn = rank(X<n−1> ) and R1 = RN +1 = 1.


Different from the aforementioned ranks, TT rank can be affected by the permu-
tation of tensor modes. For instance, given a tensor A ∈ RI1 ×I2 ×I3 , if we generate
A ∈ RI3 ×I2 ×I1 by swapping the first and the third mode of A, the TT rank can be
different. That leads to the question of how to choose a desirable permutation for a
given tensor.

1.3.5.3 Tensor ring decomposition


Zhao et al. [12] generalized TT to tensor ring (TR) decomposition (a.k.a. tensor chain
[TC]). It employs the trace operation to create a symmetrical structure and can be
treated as TT with periodic boundary conditions (PBC). An illustration of TR de-
composition is shown in Fig. 1.24.

FIGURE 1.24
Tensor ring decomposition of an N-th-order tensor X , where G (n) ∈ RRn ×In ×Rn+1 are the
core tensors.

From Fig. 1.24, we can see that TR decomposes an N -th-order tensor into N third-
order factor tensors whose dimensions are much smaller than the original one. The
constraint R1 = RN +1 still exists but not necessarily equals one. Therefore, the defi-
nition of TR decomposition can be given.
Definition 1.3.18. (TR decomposition) The TR decomposition of X ∈ RI1 ×···×IN is


R1 
RN
X= ··· G (1) (r1 , :, r2 ) ◦ G (2) (r2 , :, r3 ) ◦ · · · ◦ G (N ) (rN , :, r1 ). (1.49)
r1 =1 rN =1

Using the trace operation, it can be written element-wisely as follows:


 
xi1 ,i2 ,··· ,iN = tr G (1) (:, i1 , :)G (2) (:, i2 , :) · · · G (N ) (:, iN , :) , (1.50)
24 CHAPTER 1 TDs: computations, applications, and challenges

where G (n) ∈ RRn ×In ×Rn+1 , n = 1, · · · , N, are core tensors.


Similar to TT rank, TR rank can be defined as a tuple, i.e., rankTR (X ) =
(R1 , · · · , RN ).

1.3.5.4 Other variants


Even though TNs are derived to substantially reduce the computational cost, HT and
TT may fail to achieve the outstanding effect when the data size increases. TR was
proposed to solve the ordering issue induced by TT.
Other than TR, projected entangled pair states (PEPS) [44] was also introduced
for the same reason. PEPS is a hierarchical 2D TT model employing fifth/sixth-order
tensors as cores. This strategy can be taken as a trade-off between rank and computa-
tional complexity. However, when the estimation accuracy is high, the rank of PEPS
also grows rapidly.
Honeycomb lattice (HCL) [45] and multiscale entanglement renormalization
ansatz (MERA) [46] were put forward to resolve this dilemma. Similar to TR and
PEPS, HCL and MERA also contain loop structures that cause a high computational
burden in the contraction. However, due to the fact that core tensors for HCL and
MERA are only of third order and third/fourth order, respectively, they are still able
to attain a desirable effect.
Recently, a new idea has been proposed by combining traditional decomposition
approaches. For example, TR and t-SVD are merged in a hierarchical way in [47].

1.4 Tensor processing techniques


Tensor decomposition plays a key role in machine learning, signal processing, and
data mining. A series of tensor-based data processing techniques are listed as follows:
• Tensor dictionary learning [48,49]
Tensor dictionary learning aims to sparsely represent the high-order tensor in
a tensor factorization way where some factors can be regarded as learned dic-
tionaries with respect to the rest sparse factors. Benefiting from its data-driven
characteristic and high-order sparse representation ability, it has been widely used
in various data processing applications, such as image denoising, enhancement
and classification, texture synthesis, unsupervised clustering, and biomedical data
analysis.
• Tensor completion [50–54]
Tensor completion fills in the missing entries of a partially observed tensor, which
is popularly applied in recommender systems, image recovery, knowledge graph
completion, and traffic flow prediction.
• Tensor robust principal component analysis (TRPCA) [55–57]
TRPCA separates additive low-rank and sparse components from multiway data.
It can be used for dimensionality reduction, background extraction, small target
detection, and anomaly detection.
Other documents randomly have
different content
“Read it, then you will see,” replied Savelitch. Pougatcheff took the
paper and examined it for a long time with a consequential look.
“Why do you write so illegibly?” said he at last. “Our lucid eyes[1]
cannot decipher a word. Where is my chief secretary?”
A young man, in the uniform of a corporal, immediately ran up to
Pougatcheff.
“Read it aloud,” said the usurper, giving him the paper.
I was exceedingly curious to know what my follower could have
written to Pougatcheff about. The chief secretary, in a loud voice,
began to spell out as follows:
“Two dressing-gowns, one of linen and one of striped silk, six
roubles.”
“What does this mean?” said Pougatcheff, frowning.
“Order him to read on,” replied Savelitch coolly.
The chief secretary continued:
“One uniform coat of fine green cloth, seven roubles.
“One pair of white cloth breeches, five roubles.
“Twelve Holland linen shirts with ruffles, ten roubles.
“A chest and tea-service, two roubles and a half....”
“What is all this nonsense?” exclaimed Pougatcheff. “What are these
chests and breeches with ruffles to do with me?”
Savelitch cleared his throat and began to explain.
“This, my father, you will please to understand is a list of my
master’s goods that have been stolen by those scoundrels——”
“What scoundrels?” said Pougatcheff, threateningly.
“I beg your pardon, that was a slip on my part,” replied Savelitch.
“They were not scoundrels, but your fellows, who have rummaged
and plundered everything. Do not be angry: the horse has got four
legs, and yet he stumbles. Order him to read to the end.”
“Read on to the end,” said Pougatcheff.
The secretary continued:
“One chintz counterpane, another of taffety quilted with cotton wool,
four roubles.
“A fox-skin pelisse, covered with red flannel, forty roubles.
“Likewise a hare-skin morning-gown, presented to your Grace at the
inn on the steppe, fifteen roubles.”
“What’s that’!” exclaimed Pougatcheff, his eyes flashing fire.
I confess that I began to feel alarmed for my poor servant. He was
about to enter again into explanations, but Pougatcheff interrupted
him.
“How dare you pester me with such nonsense!” he cried, snatching
the paper out of the secretary’s hands and flinging it in Savelitch’s
face. “Stupid old man! You have been robbed; what a misfortune!
Why, old greybeard, you ought to be eternally praying to God for me
and my lads, that you and your master are not hanging yonder
along with the other traitors to me.... A hare-skin morning-gown! Do
you know that I could order you to be flayed alive and have your
skin made into a morning-gown?”
“As you please,” replied Savelitch; “but I am not a free man, and
must be answerable for my lord’s goods.”
Pougatcheff was evidently in a magnanimous humour. He turned
round and rode off without saying another word. Shvabrin and the
chiefs followed him. The troops marched out of the fortress in order.
The crowd pressed forward to accompany Pougatcheff. I remained in
the square alone with Savelitch. My servant held in his hand the list
of my things and stood looking at it with an air of deep regret.
Seeing me on such good terms with Pougatcheff, he thought that he
might take advantage of the circumstance; but his sage scheme did
not succeed. I was on the point of scolding him for his misplaced
zeal, but I could not restrain myself from laughing.
“Laugh away, my lord,” replied Savelitch: “laugh away; but when the
time comes for you to procure a new outfit, we shall see if you will
laugh then.”
I hastened to the priest’s house to see Maria Ivanovna. The priest’s
wife met me with sad news. During the night Maria Ivanovna had
been seized with a violent attack of fever. She lay unconscious and in
a delirium. The priest’s wife conducted me into her room. I softly
approached her bed. The change in her face startled me. She did
not recognize me. For a long time I stood beside her without paying
any heed either to Father Gerasim or to his good wife, who
endeavoured to console me. Gloomy thoughts took possession of
me. The condition of the poor defenceless orphan, left alone in the
midst of the lawless rebels, as well as my own powerlessness,
terrified me. But it was the thought of Shvabrin more than anything
else that filled my imagination with alarm. Invested with power by
the usurper, and entrusted with the command of the fortress, in
which the unhappy girl—the innocent object of his hatred—
remained, he was capable of any villainous act. What was I to do?
How should I help her? How could I rescue her out of the hands of
the brigands? There remained only one way. I resolved to set out
immediately for Orenburg, in order to hasten the deliverance of
Bailogorsk, and, as far as possible, to co-operate in the undertaking.
I took leave of the priest and of Akoulina Pamphilovna,
recommending to their care her whom I already considered as my
wife. I seized the hand of the poor girl and kissed it, bedewing it
with my tears.
“Farewell,” said the pope’s wife to me, accompanying me to the door
“farewell, Peter Andreitch. Perhaps we shall see each other again in
happier times. Do not forget us, and write to us often. Poor Maria
Ivanovna has nobody now, except you, to console and protect her.”
On reaching the square, I stopped for a moment and looked at the
gibbet, then, bowing my head before it, I quitted the fortress and
took the road to Orenburg, accompanied by Savelitch, who had not
left my side.
I was walking on, occupied with my reflections, when suddenly I
heard behind me the trampling of horses’ feet. Looking round, I saw,
galloping out of the fortress, a Cossack, holding a Bashkir horse by
the rein and making signs to me from afar. I stopped and soon
recognized our orderly. Galloping up to us, he dismounted from his
own horse, and giving me the rein of the other, said:
“Your lordship! our father sends you a horse, and a pelisse from his
own shoulders.” (To the saddle was attached a sheepskin pelisse.)
“Moreover,” continued the orderly with some hesitation, “he sends
you—half-a-rouble—but I have lost it on the road; be generous and
pardon me.”
Savelitch eyed him askance and growled out:
“You lost it on the road! What is that chinking in your pocket, then,
you shameless rascal!”
“What is that chinking in my pocket?” replied the orderly, without
being in the least confused. “God be with you, old man! It is a
horse’s bit, and not half-a-rouble.”
“Very well,” said I, putting an end to the dispute. “Give my thanks to
him who sent you; and as you go back, try and find the lost half-
rouble and keep it for drink-money.”
“Many thanks, your lordship,” replied he, turning his horse round; “I
will pray to God for you without ceasing.” With these words he
galloped back again, holding one hand to his pocket, and in about a
minute he was hidden from sight.
I put on the pelisse and mounted the horse, taking Savelitch up
behind me.
“Now do you see, my lord,” said the old man, “that I did not give the
petition to the rascal in vain? The robber felt ashamed of himself.
Although this lean-looking Bashkir jade and this sheepskin pelisse
are not worth half of what the rascals stole from us, and what you
chose to give him yourself, they may yet be of some use to us; from
a vicious dog, even a tuft of hair.”

[1] An allusion to the customary form of speech on presenting a


petition to the Czar: “I strike the earth with my forehead, and
present my petition to your lucid eyes.”

CHAPTER X.

THE SIEGE.

In approaching Orenburg, we saw a crowd of convicts, with shaven


heads, and with faces disfigured by the hangman’s pincers. They
were at work on the fortifications, under the direction of the soldiers
of the garrison. Some were carrying away in wheel-barrows the
earth and refuse which filled the moat, others with shovels were
digging up the ground; on the rampart the masons were carrying
stones and repairing the walls. The sentinels stopped us at the gate
and demanded our passports. As soon as the sergeant heard that I
came from Bailogorsk, he took me straight to the General’s house.
I found him in the garden. He was inspecting the apple-trees, which
the autumn winds had stripped of their leaves, and, with the help of
an old gardener, was carefully covering them with straw. His face
expressed tranquillity, health, and good-nature. He was much
pleased to see me, and began questioning me about the terrible
events of which I had been an eye-witness. I related everything to
him. The old man listened to me with attention, and continued the
meantime to lop off the dry twigs.
“Poor Mironoff!” said he, when I had finished my sad story; “I feel
very sorry for him, he was a good officer; and Madame Mironoff was
a good woman,—how clever she was at pickling mushrooms! And
what has become of Masha, the Captain’s daughter?”
I replied that she was still at the fortress in the hands of the pope
and his wife.
“That is bad, very bad. Nobody can place any dependence upon the
discipline of robbers. What will become of the poor girl?”
I replied that the fortress of Bailogorsk was not far off and that,
without doubt, his Excellency would not delay in sending thither a
detachment of soldiers to deliver the poor inhabitants.
The General shook his head dubiously.
“We shall see, we shall see,” said he, “we have plenty of time to talk
about that. Do me the pleasure of taking a cup of tea with me: a
council of war is to be held at my house this evening. You may be
able to give us some trustworthy information concerning this rascal
Pougatcheff and his army. And now go and rest yourself for a little
while.”
I went to the quarter assigned to me, where Savelitch had already
installed himself, and where I awaited with impatience the appointed
time. The reader will easily imagine that I did not fail to make my
appearance at the council which was to have such an influence upon
my fate At the appointed hour I repaired to the General’s house.
I found with him one of the civil officials of the town, the director of
the custom-house, if I remember rightly, a stout, red-faced old man
in a silk coat. He began to question me about the fate of Ivan
Kouzmitch, whom he called his gossip, and frequently interrupted my
discourse with additional questions and moral observations, which, if
they did not prove him to be a man well versed in military matters,
showed at least that he possessed sagacity and common sense. In
the meantime the other persons who had been invited to the council
had assembled. When they were all seated, and a cup of tea had
been handed round to each, the General entered into a clear and
detailed account of the business in question.
“And now, gentlemen,” continued he, “we must decide in what way
we are to act against the rebels: offensively or defensively? Each of
these methods has its advantages and disadvantages. Offensive
warfare holds out a greater prospect of a quicker extermination of
the enemy; defensive action is safer and less dangerous....
Therefore let us commence by putting the question to the vote in
legal order, that is, beginning with the youngest in rank. Ensign,”
continued he, turning to me, “will you please favour us with your
opinion?”
I rose, and after having described, in a few words, Pougatcheff and
his followers, I expressed my firm opinion that the usurper was not
in a position to withstand disciplined troops.
My opinion was received by the civil officials with evident
dissatisfaction. They saw in it only the rashness and temerity of a
young man. There arose a murmur, and I distinctly heard the word
“greenhorn” pronounced in a whisper. The General turned to me and
said with a smile:
“Ensign, the first voices in councils of war are generally in favour of
adopting offensive measures. We will now continue and hear what
others have to say. Mr. Counsellor of the College, tell us your
opinion.”
The little old man in the silk coat hastily swallowed his third cup of
tea, into which he had poured some rum, and then replied:
“I think, your Excellency, that we ought to act neither offensively nor
defensively.”
“How, Sir Counsellor?” replied the astonished General. “Tactics
present no other methods of action; offensive action or defensive....”
“Your Excellency, act diplomatically.”
“Ah! your idea is a very sensible one. Diplomatic action is allowed by
the laws of tactics, and we will profit by your advice. We might offer
for the head of the rascal ... seventy or even a hundred roubles ...
out of the secret funds....”
“And then,” interrupted the Director of the Customs, “may I become
a Kirghis ram, and not a College Counsellor, if these robbers do not
deliver up to us their leader, bound hand and foot.”
“We will think about it, and speak of it again,” replied: the General.
“But, in any case, we must take military precautions. Gentlemen,
give your votes in regular order.”
The opinions of all were contrary to mine. All the civil officials
expatiated upon the untrustworthiness of the troops, the uncertainty
of success, the necessity of being cautious, and the like. All agreed’
that it was more prudent to remain behind the stone walls of the
fortress under the protection of the cannon, than to try the fortune
of arms in the open field. At length the General, having heard all
their opinions, shook the ashes from his pipe and spoke as follows:
“Gentlemen, I must declare to you that, for my part, I am entirely of
the same opinion as the ensign; because this opinion is founded
upon sound rules of tactics, which nearly always give the preference
to offensive action rather than to defensive.”
Then he paused and began to fill his pipe. My vanity triumphed. I
cast a proud glance at the civil officials, who were whispering among
themselves with looks of displeasure and uneasiness.
“But, gentlemen,” continued the General, heaving a deep sigh, and
emitting at the same time a thick cloud of tobacco smoke, “I dare
not take upon myself such a great responsibility, when it is a
question of the safety of the provinces confided to me by Her
Imperial Majesty, my Most Gracious Sovereign. Therefore it is that I
fall in with the views of the majority, who have decided that it is
safer and more prudent to await the siege inside the town, and to
repel the attack of the enemy by the use of artillery and—if possible
—by sallies.”
The officials in their turn now glanced at me ironically. The council
separated. I could not but deplore the weakness of this estimable
soldier, who, contrary to his own conviction, resolved to follow the
advice of ignorant and inexperienced persons.
Some days after this memorable council we heard that Pougatcheff,
faithful to his promise, was marching on Orenburg. From the lofty
walls of the town I observed the army of the rebels. It seemed to
me that their numbers had increased since the last assault, of which
I had been a witness. They had with them also some pieces of
artillery which had been taken by Pougatcheff from the small
fortresses that had been conquered by him. Remembering the
decision of the council, I foresaw a long incarceration within the
walls of Orenburg, and I was almost ready to weep with vexation.
I do not intend to describe the siege of Orenburg, which belongs to
history and not to family memoirs. I will merely observe that this
siege, through want of caution on the part of the local authorities,
was a disastrous one for the inhabitants, who had to endure hunger
and every possible privation. It can easily be imagined that life in
Orenburg was almost unbearable. All awaited in melancholy anxiety
the decision of fate; all complained of the famine, which was really
terrible. The inhabitants became accustomed to the cannon-balls
falling upon their houses; even Pougatcheff’s assaults no longer
produced any excitement. I was dying of ennui. Time wore on. I
received no letters from Bailogorsk. All the roads were cut off.
Separation from Marla Ivanovna became insupportable to me.
Uncertainty with respect to her fate tortured me. My only diversion
consisted in making excursions outside the city. Thanks to the
kindness of Pougatcheff, I had a good horse, with which I shared my
scanty allowance of food, and upon whose back I used to ride out
daily beyond the walls and open fire upon Pougatcheff’s partisans. In
these skirmishes the advantage was generally on the side of the
rebels, who had plenty to eat and drink, and possessed good horses.
Our miserable cavalry were unable to cope with them. Sometimes
our famished infantry made a sally; but the depth of the snow
prevented their operations being successful against the flying cavalry
of the enemy. The artillery thundered in vain from the summit of the
ramparts, and had it been in the field, it could not have advanced on
account of our emaciated horses. Such was our style of warfare! And
this was what the civil officials of Orenburg called prudence and
foresight!
One day, when we had succeeded in dispersing and driving off a
tolerably large body of the enemy, I came up with a Cossack who
had remained behind his companions, and I was just about to strike
him with my Turkish sabre, when he suddenly took off his cap and
cried out:
“Good day, Peter Andreitch; how do you do?”
I looked at him and recognized our orderly. I cannot say how
delighted I was to see him.
“Good day, Maximitch,” said I to him. “How long is it since you left
Bailogorsk?”
“Not long, Peter Andreitch; I only returned from there yesterday. I
have a letter for you.”
“Where is it?” cried I, perfectly beside myself with excitement.
“I have it here,” replied Maximitch, placing his hand upon his bosom.
“I promised Palasha that I would give it to you somehow.”
He then gave me a folded paper and immediately galloped off. I
opened it and, deeply agitated, read the following lines:

“It has pleased God to deprive me suddenly of both father and


mother: I have now on earth neither a relation nor a protector. I
therefore turn to you, because I know that you have always
wished me well, and that you are ever ready to help others. I
pray to God that this letter may reach you in some way!
Maximitch has promised to give it to you. Palasha has also
heard from Maximitch that he has frequently seen you from a
distance in the sorties, and that you do not take the least care
of yourself, not thinking about those who pray to God for you in
tears. I was ill a long time, and, when I recovered, Alexei
Ivanovitch, who commands here in place of my deceased father,
compelled Father Gerasim to deliver me up to him, threatening
him with Pougatcheff’s anger if he refused. I live in our house
which is guarded by a sentry. Alexei Ivanovitch wants to compel
me to marry him. He says that he saved my life because he did
not reveal the deception practised by Akoulina Pamphilovna,
who told the rebels that I was her niece. But I would rather die
than become the wife of such a man as Alexei Ivanovitch. He
treats me very cruelly, and threatens that if I do not change my
mind and agree to his proposal, he will conduct me to the
rebels’ camp, where I shall suffer the same fate as Elizabeth
Kharloff.[1] I have begged Alexei Ivanovitch to give me time to
reflect. He has consented to give me three days longer, and if at
the end of that time I do not agree to become his wife, he will
show me no further mercy. Oh, Peter Andreitch! you are my
only protector; save a poor helpless girl! Implore the General
and all the commanders to send us help as soon as possible,
and come yourself if you can.
“I remain your poor obedient orphan,
“MARIA MIRONOFF.”

The reading of this letter almost drove me out of my mind. I


galloped back to the town, spurring my poor horse without mercy.
On the way I turned over in my I mind one plan and another for the
rescue of the poor girl, but I could not come to any definite
conclusion. On reaching the town I immediately repaired to the
General’s, and presented myself before him without the least delay.
He was walking up and down the room, smoking his meerschaum
pipe. On seeing me he stopped. Probably; he was struck by my
appearance, for he anxiously inquired the reason of my hasty visit.
“Your Excellency,” said I to him, “I come to you as I would to my
own father: for Heaven’s sake, do not refuse my request; the
happiness of my whole life depends upon it!”
“What is the matter?” asked the astonished old soldier. “What can I
do for you? Speak!”
“Your Excellency, allow me to take a battalion of soldiers and a
company of Cossacks to recapture the fortress of Bailogorsk.”
The General looked at me earnestly, imagining, without doubt, that I
had taken leave of my senses—and, for the matter of that, he was
not very far out in his supposition.
“How?—what? Recapture the fortress of Bailogorsk?” said he at last.
“I will answer for the success of the undertaking,” I replied with
ardour; “only let me go.”
“No, young man,” said he, shaking his head. “At such a great
distance the enemy would easily cut off your communication with
the principal strategical point, and gain a complete victory over you.
Communication being cut off....”
I became alarmed when I perceived that he was about to enter
upon a military dissertation, and I hastened to interrupt him.
“The daughter of Captain Mironoff has written a letter to me,” I said
to him; “she asks for help: Shvabrin wants to compel her to become
his wife.”
“Indeed! Oh, this Shvabrin is a great rascal, and if he should fall into
my hands I will order him to be tried within twenty-four hours, and
we will have him shot on the parapet of the fortress. But in the
meantime we must have patience.”
“Have patience!” I cried, perfectly beside myself. “But in the
meantime he will force Maria Ivanovna to become his wife!”
“Oh!” exclaimed the General. “But even that would be no great
misfortune for her. It would be better for her to become the wife of
Shvabrin, he would then take her under his protection; and when we
have shot him we will soon find a sweetheart for her, please God.
Pretty widows do not remain single long; I mean that a widow finds
a husband much quicker than a spinster.”
“I would rather die,” said I in a passion, “than resign her to
Shvabrin.”
“Oh, oh!” said the old man, “now I understand. You are evidently in
love with Maria Ivanovna, and that alters the case altogether. Poor
fellow! But, for all that, I cannot give you a battalion of soldiers and
fifty Cossacks. Such an expedition would be the height of folly, and I
cannot take the responsibility of it upon myself.”
I cast down my head; despair took possession of me. Suddenly a
thought flashed through my mind: what it was, the reader will
discover in the following chapter, as the old romance writers used to
say.

[1] A Commandant’s daughter, whom Pougatcheff outraged and


then put to death.

CHAPTER XI.

THE REBEL ENCAMPMENT.

I left the General and hastened to my own quarters. Savelitch


received me with his usual admonitions.
“What pleasure do you find, my lord, in fighting against drunken
robbers? Is that the kind of occupation for a nobleman? All hours are
not alike, and you will sacrifice your life for nothing. It would be all
well and good if you were fighting against the Turks or the Swedes,
but it is a shame to mention the name of the enemy that you are
dealing with now.”
I interrupted him in his speech by the question:
“How much money have I left?”
“You have a tolerably good sum still left,” he replied, with a look of
satisfaction. “In spite of their searching and rummaging, I succeeded
in hiding it from the robbers.” So saying, he drew from his pocket a
long knitted purse, filled with silver pieces.
“Well, Savelitch,” said I to him, “give me half of what you have, and
keep the rest yourself. I am going to Fortress Bailogorsk.”
“My little father, Peter Andreitch!” said my good old servant in a
trembling voice; “do not tempt God! How can you travel at the
present time, when none of the roads are free from the robbers?
Have compassion upon your parents, if you have no pity for yourself.
Where do you Want to go? And why? Wait a little while. The troops
will soon be here and will quickly make short work of the robbers.
Then you may go in whatever direction you like.” But my resolution
was not to be shaken.
“It is too late to reflect,” I said to the old man. “I must go, I cannot
do otherwise than go. Do not grieve, Savelitch: God is merciful,
perhaps we may see each other again. Have no scruples about
spending the money, and don’t be sparing of it. Buy whatever you
require, even though you have to pay three times the value of it. I
give this money to you. If in three days I do not return——”
“What are you talking about, my lord?” said Savelitch, interrupting
me. “Do you think that I could let you go alone? Do not imagine
anything of the kind. If you have resolved to go, I will accompany
you, even though it be on foot; I will not leave you. The idea of my
sitting down behind a stone wall without you! Do you think then that
I have gone out of my mind? Do as you please, my lord, but I will
not leave you.”
I knew that it was useless to dispute with Savelitch, and I allowed
him to prepare for the journey. In half an hour I was seated upon
the back of my good horse, while Savelitch was mounted upon a
lean and limping jade, which one of the inhabitants of the town had
given to him for nothing, not having the means to keep it any longer.
We reached the gates of the town; the sentinels allowed us to pass,
and we left Orenburg behind us.
It was beginning to grow dark. My road led past the village of Berd,
one of Pougatcheff’s haunts. The way was covered with snow, but
over the whole of the steppe could be seen the footprints of horses,
renewed every day. I rode forward at a quick trot. Savelitch could
hardly keep pace with me, and kept calling out:
“Not so fast, my lord, for Heaven’s sake, not so fast! My accursed
hack cannot keep up with your long-legged devil. Where are you off
to in such a hurry? It would be all very well if we were going to a
feast, but we are more likely going to run our heads into a noose....
Peter Andreitch ... little father ... Peter Andreitch! Lord God! the child
is rushing to destruction!”
We soon caught sight of the fires of Berd glimmering in the distance.
We approached some ravines, which served as natural defences to
the hamlet. Savelitch still followed me, and did not cease to utter his
plaintive entreaties. I hoped to be able to ride round the village
without being observed, when suddenly I perceived through the
darkness, straight in front of me, five peasants armed with clubs; it
was the advanced guard of Pougatcheff’s camp. They challenged us.
Not knowing the password, I wanted to ride on without saying
anything; but they immediately surrounded me, and one of them
seized hold of my horse’s bridle. I drew my sword and struck the
peasant on the head. His cap saved him, but he staggered and let
the reins fall from his hand. The others grew frightened and took to
their heels; I seized the opportunity, and, setting spurs to my horse,
I galloped off.
The increasing darkness of the night might have saved me from
further dangers, but, turning round all at once, I perceived that
Savelitch was no longer with me. The poor old man, with his lame
horse, had not been able to get clear of the robbers. What was to be
done? After waiting a few minutes for him, and feeling convinced
that he had been stopped, I turned my horse round to hasten to his
assistance.
Approaching the ravine, I heard in the distance confused cries, and
the voice of my Savelitch. I quickened my pace, and soon found
myself in the midst of the peasants who had stopped me a few
minutes before. Savelitch was among them. With loud shouts they
threw themselves upon me and dragged me from my horse in a
twinkling. One of them, apparently the leader of the band, informed
us that he was going to conduct us immediately before the Czar. I
“And our father,” added he, “will decide whether you shall be hanged
immediately or wait till daylight.”
I offered no resistance; Savelitch followed my example, and the
sentinels led us away in triumph.
We crossed the ravine and entered the village. In all the huts fires
were burning. Noise and shouts resounded on every side. In the
streets I met a large number of people; but nobody observed us in
the darkness, and no one recognized in me an officer from
Orenburg. We were conducted straight to a cottage which stood at
the corner where two streets met. Before the door stood several
wine-casks and two pieces of artillery.
“This is the palace,” said one of the peasants; “we will announce you
at once.”
He entered the cottage. I glanced at Savelitch: the old man was
making the sign of the cross and muttering his prayers to himself.
I waited a long time; at last the peasant returned and said to me:
“Come inside; our father has given orders for the officer to be
brought before him.”
I entered the cottage, or the palace, as the peasants called it. It was
lighted by two tallow candles, and the walls were covered with gilt
paper; otherwise, the benches, the table, the little wash-hand basin
suspended by a cord, the towel hanging on a nail, the oven-fork in
the corner, the broad shelf loaded with pots—everything was the
same as in an ordinary cottage. Pougatcheff was seated under the
holy picture,[1] dressed in a red caftan and wearing a tall cap, and
with his arms set akimbo in a very self-important manner. Around
him stood several of his principal followers, with looks of feigned
respect and submission upon their faces. It was evident that the
news of the arrival of an officer from Orenburg had awakened a
great curiosity among the rebels, and that they had prepared to
receive me with as much pomp as possible. Pougatcheff recognized
me at the first glance. His assumed importance vanished all at once.
“Ah! your lordship!” said he gaily. “How do you do?”
“What, in Heaven’s name, has brought you here?”
I replied that I was travelling on my own business, and that his
people had stopped me.
“What business?” asked he.
I knew not what to reply. Pougatcheff, supposing that I did not like
to explain in the presence of witnesses, turned to his companions
and ordered them to go out of the room. All obeyed, except two,
who did not stir from their places.
“Speak boldly before them,” said Pougatcheff, “I do not hide
anything from them.”
I glanced stealthily at the impostor’s confidants. One of them, a
weazen-faced, crooked old man, with a short grey beard, had
nothing remarkable about him except a blue riband, which he wore
across his grey tunic. But never shall I forget his companion. He was
a tall, powerful, broad-shouldered man, and seemed to me to be
about forty-five years of age. A thick red beard, grey piercing eyes, a
nose without nostrils, and reddish scars upon his forehead and
cheeks, gave to his broad, pock-marked face an indescribable
expression. He had on a red shirt, a Kirghis robe, and Cossack
trousers. The first, as I learned afterwards, was the runaway
corporal Bailoborodoff; the other, Afanassy Sokoloff, surnamed
Khlopousha,[2] a condemned criminal, who had three times escaped
from the mines of Siberia. In spite of the feelings of agitation which
so exclusively occupied my mind at that time, the society in the
midst of which I so unexpectedly found myself awakened my
curiosity in a powerful degree. But Pougatcheff soon recalled me to
myself by his question:
“Speak! on what business did you leave Orenburg?”
A strange thought came into my head: it seemed to me that
Providence, by conducting me a second time into the presence of
Pougatcheff, gave me the opportunity of carrying my project into
execution. I determined to take advantage of it, and, without any
further reflection, I replied to Pougatcheff’s question:
“I was going to the fortress of Bailogorsk to rescue an orphan who is
oppressed there.”
Pougatcheff’s eyes sparkled.
“Which of my people dares to oppress the orphan?” cried he. “Were
he seven feet high he should not escape my judgment. Speak! who
is the culprit?”
“Shvabrin is the culprit,” replied I. “He holds captive the young girl
whom you saw ill at the priest’s house, and wants to force her to
marry him.”
“I will soon put Shvabrin in his right place,” said Pougatcheff fiercely.
“He shall learn what it is to oppress my people according to his own
will and pleasure. I will have him hanged.”
“Allow me to speak a word,” said Khlopousha in a hoarse; voice.
“You were in too great a hurry in appointing Shvabrin to the
command of the fortress, and now you are in too great a hurry to
hang him. You have already offended the Cossacks by placing a
nobleman over them as their chief; do not now alarm the nobles by
hanging them at the first accusation.”
“They ought neither to be pitied nor favoured,” said the little old
man with the blue riband. “To hang Shvabrin would be no great
misfortune, neither would it be amiss to put this officer through a
regular course of questions. Why has he deigned to pay us a visit? If
he does not recognize you as Czar, he cannot come to seek justice
from you; and if he does recognize you, why has he remained up to
the present time in Orenburg along with your enemies? Will you not
order him to be conducted to the court-house, and have a fire lit
there?[3] It seems to me that his Grace is sent to us from the
generals in Orenburg.”
The logic of the old rascal seemed to me to be plausible enough. A
shudder passed through the whole of my body, when I thought into
whose hands I had fallen. Pougatcheff observed my agitation.
“Well, your lordship,” said he to me, winking his eyes; “my Field-
Marshal, it seems to me, speaks to the point. What do you think?”
Pougatcheff’s raillery restored my courage. I calmly replied that I
was in his power, and that he could deal with me in whatever way
he pleased.
“Good,” said Pougatcheff. “Now tell me, in what condition is your
town?”
“Thank God!” I replied, “everything is all right.”
“All right!” repeated Pougatcheff, “and the people are dying of
hunger!”
The impostor spoke the truth; but in accordance with the duty
imposed upon me by my oath, I assured him that what he had heard
were only idle reports, and that in Orenburg there was a sufficiency
of all kinds of provisions.
“You see,” observed the little old man, “that he deceives you to your
face. All the deserters unanimously declare that famine and sickness
are rife in Orenburg, that they are eating carrion there and think
themselves fortunate to get it to eat; and yet his Grace assures us
that there is plenty of everything there. If you wish to hang
Shvabrin, then hang this young fellow on the same gallows, that
they may have nothing to reproach each other with.”
The words of the accursed old man seemed to produce an effect
upon Pougatcheff. Fortunately, Khlopousha began to contradict his
companion.
“That will do, Naoumitch,” said he to him: “you only think of
strangling and hanging. What sort of a hero are you? To look at you,
one is puzzled to imagine how your body and soul contrive to hang
together. You have one foot in the grave yourself, and you want to
kill others. Haven’t you enough blood on your conscience?”
“And what sort of a saint are you?” replied Bailoborodoff. “Whence
this compassion on your side?”
“Without doubt,” replied Khlopousha, “I also am a sinner, and this
hand”—here he clenched his bony fist and, pushing back his sleeve,
disclosed his hairy arm—“and this hand is guilty of having shed
Christian blood. But I killed my enemy, and not my guest; on the
open highway or in a dark wood, and not in the house, sitting
behind the stove; with the axe and club, and not with old woman’s
chatter.”
The old man turned round and muttered the words: “Slit nostrils!”
“What are you muttering, you old greybeard?” cried Khlopousha. “I
will give you slit nostrils. Just wait a little, and your turn will come
too. Heaven grant that your nose may smell the pincers.... In the
meantime, take care that I don’t pull out your ugly beard by the
roots.” “Gentlemen, generals!” said Pougatcheff loftily, “there has
been enough of this quarrelling between you. It would be no great
misfortune if all the Orenburg dogs were hanging by the heels from
the same crossbeam; but it would be a very great misfortune if our
own dogs were to begin devouring each other. So now make it up
and be friends again.”
Khlopousha and Bailoborodoff said not a word, but glared furiously
at each other. I felt the necessity of changing the subject of a
conversation which might end in a very disagreeable manner for me,
and turning to Pougatcheff, I said to him with a cheerful look:
“Ah! I had almost forgotten to thank you for the horse and pelisse.
Without you I should never have reached the town, and I should
have been frozen to death on the road.”
My stratagem succeeded. Pougatcheff became good-humoured
again.
“The payment of a debt is its beauty,” said he, winking his eyes.
“And now tell me, what have you to do with this young girl whom
Shvabrin persecutes? Has she kindled a flame in your young heart,
eh?”
“She is my betrothed,” I replied, observing a favourable change in
the storm, and hot deeming it necessary to conceal the truth.
“Your betrothed!” exclaimed Pougatcheff. “Why did you not say so
before? We will marry you, then, and have some merriment at your
wedding!”
Then turning to Bailoborodoff:
“Listen, Field-Marshal!” said he to him: “his lordship and I are old
friends; let us sit down to supper; morning’s judgment is wiser than
that of evening—so we will see to-morrow what is to be done with
him.”
I would gladly have declined the proposed honour, but there was no
help for it. Two young Cossack girls, daughters of the owner of the
cottage, covered the table with a white cloth, and brought in some
bread, fish-soup, and several bottles of wine and beer, and for the
second time I found myself seated at the same table with
Pougatcheff and his terrible companions.
The drunken revel, of which, I was an involuntary witness, continued
till late into the night. At last, intoxication began; to overcome the
three associates. Pougatcheff fell off to sleep where he was sitting:
his companions rose and made signs to me to leave him where he
was. I went out with them. By order of Khlopousha, the sentinel
conducted me; to the justice-room, where I found Savelitch, and
where they left me shut up with him. My servant was so astonished
at all he saw and heard, that he could not ask me a single question.
He lay down in the dark, and continued to sigh and moan for a long
time; but at length he began to snore, and I gave myself up to
meditations, which hindered me from obtaining sleep for a single
minute during the whole of the night.
The next morning, Pougatcheff gave orders for me to be brought
before him. I went to him. In front of his door stood a kibitka, with
three Tartar horses harnessed to it. The crowd filled the street. I
encountered Pougatcheff in the hall. He was dressed for a journey,
being attired in a fur cloak and a Kirghis cap. His companions of the
night before stood around him, exhibiting an appearance of
submission, which contrasted strongly with everything that I had
witnessed the previous evening. Pougatcheff saluted me in a
cheerful tone, and ordered me to sit down beside him in the kibitka.
We took our seats.
“To the fortress of Bailogorsk!” said Pougatcheff to the broad-
shouldered Tartar who drove the vehicle. My heart beat violently.
The horses broke into a gallop, the little bell tinkled, and the kibitka
flew over the snow.
“Stop! stop!” cried a voice which I knew only too well, and I saw
Savelitch running towards us.
Pougatcheff ordered the driver to stop.
“Little father, Peter Andreitch!” cried my servant; “do not leave me in
my old age among these scoun——”
“Ah, old greybeard!” said Pougatcheff to him. “It is God’s will that we
should meet again. Well, spring up behind.”
“Thanks, Czar, thanks, my own father!” replied Savelitch, taking his
seat. “May God give you a hundred years of life and good health for
deigning to cast your eyes upon and console an old man. I will pray
to God for you all the days of my life, and I will never again speak
about the hareskin pelisse.”
This allusion to the hareskin pelisse might have made Pougatcheff
seriously angry. Fortunately, the usurper did not hear, or pretended
not to hear, the misplaced remark. The horses again broke into a
gallop; the people in the streets stood still and made obeisance.
Pougatcheff bowed his head from side to side. In about a minute we
had left the village behind us and were flying along over the smooth
surface of the road.
One can easily imagine what my feelings were at that moment. In a
few hours I should again set eyes upon her whom I had already
considered as lost to me for ever. I pictured to myself the moment of
our meeting.... I thought also of the man in whose hands lay my
fate, and who, by a strange concourse of circumstances, had
become mysteriously connected with me. I remembered the
thoughtless cruelty and the bloodthirsty habits of him, who now
constituted himself the deliverer of my beloved. Pougatcheff did not
know that she was the daughter of Captain Mironoff; the
exasperated Shvabrin might reveal everything to him; it was also
possible that Pougatcheff might find out the truth in some other
way.... Then what would become of Maria Ivanovna? A shudder
passed through my frame, and my hair stood on end.
Suddenly Pougatcheff interrupted my meditations, by turning to me
with the question:
“What is your lordship thinking of?”
“What should I not be thinking of,” I replied. “I am an officer and a
gentleman; only yesterday I was fighting against you, and now to-
day I am riding side by side with you in the same carriage, and the
happiness of my whole life depends upon you.”
“How so?” asked Pougatcheff. “Are you afraid?”
I replied that, having already had my life spared by him,
I hoped, not only for his mercy, but even for his assistance.
“And you are right; by God, you are right!” said the impostor. “You
saw that my fellows looked askant at you; and this morning the old
man persisted in his statement that you were a spy, and that it was
necessary that you should be interrogated by means of torture and
then hanged. But I would not consent to it,” he added, lowering his
voice, so that Savelitch and the Tartar should not be able to hear
him, “because I remembered your glass of wine and hareskin
pelisse. You see now that I am not such a bloodthirsty creature as
your brethren maintain.”
I recalled to mind the capture of the fortress of Bailogorsk but I did
not think it advisable to contradict him, and so I made no reply.
“What do they say of me in Orenburg?” asked Pougatcheff, after a
short interval of silence.
“They say that it will be no easy matter to get the upper hand of
you; and there is no denying that you have made yourself felt.”
The face of the impostor betokened how much his vanity was
gratified by this remark.
“Yes,” said he, with a look of self-satisfaction, “I wage war to some
purpose. Do you people in Orenburg know about the battle of
Youzeiff?[4] Forty general officers killed, four armies taken captive.
Do you think the King of Prussia could do as well as that?”
The boasting of the brigand appeared to me to be somewhat
amusing.
“What do you think about it yourself?” I said to him: “do you think
that you could beat Frederick?”
“Fedor Fedorovitch?[5] And why not? I beat your generals, and they
have beaten him. My arms have always been successful up till now.
But only wait awhile, you will see something very different when I
march to Moscow.”
“And do you intend marching to Moscow?”
The impostor reflected for a moment and then said in a low voice:
“God knows. My road is narrow; my will is weak. My followers do not
obey me. They are scoundrels. I must keep a sharp look-out; at the
first reverse they will save their own necks at the expense of my
head.”
“That is quite true,” I said to Pougatcheff. “Would it not be better for
you to separate yourself from them in good time, and throw yourself
upon the mercy of the Empress?”
Pougatcheff smiled bitterly.
“No,” replied he: “it is too late for me to repent now. There would be
no pardon for me. I will go on as I have begun. Who knows?
Perhaps I shall be successful. Grishka Otrepieff was made Czar at
Moscow.”
“And do you know what his end was? He was flung out of a window,
his body was cut to pieces and burnt, and then his ashes were
placed in a cannon and scattered to the winds!”
“Listen,” said Pougatcheff with a certain wild inspiration. “I will tell
you a tale which was told to me in my childhood by an old Calmuck.
‘The eagle once said to the crow: “Tell me, crow, why is it that you
live in this bright world for three hundred years, and I only for thirty-
three years?” “Because, little father,” replied the crow, “you drink live
blood, and I live on carrion.”—The eagle reflected for a little while
and then said: “Let us both try and live on the same food.”—“Good!
agreed!” The eagle and the crow flew away. Suddenly they caught
sight of a fallen horse, and they alighted upon it. The crow began to
pick its flesh and found it very good. The eagle tasted it once, then a
second time, then shook its pinions and said to the crow: “No,
brother crow; rather than live on carrion for three hundred years, I
would prefer to drink live blood but once, and trust in God for what
might happen afterwards!”’ What do you think of the Calmuck’s
story?”
“It is very ingenious,” I replied. “But to live by murder and robbery
is, in my opinion, nothing else than living on carrion.”
Pougatcheff looked at me in astonishment and made no reply. We
both became silent, each being wrapped in his own thoughts. The
Tartar began to hum a plaintive song. Savelitch, dozing, swayed
from side to side. The kibitka glided along rapidly over the smooth
frozen road.... Suddenly I caught sight of a little village on the steep
bank of the Yaik, with its palisade and belfry, and about a quarter of
an hour afterwards we entered the fortress of Bailogorsk.

[1] The picture of some saint, usually painted on wood. There is


generally one of them hung in the corner of every room in the
houses of the Russians.
[2] The name of a celebrated bandit of the last century, who for a
long time offered resistance to the Imperial troops.
[3] For the purpose of torture.
[4] An engagement in which Pougatcheff had the advantage.
[5] The name given to Frederick the Great by the Russian
soldiers.

CHAPTER XII.

THE ORPHAN.

The kibitka drew up in front of the Commandant’s house. The


inhabitants had recognized Pougatcheff’s little bell, and came
crowding around us. Shvabrin met the impostor at the foot of the
steps. He was dressed as a Cossack, and had allowed his beard to
grow. The traitor helped Pougatcheff to alight from the kibitka,
expressing, in obsequious terms, his joy and zeal. On seeing me, he
became confused; but quickly recovering himself, he stretched out
his hand to me, saying:
“And are you also one of us? You should have been so long ago!”
I turned away from him and made no reply.
My heart ached when we entered the well-known room, on the wall
of which still hung the commission of the late Commandant, as a
mournful epitaph of the past. Pougatcheff seated himself upon the
same sofa on which Ivan Kouzmitch was accustomed to fall asleep,
lulled by the scolding of his wife. Shvabrin himself brought him some
brandy. Pougatcheff drank a glass, and said to him, pointing to me:
“Give his lordship a glass.”
Shvabrin approached me with his tray, but I turned away from him a
second time. He seemed to have become quite another person. With
his usual sagacity, he had certainly perceived that Pougatcheff was
dissatisfied with him. He cowered before him, and glanced at me
with distrust.
Pougatcheff asked some questions concerning the condition of the
fortress, the reports referring to the enemy’s army, and the like.
Then suddenly and unexpectedly he said to him:
“Tell me, my friend, who is this young girl that you hold a prisoner
here? Show her to me.”
Shvabrin turned as pale as death.
“Czar,” said he, in a trembling voice... “Czar, she is not a prisoner ...
she is ill ... she is in bed.”
“Lead me to her,” said the impostor, rising from his seat.
Refusal was impossible. Shvabrin conducted Pougatcheff to Maria
Ivanovna’s room. I followed behind them.
Shvabrin stopped upon the stairs.
“Czar,” said he: “you may demand of me whatever you please; but
do not permit a stranger to enter my wife’s bedroom.”
I shuddered.
“So you are married!” I said to Shvabrin, ready to tear him to pieces.
“Silence!” interrupted Pougatcheff: “that is my business. And you,”
he continued, turning to Shvabrin, “keep your airs and graces to
yourself: whether she be your wife or whether she be not, I will take
to her whomsoever I please. Your lordship, follow me.”
At the door of the room Shvabrin stopped again, and said in a
faltering voice:
“Czar, I must inform you that she is in a high fever, and has been
raving incessantly for the last three days.”
“Open the door!” said Pougatcheff.
Shvabrin began to search in his pockets and then said that he had
not brought the key with him. Pougatcheff pushed the door with his
foot; the lock gave way, the door opened, and we entered.
I glanced round the room—and nearly fainted away. On the floor,
clad in a ragged peasant’s dress, sat Maria Ivanovna, pale, thin, and
with dishevelled hair. Before her stood a pitcher of water, covered
with a piece of bread. Seeing me, she shuddered and uttered a
piercing cry. What I felt at that moment I cannot describe.
Pougatcheff looked at Shvabrin and said with a sarcastic smile:
“You have a very nice hospital here!”
Then approaching Maria Ivanovna:
“Tell me, my little dove, why does your husband punish you in this
manner?”
“My husband!” repeated she. “He is not my husband. I will never be
his wife! I would rather die, and I will die, if I am not set free.”
Pougatcheff cast a threatening glance at Shvabrin.
“And you have dared to deceive me!” he said to him. “Do you know,
scoundrel, what you deserve?”
Shvabrin fell upon his knees.... At that moment contempt
extinguished within me all feelings of hatred and resentment. I
looked with disgust at the sight of a nobleman grovelling at the feet
of a runaway Cossack.
Pougatcheff relented.
“I forgive you this time,” he said to Shvabrin: “but bear in mind that
the next time you are guilty of an offence, I will remember this one
also.”
Then he turned to Maria Ivanovna and said to her kindly:
“Go, my pretty girl; I give you your liberty. I am the Czar.”
Maria Ivanovna glanced rapidly at him, and intuitively divined that
before her stood the murderer of her parents. She covered her face
with both hands and fainted away. I hastened towards her; but at
that moment my old acquaintance, Palasha, very boldly entered the
room, and began to attend to her young mistress. Pougatcheff
quitted the apartment, and we all three entered the parlour.
“Well, your lordship,” said Pougatcheff smiling, “we have set the
pretty girl free! What do you say to sending for the pope and making
him marry his niece to you? If you like, I will act as father, and
Shvabrin shall be your best man. We will then smoke and drink and
make ourselves merry to our hearts’ content!”
What I feared took place. Shvabrin, hearing Pougatcheff’s proposal,
was beside himself with rage.
“Czar!” he exclaimed, in a transport of passion, “I am guilty; I have
lied to you; but Grineff is deceiving you also. This young girl is not
the pope’s niece: she is the daughter of Ivan Mironoff, who was
hanged at the taking of the fortress.”
Pougatcheff glanced at me with gleaming eyes.
“What does this mean?” he asked in a gloomy tone.
“Shvabrin has told you the truth,” I replied in a firm voice.
“You did not tell me that,” replied Pougatcheff, whose face had
become clouded.
“Judge of the matter yourself,” I replied: “could I, in the presence of
your people, declare that she was the daughter of Mironoff? They
would have torn her to pieces! Nothing would have saved her!”
“You are right,” said Pougatcheff smiling. “My drunkards would not
have spared the poor girl; the pope’s wife did well to deceive them.”
“Listen,” I continued, seeing him so well disposed; “I know not what
to call you, and I do not wish to know.... But God is my witness that
I would willingly repay you with my life for what you have done for
me. But do not demand of me anything that is against my honour
and my Christian conscience. You are my benefactor. End as you
have begun: let me go away with that poor orphan wherever God
will direct us. And wherever you may be, and whatever may happen
to you, we will pray to God every day for the salvation of your
soul....”
Pougatcheff’s fierce soul seemed touched.
“Be it as you wish!” said he. “Punish thoroughly or pardon
thoroughly: that is my way. Take your beautiful one, take her
wherever you like, and may God grant you love and counsel!”
Then he turned to Shvabrin and ordered him to give me a safe
conduct for all barriers and fortresses subjected to his authority.
Shvabrin, completely dumbfounded, stood as if petrified. Pougatcheff
then went off to inspect the fortress. Shvabrin accompanied him,
and I remained behind under the pretext of making preparations for
my departure.
I hastened to Maria’s room. The door was locked. I knocked.
“Who is there?” asked Palasha.
I called out my name. The sweet voice of Maria Ivanovna sounded
from behind the door:
“Wait a moment, Peter Andreitch. I am changing my dress. Go to
Akoulina Pamphilovna; I shall be there presently.”
I obeyed and made my way to the house of Father Jerasim. He and
his wife came forward to meet me Savelitch had already informed
them of what had happened.
“You are welcome, Peter Andreitch,” said the pope’s wife. “God has
ordained that we should meet again. And how are you? Not a day
has passed without our talking about you. And Maria Ivanovna, the
poor little dove, what has she not suffered while you have been
away! But tell us, little father, how did you manage to arrange
matters with Pougatcheff? How was it that he did not put you to
death? The villain be thanked for that, at all events!”
“Enough, old woman,” interrupted Father Gerasim. “Don’t babble
about everything that you know. There is; no salvation for
chatterers. Come in, Peter Andreitch, I beg of you. It is a long, long
time since we saw each other.” The pope’s wife set before me
everything that she had in the house, without ceasing to chatter
away for a single moment. She related to me in what manner
Shvabrin had compelled them to deliver Maria Ivanovna up to him;
how the poor girl wept and did not wish to be parted from them;
how she had kept up a constant communication with them; by
means of Palashka[1] (a bold girl who compelled the orderly himself
to dance to her pipe); how she had advised Maria Ivanovna to write
a letter to me, and so forth.
I then, in my turn, briefly related to them my story. The pope and
his wife made the sign of the cross on hearing that; Pougatcheff had
become acquainted with their deception.
“The power of the Cross defend us!” ejaculated Akoulina
Pamphilovna. “May God grant that the cloud will pass over. Well,
well, Alexei Ivanitch, you are a very nice fellow: there is no denying
that!”
At that moment the door opened, and Maria Ivanovna entered the
room with a smile upon her pale face. She had doffed her peasant’s
dress, and was attired as before, plainly and becomingly.
I grasped her hand and for some time could not utter a single word.
We were both silent from fulness of heart. Our hosts felt that their
presence was unnecessary to us, and so they withdrew. We were
left by ourselves. Everything else was forgotten. We talked and
talked and could not say enough to each other. Maria related to me
all that had happened to her since the capture of the fortress; she
described to me all the horror of her situation, all the trials which
she had experienced at the hands of the detestable Shvabrin. We
recalled to mind the happy days of the past, and we could not
prevent the tears coming into our eyes. At last I began to explain to
her my project. For her to remain in the fortress, subjected to
Pougatcheff and commanded by Shvabrin, was impossible. Neither
could I think of taking her to Orenburg, just then undergoing all the
calamities of a siege. She had not a single relative in the whole
world. I proposed to her that she should seek shelter with my
parents. She hesitated at first: my father’s unfriendly disposition
towards her frightened her. I made her mind easy on that score. I
knew that my father would consider himself bound in honour to
receive into his house the daughter of a brave and deserving soldier
who had lost his life in the service of his country.
“Dear Maria Ivanovna,” I said at last: “I look upon you as my wife.
Strange circumstances have united us together indissolubly; nothing
in the world can separate us.”
Maria Ivanovna listened to me without any assumption of
affectation. She felt that her fate was linked with mine. But she
repeated that she would never be my wife, except with the consent
of my parents. I did not contradict her. We kissed each other
fervently and passionately, and in this manner everything was
resolved upon between us.
About an hour afterwards, the orderly brought me my safe conduct,
inscribed with Pougatcheff’s scrawl, and informed me that his master
wished to see me. I found him ready to st out on his road. I cannot
describe what I felt on taking leave of this terrible man, this outcast,
so villainously cruel to all except myself alone. But why should I not
tell the truth? At that moment I felt drawn towards him by a
powerful sympathy. I ardently wished to tear him away! from the
midst of the scoundrels, whom he commanded, and save his head
while there was yet time. Shvabrin, and the crowd gathered around
us, prevented me from giving expression to all that filled my heart.
We parted as friends. Pougatcheff, catching sight of Akoulina
Pamphilovna among the crowd, threatened her with his finger and
winked significantly; then he seated himself in his kibitka[2] and
gave orders to return to Berd; and when the horses started off, he
leaned once out of the carriage, and cried out to me: “Farewell, your
lordship! Perhaps we shall see each other again!”
We did indeed see each other again, but under what circumstances!
Pougatcheff was gone. I stood for a long time gazing across the
white steppe, over which his troika[6] went gliding rapidly. The
crowd dispersed. Shvabrin disappeared. I returned to the pope’s
house. Everything was ready for our departure; I did not wish to
delay any longer. Our luggage had already been deposited in the
Commandant’s old travelling carriage. The horses were harnessed in
a twinkling. Maria Ivanovna went to pay a farewell visit to the graves
of her parents, who were buried behind the church. I wished to
accompany her, but she begged of me to let her go alone. After a
few minutes she returned silently weeping. The carriage was ready.
Father Gerasim and his wife came out upon the steps. Maria
Ivanovna, Palasha and I took our places inside the kibitka, while
Savelitch seated himself in the front.
“Farewell, Maria Ivanovna, my little dove; farewell, Peter Andreitch,
my fine falcon!” said the pope’s good wife. “A safe journey, and may
God bless you both and make you happy!”
We drove off. At the window of the Commandant’s house I perceived
Shvabrin standing. His face wore an expression of gloomy malignity.
I did not wish to triumph over a defeated enemy, so I turned my
eyes the other way.
At last we passed out of the gate, and left the fortress of Bailogorsk
behind us for ever.

[1] Diminutive of Palasha.


[2] An open vehicle drawn by three horses yoked abreast.

CHAPTER XIII.

THE ARREST.

United so unexpectedly with the dear girl, about whom I was so


terribly uneasy that very morning, I could scarcely believe the
evidence of my senses, and imagined that everything that had
happened to me was nothing but an empty dream. Maria Ivanovna
gazed thoughtfully, now at me, now at the road, and seemed as if
she had not yet succeeded in recovering her senses. We were both
silent. Our hearts were too full of emotion. The time passed almost
imperceptibly, and after journeying for about two hours, we reached
the next fortress, which was also subject to Pougatcheff. Here we
changed horses. By the rapidity with which this was effected, and by
the obliging manner of the bearded Cossack who had been
appointed Commandant by Pougatcheff, I perceived that, thanks to
the gossip of our driver, I was taken for a favourite of their master.
We continued our journey. It began to grow dark. We approached a
small town, where, according to the bearded Commandant, there
was a strong detachment on its way to join the impostor. We were
stopped by the sentries. In answer to the challenge: “Who goes
there?” our driver replied in a loud voice: “The Czar’s friend with his
little, wife.”
Suddenly a troop of hussars surrounded us, uttering the most
terrible curses.
“Step down, friend of the devil!” said a moustached sergeant-major.
“We will make it warm for you and your little wife!”
I got out of the kibitka and requested to be brought before their
commander. On seeing my officer’s uniform, the soldiers ceased their
imprecations, and the sergeant conducted me to the major.
Savelitch followed me, muttering:
“So much for your being a friend of the Czar! Out of the frying-pan
into the fire. Lord Almighty! how is all this going to end?”
The kibitka followed behind us at a slow pace.
In about five minutes we arrived at a small, well-lighted house. The
sergeant-major left me under a guard and entered to announce me.
He returned immediately and informed me that his Highness had no
time to receive me, but that he had ordered that I should be taken
to prison, and my wife conducted into his presence.
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.

More than just a book-buying platform, we strive to be a bridge


connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.

Join us on a journey of knowledge exploration, passion nurturing, and


personal growth every day!

ebookmasss.com

You might also like