% vim: tw=50 % 07/03/2022 10AM \newpage \section{Tensors} \subsection{What is a Tensor?} Not any list of $n$ numbers constitutes a vector in $\RR^n$. They come with certain responsibilities. \myskip We start with a point $\bf{x} \in \RR^n$. To attach some coordinates to this, we first introduce a basis $\{\bf{e}_i, i = 1, \dots, n\}$ such that \[ \bf{e}_i \cdot \bf{e}_j = \delta_{ij} \] And we write $\bf{x} = x_i \bf{e}_i$. We call $x_i = (x_1, \dots, x_n)$ a ``vector''. It's a set of labels to specify $\bf{x}$. \myskip Alternatively, we could use \[ \bf{e}_i' = R_{ij} \bf{e}_j \tag{$*$} \] We insist that $\bf{e}_i' \cdot \bf{e}_j' = \delta_{ij}$. \[ \implies R_{ik}R_{jk} \bf{e}_k \cdot \bf{e}_l = R_{ik}R_{jk} = \delta_{ij} \] \[ \implies RR^\top = \mathbbm{1} \] Such matrices are called \emph{orthogonal}. We write $R \in O(n)$. We have \[ \det RR^\top = (\det R)^2 = 1 \implies \det R = \pm 1 \] If $\det R = + 1$, then $R$ corresponds to a rotation and we write $R \in SO(n)$ (special orthogonal). \\ If $\det R = -1$, it is a reflection + rotation. Under a change of basis, $\bf{x}$ doesn't change. We have \[ \bf{x} = x_i \bf{e}_i = x_i' \bf{e}_i' = x_i' R_{ij} \bf{e}_j \] \[ \bf{x} \cdot \bf{e}_k = x_k = x_i' R_{ik} \] inverting: \[ \implies x_i' = R_{ij} x_j \] \myskip A \emph{tensor} $T$ is a generalisation of these ideas to an object with more indices. When measured with respect to the basis $\{\bf{e}_i\}$, a \emph{tensor of rank $p$} (or \emph{$p$-tensor}) has indices \[ T_{i_1 \cdots i_p} \] Under a change of basis ($*$) we have the \emph{tensor transformation rule} \[ T_{i_1 \cdots i_p}' = R_{i_1j_1} \cdots R_{i_pj_p} T_{j_1 \cdots j_p} \] \begin{note*} 0-tensor is a number \\ 1-tensor is a vector \\ 2-tensor is a matrix such that $T_{ij}' = R_{ik}R_{jl} T_{kl}$. \end{note*} \begin{example*} There is one special rank 2 tensor in $\RR^n$: \[ \delta_{ij} = \begin{cases} 1 & i = j \\ 0 & \text{otherwise} \end{cases} \] This is the same in all bases since \[ \delta_{ij}' = R_{ik}R_{jl}\delta_{kl} = \delta_{ij} \] It's an example of an \emph{invariant} tensor. \end{example*} \subsubsection*{Tensors as Maps} There is an equivalent, coordinate independent view. A $p$-tensor is a multi-linear map \[ T : \ub{\RR^n \times \cdots \times \RR^n}_{p} \to \RR \] such that \[ T(\bf{a}, \bf{b}, \cdots, \bf{c}) = T_{i_1 \cdots i_p} a_{i_1} b_{i_2} c_{i_p} \] (multi-linear = linear in each entry seperately). \myskip The tensor transformation rule ensures that the map is independent of the choice of basis. \begin{align*} T(\bf{a}, \bf{b}, \cdots \bf{c}) &= T_{i_1 \cdots i_p}' a_{i_1}' b_{i_2}' \cdots c_{i_p}' \\ &= (R_{i_1 j_1} \cdots R_{i_p j_p}) T_{j_1 \cdots j_p} \times (R_{i_1 k_1} a_{k_1}) \cdots (R_{i_p k_p} c_{k_p}) \\ &= T_{j_1 \cdots j_p} a_{j_1} b_{j_2} \cdots c_{j_p} \end{align*} Alternatively, we can think of a tensor as a map between lower rank tensors. For example, a $p$-tensor can be viewed as a map \[ T : \ub{\RR^n \times \cdots \times \RR^n}_{p - 1} \to \RR^n \] The map is \[ a_i = T_{ij_1 \cdots j_{p - 1}} b_{j_1} \cdots c_{j_{p - 1}} \] This is the way that tensors originally appear in maths and physics, typically as a map from vectors to vectors. \[ \bf{u} = T \bf{v} \implies u_i T_{ij} v_j \] $T$ is a matrix but, importantly, transforms as a tensor so the equation holds in all bases \[ T_{ij}' = R_{ik}R_{jl}T_{kl} \] or \[ T' = RTR^\top \] \subsubsection*{Tensor Operations} \begin{itemize} \item If $S$, $T$ are tensors of the same rank, then so is $S + T$ and $\lambda T$ for $\lambda \in \RR$. \item If $S$ is a $p$-tensor and $T$ is a $q$-tensor then we can form a $(p + q)$-tensor known as the tensor product \[ (S \otimes T)_{i_1 \cdots i_p j_1 \cdots j_q} = S_{i_1 \cdots i_p} T_{j_1 \cdots j_q} \] for example, given two vectors $\bf{a}$ and $\bf{b}$ we can form the matrix \[ (\bf{a} \otimes \bf{b})_{ij} = a_i b_j \] \item If $T$ is a $p$-tensor then we can construct a $(p - 2)$-tensor by \emph{contraction}: \[ \delta_{ij} T_{ij k_1 \cdots k_{p - 2}} = T_{iik_1 \cdots k_{p - 2}} \] for example \[ T_rT = T_{ii} \] for a 2-tensor. \end{itemize} \myskip We can combine the tensor product and contraction. If $P$ is a $p$-tensor and $Q$ is a $q$-tensor, we can form a $(p + q - 2)$-tensor. For example, contraction on the first index gives \[ P_{i k_1 \cdots k_{p - 1}} Q_{il_1 \cdots l_{q - 1}} \] for example given vectors $\bf{a}, \bf{b}$, \[ \delta_{ij} a_i b_j = \bf{a} \cdots \bf{b} \] is a zero-tensor. This is just the usual inner-product. Another example is matrix multiplication.