7. Tensor Algebra#
This section introduces tensor algebra
Warning
This introduction is meant for spaces with inner product that allows to introduce quite naturally the useful concept of the reciprocal basis of a given basis of \(\mathscr{V}\), belonging to the same space \(\mathscr{V}\), somehow avoiding the complications coming from the introduction of dual space \(\mathscr{V}^*\) and basis, and everything is required for that.
7.1. Vector space \(\mathscr{V}\)#
A vector space is an algebraic structure with:
a set \(\mathscr{V}\), whose elements are called vectors, here indicated with bold symbols \(\mathbf{v} \in \mathscr{V}\)
a field \(K\) (usually \(\mathbb{R}\) or \(\mathbb{C}\)), whose elements are calles scalars,
2 operations, closed w.r.t. the set \(\mathscr{V}\):
vector sum, \(\mathbf{u} + \mathbf{v} \in \mathscr{V}\) for \(\forall \mathbf{u}, \ \mathbf{v} \in \mathscr{V}\)
multiplication of a scalar and a vector, \(a \mathbf{v} \in \mathscr{V}\) for \(\forall a \in K, \ \mathbf{u} \in \mathscr{V}\)
with properties discussed below todo …
7.1.1. Operations (I)#
7.1.1.1. Sum#
…
7.1.1.2. Multiplication by a scalar#
…
Definition 7.1 (Linear combination)
A linear combination of \(n\) vectors \(\{ \mathbf{v}_i \}_{i=1:n}\), \(\mathbf{v}_i \in \mathscr{V}\), is the weighted sum
having used Einstein’s summation convention over repeated indices. Here the position of the indices has no particular meaning, but it’ll have soon in the following sections.
7.1.1.3. Inner product#
The existence of an inner product is not a requirement of a vector space
…
An inner product in a vectors space \(\mathscr{V}\) over field \(K\) is an operation \(\cdot: \mathscr{V} \times \mathscr{V} \rightarrow K\),
with the following properties: todo
7.1.2. Basis of a vector space \(\mathscr{V}\)#
Definition 7.2 (Basis)
A basis is a minimal set of vectors of \(\mathscr{V}\) that can represent all the elements of \(\mathscr{V}\).
Definition 7.3 (Reciprocal basis)
In a inner product space, the reciprocal basis of a given basis \(\{ \mathbf{b}_a \}_{a=1:d}\) is the set of vectors \(\{ \mathbf{b}^{b} \}_{b=1:d}\), s.t.
Definition 7.4 (“Metric tensor”)
The following holds
“Proof”
Taking the dot product with \(\mathbf{b}_c\) of the relation (7.1)(1),
7.1.3. Change of basis#
Let \(T_{a}^b\) the elements of the matrix of change of basis, representing the vectors of the basis \(\{ \widetilde{b}_a \}\) as linear combinations of the vectors of the basis \(\{ \mathbf{b}_b \}\),
Inverse transformation. Let \(\widetilde{T}\) be the elements of the inverse transformation,
and thus
Transformation of the reciprocal basis. todo
7.1.3.1. Transformation of components#
A vector \(\mathbf{v}\) can be represented in different basis, as different linear combinations of the elements of those bases,
Given the rules of change of basis, the rule of transformation of components immediately follwos
Proof
or
It’s clear that the vectors of the bases and the components follow inverse transformations to preserve the invariance of vector w.r.t. a change of basis: the vector \(\mathbf{v}\) doesn’t change if we change our description of it, by changing a basis.
Matrix of change of basis as a tensor (todo maybe later? Tensor not introduced yet here)
The rule of transformation between different basis can be interpreted using dot product between tensors and vectors
Inverse and transpose
Interpreting the indices of the transformation matrix as the indices of rows and columns of a 2x2 matrix, transformation of components involves the transpose of the inverse matrix, as the indices \(a\), \(b\) are swapped.
Example 7.1
Example 7.2 (Rotation)
As the inverse of a rotation is its transpose, \(\widetilde{T}^{b}_{a} = T^{a}_b\). So the rule of transformation of components follows the same rule of change of basis. As an example, let the transformation between two Cartesian bases be the rotation
Let a vector
and thus
7.1.4. Operations#
7.1.4.1. Tensor product of vectors#
or writing each vector as a linear combination of the elements of a basis \(\mathbf{b}_a\),
The result of the tensor product of \(r\) vectors is a rank-\(r\) tensor, \(\mathbb{T}\), as it will be clear below, with components
7.2. Space of tensors#
Definition 7.5
7.2.1. Operations (I)#
7.2.1.1. Sum#
7.2.1.2. Multiplication by a scalar#
Vector space of tensors
The set of tensors of a given rank with the operations of sum and multiplication by a scalar defined above forms a vector space.
7.2.2. Basis#
7.2.2.1. Change of basis and rule of transformation of components - classical definition of a tensor#
with
Definition 7.6 (Classical definition of a tensor)
Relation (7.2) is the “historical” definition of a tensor, through the law of transformation of its components following a change of basis.
7.2.3. Operations (II)#
7.2.3.1. Tensor product#
The tensor product of a \(p\)-rank tensor \(\mathbf{A}\) and a \(q\)-rank tensor \(\mathbf{B}\) is the \((p+q)\)-rank tensor \(\mathbf{A} \otimes \mathbf{B} = \mathbf{A} \mathbf{B}\), that can be defined using component representation in a given basis,
7.2.3.2. Dot product#
The dot product of a \(p\)-rank tensor \(\mathbf{A}\) and a \(q\)-rank tensor \(\mathbf{B}\) is the \((p+q-2)\)-rank tensor \(\mathbf{A} \cdot \mathbf{B}\), that can be defined using component representation in a given basis,
7.2.3.3. Contraction#
Contraction of a pair of index of a \(p\)-rank tensor \(\mathbf{A}\) returns a \(p-2\)-rank tensor defined as
7.2.3.4. Exterior product#
todo see exterior algebra