Note.
We will focus our discussion on finite-dimensional vector spaces.
By default, vector spaces introduced in this section are finite-dimensional vector spaces over the field $F$ with dimensionality $n\in N$.
Notation.
In this section, we will use the symbol $\cong$ to denote natural isomorphism between vector spaces.
Note that the corresponding canonical isomorphisms must be isomorphisms that preserve operations of vector spaces.
And we may implicitly use naturally isomorphic vector spaces interchangeably,
with corresponding vectors representing each other.
Kronecker delta
We define a function $\delta:N\times N\to N$ by
$$\delta^i_j =
\begin{cases}
1 & i=j\\
0 & i\neq j
\end{cases}$$
Dual space
Let $V$ be a vector space. Denote $L(V,F)$ by $V^*$, then $V^*$ is clearly a vector space over $F$,
called the dual space of $V$.
Elements of $V^*$ are called covectors.
Note that the constant function $0$ is the zero covector of $V^*$.
Notation.
For this section, we will use
- lower indexes to denote vectors in $V$ and components with respect to bases of $V^*$;
- upper indexes to denote vectors in $V^*$ and components with respect to bases of $V$.
Dual basis
Let $B=(e_1,\ldots,e_n)$ be a basis of $V$,
and let $\phi$ be the coordinate map with respect to $B$.
Define $\mu^j:V\to F$ by $\mu^j(v)=\phi(v)^j$ for all $j$ from $1$ to $n$,
then $(\mu^1,\ldots,\mu^n)$ is a basis of $V^*$
(show proof),
Proof.
Since coordinate maps are linear, $\mu^i\in V^*$ for all $i$.
Suppose $\sum_j c_j\mu^j=0$. Then
$$c_i=\sum_j c_j\phi(e_i)^j=\sum_j c_j\mu^j(e_i)=0(e_i)=0$$
for all $i$.
We have shown that $(\mu^1,\ldots,\mu^n)$ is independent.
Suppose we have $f\in V^*$.
Define $g=\sum_j f(e_j)\mu^j$, then $g\in V^*$,
and for all $v\in V$,
$$g(v)=\sum_i\phi(v)^ig(e_i)=\sum_i\phi(v)^i\sum_j f(e_j)\mu^j(e_i)=\sum_i\phi(v)^i\sum_j f(e_j)\phi(e_i)^j=\sum_i\phi(v)^if(e_i)=f(v)$$
Hence $(\mu^1,\ldots,\mu^n)$ spans $V^*$.
Therefore, $(\mu^1,\ldots,\mu^n)$ is a basis of $V^*$.
$\blacksquare$
called the dual basis for $B$. Hence $$\dim V^*=\dim V$$
Lemma.
Let $v\in V$, if $f(v)=0$ for all $f\in V^*$, then $v=0$.
(show proof)
Proof.
Let $B=(e_1,\ldots,e_n)$ be a basis of $V$, and $(\mu^1,\ldots,\mu^n)$ be the dual basis of $V^*$ for $B$.
Then $\mu^j(v)=0$ for all $j$. Thus, let $\phi$ be the coordinate map with respect to $B$,
then $\phi(v)^j=0$ for all $j$, implying $v=\sum_i0e_i=0$.
$\blacksquare$
Proposition.
$$V^{**}\cong V$$
as in there uniquely exists a canonical isomorphism $\Phi:V\to V^{**}$
such that for all $v\in V$, for all $v^*\in V^*$, $\Phi(v)(v^*)=v^*(v)$.
(show proof)
Proof.
Let $v\in V$, then $w\mapsto w(v)$ defines a function $u_v:V^*\to F$.
Let $w,w'\in V^*$ and $c\in F$,
then $$u_v(w+w')=(w+w')(v)=w(v)+w'(v)=u_v(w)+u_v(w')$$
$$u_v(cw)=(cw)(v)=cw(v)=cu_v(w)$$
Thus $u_v\in V^{**}$.
Then $v\mapsto u_v$ defines a function $\Phi:V\to V^{**}$.
Note that for all $v\in V$, for all $v^*\in V^*$, $\Phi(v)(v^*)=v^*(v)$.
Let $v,v'\in V$ and $c\in F$,
then for all $w\in V^*$,
$$\Phi(v+v')(w)=u_{v+v'}(w)=w(v+v')=w(v)+w(v')=u_v(w)+u_{v'}(w)=\Phi(v)(w)+\Phi(v')(w)=(\Phi(v)+\Phi(v'))(w)$$
$$\Phi(cv)(w)=u_{cv}(w)=w(cv)=cw(v)=cu_v(w)=c\Phi(v)(w)=(c\Phi(v))(w)$$
Thus $\Phi$ is linear.
Suppose $v,v'\in V$ and $\Phi(v)=\Phi(v')$.
Then for all $w\in V^*$, $w(v-v')=w(v)-w(v')=u_v(w)-u_{v'}(w)=\Phi(v)(w)-\Phi(v')(w)=0$.
Thus $v-v'=0$, implying $v=v'$.
Hence $\Phi$ is injective.
Since $\dim V=\dim V^{**}$, $\Phi$ is bijective.
Since $\Phi$ is linear and bijective, it is an isomorphism.
Uniqueness is trivial.
$\blacksquare$
Proposition.
The canonical isomorphism $\Phi:V\to V^{**}$ sends any basis of $V$ to the dual basis of its dual basis.
(show proof)
Proof.
Let $(e_1,\ldots,e_n)$ be a basis of $V$, let $(\mu^1,\ldots,\mu^n)$ be its dual basis, and let $(\omega_1,\ldots,\omega_n)$ be the dual basis of the dual basis.
Let $j\in\{1,\ldots,n\}$.
Let $v^*\in V^*$, then $v^*=\sum_ic_i\mu^i$ for some $c_1,\ldots,c_n\in F$.
Then $$\Phi(e_j)(v^*)=v^*(e_j)=\sum_ic_i\mu^i(e_j)=c_j=\sum_ic_i\omega_j(\mu^i)=\omega_j\p{\sum_ic_i\mu^i}=\omega_j(v^*)$$
Thus $\Phi(e_j)=\omega_j$.
$\blacksquare$
Proposition.
Let $\mathcal B$ denote the set of bases of $V$ and $\mathcal B^*$ the set of bases of $V^*$.
Then the function $\mathcal F:\mathcal B\to\mathcal B^*$ that maps a basis to its dual basis is a bijection.
And given a basis $B^*$ of $V^*$, the basis of $V^{**}$ dual to $B^*$ corresponds to $\mathcal F^{-1}(B^*)$ by the canonical isomorphism.
(show proof)
Proof.
Let $(e_1,\ldots,e_n),(e'_1,\ldots,e'_n)\in\mathcal B$ and $\mathcal F(e_1,\ldots,e_n)=\mathcal F(e'_1,\ldots,e'_n)$.
Let $(\mu^1,\ldots,\mu^n)$ denote their common dual basis.
Then $$\phi(e_i)^j=\mu^j(e_i)=\mu^j(e'_i)=\phi(e'_i)^j$$
where $\phi$ is the coordinate map defined with $(e_1,\ldots,e_n)$.
Thus $(e_1,\ldots,e_n)=(e'_1,\ldots,e'_n)$.
We have shown that $\mathcal F$ is injective.
Let $\Phi:V\to V^{**}$ be the canonical isomorphism.
Define the function $\mathcal F^*:\mathcal B^*\to\mathcal B$ that maps a basis to its dual basis, then transforms it into a basis of $V$ by applying $\Phi^{-1}$ on each entry.
Let $B\in\mathcal B$, let $B^*$ denote its dual, and let $B^{**}$ denote its double dual,
then $\mathcal F^*(\mathcal F(B))$ is a basis of $V$ whose double dual is also $B^{**}$.
By injectivity of dual maps, $\mathcal F^*(\mathcal F(B))=B$.
Now let $B^*\in\mathcal B^*$ and let $B^{**}$ denote its dual.
Then $\mathcal F^*(B^*)$ is a basis of $V$ whose double dual is $B^{**}$.
Again, by injectivity of dual maps, $B^*$ is the dual of $\mathcal F^*(B^*)$,
thus $\mathcal F(\mathcal F^*(B^*))=B^*$.
We have shown that $\mathcal F$ is bijective, and $\mathcal F^{-1}=\mathcal F^*$.
$\blacksquare$
Coordinate representation of covector
Let $(\mu^1,\ldots,\mu^n)$ be a basis of $V^*$ and let $\phi^*$ be the corresponding coordinate map.
Then given $w\in V^*$, the $1\times n$ matrix $A$ such that $A_{1j}=\phi^*(w)_j$ is the coordinate representation of $v$
under the basis $(\mu^1,\ldots,\mu^n)$.
We may implicitly use $\phi^*(v)$ to denote the matrix $A$, just as the coordinate representation of a vector can be implicitly viewed as an $n\times 1$ matrix.
Proposition.
Let $e,\mu$ be bases of $V,V^*$ respectively that are dual to each other.
Let $\phi,\phi^*$ denote the corresponding coordinate maps.
Then given any $v\in V$ and $w\in V^*$,
$$w(v)=\phi^*(w)\phi(v)$$
(show proof)
Proof.
$$
w(v)
=\sum_jw_j\mu^j\p{\sum_iv^ie_i}
=\sum_jw_jv^j
=\sum_j\phi^*(w)_j\phi(v)^j
=\phi^*(w)\phi(v)
$$
$\blacksquare$
Change of basis on dual space
Suppose $B_1,B_2$ are bases of $V^*$.
Let $\varphi_{B_1},\varphi_{B_2}$ be the coordinate maps with respect to $B_1,B_2$,
and define $\psi_{B_2}:{V^*}^n\to F^{n\times n}$ such that $\psi_{B_2}(w^1,\ldots,w^n)_{ij}=\varphi_{B_2}(w^i)_j$.
Then $\psi_{B_2}(B_1)$ is called the
transition matrix from $B_1$ to $B_2$.
Given $w\in V^*$, $$\varphi_{B_1}(w)\psi_{B_2}(B_1)=\varphi_{B_2}(w)$$
(show proof)
Proof.
Let $B_1=(\mu^1,\ldots,\mu^j)$.
Given $w\in V^*$, there exist $c_1,\ldots,c_n$ such that $w=\sum_ic_i\mu^i$.
Thus $$\varphi_{B_1}(w)\psi_{B_2}(B_1)
=\varphi_{B_1}\p{\sum_ic_i\mu^i}\psi_{B_2}(B_1)
=\sum_ic_i\varphi_{B_1}(\mu^i)\psi_{B_2}(B_1)
=\sum_ic_i\psi_{B_2}(B_1)_{i,*}
=\sum_ic_i\varphi_{B_2}(\mu^i)
=\varphi_{B_2}\p{\sum_ic_i\mu^i}
=\varphi_{B_2}(w)$$
$\blacksquare$
Proposition.
Let $e,e'$ be bases of $V$ and $\mu,\mu'$ be their dual bases of $V^*$.
Let $A$ be the transition matrix from $e$ to $e'$ and $B$ be the transition matrix from $\mu$ to $\mu'$.
Then $AB=I$.
(show proof)
Proof.
Let $\phi,\phi'$ be the coordinate maps corresponding to $e,e'$,
and $\psi,\psi'$ to $\mu,\mu'$. Then
$$
(BA)_{ji}
=B_{j,*}A_{*,i}
=\psi(\mu^j)BA\phi(e_i)
=\psi'(\mu^j)\phi'(e_i)
=\mu^j(e_i)
=I_{ji}
$$
Thus $AB=BA=I$.
$\blacksquare$
Tensor
Given a natural number $k$, suppose $V_1,\ldots,V_k$ and $W$ are vector spaces,
then a tensor $T$, also called a
multilinear function, with respect to $V_1,\ldots,V_k$ and $W$ is a function from $V_1\times\ldots\times V_k$ to $W$,
with the properties that
- for all $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$, $i\in\{1,\ldots,k\}$, and $v_i'\in V_i$, $$T(v_1,\ldots,v_i+v_i',\ldots,v_k)=T(v_1,\ldots,v_i,\ldots,v_k)+T(v_1,\ldots,v_i',\ldots,v_k)$$
- for all $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$, $i\in\{1,\ldots,k\}$, and $c\in F$, $$T(v_1,\ldots,cv_i,\ldots,v_k)=cT(v_1,\ldots,v_i,\ldots,v_k)$$
We denote the set of tensors with respect to $V_1,\ldots,V_k$ and $W$ as $L(V_1,\ldots,V_k;W)$, which is clearly a vector space over $F$.
Given a natural number $k$ and a vector space $V$,
the collection of tensors from $V^k$ to $F$ is denoted $L(V^k)$.
When $k=1$, we may just write $L(V)$.
Note.
- $L(V^0)$ is naturally isomorphic to $F$.
- $L(V)$ is naturally isomorphic to $V^*$.
- $L(V^*)$ is naturally isomorphic to $V^{**}$ and hence $V$.
Tensor product
Given vector spaces $V_1,\ldots,V_k$ and $W_1,\ldots,W_m$,
we define the operation $\otimes:L(V_1,\ldots,V_k;F)\times L(W_1,\ldots,W_m;F)\to L(V_1,\ldots,V_k,W_1,\ldots,W_m;F)$ by
$$S\otimes T(v_1,\ldots,v_k,w_1,\ldots,w_m)=S(v_1,\ldots,v_k)T(w_1,\ldots,w_m)$$
Properties of tensor product
Let $S,S_1,S_2\in L(V_1,\ldots,V_k;F)$, $T\in L(W_1,\ldots,W_l;F)$, $U\in L(X_1,\ldots,X_m;F)$, and $c\in F$,
where $V_1,\ldots,V_k,W_1,\ldots,W_l,X_1,\ldots,X_m$ are vector spaces, then
$$(S_1+S_2)\otimes T=S_1\otimes T+S_2\otimes T$$
$$T\otimes(S_1+S_2)=T\otimes S_1+T\otimes S_2$$
$$(cS)\otimes T=S\otimes(cT)=c(S\otimes T)$$
$$(S\otimes T)\otimes U=S\otimes(T\otimes U)$$
(show proof)
Proof.
Trivial.
$\blacksquare$
Note.
Repeated tensor product on a tuple $(T_1,\ldots,T_m)$ of tensors can be denoted as $$\bigotimes_{i=1}^mT_i$$
When $m=0$, we define $\bigotimes_{i=1}^mT_i$ to be the tensor $1\in L(;F)$.
Basis of tensor space
Let $V_1,\ldots,V_k$ be vector spaces of dimensions $n_1,\ldots,n_k$.
For each $j$, let $(e_1^j,\ldots,e_{n_j}^j)$ be a basis of $V_j$ and let $(\mu_j^1,\ldots,\mu_j^{n_j})$ be the corresponding dual basis of $V_j^*$.
Let $I$ be the set of multidimensional indexes with dimensions $(n_1,\ldots,n_k)$.
Then $$\p{\bigotimes_{j=1}^k\mu_j^{i_j}}_{i\in I}$$ is a basis of $L(V_1,\ldots,V_k,F)$,
implying it has dimension $\prod_jn_j$.
Given $T\in L(V_1,\ldots,V_k,F)$ and $i\in I$, the coordinate for this basis is
$$T_i=T(e_{i_1}^1,\ldots,e_{i_k}^k)$$
(show proof)
Proof.
Let $T\in L(V_1,\ldots,V_k,F)$.
Then for all $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$,
let $v_j^i$ denote the $i$-th coordinate of $v_j$ for each $j$,
we have
$$
\sum_{i\in I}T(e_{i_1}^1,\ldots,e_{i_k}^k)\bigotimes_j\mu_j^{i_j}(v_1,\ldots,v_k)
=\sum_{i\in I}T(e_{i_1}^1,\ldots,e_{i_k}^k)\prod_j\mu_j^{i_j}(v_j)
=\sum_{i\in I}T(e_{i_1}^1,\ldots,e_{i_k}^k)\prod_jv_j^{i_j}
=T\p{\sum_iv_1^ie_i^1,\ldots,\sum_iv_k^ie_i^k}
=T(v_1,\ldots,v_k)
$$
Thus $(\bigotimes_j\mu_j^{i_j})_{i\in I}$ spans $L(V_1,\ldots,V_k,F)$.
Let $(c_i)_{i\in I}$ be elements of $F$ such that $\sum_{i\in I}c_i\bigotimes_j\mu_j^{i_j}=0$.
Then for all $i'\in I$,
$$
c_{i'}
=\sum_{i\in I}c_i\prod_j\mu_j^{i_j}(e_{i'_j}^j)
=\sum_{i\in I}c_i\bigotimes_j\mu_j^{i_j}(e_{i'_1}^1,\ldots,e_{i'_k}^k)
=0
$$
Thus $(\bigotimes_j\mu_j^{i_j})_{i\in I}$ is linearly independent.
$\blacksquare$
Notation.
Let $V_1,\ldots,V_k$ be vector spaces.
Given $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$,
we use $v_1\otimes\ldots\otimes v_k$ to denote $\Phi_1(v_1)\otimes\ldots\otimes\Phi_k(v_k)$,
where $\Phi_j$ is the canonical isomorphism from $V_j$ to $V_j^{**}$,
so that $v_1\otimes\ldots\otimes v_k\in L(V_1^*,\ldots,V_k^*;F)$.
Tensor product space
Let $V_1,\ldots,V_k$ be vector spaces.
For any set $S$, define $\mathcal F(S)$ to be the set of functions $f:S\to F$
such that $\{x\in S|f(x)\neq0\}$ is finite, then $\mathcal F(S)$ is a vector space.
Let $x\in S$, we denote the function
$$
f(t) =
\begin{cases}
1 & t=x\\
0 & t\neq x
\end{cases}
$$
as $x_{\mathcal F}$, so that $x_{\mathcal F}\in\mathcal F(S)$.
Let $\mathcal R$ denote the subspace of $\mathcal F(V_1\times\ldots\times V_k)$
spanned by the union of the following:
$$\{(v_1,\ldots,cv_i,\ldots,v_k)_{\mathcal F}-c(v_1\ldots,v_k)_{\mathcal F}:(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k,i\in\{1,\ldots,k\},c\in F\}$$
$$\{(v_1,\ldots,v_i+v_i',\ldots,v_k)_{\mathcal F}-((v_1\ldots,v_k)_{\mathcal F}+(v_1,\ldots,v_i',\ldots,v_k)_{\mathcal F}):(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k,i\in\{1,\ldots,k\},v_i'\in V_i\}$$
Then we use $V_1\otimes\ldots\otimes V_k$ to denote the quotient
$$\mathcal F(V_1\times\ldots\times V_k)/\mathcal R$$
called the tensor product space of $V_1,\ldots,V_k$.
Given $(v_1\ldots,v_k)\in V_1\times\ldots\times V_k$,
we denote the coset $(v_1\ldots,v_k)_{\mathcal F}+\mathcal R$ as $v_1\otimes\ldots\otimes v_k$,
so that $$v_1\otimes\ldots\otimes v_k\in V_1\otimes\ldots\otimes V_k$$
Note that we used the same notation as tensor product here.
We will call this abstract tensor product of $v_1\ldots,v_k$ to distinguish it from tensor product.
Note that if $k=0$, then the tensor product space is naturally isomorphic to $F$.
Let $\pi$ denote the map $(v_1,\ldots,v_k)\mapsto v_1\otimes\ldots\otimes v_k$.
Then $\pi$ is clearly multilinear, and $\{\pi(u):u\in V_1\times\ldots\times V_k\}$ spans $V_1\otimes\ldots\otimes V_k$.
Lemma.
Let $S$ be a set, then with respect to any field $F$, $\{u_{\mathcal F}:u\in S\}$ is a basis of $\mathcal F(S)$.
(show proof)
Proof.
Trivial.
$\blacksquare$
Lemma.
Let $S$ be a set and $X$ be a vector space.
Then for every function $A:S\to X$, there exists a unique linear function $\overline A:\mathcal F(S)\to X$
such that $A(u)=\overline A(u_{\mathcal F})$ for all $u\in S$.
(show proof)
Proof.
Given $f\in\mathcal F(S)$, $\sum_{\{u\in S|f(u)\neq0\}}f(u)A(u)\in X$.
This defines a function $\overline A:\mathcal F(S)\to X$, which clearly satisfies the requirements.
Suppose we have another linear function $\overline A':\mathcal F(S)\to X$ that satisfies the requirements,
then for all $f\in\mathcal F(S)$,
$$\overline A(f)
=\sum_{\{u\in S|f(u)\neq0\}}f(u)A(u)
=\sum_{\{u\in S|f(u)\neq0\}}f(u)\overline A'(u_{\mathcal F})
=\overline A'\p{\sum_{\{u\in S|f(u)\neq0\}}f(u)u_{\mathcal F}}
=\overline A'(f)$$
Thus $\overline A=\overline A'$.
$\blacksquare$
Lemma.
Let $X$ and $V_1,\ldots,V_k$ be vector spaces,
then for every tensor $T\in L(V_1,\ldots,V_k;X)$,
there uniquely exists a linear map $\tilde T:V_1\otimes\ldots\otimes V_k\to X$
such that $\tilde T\circ\pi=T$.
(show proof)
Proof.
We will denote the map $u\mapsto u_{\mathcal F}$ as $\varphi$ and
the map $v\mapsto v+\mathcal R$ as $\psi$,
then $\pi=\psi\circ\varphi$.
By the above lemma, $\overline T:\mathcal F(V_1\times\ldots\times V_k)\to X$ is a linear function such that $T(u)=\overline T(u_{\mathcal F})$ for all $u\in V_1\times\ldots\times V_k$,
implying $T=\overline T\circ\varphi$.
Let $u\in V_1\otimes\ldots\otimes V_k$, then for some $v_u\in\mathcal F(V_1\times\ldots\times V_k)$,
$u=v_u+\mathcal R$, and we have $\overline T(v_u)\in X$.
This defines a function $\tilde T:V_1\otimes\ldots\otimes V_k\to X$.
Let $u\in\mathcal F(V_1\times\ldots\times V_k)$, then $\tilde T(u+\mathcal R)=\overline T(v)$,
where $v\in\mathcal F(V_1\times\ldots\times V_k)$ and $u+\mathcal R=v+\mathcal R$, implying $v-u\in\mathcal R$.
Let $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k,i\in\{1,\ldots,k\},c\in F$, then
$$\overline T((v_1,\ldots,cv_i,\ldots,v_k)_{\mathcal F})=T(v_1,\ldots,cv_i,\ldots,v_k)=cT(v_1,\ldots,v_k)
=c\overline T((v_1,\ldots,v_k)_{\mathcal F})=\overline T(c(v_1,\ldots,v_k)_{\mathcal F})$$
Let $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k,i\in\{1,\ldots,k\},v_i'\in V_i$, then
$$\overline T((v_1,\ldots,v_i+v_i',\ldots,v_k)_{\mathcal F})=T(v_1,\ldots,v_i+v_i',\ldots,v_k)=T(v_1,\ldots,v_k)+T(v_1,\ldots,v_i',\ldots,v_k)
=\overline T((v_1,\ldots,v_k)_{\mathcal F})+\overline T((v_1,\ldots,v_i',\ldots,v_k)_{\mathcal F})
=\overline T((v_1,\ldots,v_k)_{\mathcal F}+(v_1,\ldots,v_i',\ldots,v_k)_{\mathcal F})$$
We have shown that for all $w\in\mathcal R$, $\overline T(w)=0$.
Thus $\tilde T(u+\mathcal R)=\overline T(v)=\overline T(u+(v-u))=\overline T(u)$.
We have shown that $\tilde T\circ\psi=\overline T$,
thus $$\tilde T\circ\pi=\tilde T\circ\psi\circ\varphi=\overline T\circ\varphi=T$$
Let $u,w\in V_1\otimes\ldots\otimes V_k$,
then $v_{u+w}+\mathcal R=u+w=(v_u+\mathcal R)+(v_w+\mathcal R)=(v_u+v_w)+\mathcal R$.
Thus $$\tilde T(u+w)=\overline T(v_{u+w})=\overline T(v_u+v_w)=\overline T(v_u)+\overline T(v_w)=\tilde T(u)+\tilde T(w)$$
Let $u\in V_1\otimes\ldots\otimes V_k$ and $c\in F$,
then $v_{cu}+\mathcal R=cu=c(v_u+\mathcal R)=(cv_u)+\mathcal R$.
Thus $$\tilde T(cu)=\overline T(v_{cu})=\overline T(cv_u)=c\overline T(v_u)=c\tilde T(u)$$
We have shown that $\tilde T$ is linear.
Suppose we have $\tilde T'$ that also satisfies the conditions.
Let $u\in V_1\otimes\ldots\otimes V_k$, then there exists $v\in\mathcal F(V_1\times\ldots\times V_k)$ such that $u=\psi(v)$,
and there exist $x_1,\ldots,x_l\in V_1\times\ldots\times V_k$ and $c^1,\ldots,c^l\in F$ for some $l\in N$,
such that $v=\sum_ic^i\varphi(x_i)$.
Thus
$$\tilde T(u)=\tilde T\p{\psi\p{\sum_ic^i\varphi(x_i)}}=\tilde T\p{\sum_ic^i\pi(x_i)}=\sum_ic^iT(x_i)$$
Similarly, we have $\tilde T'(u)=\sum_ic^iT(x_i)=\tilde T(u)$.
We have shown that $\tilde T'=\tilde T$.
$\blacksquare$
Basis of tensor product space
Let $V_1,\ldots,V_k$ be vector spaces of dimensions $n_1,\ldots,n_k$.
For each $j$, let $(e_1^j,\ldots,e_{n_j}^j)$ be a basis of $V_j$.
Let $I$ be the set of multidimensional indexes with dimensions $(n_1,\ldots,n_k)$.
Then $$\p{\bigotimes_{j=1}^ke_{i_j}^j}_{i\in I}$$ is a basis of $V_1\otimes\ldots\otimes V_k$,
implying it has dimension $\prod_jn_j$.
(show proof)
Proof.
For each $j$, let $(\mu_j^1,\ldots,\mu_j^{n_j})$ denote the dual basis of $(e_1^j,\ldots,e_{n_j}^j)$.
Let $T\in V_1\otimes\ldots\otimes V_k$.
Then for some $u_1,\ldots,u_l\in V_1\times\ldots\times V_k$ where $l\in N$,
$T$ is a linear combination of $\pi(u_1),\ldots,\pi(u_l)$,
and each of which is a linear combination of $\p{\bigotimes_{j=1}^ke_{i_j}^j}_{i\in I}$,
implying $T$ is a linear combination of $\p{\bigotimes_{j=1}^ke_{i_j}^j}_{i\in I}$.
We have shown that $\p{\bigotimes_{j=1}^ke_{i_j}^j}_{i\in I}$ spans $V_1\otimes\ldots\otimes V_k$.
Let $(c^i)_{i\in I}$ be elements of $F$ such that $\sum_{i\in I}c^i\bigotimes_je_{i_j}^j=0$.
Let $m\in I$, given $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$,
we have $\prod_j\mu_j^{m_j}(v_j)\in F$, which defines a function $\tau^m:V_1\times\ldots\times V_k\to F$ that is multilinear.
By the above lemma, we have linear $\tilde{\tau^m}:V_1\otimes\ldots\otimes V_k\to F$
such that $\tilde{\tau^m}\circ\pi=\tau^m$.
Then $$c^m=\sum_{i\in I}c^i\tau^m(e_{i_1}^1,\ldots,e_{i_k}^k)=\sum_{i\in I}c^i\tilde{\tau^m}(e_{i_1}^1\otimes\ldots\otimes e_{i_k}^k)
=\tilde{\tau^m}\p{\sum_{i\in I}c^i\bigotimes_je_{i_j}^j}=0$$
We have shown that $\p{\bigotimes_{j=1}^ke_{i_j}^j}_{i\in I}$ is linearly independent.
$\blacksquare$
Proposition.
Let $V,W$ be vector spaces, then $$V\otimes W\cong W\otimes V$$
as in there uniquely exists a canonical isomorphism such that for all $v\in V$ and $w\in W$, $v\otimes w$ corresponds to $w\otimes v$.
(show proof)
Proof.
Let $n,m$ denote $\dim V,\dim W$. Let $(a_1,\ldots,a_n),(b_1,\ldots,b_m)$ be bases of $V,W$ respectively.
Note that the map $(v,w)\mapsto w\otimes v$ from $V\times W$ to $W\otimes V$, denoted $A$, is multilinear.
Thus there exists linear $\tilde A:V\otimes W\to W\otimes V$ such that $\tilde A\circ\pi=A$.
Then for all $v\in V$ and $w\in W$, $$\tilde A(v\otimes w)=w\otimes v$$
Let $T\in W\otimes V$, then $T=\sum_{j,i}T^{ji}b_j\otimes a_i$,
implying
$$
\tilde A\p{\sum_{j,i}T^{ji}a_i\otimes b_j}
=\sum_{j,i}T^{ji}\tilde A(a_i\otimes b_j)
=\sum_{j,i}T^{ji}b_j\otimes a_i
=T
$$
Thus $\tilde A$ is surjective.
Since $V\otimes W$ and $W\otimes V$ share the same dimensionality, $\tilde A$ is bijective.
Hence $\tilde A$ is an isomorphism.
Suppose $\tilde A'$ is another such isomorphism, then $\tilde A$ and $\tilde A'$ agree on $a_i\otimes b_j$ for each $(i,j)$,
implying they are equal.
$\blacksquare$
Proposition.
Let $m_1,\ldots,m_k\in N$, let $n_j$ denote $\sum_{i=1}^{j}m_i$, and let $V_1,\ldots,V_{n_k}$ be vector spaces,
then $$\bigotimes_{i=1}^{n_k}V_i\cong\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i$$
as in there uniquely exists a canonical isomorphism such that for all $(v_1,\ldots,v_{n_k})\in V_1\times\ldots\times V_{n_k}$,
$\bigotimes_{i=1}^{n_k}v_i$ corresponds to $\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}v_i$.
(show proof)
Proof.
Note that the map $(v_1,\ldots,v_{n_k})\mapsto\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}v_i$, denoted $A$, is multilinear.
Thus there exists linear $\tilde A:\bigotimes_{i=1}^{n_k}V_i\to\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i$ such that $\tilde A\circ\pi=A$.
Then for all $(v_1,\ldots,v_{n_k})\in V_1\times\ldots\times V_{n_k}$,
$$\tilde A\p{\bigotimes_{i=1}^{n_k}v_i}=\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}v_i$$
Note that $\{\bigotimes_{j=1}^ku_j:(u_1,\ldots,u_k)\in\bigtimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i\}$ spans $\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i$.
And for each $(u_1,\ldots,u_k)\in\bigtimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i$,
$\bigotimes_{j=1}^ku_j$ is spanned by $\{\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}v_i:(v_1,\ldots,v_{n_k})\in\bigtimes_{i=1}^{n_k}V_i\}$.
Thus $\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i$ is spanned by $\{\tilde A(\bigotimes_{i=1}^{n_k}v_i):(v_1,\ldots,v_{n_k})\in\bigtimes_{i=1}^{n_k}V_i\}$.
Hence $\tilde A$ is surjective.
Since $\bigotimes_{i=1}^{n_k}V_i$ and $\bigotimes_{j=1}^k\bigotimes_{i=n_{j-1}+1}^{n_j}V_i$ share the same dimensionality,
$\tilde A$ is bijective.
We have shown that $\tilde A$ is an isomorphism.
Suppose $\tilde A'$ is another such isomorphism, then $\tilde A$ and $\tilde A'$ agree on a basis of $\bigotimes_{i=1}^{n_k}V_i$,
implying they are equal.
$\blacksquare$
Proposition.
Let $V_1,\ldots,V_k$ and $W_1,\ldots,W_k$ be vector spaces and let $\Phi_j:V_j\to W_j$ be an isomorphism for each $j$,
then $$V_1\otimes\ldots\otimes V_k\cong W_1\otimes\ldots\otimes W_k$$
as in there uniquely exists a canonical isomorphism such that for all $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$,
$v_1\otimes\ldots\otimes v_k$ corresponds to $\Phi_1(v_1)\otimes\ldots\otimes\Phi_k(v_k)$.
(show proof)
Proof.
Let $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$,
then $\Phi_1(v_1)\otimes\ldots\otimes\Phi_k(v_k)\in W_1\otimes\ldots\otimes W_k$.
This defines a function $A:V_1\times\ldots\times V_k\to W_1\otimes\ldots\otimes W_k$, which is multilinear.
Thus we have linear $\tilde A:V_1\otimes\ldots\otimes V_k\cong W_1\otimes\ldots\otimes W_k$ such that $\tilde A\circ\pi=A$.
Then for all $(v_1,\ldots,v_k)\in V_1\times\ldots\times V_k$,
$$\tilde A(v_1\otimes\ldots\otimes v_k)=\Phi_1(v_1)\otimes\ldots\otimes\Phi_k(v_k)$$
Note that for each $j$, $\Phi_j$ sends a basis of $V_j$ to a basis of $W_j$.
This means that $\tilde A$ sends a basis of $V_1\otimes\ldots\otimes V_k$ to a basis of $W_1\otimes\ldots\otimes W_k$,
implying it is surjective.
Since $V_1\otimes\ldots\otimes V_k$ and $W_1\otimes\ldots\otimes W_k$ share the same dimensionality,
$\tilde A$ is bijective, and thus is an isomorphism.
Suppose $\tilde A'$ is another such isomorphism, then $\tilde A$ and $\tilde A'$ agree on a basis of $V_1\otimes\ldots\otimes V_k$,
implying they are equal.
$\blacksquare$
Note.
Let $V_1,\ldots,V_k$ be vector spaces and let $\sigma\in S_k$, then
$$V_1\otimes\ldots\otimes V_k\cong V_{\sigma_1}\otimes\ldots\otimes V_{\sigma_k}$$
And there unique exists a canonical isomorphism that corresponds $v_1\otimes\ldots\otimes v_k$ to $v_{\sigma_1}\otimes\ldots\otimes v_{\sigma_k}$.
Proposition.
Let $V_1,\ldots,V_k$ be vector spaces,
then $$V_1^*\otimes\ldots\otimes V_k^*\cong L(V_1,\ldots,V_k;F)$$
as in there uniquely exists a canonical isomorphism such that for all $(w^1,\ldots,w^k)\in V_1^*\times\ldots\times V_k^*$,
the abstract tensor product $w^1\otimes\ldots\otimes w^k$ corresponds to the tensor product $w^1\otimes\ldots\otimes w^k$.
(show proof)
Proof.
Let $(w^1,\ldots,w^k)\in V_1^*\times\ldots\times V_k^*$,
then $w^1\otimes\ldots\otimes w^k\in L(V_1,\ldots,V_k;F)$.
This defines a function $A:V_1^*\times\ldots\times V_k^*\to L(V_1,\ldots,V_k;F)$, which is multilinear.
Thus we have linear $\tilde A:V_1^*\otimes\ldots\otimes V_k^*\to L(V_1,\ldots,V_k;F)$ such that $\tilde A\circ\pi=A$.
Then for all $(w^1,\ldots,w^k)\in V_1^*\times\ldots\times V_k^*$,
the abstract tensor product $w^1\otimes\ldots\otimes w^k$ corresponds to the tensor product $w^1\otimes\ldots\otimes w^k$ by $\tilde A$.
This means that $\tilde A$ sends a basis of $V_1^*\otimes\ldots\otimes V_k^*$ to a basis of $L(V_1,\ldots,V_k;F)$,
implying it is surjective.
Since $V_1^*\otimes\ldots\otimes V_k^*$ and $L(V_1,\ldots,V_k;F)$ share the same dimensionality,
$\tilde A$ is bijective, and thus is an isomorphism.
Suppose $\tilde A'$ is another such isomorphism, then $\tilde A$ and $\tilde A'$ agree on a basis of $V_1^*\otimes\ldots\otimes V_k^*$,
implying they are equal.
$\blacksquare$
Note.
Let $V_1,\ldots,V_k$ be vector spaces.
Then $$V_1\otimes\ldots\otimes V_k\cong V_1^{**}\otimes\ldots\otimes V_k^{**}\cong L(V_1^*,\ldots,V_k^*;F)$$
And there unique exists a canonical isomorphism that corresponds the abstract tensor product $v_1\otimes\ldots\otimes v_k$ to the tensor product $v_1\otimes\ldots\otimes v_k$,
thus justifying the coinciding notation.
Covariance and contravariance
Let $V$ be a vector space.
We denote $\bigotimes_{j=1}^kV$ as $T^k(V)$ and $\bigotimes_{j=1}^lV^*$ as $T_l(V)$.
We call $T^k(V)$ the $k$-
contravariant tensor space of $V$, and $T_l(V)$ the $l$-
covariant tensor space of $V$.
And we denote $T^k(V)\otimes T_l(V)$ as $T^k_l(V)$ or $T^{(k,l)}(V)$, which is called the $(k,l)$-
tensor space of $V$.
Note that $$T^k_l(V)\cong V\otimes\ldots\otimes V\otimes V^*\otimes\ldots\otimes V^*$$
where there are $k$ copies of $V$ and $l$ copies of $V^*$.
And we have
- $T^0_0(V)\cong T^0(V)\cong T_0(V)\cong F$;
- $T^1_0(V)\cong T^1(V)\cong T_1(V^*)\cong V$;
- $T^0_1(V)\cong T^1(V^*)\cong T_1(V)\cong V^*$;
- $T^k_0(V)\cong T^k(V)\cong T_k(V^*)\cong L({V^*}^k)$;
- $T^0_l(V)\cong T^l(V^*)\cong T_l(V)\cong L(V^l)$.
Component notation
Let $V$ be a vector space with a basis $\mathcal B$ and let $\tau$ be a $k$-tuple of $\{1,-1\}$ serving as a signature.
Define a $k$-tuple $\mathcal V$ of $\{V,V^*\}$ such that
$$
\mathcal V_j =
\begin{cases}
V & \tau_j=1\\
V^* & \tau_j=-1
\end{cases}
$$
Let $\mathcal W$ be a $k$-tuple of $\{V,V^*\}$ such that
$$
\mathcal W_j =
\begin{cases}
V^* & \tau_j=1\\
V & \tau_j=-1
\end{cases}
$$
Then we have
$$
\mathcal V_1\otimes\ldots\otimes\mathcal V_k\cong L(\mathcal W_1,\ldots,\mathcal W_k;F)
$$
where coordinates are preserved under standard bases of each.
Thus given $T\in\mathcal V_1\otimes\ldots\otimes\mathcal V_k$ and a $k$-tuple $I$ of $n$,
$T_I=T(\varepsilon^1_{I_1},\ldots,\varepsilon^k_{I_k})$, where $\varepsilon^j$ is the basis of $\mathcal W_j$ constructed from $\mathcal B$.
In terms of notation, if $k$ is obtained from meta-logic, then we may write the entries of $I$ explicitly, using variable symbols,
with superscripts for contravariant components and subscripts for covariant components.
Such notations are called component notations, which represent both the tensors and their components.
Note that we can manipulate $\mathcal V_1\otimes\ldots\otimes\mathcal V_k$ under an isomorphism so that all contravariant components are on the left and all covariant components are on the right.
If $k$ is obtained from meta-logic, then for some meta-logically obtained natural numbers $p,q$ where $p+q=k$,
$\mathcal V_1\otimes\ldots\otimes\mathcal V_k$ is isomorphic to $T^p_q(V)$, and we may denote $T$, as a $(p,q)$-tensor, with the component notation
$$T^{i_1\ldots i_p}_{j_1\ldots j_q}$$
where $i_1,\ldots,i_p,j_1,\ldots,j_q$ are distinct variable symbols.
Suppose we have tensors $T^{i_1\ldots i_p}_{j_1\ldots j_q}$ and $S^{i'_1\ldots i'_{p'}}_{j'_1\ldots j'_{q'}}$,
then $T^{i_1\ldots i_p}_{j_1\ldots j_q}S^{i'_1\ldots i'_{p'}}_{j'_1\ldots j'_{q'}}$ denotes their tensor product.
Also, in terms of components, we have that
$$(T\otimes S)^{i_1\ldots i_pi'_1\ldots i'_{p'}}_{j_1\ldots j_qj'_1\ldots j'_{q'}}=T^{i_1\ldots i_p}_{j_1\ldots j_q}S^{i'_1\ldots i'_{p'}}_{j'_1\ldots j'_{q'}}$$
Tensor contraction
Let $V$ be a vector space with a basis $\mathcal B$ and let $\tau$ be a $k$-tuple of $\{1,-1\}$ serving as a signature.
Define $\mathcal V$ as above, and let $T\in\mathcal V_1\otimes\ldots\otimes\mathcal V_k$.
Suppose we have injective $l$-tuples $\alpha,\beta$ of $\{1,\ldots,k\}$,
such that each $\tau_{\alpha_j}=1$ and each $\tau_{\beta_j}=-1$.
Then we define the contraction $T'$ of $T$ on the indexes $\alpha,\beta$ to be
a tensor in $\mathcal V'_1\otimes\ldots\otimes\mathcal V'_l$,
where $\mathcal V'$ is defined with the $(k-2l)$-tuple $\tau'$ of $\{1,-1\}$ formed from $\tau$ by removing the indexes in $\alpha$ and $\beta$,
such that the components of $T'$ are obtained from the components of $T$ by summing over each pair $(\alpha_j,\beta_j)$ of indexes.
In terms of component notation, we use the same variable on each pair of indexes to be summed over.
Consider a special case where $\mathcal V$ is $T^p_q(V)$,
and there is only one pair of indexes to be summed over,
then the notation $$T^{i_1\ldots a\ldots i_{p-1}}_{j_1\ldots a\ldots j_{q-1}}$$
denotes a tensor $T'$ of $T^{p-1}_{q-1}(V)$,
where $$T'^{i_1\ldots i_{p-1}}_{j_1\ldots j_{q-1}}=\sum_aT^{i_1\ldots a\ldots i_{p-1}}_{j_1\ldots a\ldots j_{q-1}}$$
Notation.
Given a basis $e$ of a vector space, we use $e_j$ to denote an entry of $e$ and $e^j$ to denote an entry of the dual basis to $e$.
Given a matrix $A$, we may use ${A^i}_j$ to denote $A_{ij}$.
Note that the notation ${A^i}_j$ does not imply that $A$ is a tensor.
However, this notation can be used in component notations for tensors, and
repeating variables are to be summed over as usual.
Change of basis on tensors
Let $V$ be a vector space and let $e,e'$ be its bases.
Let $A$ denote the transition matrix from $e$ to $e'$ and $B$ from the dual of $e$ to the dual of $e'$.
Let $T\in T^k_l(V)$.
Then
$$T^{I'}_{J'}=\sum_{I,J}T^I_J\prod_i{A^{I'_i}}_{I_i}\prod_j{B^{J_j}}_{J'_j}$$
where $T^I_J$ is the $(I,J)$ coordinate of $T$ with respect to $e$ and
$T^{I'}_{J'}$ is the $(I',J')$ coordinate of $T$ with respect to $e'$
(show proof).
Proof.
$$
T
=\sum_{I,J}T^I_Je_{I_1}\otimes\ldots\otimes e_{I_k}\otimes e^{J_1}\otimes\ldots\otimes e^{J_l}
=\sum_{I,J}T^I_J\bigotimes_i\p{\sum_{i'}{A^{i'}}_{I_i}e'_{i'}}\otimes\bigotimes_j\p{\sum_{j'}{B^{J_j}}_{j'}e'^{j'}}
=\sum_{I,J,I',J'}T^I_J\bigotimes_i\p{{A^{I'_i}}_{I_i}e'_{I'_i}}\otimes\bigotimes_j\p{{B^{J_j}}_{J'_j}e'^{J'_j}}
=\sum_{I',J'}\p{\sum_{I,J}T^I_J\prod_i{A^{I'_i}}_{I_i}\prod_j{B^{J_j}}_{J'_j}}e'_{I'_1}\otimes\ldots\otimes e'_{I'_k}\otimes e'^{J'_1}\otimes\ldots\otimes e'^{J'_l}
$$
$\blacksquare$
This can be easily generalized to $\mathcal V_1\otimes\ldots\otimes\mathcal V_k$ as defined above,
and with any pair of bases for each component.
Suppose we have a tensor $T^{i_1\ldots i_p}_{j_1\ldots j_q}$.
If we want to change the $k$-th contravariant basis, let $A$ denote the transition matrix for the corresponding bases,
then $$T^{i_1\ldots a\ldots i_p}_{j_1\ldots j_q}=T^{i_1\ldots i_p}_{j_1\ldots j_q}{A^a}_{i_k}$$
If we want to change the $l$-th covariant basis, let $B$ denote the transition matrix for the corresponding dual bases,
then $$T^{i_1\ldots i_p}_{j_1\ldots b\ldots j_q}=T^{i_1\ldots i_p}_{j_1\ldots j_q}{B^{j_l}}_b$$
Proposition.
Tensor contraction is coordinate independent.
(show proof)
Proof.
We will show for the special case where contraction is done on $T\in T^k_l(V)$ and there is only one pair of indexes next to each other to be summed over.
The general case follows similarly.
We will use the same notations as above,
and use $S,S'$ to denote the resulting tensors of contraction obtained with $e,e'$ respectively.
Then
$$
\sum_b{T'^{I'b}}_{bJ'}
=\sum_b\sum_{I,a,a',J}{T^{Ia}}_{a'J}\prod_i{A^{I'_i}}_{I_i}\p{{A^b}_a{B^{a'}}_{b}}\prod_j{B^{J_j}}_{J'_j}
=\sum_{I,a,a',J}\p{\sum_b\p{{B^{a'}}_{b}{A^b}_a}}{T^{Ia}}_{a'J}\prod_i{A^{I'_i}}_{I_i}\prod_j{B^{J_j}}_{J'_j}
=\sum_{I,a,a',J}\delta^{a'}_a{T^{Ia}}_{a'J}\prod_i{A^{I'_i}}_{I_i}\prod_j{B^{J_j}}_{J'_j}
=\sum_{I,J}\sum_a{T^{Ia}}_{aJ}\prod_i{A^{I'_i}}_{I_i}\prod_j{B^{J_j}}_{J'_j}
$$
Thus
$$
S
=\sum_{I,J}\p{\sum_a{T^{Ia}}_{aJ}}e_{I_1}\otimes\ldots\otimes e_{I_{k-1}}\otimes e^{J_1}\otimes\ldots\otimes e^{J_{l-1}}
=\sum_{I',J'}\p{\sum_{I,J}\sum_a{T^{Ia}}_{aJ}\prod_i{A^{I'_i}}_{I_i}\prod_j{B^{J_j}}_{J'_j}}e'_{I'_1}\otimes\ldots\otimes e'_{I'_{k-1}}\otimes e'^{J'_1}\otimes\ldots\otimes e'^{J'_{l-1}}
=\sum_{I',J'}\p{\sum_b{T'^{I'b}}_{bJ'}}e'_{I'_1}\otimes\ldots\otimes e'_{I'_{k-1}}\otimes e'^{J'_1}\otimes\ldots\otimes e'^{J'_{l-1}}
=S'
$$
$\blacksquare$
Pullback of covariant tensor
Suppose $V$ and $W$ are vector spaces and $A\in L(V,W)$.
Given any natural number $k$, we can define a function $A^*:T^k(W^*)\to T^k(V^*)$
such that for all $T\in T^k(W^*)$ and $(v_1,\ldots,v_k)\in V^k$, $$A^*T(v_1,\ldots,v_k)=T(Av_1,\ldots,Av_k)$$
We call $A^*T$ the pullback of $T$ by $A$.
Alternating tensor
Let $k\in N$ and $T\in T^k(V^*)$. If for all $(v_1,\ldots,v_k)\in V^k$ and $\sigma\in S_k$ such that $\sigma$ is a transposition,
we have $T(v_1,\ldots,v_k)=-T(v_{\sigma_1},\ldots,v_{\sigma_k})$,
then $T$ is said to be alternating. Clearly, the set of alternating tensors in $T^k(V^*)$, denoted $\Lambda^k(V^*)$, is a subspace of $T^k(V^*)$.
We will also define a function $\Alt:T^k(V^*)\to\Lambda^k(V^*)$ by
$$\Alt(T)(v_1,\ldots,v_k)=\frac{1}{k!}\sum_{\sigma\in S_k}\sgn(\sigma)T(v_{\sigma_1},\ldots,v_{\sigma_k})$$
which is well-defined
(show proof).
Proof.
Clearly, given $T\in T^k(V^*)$, $(v_1,\ldots,v_k)\mapsto\frac{1}{k!}\sum_{\sigma\in S_k}\sgn(\sigma)T(v_{\sigma_1},\ldots,v_{\sigma_k})$
defines a function $A_T:V^k\to F$ such that $A_T\in T^k(V^*)$.
Let $(v_1,\ldots,v_k)\in V^k$ and $\tau\in S_k$ be a transposition,
then the function $\sigma\mapsto\tau\sigma$ is clearly a bijection from $S_k$ to $S_k$. Hence
$$A_T(v_1,\ldots,v_k)
=\frac{1}{k!}\sum_{\sigma\in S_k}\sgn(\sigma)T(v_{\sigma_1},\ldots,v_{\sigma_k})
=\frac{1}{k!}\sum_{\sigma\in S_k}\sgn(\tau\sigma)T(v_{\tau\sigma_1},\ldots,v_{\tau\sigma_k})
=-\frac{1}{k!}\sum_{\sigma\in S_k}\sgn(\sigma)T(v_{\tau\sigma_1},\ldots,v_{\tau\sigma_k})
=-A_T(v_{\tau_1},\ldots,v_{\tau_k})$$
We have shown that $A_T\in\Lambda^k(V^*)$.
Thus $T\mapsto A_T$ defines a function $\Alt:T^k(V^*)\to\Lambda^k(V^*)$,
and for all $T\in T^k(V^*)$, for all $(v_1,\ldots,v_k)\in V^k$,
$$\Alt(T)(v_1,\ldots,v_k)=\frac{1}{k!}\sum_{\sigma\in S_k}\sgn(\sigma)T(v_{\sigma_1},\ldots,v_{\sigma_k})$$
$\blacksquare$
Note.
Clearly, $\Alt:T^k(V^*)\to\Lambda^k(V^*)$ is linear.
Proposition.
For all $\omega\in\Lambda^k(V^*)$, $\Alt(\omega)=\omega$.
(show proof)
Proof.
Let $\sigma\in S_k$. If $\sgn(\sigma)=1$, then $\sigma$ is a composition of an even number of elementary transpositions,
hence $\omega(v_{\sigma_1},\ldots,v_{\sigma_k})=\omega(v_1,\ldots,v_k)$ for all $(v_1,\ldots,v_k)\in V^k$.
If $\sgn(\sigma)=-1$, then $\sigma$ is a composition of an odd number of elementary transpositions,
hence $\omega(v_{\sigma_1},\ldots,v_{\sigma_k})=-\omega(v_1,\ldots,v_k)$ for all $(v_1,\ldots,v_k)\in V^k$.
In both cases, $\sgn(\sigma)\omega(v_{\sigma_1},\ldots,v_{\sigma_k})=\omega(v_1,\ldots,v_k)$.
Hence $\Alt(\omega)(v_1,\ldots,v_k)=\omega(v_1,\ldots,v_k)$.
$\blacksquare$
Notation.
Suppose $I$ is a $k$-tuple of $n$, $J$ is an $l$-tuple of $n$, and $\sigma$ is a $k$-permutation.
We use $I\sigma$ to denote $I\circ\sigma$ and $IJ$ to denote $(I_1,\ldots,I_k,J_1,\ldots,J_l)$.
If $I$ is injective, then it is bijective to its range, and $I^{-1}$ denotes the inverse of this bijective function.
Elementary alternating tensor
Let $I$ be a $k$-tuple of $n$.
Let $(e_i)$ be a basis of $V$ and let $(\mu^i)$ be its dual basis.
Define $$e^I(v_1,\ldots,v_k)=\det\p{\p{\mu^{I_i}(v_j)}_{ij}}$$
Then $e^I\in\Lambda^k(V^*)$.
Note that if $k=0$, $e^{()}$ is equivalent to the scalar $1$;
if $k=1$, $e^{(i)}$ is equivalent to $\mu^i$.
Lemma.
Let $I,J$ be $k$-tuples of $n$.
- If $I$ is not injective, then $e^I=0$.
- If $I$ and $J$ are injective and their ranges are equal, then $e^J=e^I\sgn(I^{-1}J)$.
(show proof)
Proof.
If $I$ is not injective, then given $v_1,\ldots,v_k\in V$, $\p{\mu^{I_i}(v_j)}_{ij}$ has repeated rows, thus $e^I=0$.
Suppose $I$ and $J$ are injective and their ranges are equal.
Suppose $\sigma$ is a permutation of $k$ and $\tau$ is an elementary transposition of $k$, and $e^{I\sigma}=e^I\sgn(\sigma)$,
then given $v_1,\ldots,v_k\in V$, $e^{I\sigma\tau}(v_1,\ldots,v_k)=\det\p{\p{\mu^{{I\sigma\tau}_i}(v_j)}_{ij}}=-\det\p{\p{\mu^{{I\sigma}_i}(v_j)}_{ij}}=-e^{I\sigma}(v_1,\ldots,v_k)$.
Thus $e^{I\sigma\tau}=-e^{I\sigma}=-e^I\sgn(\sigma)=e^I\sgn(\sigma\tau)$.
By induction, we have $e^J=e^{II^{-1}J}=e^I\sgn(I^{-1}J)$.
$\blacksquare$
Definition.
We define a function $\delta$ from pairs of $k$-tuple of $n$ to $R$ by
$$\delta^I_J=\det\p{(\delta^{I_i}_{J_j})_{ij}}$$
Lemma.
Let $I,J$ be $k$-tuples of $n$.
If $I$ or $J$ is not injective, or the range of $I$ does not equal the range of $J$, then $\delta^I_J=0$;
otherwise, $\delta^I_J=\sgn(I^{-1}J)$.
(show proof)
Proof.
If $I$ or $J$ is not injective, then $(\delta^{I_i}_{J_j})_{ij}$ has repeated rows or columns, thus $\delta^I_J=0$.
If the range of $I$ does not equal the range of $J$, then $(\delta^{I_i}_{J_j})_{ij}$ has a zero row or a zero column, thus $\delta^I_J=0$.
Suppose otherwise, then $(\delta^{I_i}_{I_j})_{ij}$ is the identity matrix, thus $\delta^I_I=1=\sgn(e)$ where $e$ is the identity permutation of $k$.
Suppose $\sigma$ is a permutation of $k$ and $\tau$ is an elementary transposition of $k$, and $\delta^I_{I\sigma}=\sgn(\sigma)$,
then $\delta^I_{I\sigma\tau}=\det\p{(\delta^{I_i}_{I\sigma\tau_j})_{ij}}=-\det\p{(\delta^{I_i}_{I\sigma_j})_{ij}}=-\delta^I_{I\sigma}=-\sgn(\sigma)=\sgn(\sigma\tau)$.
By induction, $\delta^I_J=\delta^I_{II^{-1}J}=\sgn(I^{-1}J)$.
$\blacksquare$
Basis of alternating tensor space
Let $(e_1,\ldots,e_n)$ be a basis of $V$, and let $\mathcal J$ denote the set of strictly increasing $k$-tuples of $n$.
Then the elementary alternating tensors $$(e^J)_{J\in\mathcal J}$$
form a basis of $\Lambda^k(V^*)$.
Hence $$\dim\Lambda^k(V^*)=\binom{n}{k}$$
Given $\omega\in\Lambda^k(V^*)$ and $J\in\mathcal J$, the coordinate for this basis is
$$\omega_J=\omega(e_{J_1},\ldots,e_{J_k})$$
(show proof)
Proof.
Let $\mathcal I$ denote the set of $k$-tuples of $n$.
Let $\omega\in\Lambda^k(V^*)$.
Let $I\in\mathcal I$.
- If $I$ is not injective, then
$$
\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})e^J(e_{I_1},\ldots,e_{I_k})
=\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})\delta^J_I
=0
=\omega(e_{I_1},\ldots,e_{I_k})
$$
- If $I$ is injective, then
$$
\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})e^J(e_{I_1},\ldots,e_{I_k})
=\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})\delta^J_I
=\omega(e_{J'_1},\ldots,e_{J'_k})\delta^{J'}_I
=\omega(e_{I_1},\ldots,e_{I_k})\sgn(I^{-1}J')\sgn({J'}^{-1}I)
=\omega(e_{I_1},\ldots,e_{I_k})
$$
where $J'\in\mathcal J$ uniquely shares the same range as $I$.
Thus given any $(v_1,\ldots,v_k)\in V^k$,
$$
\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})e^J(v_1,\ldots,v_k)
=\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})\sum_{I\in\mathcal I}e^J(e_{I_1},\ldots,e_{I_k})c^I
=\sum_{I\in\mathcal I}\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})e^J(e_{I_1},\ldots,e_{I_k})c^I
=\sum_{I\in\mathcal I}\omega(e_{I_1},\ldots,e_{I_k})c^I
=\omega(v_1,\ldots,v_k)
$$
where $c^I$ is the product of the coordinates $v_1^{I_1},\ldots,v_k^{I_k}$.
Hence $$\sum_{J\in\mathcal J}\omega(e_{J_1},\ldots,e_{J_k})e^J=\omega$$
We have shown that $(e^J)_{J\in\mathcal J}$ spans $\Lambda^k(V^*)$.
Suppose for some scalars $(c_J)_{J\in\mathcal J}$,
$\sum_{J\in\mathcal J}c_Je^J=0$.
Then for each $I\in\mathcal J$,
$$
c_I
=\sum_{J\in\mathcal J}c_J\delta^J_I
=\sum_{J\in\mathcal J}c_Je^J(e_{I_1},\ldots,e_{I_k})
=0(e_{I_1},\ldots,e_{I_k})
=0
$$
We have shown that $(e^J)_{J\in\mathcal J}$ is independent.
$\blacksquare$
Wedge product
We define the operation $\wedge:T^k(V^*)\times T^l(V^*)\to\Lambda^{k+l}(V^*)$ by
$$\omega\wedge\eta=\frac{(k+l)!}{k!l!}\Alt(\omega\otimes\eta)$$
Proposition.
Let $(e_1,\ldots,e_n)$ be a basis of $V$, let $I$ be a $k$-tuple of $n$, and let $J$ be an $l$-tuple of $n$.
Then $$e^I\wedge e^J=e^{IJ}$$
(show proof)
Proof.
Let $K$ be a $(k+l)$-tuple of $n$.
Then $e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})=\delta^{IJ}_K$ and
$$
e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})
=\frac{(k+l)!}{k!l!}\Alt(e^I\otimes e^J)(e_{K_1},\ldots,e_{K_{k+l}})
=\frac{(k+l)!}{k!l!}\frac{1}{(k+l)!}\sum_{\sigma\in S_{k+l}}\sgn(\sigma)(e^I\otimes e^J)(e_{K\sigma_1},\ldots,e_{K\sigma_{k+l}})
=\frac{1}{k!l!}\sum_{\sigma\in S_{k+l}}\sgn(\sigma)e^I(e_{K\sigma_1},\ldots,e_{K\sigma_k})e^J(e_{K\sigma_{k+1}},\ldots,e_{K\sigma_{k+l}})
=\frac{1}{k!l!}\sum_{\sigma\in S_{k+l}}\sgn(\sigma)\delta^I_{K\sigma[1:k]}\delta^J_{K\sigma[k+1:k+l]}
$$
Suppose $K$ is not injective, then both $e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})$ and $e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})$ are clearly $0$.
Now suppose $K$ is injective and $IJ$ is not injective, then for some distinct indexes $a,b$, $(IJ)_a=(IJ)_b$, and $e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})=0$.
- If $a,b\in\{1,\ldots,k\}$, then each $\delta^I_{K\sigma[1:k]}$ is $0$, thus $e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})=0$.
- If $a,b\in\{k+1,\ldots,k+l\}$, then each $\delta^J_{K\sigma[k+1:k+l]}$ is $0$, thus $e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})=0$.
- If one of $a$ and $b$ is in $\{1,\ldots,k\}$ and the other in $\{k+1,\ldots,k+l\}$, then for each $\sigma\in S_{k+l}$,
either $\delta^I_{K\sigma[1:k]}=0$ or $\delta^J_{K\sigma[k+1:k+l]}=0$, thus $e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})=0$.
Now suppose both $K$ and $IJ$ are injective.
If the ranges of $K$ and $IJ$ do not match, then for each $\sigma\in S_{k+l}$, the ranges of $K\sigma$ and $IJ$ also do not match.
Thus both $e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})$ and $e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})$ are $0$.
Now suppose the range of $K$ equals the range of $IJ$, then $K^{-1}(IJ)$ is a $(k+l)$-permutation.
Let $\mathcal S$ denote the subset of $S_{k+l}$ such that $\sigma\in\mathcal S$ if and only if the restriction of $\sigma$ on $\{1,\ldots,k\}$ is a $k$-permutation.
Then
$$
e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})
=\frac{1}{k!l!}\sum_{\sigma\in S_{k+l}}\sgn(K^{-1}(IJ)\sigma)\delta^I_{(IJ)\sigma[1:k]}\delta^J_{(IJ)\sigma[k+1:k+l]}
=\frac{1}{k!l!}\sum_{\sigma\in\mathcal S}\sgn(K^{-1}(IJ)\sigma)\delta^I_{(IJ)\sigma[1:k]}\delta^J_{(IJ)\sigma[k+1:k+l]}
=\frac{1}{k!l!}\sum_{\tau\in S_k}\sum_{\eta\in S_l}\sgn(K^{-1}(IJ))\sgn(\tau)\sgn(\eta)\delta^I_{I\tau}\delta^J_{J\eta}
$$ $$
=\sgn(K^{-1}(IJ))\p{\frac{1}{k!}\sum_{\tau\in S_k}\sgn(\tau)\delta^I_{I\tau}}\p{\frac{1}{l!}\sum_{\eta\in S_l}\sgn(\eta)\delta^J_{J\eta}}
=\sgn(K^{-1}(IJ))\Alt(e^I)(e_{I_1},\ldots,e_{I_k})\Alt(e^J)(e_{J_1},\ldots,e_{J_l})
=\sgn(K^{-1}(IJ))e^I(e_{I_1},\ldots,e_{I_k})e^J(e_{J_1},\ldots,e_{J_l})
=\sgn((IJ)^{-1}K)
=e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})
$$
We have shown that for all $(k+l)$-tuple $K$ of $n$, $e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})=e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})$.
Given any $(v_1,\ldots,v_{k+l})\in V^{k+l}$,
$$
e^I\wedge e^J(v_1,\ldots,v_{k+l})
=\sum_{K\in\mathcal K}e^I\wedge e^J(e_{K_1},\ldots,e_{K_{k+l}})c^K
=\sum_{K\in\mathcal K}e^{IJ}(e_{K_1},\ldots,e_{K_{k+l}})c^K
=e^{IJ}(v_1,\ldots,v_{k+l})
$$
where $\mathcal K$ is the set of $(k+l)$-tuples of $n$, and each $c^K$ is a scalar.
Therefore, $$e^I\wedge e^J=e^{IJ}$$
$\blacksquare$
Note.
Repeated wedge product on a tuple $(\omega_1,\ldots,\omega_m)$ of alternating tensors can be denoted as $$\bigwedge_{i=1}^m\omega_i$$
When $m=0$, we define $\bigwedge_{i=1}^m\omega_i$ to be the tensor $1\in L(;F)$.
Note.
By the above proposition, given $\omega\in\Lambda^k(V^*)$,
$$\omega=\sum_I\omega(e_{I_1},\ldots,e_{I_k})\mu^{I_1}\wedge\ldots\wedge\mu^{I_k}$$
where $I$ is any strictly increasing $k$-tuple of $n$.
Properties of wedge product
Let $\omega,\omega_1,\omega_2\in\Lambda^k(V^*)$, $\eta,\eta_1,\eta_2\in\Lambda^l(V^*)$, $\theta\in\Lambda^m(V)$, and $c\in F$, then
$$(\omega_1+\omega_2)\wedge\eta=\omega_1\wedge\eta+\omega_2\wedge\eta$$
$$\omega\wedge(\eta_1+\eta_2)=\omega\wedge\eta_1+\omega\wedge\eta_2$$
$$(c\omega)\wedge\eta=\omega\wedge(c\eta)=c(\omega\wedge\eta)$$
$$(\omega\wedge\eta)\wedge\theta=\omega\wedge(\eta\wedge\theta)$$
$$\omega\wedge\eta=(-1)^{kl}\eta\wedge\omega$$
(show proof)
Proof.
$$
(\omega_1+\omega_2)\wedge\eta
=\frac{(k+l)!}{k!l!}\Alt((\omega_1+\omega_2)\otimes\eta)
=\frac{(k+l)!}{k!l!}\Alt(\omega_1\otimes\eta+\omega_2\otimes\eta)
=\frac{(k+l)!}{k!l!}\Alt(\omega_1\otimes\eta)+\frac{(k+l)!}{k!l!}\Alt(\omega_2\otimes\eta)
=\omega_1\wedge\eta+\omega_2\wedge\eta
$$
$$
\omega\wedge(\eta_1+\eta_2)
=\frac{(k+l)!}{k!l!}\Alt(\omega\otimes(\eta_1+\eta_2))
=\frac{(k+l)!}{k!l!}\Alt(\omega\otimes\eta_1+\omega\otimes\eta_2)
=\frac{(k+l)!}{k!l!}\Alt(\omega\otimes\eta_1)+\frac{(k+l)!}{k!l!}\Alt(\omega\otimes\eta_2)
=\omega\wedge\eta_1+\omega\wedge\eta_2
$$
$$
(c\omega)\wedge\eta
=\frac{(k+l)!}{k!l!}\Alt((c\omega)\otimes\eta)
=\frac{(k+l)!}{k!l!}\Alt(c(\omega\otimes\eta))
=c\frac{(k+l)!}{k!l!}\Alt(\omega\otimes\eta)
=c(\omega\wedge\eta)
$$
$$
\omega\wedge(c\eta)
=\frac{(k+l)!}{k!l!}\Alt(\omega\otimes(c\eta))
=\frac{(k+l)!}{k!l!}\Alt(c(\omega\otimes\eta))
=c\frac{(k+l)!}{k!l!}\Alt(\omega\otimes\eta)
=c(\omega\wedge\eta)
$$
Let $(e_1,\ldots,e_n)$ be a basis of $V$.
$$
(\omega\wedge\eta)\wedge\theta
=\p{\sum_I\omega_Ie^I\bigwedge\sum_J\eta_Je^J}\bigwedge\sum_K\theta_Ke^K
=\sum_I\sum_J\sum_K\omega_I\eta_J\theta_K((e^I\wedge e^J)\wedge e^K)
=\sum_I\sum_J\sum_K\omega_I\eta_J\theta_Ke^{IJK}
=\sum_I\sum_J\sum_K\omega_I\eta_J\theta_K(e^I\wedge(e^J\wedge e^K))
=\sum_I\omega_Ie^I\bigwedge\p{\sum_J\eta_Je^J\bigwedge\sum_K\theta_Ke^K}
=\omega\wedge(\eta\wedge\theta)
$$
$$
\omega\wedge\eta
=\sum_I\omega_Ie^I\bigwedge\sum_J\eta_Je^J
=\sum_I\sum_J\omega_I\eta_Je^{IJ}
=\sum_J\sum_I\eta_J\omega_Ie^{JI}(-1)^{kl}
=(-1)^{kl}\p{\sum_J\eta_Je^J\bigwedge\sum_I\omega_Ie^I}
=(-1)^{kl}\eta\wedge\omega
$$
$\blacksquare$
Proposition.
Let $(v_1,\ldots,v_k)\in V^k$ and $(w^1,\ldots,w^k)\in {V^*}^k$.
Then $$(w^1\wedge\ldots\wedge w^k)(v_1,\ldots,v_k)=\det\p{\p{w^i(v_j)}_{ij}}$$
(show proof)
Proof.
Let $(e_1,\ldots,e_n)$ be a basis of $V$ and let $(\mu^1,\ldots,\mu^n)$ be the corresponding dual basis.
$$
(w^1\wedge\ldots\wedge w^k)(v_1,\ldots,v_k)
=\sum_Ic_I(\mu^{I_1}\wedge\ldots\wedge\mu^{I_k})(v_1,\ldots,v_k)
=\sum_Ic_Ie^I(v_1,\ldots,v_k)
=\sum_Ic_I\det\p{\p{\mu^{I_i}(v_j)}_{ij}}
=\det\p{\p{w^i(v_j)}_{ij}}
$$
$\blacksquare$