Multilinear tensors

Multilinear tensors

using TensorDec

We consider a multi-linear tensor of size 3 x 5 x 4, which is sum of r=4 tensor products of the random column vectors of the matrices A0, B0, C0with weights w0:

r=4
w0 = rand(r)
A0 = rand(3,r)
B0 = rand(5,r)
C0 = rand(4,r)

T0 = tensor(w0, A0, B0, C0)
3×5×4 Array{Float64,3}:
[:, :, 1] =
 0.222402  0.283308  0.249751  0.357905  0.445553
 0.48642   0.249264  0.429532  0.267033  0.514896
 0.484621  0.409858  0.520992  0.446184  0.701548

[:, :, 2] =
 0.100051  0.0767232  0.0951218  0.0922465  0.138464
 0.380406  0.0940893  0.284429   0.0865987  0.280005
 0.33198   0.131513   0.278601   0.129391   0.302344

[:, :, 3] =
 0.0633582  0.121002  0.121238  0.104648   0.150986
 0.170347   0.117575  0.195356  0.0910639  0.200025
 0.164727   0.206409  0.263668  0.16259    0.288677

[:, :, 4] =
 0.0952716  0.0730845  0.0707191  0.112236   0.141138
 0.276662   0.0692249  0.190936   0.0838133  0.212043
 0.253229   0.0953878  0.180919   0.127586   0.237287

We compute its decomposition:

w, A, B, C = decompose(T0);

We obtain a decomposition of rank 4 with weights:

w
4-element Array{Float64,1}:
 1.05137 
 0.608236
 0.657986
 0.199031

The r=4 vectors of norm 1 of the first components of the decomposition are the columns of the matrix A:

A
3×4 Array{Float64,2}:
 0.135676  0.383353  0.64105   0.720917
 0.767565  0.435744  0.397167  0.292406
 0.626447  0.814351  0.656744  0.628313

The r=4 vectors of norm 1 of the second components are the columns of the matrix B:

B
5×4 Array{Float64,2}:
 -0.747703   -0.0622569  -0.287695  0.285055
 -0.0822081  -0.511699   -0.334426  0.49955 
 -0.499925   -0.527881   -0.187468  0.329212
 -0.0517652  -0.360528   -0.606854  0.507575
 -0.42612    -0.570657   -0.634015  0.550618

The r=4 vectors of norm 1 of the third components are the columns of the matrix C:

C
4×4 Array{Float64,2}:
 -0.651003  -0.768702   -0.901631   0.792054
 -0.591965  -0.226033   -0.225253   0.154   
 -0.244873  -0.597034   -0.0443904  0.566911
 -0.407198  -0.0394597  -0.366542   0.165971

It corresponds to the tensor $\sum_{i=1}^{r} w_i \, A[:,i] \otimes B[:,i] \otimes C[:,i]$ for $i \in 1:r$:

T = tensor(w, A, B, C)
3×5×4 Array{Float64,3}:
[:, :, 1] =
 0.222402  0.283308  0.249751  0.357905  0.445553
 0.48642   0.249264  0.429532  0.267033  0.514896
 0.484621  0.409858  0.520992  0.446184  0.701548

[:, :, 2] =
 0.100051  0.0767232  0.0951218  0.0922465  0.138464
 0.380406  0.0940893  0.284429   0.0865987  0.280005
 0.33198   0.131513   0.278601   0.129391   0.302344

[:, :, 3] =
 0.0633582  0.121002  0.121238  0.104648   0.150986
 0.170347   0.117575  0.195356  0.0910639  0.200025
 0.164727   0.206409  0.263668  0.16259    0.288677

[:, :, 4] =
 0.0952716  0.0730845  0.0707191  0.112236   0.141138
 0.276662   0.0692249  0.190936   0.0838133  0.212043
 0.253229   0.0953878  0.180919   0.127586   0.237287

We compute the $L^2$ norm of the difference between $T$ and $T_0$:

norm(T-T0)
5.81504149525808e-15