ETH Zurich AI researchers present “tntorch”: a Python tensor learning library powered by PyTorch that supports multiple decompositions under a unified interface

Tensors are an efficient method for managing and representing multidimensional arrays of data. However, they have a limitation in terms of storage and computing. Tensor decompositions are crucial in machine learning because they factor the weights of neural networks. This research presents tntorch, an open-source python package for tensor learning that supports multiple decompositions through a single user interface. Unlike state-of-the-art packages, tntorch emphasizes an easy-to-use interface that is independent of PyTorch’s legacy decomposition.

Several decomposition models crucial for machine learning, such as CANDODOM/PARAFAC (CP), Tucker decomposition, and tensor train (TT), are supported by tntorch. Figure 1 illustrates examples of tensor networks that tntorch can assemble. tntorch’s tensors can incorporate more than one format; for example, one can interleave CP and TT cores, connect Tucker factors to TT cores, or even combine the three formats differently.

Source: https://arxiv.org/pdf/2206.11128v1.pdf

The basic decompositions represented by tntorch employ CP, TT and Tucker. CP represents a series of 2D factor matrices, TT represents a series of 3D tensor train kernels, and Tucker represents TT-like kernels that have been carefully chosen to be error-free. Table 1 shows a feature comparison of tntorch and six related libraries. The cross approximation technique is used by tntorch to extract a compressed TT tensor from a black box function. For discrete problems, the cross approximation has also been adapted as a global gradientless optimizer. This research implements the decomposition of the TT matrix as well as the decomposition of the CP matrix. Both matrices are implemented as a separate CPMatrix class because they require custom operations that are not accessible to CP tensors.

Source: https://arxiv.org/pdf/2206.11128v1.pdf

To learn incomplete tensors, tensors with constraints or to add other loss terms, tntorch can be used. Tensors can be accessed in tntorch using a variety of methods, including basic indexing, sophisticated indexing, indexing using Numpy Arrays, and inserting dummy dimensions. Various arithmetic operations include tensor vector and tensor matrix products, element-by-element operations, dot products, convolution, concatenation, mode rearrangement, rank filling, orthogonalization and truncation are supported by the tntorch library. The TT-SVD technique is used in this study to simultaneously decompose several tensors into TT kernels. Many TT matrix decomposition techniques are used in this study, including fast matrix inverse, linear algebra operations, and determinant algorithms for rank-1 TT matrices that are equivalent to Kronecker products.

The four modalities tested in this work are CPU vs GPU and loop vs vectorized batch processing in both situations. Except for the TT-SVD experiment, which uses N = 4, all experiments use randomly initialized tensors with TTrank R = 20, physical dimension sizes I = 15; :: ; 45, and N = 8 dimensions. PyTorch 1.13.0a0+git87148f2 and NumPy 1.22.4 were used by the authors on an Intel(R) Core(TM) i7-7700K processor with 64 GB of RAM and an NVIDIA GeForce RTX 3090 GPU. The experimental results demonstrate how the GPU works best in batch and non-batch mode. Also, tntorch scales regarding tensor size are better (or similar, for cross approximation) than baseline, making it a suitable choice for data-intensive ML applications.

To conclude, the library powered by PyTorch is intended to integrate many capable tensor formats under a single user interface and provide various analytical tools and methodologies. It allows machine learning to access the power of lower rank tensor decompositions while maintaining the excellent look and feel of PyTorch tensors. Many standard features of modern machine learning frameworks are included in the library, such as automatic differencing, GPU and batch processing, and advanced indexing.

This Article is written as a paper summary article by Marktechpost Research Staff based on the paper 'TNTORCH: TENSOR NETWORK LEARNING WITH PYTORCH'. All Credit For This Research Goes To Researchers on This Project. Checkout the paper and github.

Please Don't Forget To Join Our ML Subreddit

James G. Williams