8000 tensor compression seems to not work · Issue #50 · rballester/tntorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
tensor compression seems to not work #50
Open
@LukGross

Description

@LukGross

The straight forward TT-decomposition of a full tensor does not work properly for me.

Minimal example:

import tntorch as tn
import torch
import numpy as np

X, Y, Z = np.meshgrid(range(128), range(128), range(128))
full = torch.Tensor(
    np.sqrt(np.sqrt(X) * (Y + Z) + Y * Z**2) * (X + np.sin(Y) * np.cos(Z))
)  # Some analytical 3D function
print(full.shape)

t = tn.Tensor(full, ranks_tt=3, requires_grad=True)  # You can also pass a list of ranks


def metrics():
    print(t)
    print(
        "Compression ratio: {}/{} = {:g}".format(
            full.numel(), t.numel(), full.numel() / t.numel()
        )
    )
    print("Relative error:", tn.relative_error(full, t))
    print("RMSE:", tn.rmse(full, t))
    print("R^2:", tn.r_squared(full, t))


metrics()

Output:

torch.Size([128, 128, 128])
3D TT tensor:

 128 128 128
  |   |   |
 (0) (1) (2)
 / \ / \ / \
1   3   3   1

Compression ratio: 2097152/2097152.0 = 1
Relative error: tensor(0.0005, grad_fn=<DivBackward0>)
RMSE: tensor(22.0728, grad_fn=<DivBackward0>)
R^2: tensor(1.0000, grad_fn=<RsubBackward1>)

The expected output would be the one given in the tutorial.
Especially, compression ratio should be $&gt;0$.

I experience this behavior both with python 3.9.6 and 3.12.2 on an M1 MacBook under macOS Sonoma 14.4.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0