You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I did set up XRT_TPU_CONFIG with the IP address of the TPU.
This is my test.py script
import os
import torch
import torch_xla.core.xla_model as xm
os.environ['XRT_TPU_CONFIG'] = "tpu_worker;0;10.128.0.29:8470"
dev = xm.xla_device() ## Error while executing this line
t1 = torch.randn(3,3,device=dev)
t2 = torch.randn(3,3,device=dev)
print(t1 + t2)
Do you require specific PyTorch/XLA version or it is fine to use the most recent stable version (2.0)? If you are fine with version 2.0, can you remove the line os.environ['XRT_TPU_CONFIG'] = "tpu_worker;0;10.128.0.29:8470" and retry?
Also for XRT_TPU_CONFIG, as the name suggests, it uses the xrt runtime which we plan to drop the support in the near future.
❓ Questions and Help
I did set up XRT_TPU_CONFIG with the IP address of the TPU.
This is my test.py script
Here's the error:
I don't know what am I doing wrong. Can someone give me a possible fix
The text was updated successfully, but these errors were encountered: