Open
Description
Hi I try to run a demo(BLIVA_Vicuna 7B) in my local machine(V100, 16GB), It comes with a OOM error:
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 86.00 MiB (GPU 0; 15.77 GiB total capacity; 12.18 GiB already allocated; 54.88 MiB free; 12.38 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
So how much is the minimum amount of GPU memory required ? Or is there any way to reduce GPU usage?
Thanks a lot!
Metadata
Metadata
Assignees
Labels
No labels