Advice: don´t include CUDA dependencies in your container if you don't need them ¶
By: Jardvanroest on May 21, 2024, 4:15 p.m.
I made a model that uses torch and torchvision, which resulted in a container of 3.1 GB. This is because it includes all dependencies for CUDA. If you do not need CUDA, you can use the CPU versions of torch and torchvision, by putting this in your requirements.txt:
--extra-index-url https://download.pytorch.org/whl/cpu torch==2.3.0+cpu torchvision==0.18.0+cpu