Running an AI on a GPU requires enough VRAM to fit the model, otherwise it will fall back to the CPU which is very slow. Mac Minis share RAM between the CPU and GPU, and you can get a Mac mini with a lot of shared RAM for a lot cheaper than a GPU with a lot of VRAM.
Running an AI on a GPU requires enough VRAM to fit the model, otherwise it will fall back to the CPU which is very slow. Mac Minis share RAM between the CPU and GPU, and you can get a Mac mini with a lot of shared RAM for a lot cheaper than a GPU with a lot of VRAM.
Ty this is the answer I was looking for