I stole this fair and square. Hope this hasn’t been posted yet.

  • WolfLink@sh.itjust.works
    link
    fedilink
    arrow-up
    10
    ·
    10 days ago

    Running an AI on a GPU requires enough VRAM to fit the model, otherwise it will fall back to the CPU which is very slow. Mac Minis share RAM between the CPU and GPU, and you can get a Mac mini with a lot of shared RAM for a lot cheaper than a GPU with a lot of VRAM.