Something to handle code, text and math.

  • monovergent@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 hours ago

    16 GB VRAM GPU, models stored on SSD, rest of the computer doesn’t have to be crazy. Intel Arc is best bang for the buck at the moment. You can get LLM running on 8 GB cards or even the CPU, but IMO such small models are more novelties than workhorses. I personally use Debian but you’ll be fine as long as your distro’s repo has drivers recent enough for your GPU.

    For perspective, I’m using such a build to help with boilerplate code, single-use scripts that I don’t have the patience to trial-and-error (like ones that have to deal with directory structures and special characters), getting an idea of what’s what when decompiling and reverse engineering, brainstorming tip-of-the-tongue ideas, and upscaling images.