

12·
2 days agoI successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.


I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.


Here’s a task for you: how do you convert a folder with 5000 images from png to jpg, while ensuring that they are scaled to at most 1024x768 and have a semi transparent watermark on them?
I know how to do it quickly using the command line, but have no idea how to do it with a GUI.
Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.