

1·
19 days agoWait hasn’t DDG been the default in Safari for a few years now?
Wait hasn’t DDG been the default in Safari for a few years now?
You can run your own LLM chatbot with https://ollama.com/
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.
Can you not just brew install sshfs
on a mac? (Assuming you’ve already installed Homebrew).
I’ve worked on a library that’s Python because the users of said library are used to Python.
The original version of the project made heavy use of numpy, so the actual performance sensitive code was effectively C++ and fourtran, which is what numpy is under the hood.
We eventually replaced the performance sensitive part of the code with Rust (and still some fourtran because BLAS) which ended up being about 10x faster.
The outermost layer of code is still Python though.