geteilt von: https://sopuli.xyz/post/26841469
Meme transcription:
An obviously exhausted Spongebob is raising his arms in a rejoicing gesture. His face shows great tiredness, but also happiness.
Title: Finally finding your stupidity after hours of debugging.
On Friday I spent over an hour trying to fix my Firefox tabs - I could no longer drag them to reorder or to a new window, and ctrl-shift-T didn’t restore tabs, but I could still do it via menus. I thought it might be something to do with the new tab-island stuff and tried FF safe mode, restarting computer, confirming about:config options, etc.
Turns out my headphones were resting on my Esc key.
Light debugging I actually use an LLM for. Yes, I know, I know. But when you know it’s a syntax issue or something simple, but a quick skim through produces no results; AI be like, “Used a single quote instead of double quote on line 154, so it’s indirectly using a string instead of calling a value. Also, there’s a typo in the source name on line 93 because you spelled it like this everywhere else.”
By design, LLMs do be good for syntax, whether a natural language or a digital one.
Nothing worse than going through line by line, only to catch the obvious mistake on the third “Am I losing my sanity?!” run through.
If you’re stupid, look into Static analysis for additional warnings that let you know when you’re fucking up as you code, instead of at runtime.
EDit : It’s also good if you’re not stupid, but you know, you’re probably already using it.
I was trying to get dual 3090 passthrough working in a VM on my Proxmox server for like a week. The VM would not detect both GPUs, just the first and everything the forums and Reddit threads on this issue checked out. As far as I could tell, it was happy with everything, nothing was amiss. And yet the VM would only ever detect 1 GPU and 1 GPU audio controller.
I spent a week on it before realizing I had put a .1 in the PCIe ID field in the VM hardware settings instead of .0. .1 references the audio controller of the second GPU and not the GPU itself lolol