Avatar
Ryan
deab79dafa1c2be4b4a6d3aca1357b6caa0b744bf46ad529a5ae464288579e68
Play Flappy Nostrich @ flappy-nostrich.vercel.app/ Est. 776032 💜🫂🤙
Replying to Avatar node

#⬛⬛⬛str

For LLM ollama is a good choice, running from the terminal.

https://ollama.com/

If you want a GUI try LMstudio.

https://lmstudio.ai/

I've used both on my fedora machine to good results 👍

I've been doing Windows and Linux support & sysadmin work for one 20 years, I also use both. I disagree with your view. For lots of people Linux gaming works great these days, and continues to improve. I guess we better tell all those Steamdeck owners that gaming on Linux doesn't work 😂

Too many short term thinkers 💯

You seem pretty triggered by people choosing to use an OS different than your own preference, judging by your responses in this thread 🤭

I didn't realize it was a static page. So you could just run it locally using 'python -m http.server' ?

TFW you thought you'd be decorating your Citadel by now.

"I like the ads"