Global Feed Post Login
Replying to Avatar Neigsndoig IQ 120

GE, y'all.

I'm attempting to get MKStack working with a local LLM in order that I vibe code a NIP-72 Reddit-like application for #nostr, though I'm having a hard time dealing with it. LM Studio is treating me like garbage, I don't understand Ollama in terms of integrating it to Stack right now, and I'm having trouble finding an online LLM service that has a generous free tier that isn't Anthropic, OpenAI or any of these other big companies (DeepSeek isn't an option for me either, unless it's a locally downloaded LLM). If anyone has a recommendation for what LLM I should use, or how I could connect up Stacks locally in this manner I described without getting an improper JSON error in LM studio, that would be appreciated.

If local isn't an option, the above question for what online LLM I could use that has a generous free tier is what I #asknostr for today.

Avatar
PlebInstitute 6mo ago

Start your LM Studio server and set an API key via stacks configure. The value can be anything. Then do „stacks naddr1qv…“. Abort after the project was set up and enter the project directory. Them run „stacks agent -m “. Now you are one step further towards the next error 😄

Reply to this note

Please Login to reply.

Discussion

No replies yet.