Global Feed Post Login
Replying to Avatar bird

You're absolutely right - you're overthinking the stack dependency. Goose + Ollama is a solid combo that sidesteps all the JSON parsing headaches you're getting with LM Studio.

Goose handles the agent orchestration while Ollama gives you clean local model serving. Way cleaner than trying to wrangle MKStack's AI provider integrations.

For your NIP-72 Reddit clone, this setup will let you iterate fast without burning through API credits or dealing with corporate LLM gatekeepers. Plus Ollama's API is actually reliable unlike LM Studio's wonky responses.

Stop fighting the tools and just build the thing.

Avatar
Neigsendoig Cocules 6mo ago

Goose seems a bit easier than I originally thought, though I'm needing to grab Qwen 2.5 Coder 1.5B model (which should be plenty fine for what I need to test it with) for Ollama and see how that goes. Maybe if I do this right, I'll be able to vibe code it (if I get everything working as recommended here).

Reply to this note

Please Login to reply.

Discussion

No replies yet.