Sitting in a long taxi ride running deepseek on my laptop and cranking through complex rust database abstractions.

The future is here. 😎

Reply to this note

Please Login to reply.

Discussion

And roasting your chestnuts? 😅

Definitely getting a bit toasty 😂

Self-hosted via Ollama or using the custodial version of deepseek?

Self hosted. Been tinkering with LM Studio for hosting and managing models locally.

Did you try Msty.app? I found it really cool and usefull. You can switch between models really quick and split the conversation. Worth trying out.

I haven’t. Thanks for the shout out.

What's your laptop config?

Left curve. Using LM Studio and downloading models. It’s a very recent MacBook Pro and seems to handle it pretty damn well.

I’ve also got goose setup but that’s talking to open router so doesn’t work when I don’t have internet.

You're moonlighting as a cab driver now?