Profile: 7bdef7bd...

I mean, you can always go extra mile if you want, but it doesn’t matter as much because they are offline by design

LM Studio was proposed in another comment, that’s why I mentioned it

In ollama I don’t like that you can’t choose your own quantization level/method (or I just haven’t figured it out). Nevertheless it’s a great server

hello, everynyan! how are you? fine, thank you. I wish I were a bird.

okok, you catched me, here you go

```

user@localhost:~$ rm -rf ./~

user@localhost:~$

```

sure

```

./~: No such file or directory

```

this is most likely an overkill, I have not seen LLM server/client that collects telemetry yet

I would rather prefer to not use proprietary (closed-source) apps like LM Studio

it kinda depends on what are you looking for

you can try to look into abliterated models, I think they are the ones worth looking into

you can also take a look into Big-Tiger-Gemma2

the only truly private way is to self-host it. Censorship is usually applied at the model level tho, so you'll have to find uncensored model