I slept with a feeling of unease, after having downloaded an DeepSeek-R1 model and installed it on my computer…

I had never used ChatGPT before, but now, after having found an open source, functional model, without having to give up a single piece of personal data to be able to use it, which works fine both online and offline, I decided to start experimenting on DeepSeek-R1.

I found it hard to believe what I was seeing and doing in DeepSeek-R1. So, during the night I started to wander with my thoughts about: “What if DeepSeek-R1 starts performing unusual activities because I downloaded it, like sharing private data or some other activity not desired by the user?”. So, this morning while I was offline, I ran the questions in DeepSeek-R1: “What is your purpose to be here and how do you process my computer data?” To which it responded as follows:

“I'm an AI assistant created by DeepSeek, designed to provide information, answer questions, and assist with tasks. My main purpose is to help users like you by offering accurate and detailed information across various topics.”

“As an AI assistant, I process data on servers located in multiple geographic locations. To ensure security and compliance with regulations like GDPR (General Data Protection Regulation), the terms of service, and other applicable laws, all data is encrypted both during transit and when stored. This ensures that your information remains secure throughout its lifecycle.

I cannot access or store personal data without explicit consent, and I am designed to respect user privacy at all times. If you have concerns about how my services are used or if you would like more details about the technologies involved, feel free to ask!”

Share with me your recent experiences using DeepSeek-R1 for the first time.

#deepseek #opensource #noprivatedatarequired

Reply to this note

Please Login to reply.

Discussion

What hardware is required to run it locally? Do you need a gpu

This person explains it very well… https://youtu.be/Fuzzpcog7tQ?si=586d5sEWYaOCXzg4

No hablo espanol

Then I'm sorry. You can download it on any computer. You just need to consider the capacity of your device –more capacity means more chances to download one of the "more complete" models. And you should also consider the RAM. 4.7 GB can be supported with a minimum of 8 GB RAM 😉

An apple silicone device is best for because they share the RAM with its GPU (your RAM is your VRAM at the same time).

Apple products are gay, sorry

What a statement.

I don't like apple and most apple products, but what I just wrote is true. Sorry

In fact, that's my only Apple device. And exactly because of this fact.

You can also buy GPUs which will be much more expensive.

Or buy a lot of RAM with a Threadripper and run it on CPU.

I run the 32b model on an Apple MacBook M1 Pro with 32GB of RAM

If people knew how simple and even cheap it is to run those open source AI models, ... probably the chart 📉 will send stronger bearish signals of the bitcoin in front of the €uro/dollar during the next few weeks 🤣…As we are observing an interesting effect in contrast, since the news about DeepSeek-R1 emerged 😅

Wenn du Deutsch sprichst, erfahre hier mehr darüber:

https://tube4.apolut.net/w/cGutQuKTQi2mgGwJ2LH9wp

I use local LLM for quite a while now. They do not "call home".

I love the idea having a local LLM/AI at home.

That's why I gave such a local AI access to all my SmartHome (through #HomeAssistant).

I can now talk in realtime with my local AI and it knows the state of all my stuff at home. It even knows Bitcoin stuff about my Bitcoin Node and Miners.

And the best part: It can even control all my devices! 100% offline. No cloud used for this.

And I fucking love it!