Avatar
StellarStoic
000000005e9dda01479c76c5f4fccbaebe4e7856e02f8e85adba05ad62ad6927
Interested in everything interesting. I also fly occasionally ๐Ÿช‚ If, you feel lucky, try the Bitcoin Lightning #lottery www.satoshi.si Signal: nonce.01 XMR: 86JLdrP1GECSSWpJr18RaFdGnSXh2ubaP7oue8f2NDEj8YQytJLXd6thv7NQdz3yv1gAeMfQaM2kujUBPQtGyHE3AyokRcC

It's perfect as it is ๐Ÿ‘Œ

Replying to Avatar StellarStoic

Oh I thought you posted the industrial pollution. The gif is irrelevant if it's because of the fires.

I wish all effected a lots of rain soon

I did a quick research. ChatGPT still has way, way more parameters.

- ChatGPT-3.5, and GPT-4 are Large Language Models (LLMs) ChatGPT has 175 billion parameters

> Parameters including weights and biases.

>The numbers of parameters in a neural network is directly related to the number of neurons and the number of connections between them.

>Needs supercomputers for training and inference

- LLaMA (by Facebook): Open and Efficient Foundation Language Models

>[llama.cpp](https://github.com/ggerganov/llama.cpp)

> a collection of foundation language models ranging from 7B to 65B parameters

> Facebook says LLaMA-13B outperforms GPT-3 while being more than 10x smaller

> If it is 10x smaller then we don't need a supercomputer

- alpaca.cpp (Stanford Alpaca is an instruction-following LLaMA model)

> It is a fine-tuned version of Facebook's LLaMA 7B model

> It is trained on 52K instructions-following demonstrations.

> uses around 4GB of RAM.

Google lense is giving me suggestions for wall tiles ๐Ÿ˜„

Running [Onyx](https://github.com/TonyGiorgio/onyx): Amethyst fork without censorship

```

=IFERROR(TEXT(TIMEVALUE(IFERROR(MID(E2, FIND(" ", E2)+1, 8),E2)),"HH:MM:SS"),"")

```

So relaxing ๐Ÿšถ

I really want to see a car company startup with only the basic features that you just mentioned. I want a car that works. No electric fancy stuff where I could open windows manually. No sensors, no glamorous lamps that cost a fortune if they need to be repaired.

I want a 25 year old car but new!

When Saifedean's book is translated into my language, I'm buying ๐Ÿค™

#grownostr #reading #book #FiatStandard

But I updated it 2 days ago ๐Ÿ˜‚ I can't keep up. You are shipping new releases too fast ๐Ÿ˜Ž

A possible bug in the Amethyst, 0.54.2-play.

nostr:npub1gcxzte5zlkncx26j68ez60fzkvtkm9e0vrwdcvsjakxf9mu9qewqlfnj5z

While scrolling the home (following) feed and opening random post and then going back, always get you at the begining of the feed and forces you to scroll all the way back down to continue checking older posts.

While in global feed when going back to the main global feed, you can continue scrolling from the post that you've chacked.