How much storage do these models take? Considering this from an information/knowledge compression perspective, how small can we make "the internet's knowledge"

Reply to this note

Please Login to reply.

Discussion

It’s a good question. I don’t know. But people are walking around with multiple terabytes of storage in their pockets these days, which I think is more than enough to make these models very useful.

From ggerganov's repo:

**Memory/Disk Requirements**

As the models are currently fully loaded into memory, you will need adequate disk space to save them and sufficient RAM to load them. At the moment, memory and disk requirements are the same.

|model | original size | quantized size (4-bit)|

|-----------|---------------------|-------------------------------------|

|7B | 13 GB | 3.9 GB|

|13B | 24 GB | 7.8 GB|

|30B | 60 GB | 19.5 GB|

|65B | 120 GB | 38.5 GB|

The "B" in the model names refers to "billion", as in 7B incorporates 7 billion parameters