I’m going to try to get Llama (the leaked version) running locally on my new computer so I don’t have the constraints regarding what it is trained on and how much info I can feed it.
Very curious how these perform with local versions. I think there is a freedom critical race right now between open source AI and closed source, centralized platforms, and it’s critical that we win this race, imo.