Llama 3.3 70B is impressive
https://simonwillison.net/2024/Dec/9/llama-33-70b/
This is pretty cool. Running an LLM to filter events for your local relay or client is getting far more practical.
nostr:nevent1qqsdzmw643h8cwle9f7namfsynhkjrjn5zht0kpgg6gx85dd4m2s8qcprpmhxue69uhhyetvv9ujucnjv9ukgmmw9e3k7mf0qgs9pk20ctv9srrg9vr354p03v0rrgsqkpggh2u45va77zz4mu5p6ccrqsqqqqqpqgz59k
Please Login to reply.
No replies yet.