I’m on a paid API account but goose keeps getting rate limited. gpt4 and Claude both.
How can I fix this?
nostr:npub16l0ck0s5zened29dsaqtqm6z0t4fmk2mwtszw64fz7fppcnls8mss3yj9s #asknostr
Spend $20-30k on your own GPU server with +192GB of VRAM.
Please Login to reply.
That’s surprising affordable.
Yeah prices have come down.
RTX Pro 6000 MSRP $8.5k 96GB of VRAM.
RTX 5090 MSRP $2k 32GB of VRAM.
You will need some connections and patience to get MSRP, BBB but you shouldn't pay more than 20% over MSRP.