The DVM currently only does Speech to Text for Nostr events, but I can update it to work with urls if the PDF is available online

Full disclosure though, the cost is $0.36/1000 characters (not words) so for a full length book it could be more than $100 depending on the length

Reply to this note

Please Login to reply.

Discussion

I see. Is it that expensive to do inference on the GPU?

It's a API wrapped as a DVM and that's the service cost of the API

I'm not running my own model or hardware

I can make a much cheaper DVM, but the largest part of the expense is the voice cloning for the service I'm using