Do you know what’s more expensive than data storage? Data out… This is mainly why I have to be so limiting with View All.
Bad news, bill is 5x expectations…
Good news, accounts and donations are are doing great!
Do you know what’s more expensive than data storage? Data out… This is mainly why I have to be so limiting with View All.
Bad news, bill is 5x expectations…
Good news, accounts and donations are are doing great!
If any devs have tricks to limit data-out, LMK. I am already compressing images and using thumbnails to view.
Using ETags?
Never heard of them, looking now. Will it help?
I don't know what you are using for hosting or caching but yes, most browsers honor ETags for cache control to improve performance by not redownloading if they already have.
Ok, so basically just some html code. Even if a few support it, will still help.
Was also thinking to compress more. I’m way too generous keeping max size at 1MB per jpg, none for png or gif…. Most people probably only need 200-500k for what this free purpose. Save uncompressed for a paid account..
I had a feeling this would be your biggest expense. I always make sure to compress my animated GIF images as much as possible. It’s a pretty inefficient format as it is. Are you also scaling down the originals?
Not touching the gifs.. I need to find a new compressor :)
https://www.cloudflare.com/en-gb/products/cloudflare-images/
No egress fees...
Maybe increase the cache ttl?
Probably wouldn’t change much tho. Best thing to do is just maximize compression
Oh thanks, didn’t know I could do that, let me look into it.
Is file expiration a future option? Could limit cost of data out? Would that provide more “privacy” be default?
It is not, ideally we keep track of every single file link for future reference.. you gave me an idea though, maybe I just compress the he’ll out of everything older than ‘xx’ months.. File and content is still there, just like 20k instead of a meg, I like the idea 👊
Assuming these images are stored on S3 and delivered via CloudFront, how about setting a policy on the S3 bucket serving these images to automatically move old images to S3 infrequent access or glacier tiers?
Just curious: how much?
This is a 2 month trend, you see where my data-out goes over at the end of the month. I need to up my plan and limit my data more..
Month 1 - $6
Month 2 - $45
Month 3 - $240

But at least your accounts and donations are keeping you afloat. Maybe it's time to upgrade your data storage AND your budget!" 😂💸👀
Doing fine, just wasn’t expecting it. It motivates me to find solutions.
I’m glad to hear people are sending you sats.
Your hosting is super important and shitposting wouldn’t be the same without it.
Especially like the metadata stripping right out the gate.
Very generous Zappers 🙏⚡️💜
Support your buidlers. 🏗️🧱🚧
#[0]
🍀🤙🫂
I probably account for half your storage 😂 feel free to delete my 💩
Never, you have the best 💩
Maybe use badges and make it paid monthly to use?
I’m compressing images a bit more and actually doing pretty good so far with accts.. I want to keep the basic upload free as long as possible.
Can’t remember your deployment architecture. That would help with suggestions
Basically just an AWS instance.
Not sure how much data you’re pushing out, but look to connect with an AWS reseller. You can aggregate their usage and drop your fees from .09/gb to something like .05/gb. No cost or commitments to use a reseller (I use to be one before I was acquired)
If I go through a reseller as apposed to direct I can get a big discount?!? Looking into it. Yes, paying .09/Gb out
Yes - not just on data transfer. You will also get discounts on compute, support etc. if you need recommendations, I can send you ones that won’t fuck you over. (I was 1 of 40 premier AWS partners in the country so know most of them)
Also one more thing! If you’re building on AWS, be sure to apply for AWS usage credits! They give free money to cover your usage, especially to people just getting started on the platform.
AWS activate is one program and is super easy to apply.
Start by setting up cloudfront in front of it and setting cache headers so it works. That plus etags will help a lot
You will also want to set up a savings plan for the compute of the instance. I suggest a savings plan versus a reserved instance just because you will likely change the size and shape of that instant overtime, and the savings plan gives you more flexibility for almost as much savings.
One final comment about the compute. If you’re not already using arm instances, you should be there a significant discount for better performance per dollar.
A few people talking about using cloud front, researching, thank you.