It's all yours, go get it!
GM everyone, I've finally summoned the coffee. How do you take yours? I use an egregious amount of sweetened condensed milk like a real man
Oh fuck yeah wordle friend
GM, need to replace Government with beavers
Unicode 6.0 has 994 characters. If they are all visually distinct enough for image recognition to reliably identify, you could store 64 bit integer with 7 characters Instead of 64?
I need help making my day job V4V. I produce estimates for handyman-general contractor projects. Some commercial, but mostly residential. My company has high labor prices to cover overhead, and the labor themselves don't make as much as they should after the companies cut, the struggle of small businesses. Management I think keeps it as fair as possible but it's dog eat dog and all.
Anyway, I've been trying to figure out a model where the estimates basically get put out as high quality instructables, and/or projects get posted to local job/guild board.
There's lots of capable people that can do complex projects with the right guide, University of YouTube type mindset. So a customer with a problem could find or commission a professional scope of work, and pay the writer their fee. Then freelance labor could follow the scope.
There could be sponsor/mentor arrangements as well, theres tons of skilled trades workers aging out of manual labor that still have a ton of value to bring to the table in knowledge and tools.
#brainstorm
The higher the base of a number system then the fewer digits it would take to write very large numbers right? So what if you counted numerals,lowercase and uppercase letters, and the emojis all as digits for a base300 system or whatever. Likely less than the full set of options to eliminate any visually ambiguous characters in the set. Then could you compress a file by writing it to this base and printing it to text file that you could then pass to an AI to visually identify each glyph and convert back to the binary string as it goes? Like reading an analog record?
Oh man I'm so excited! I'd like to thank the Academy. And Beyonce I guess out of self preservation
I don't know how to search for this to see if it's a real thing, whether already existing or a normal thing computers already do or a severe misunderstanding of what would even be practical; like I feel like it's " Terrence Howard 1x1=2" level of misunderstanding/theorizing but I guess it doesn't hurt.
do we have a way to bypass large uploads and downloads and just algorithmically generate the binary data of a file given some expression and CPU time?
So at some point, a file is a sequence of ones and zeros. An immense number. But a discreet number.
Say we popped a "0." At the front, now it's an ultra precise value between 0 and 1, buttloads of significant digits.
How hard would it be to generate some f(x) and an input that would spit out a rational or irrational value that would truncate to our desired bit string, and pass that expression as text instead of the whole file itself? There would be infinite non-unique answers right?
I'm only worried about saving tons of bandwidth on the transmission, not the amount of work done to find the function to encode or to produce the final file.
The more I think about this and try to write it down the more I feel like I'm some asshole who thinks they've innovated by slicing bread.
#mathstr #nodumbquestions #somedumbquestions
That looks like a solid way to get in way over my head lol
I dunno, seems undercomplicated. You're the pro here though


