Uh oh, I'm too scared now...
There's two ways to get to it. One is probabilistic and I got it from thinking about probabilities in bitcoin hashing, and that's by far the more interesting way to get there, but also I haven't thought about it in a long time, so I don't think can do it justice now.
The other is just an analogy with data compression in computers, and how in a game you impose limits on things like speed and view distance so your graphics card doesn't freeze up or melt. If you start with the position that there's a finite amount if data in a given space, then you get all sorts of cool ideas, like an equivalence between data density per volume and matter density, as in the closeness of atoms to each other. I feel like that's a reasonable equivalency, but I wouldn't want to posit exactly how much information is possible per volume, since it could be the case that a thing like an atom or a wave is only a compression of data, and only meaningful to our consciousness, and may look entirely different when "unzipped." But the very simplest way to say it is, speed of one thing relative to another is information, and if there's a limit on that relative speed, which cannot be broken (but interestingly can be massaged), then the limit may actually constitute a file size limit for whatever contains this entire frame.
And if that made no sense, I don't blame you!