they sure aren't as fragile as they used to be, it's just the sheer number of components, and the nature of statistical sampling used to test stuff
the failure rates can be reduced by testing a larger number of the production batch but in doing so you have to raise the cost, because very often the tests are destructive
the same thing with CPUs, they actually just make a batch of chips, test them at baseline, and then escalate until the failure rate passes a percentage and they clock the whole batch at that same speed they got the less-than-target failure rate on
so, i'm sure you can see the combinatorial problem of the thousands of components in a CPU, if the chances are that 0.001% for each component to fail it doesn't take many pieces until you have a high probability