i don't think data structures have this kind of trade-off anyway. usually it's one simple structure (eg, signed integer), 5-10, a larger compound structure, made out of the simple structures, another 5-20 functions.
at least in my experience, data structure to method ratio is fairly consistent, slightly increasing with the complexity of the structure.
also considering the scope of transcendentals with floating point numbers, you got you 100 functions for a simple 64 bit data structure right there if you add in all the stuff hiding in infix and prefix and postfix operators.
i'd go so far as to say that this is a complete red herring because the purpose of a data structure dictates how many functions it needs. a large data structure often needs accessors to control access and/or prevent race conditions.
and then there's also the fact that accessing large data structures tends to be wrapped in FIFO pipes and/or network interfaces, and you can aggregate many of these into one application. Just go look at a typical web service API for an example of one data structure with hundreds of methods.
not meaning to nitpick exactly but data and methods to operate on it isn't really quantifiable in this way.