Replying to Avatar Super Testnet

Thoughts about Moore's law

The chart below is in logarithmic scale:

https://qph.cf2.quoracdn.net/main-qimg-c4d853586c9387861f0f219700476c85-lq

It charts the growth of cpu speeds in comparison with the VAX 11/780. It suggests that Moore’s law stopped applying around 2003. I decided to continue the chart using gigaflops as a metric for cpu speed.

I don’t have data on how many floating point operations the VAX 11/780 could do but I suspect it is “none” because I think the first cpu advertised as being capable of floating point operations was the Intel 8087 in 1980.

According to stack exchange it could do 50 kiloflops or .05 megaflops or .00005 gigaflops. It was in 1975 that Moore made his prediction that computers would double in speed every eighteen months, which is the same as saying it would quadruple every three years. If that had been what happened, these would be the numbers:

1980: .00005 gigaflops

1983: 0.0002 gigaflops

1986: 0.0008 gigaflops

1989: 0.0032 gigaflops

1992: 0.0128 gigaflops

1995: 0.0512 gigaflops

1998: 0.2048 gigaflops

2001: 0.8192 gigaflops

2004: 3.2768 gigaflops

2007: 13.1072 gigaflops [note that in 2007 the Intel Core 2 Quad Q6600 was a high end cpu, and it could do 25 gigaflops so we were a bit ahead of Moore’s law]

2010: 52.4288 gigaflops [note that in 2010 the Intel Core i7-980X was a high end cpu, and it could do 98.4 gigaflops so we were a bit ahead of Moore’s law]

2013: 209.7152 gigaflops [note that in 2013 the Intel Core i7-4770K was a high end cpu, and it could only do 177 gigaflops so we fell behind Moore’s law at about this time]

2016: 838.8608 gigaflops

2019: 3,355.4432 gigaflops

2022: 13,421.7728 gigaflops

2025: 53,687.0912 gigaflops

But in 2025 we are actually at 1,696 gigaflops in 2025 with the Intel Core i9-10900K. We are now significantly underperforming Moore’s law. It slowed down around 2013, by my numbers.

At a glance, looks more line a power law distribution. CPU performance is running into physical limits of transistors. I never thiught about floating point operations, why that instead of CPU performance?

Reply to this note

Please Login to reply.

Discussion

I don't understand much about how computers work but I figured gigaflops *was* a measurement of CPU performance. What other number should I look at instead?

I am not saying it's not. I just never thought to use it as a benchmark. I would bet it translates somehow to CPU cycles/ instructions per cycle, just not sure how. I think in my mind I always translated gigaflops to GPUs?