Testing the flat Earth idea with broadcast FM radio signals (roughly 100 MHz):
If the transmitting antenna is on a mountain near Los Angeles, for example, at 300 meters high, and the receiving antenna is on a Pacific island on a hill at say, 300 feet, with only ocean in between, there would be no mountains or other obstacles to alter our calculations.
Let us say that the FM broadcast transmitter is running 50,000 Watts (+77 dBm) into a unity-gain antenna, the receive antenna is also unity gain, and the goal is a 20 dB signal-to-noise ratio, good enough for decent high-fidelity reception. This would typically take a signal strength of -73 dBm at the receiver antenna terminals. Therefore, the allowable path loss is 150 dB.
The question then is, how far away could the island be from Los Angeles?
On a flat Earth, you would only be limited by path loss, the horizon could never block anything at surface level or above. So path loss strictly follows the square law rule (loss increases 6 dB every time you double the distance). Running the numbers, the island could be 4,700 miles away and still give you perfect reception.
On a spherical Earth, the radio horizon must be taken into account. For the antenna heights given, the radio horizon is at 60 miles. The signal would still be more than adequately strong at 60 miles, but would attenuate very precipitately beyond that distance. So, on the real Earth, the island could be no more than 60 miles away.