Last night I went out and took a picture of the full moon. I do not have a telescope with a camera adaptor, nor do I have an extremely long lens. But with nothing but the 55-200mm zoom that came with my Nixon D3300 DSLR, I was able to come up with a pretty good estimate for the angular size of the moon.
The shot is not very sharp, because although the sensor is 24 megapixels, I used autofocus (I was pressed for time) and shot handheld. Here is the original (scaled for the blog):
Obviously, not very close, but good enough to get a pretty good reading on the pixel width, which turned out to be 435 pixels. Here's the image cropped:
this calculator, using the lens at 200mm, the angle of view is six degrees, forty-three minutes, thirty seconds, and each pixel represents 4.043 seconds. So 435 pixels times 4.043 seconds is about 1759 seconds, or about 29 and a half minutes of arc.
According to another online calculator at calsky.com, the moon's angular size was 29 minutes, 49 seconds, so my estimate from this simple picture was only about 1/180th of a degree off.
What is the purpose of this exercise? Foremost, it is to show that these measurements given by astronomers are not "received wisdom" that we just accept without challenge. Everyone has access to at least some way to verify that they are true. And this is true of most measurements involving Earth, the moon, and the sun.
A secondary purpose is to get anyone tempted by the flat-Earth idea to do a little mathematical thinking. It can be shown mathematically that an object with an angular size of about 29.75 minutes is about 115 times its width away from the observer. You can even test this experimentally by taking a picture of, say, a basketball from 30 yards away.
Now, if a bunch of people all over the world do the exercise to measure the angular size of the moon at the same time, and they all get similar results, this presents a big problem for the flat-Earth notion. Because if I take the above picture at 9:30 in New England, and someone else takes a picture in, say, England at 2:30 in the morning, and we both get the same result, then the moon must be very far away.
Otherwise, one of us would be seeing a much smaller moon. It's just the mathematics of the situation. Again, this is something that can be modeled to scale to verify the theory.
If the moon is very far away, then it must be quite large, certainly more than the 32 miles or whatever that flat-Earthers claim. And if it's big and far away from a flat plane, then how can it possibly be ten degrees above horizontal, as it was when I photographed it last night?
Flat-Earthers will be tempted to invoke some kind of refraction, similar to Rob Skiba's version of atmospheric lensing. But before you go down that road, you should spend a little more time with lenses. Because if a lens magnifies something, it doesn't do it selectively. A lens isn't going to make the moon look bigger as it gets further away without also making seem higher off the ground at the same time.
The next temptation might be to say: "Well, the moon isn't a real object; it's a self-illuminating projection of some sort." I can tell you several reasons why that can't be true, but there's no need. Because even a self-illuminating projection can't defy the simple truth of angular size.
In short, there is no way that any model of a flat Earth can match what we see in reality. It just doesn't add up (or divide or multiply, for that matter).
If you are going to go around people to "trust their senses," then you should be prepared to follow through with some kind of proposition that actually matches what our senses tell us.
Or better yet, trust your senses. They will tell you that the Earth is, indeed, a globe.