What is this new theory?” the long-retired New York University cognitive psychologist, Lloyd Kaufman, asked me. We were sitting behind the wooden desk of his cozy home office. He had a stack of all his papers on the moon illusion, freshly printed, waiting for me on the adjacent futon. But I couldn’t think of a better way to start our discussion than to have him respond to the latest thesis claiming to explain what has gone, for thousands of years, unexplained: Why does the moon look bigger when it’s near the horizon? He scooted closer to his iMac, tilted his head and began to read the MIT Technology Review article I had pulled up.1 I thought I’d have a few moments to appreciate, as he read, the view of New York City outside the 28th floor window of his Floral Park apartment, but within a half-minute he told me, “Well, it’s clearly wrong.” It wasn’t even my theory, yet I felt astonished. It described two researchers—Joseph Antonides (an undergraduate) and Toshiro Kubota (a computer scientist), of Susquehanna University in Pennsylvania—who had constructed a perceptual model in which the sky was contiguous with the horizon, so that the moon was placed, as it were, in front of the sky, occluding it.2 Since our depth perception also places the moon farther away from us than the horizon, we are faced with a perceptual dilemma. The scientists reasoned that the horizon moon’s enlargement is a product of the brain trying to solve this dilemma. It’s wrong, he told me, because “you can get the illusion if you have only one eye. Simple!” Full article via Nautilus.