According to scientist and photographer Dr. Roger Clark, the resolution of the human eye is 576 megapixels. That’s huge when you compare it to the 12 megapixels of an iPhone 7’s camera. But what does this mean, really? Is the human eye really analogous to a camera?
A 576-megapixel resolution means that in order to create a screen with a picture so sharp and clear that you can’t distinguish the individual pixels, you would have to pack 576 million pixels into an area the size of your field of view. To get to his number, Dr. Clark assumed optimal visual acuity across the field of view; that is, it assumes that your eyes are moving around the scene before you. But in a single snapshot-length glance, the resolution drops to a fraction of that: around 5–15 megapixels.
That’s because your eyes have a lot of flaws that wouldn’t be acceptable in a camera. You only see high resolution in a very small area in the center of your vision, called the fovea. You have a blind spot where your optic nerve meets up with your retina. You move your eyes around a scene not only to take in more information but to correct for these imperfections in your visual system.
Really, though, the megapixel resolution of your eyes is the wrong question. The eye isn’t a camera lens, taking snapshots to save in your memory bank. It’s more like a detective, collecting clues from your surrounding environment, then taking them back to the brain to put the pieces together and form a complete picture. There’s certainly a screen resolution at which our eyes can no longer distinguish pixels — and according to some, it already exists — but when it comes to our daily visual experience, talking in megapixels is way too simple.
Check out my related post: Are people who glasses more intelligent?