Humans are a visual species, so it's not surprising that our eyes work pretty well - though we don't compare to avians. ClarkVision compares the eye to a digital camera, and claims a resolution equivalence of about 580 megapixels, a relatively mediocre ISO 800 sensitivity (and only grayscale for that), roughly f3.5 and @ 20mm focal length, and an awesome (albeit complex) visual range. (link via Kotke)
It's a great set of references from a photographer and professional astronomer*. I'm not sure how this translates into realtime perception however, and that's the bit that matters. I recall reading that the pathways beween the retina and the visual cortex have pretty limited bandwidth, and the visual connections to the prefrontal cortex are astoundingly weak. It's as though the world's best camera were connected to your computer by an RS-232 serial cable. There has to be an incredible amount of pre-processing and lossy compression to get any useful realtime work, and for us only realtime counts. On the other end of the circuit, the brain is doing a lot of informed guessing to create it's simulacra of "reality".
This is why a human studying a photograph will get much more from the image than they can ever perceive from a realtime glance. The eye is a marvelous camera, but evolution hasn't had harder time optimizing the neural interfaces.
By the way, how good might the eye/brain be at lossy compression and re-representation of image input? One clue is how successful living organisms are at storing their "construction specifications" and startup machinery in a single cell (egg, the sperm could be eliminated). That's a level of data compression/packing (relatively lossless) orders of magnitude greater than we can achieve with current technologies.
* I've noticed less repetition lately of the absurd "bloggers are ignorant fools" meme.