Quote:
* Human eyes can process 36,000 bits of information every hour.
|
Can someone explain this to me? What do they mean by "process" here? Do they mean how fast the retina transmits information to the brain? If so, I am assuming that they don't mean "binary digits" when they say "bits" - because that would be a pathetically low throughput (10 bits per second ... even an old 14.4kbps modem is more than a thousand times faster). A bit of googling (not in-depth investigation in any way) tells me that some research finds that the transmission speed is roughly 10 million bits per second (i.e. roughly the theoretical max speed of 10BASE-T ethernet)... which is 1 million times greater than what's mentioned above. So what unit of information are they referring to when they mean "bits" ... ? If it's just the common english word "bit" meaning "a small piece", then... well, that's a bit (-_-) vague, isn't it?