Well, if we're going to get into the science then why not, I'll bite.
I don't see why there is an assumption that cybereyes are wholly digital. Electrically driven does not mean it is digital. Nowhere in the rules is this stated and makes less sense to me then a combination of analog and digital parts.
Mostly I base my reasoning on that the brain wouldn't know how to interpret a digital signal. A stream of zeros and ones could be interpreted as anything to the visual cortex, which is used to taking input in the form of electrical stimuli (analog signals) from the fovia (I believe, high school bio was a long time ago). Additionally, it is Shadowrun cannon that the brain cannot directly interpret digital signals, one needs a simsense module to interprate digital signals into sensations. The cybereye does not require a simsense module to use, so clearly the signal reaching the brain must be an analog signal, though I'm willing to hear a counter argument.
This leaves two options for how the cybereye works. Either a digital camera that then translates information it picks up back into analog form or an analog camera, though the analog camera may have certain parts that are digitized.
Though either can work, translating light into a digital signal then back into analog again seems somewhat redundent. But that doesn't rule it out. What doesn't make any sense with this setup though is the other visual modes. Thermo, lowlight, vision mag, vision enhancement, all of these take up either essence or capacity. If the information is digital then converted back to analog, then why should it take up capacity? Shouldn't it be software driven? Is the software so complex that each one needs it's own storage space? And if you say it's purely a game balance thing, then why is it ok to have THIS be a game balance call that makes no sense and the ruling on Radar Sensor and spellcasting isn't?
To me it makes far more sense given these things that it is an optical camera system. AR can be achieved by putting a contact lens sized transparent screen in, the light from AR would be an analog signal. Magnification is a physical lens movement. Low light is achieved by use of an artificial tapetum lucidum (part of a cat's eye that allows enhanced night vision, basically cells behind the retina that reflect a small portion of light). I'm at a loss for the moment on how to get thermograph, but if it exists naturally (Trolls, Dwarfs, etc) then it should be able to be artificially duplicated. Recording could be an actual digital camera that recorded the image seen, this need not be placed in the path of the visual light. Smartlink would function just like AR Image Link, Eye enhancement would be superior lenses. Totally possible as an analog system.
This leads to the big point, when you see a spellcasting target with cybereyes the information relayed to your brain need not have even been digitized. That is not the case with Radar Sense or Ultrasound, they HAVE to be converted from a digital form at some point (while analog radar systems do exist size is an issue, plotting time another, resolution, etc, etc. At least I think so I only spent about a minute confirming the info on the net). Anyhow, the fact that the information was digitized and then reproduced as an image means what your seeing is a picture representing the object. No different then wanting to target via TacNet's AR display.
Anyhow, this is how I see it when I analyze the tech aspects, combined with the rule system.