This light wasn't sufficient for my Eakins IMX290 camera. If I wanted a good image I needed both lights cranked up to max and placed within about an inch of the board, which is too close for comfort. That's why I opened it up and started to look into "turbo charging" the light. But the fault there is with the camera, just doesn't have the right sensor for use on a trinocular scope.
Yeah the thing I don't really understand properly is because the IMX290 is supposed to be a very well performing low light sensor... probably even better (for low light) than the one in the Rpi HQ camera.
So it's like... well whats the problem then? Is it the firmware running on the eakins blue camera? Or that the camera is somehow engaged in a "mode" that is not performing to the full capabilities of the sensor? For example if that mode is not exposed to the user, or it's a technical restriction of the way the MCU in the camera communicates with the sensor module, etc.
It does not really make sense to me why the IMX290 (the sensor alone). Would be inadequate and be underperforming. Don't get what I am missing here.
I mean - it's great that the rpi camera can do such a good job at lower cost. But that doesnt then explain why the eakins itself is doing such an inferior job, when it has a (slightly better?) sensor in it.
Or maybe there is some post processing steps going on in the rpi?
Ah wait! now I remember! The rpi camera only can use 2 of those mipi dsi lanes. Half the bandwidth. SO this then limits the fps to 1080p30. Wheras the eakins is doing twice the refresh rate @ 1080p60. So maybe since each frame is being exposed for double the time (on rpi), that then makes it about twice the brightness?
So what would happen if you could limit the eakins IMX290 sensor to do a "true" 30fps? Rather than have it's MCU just be throwing away half of the frames (and still capturing at 60fps regardless).
The reason I bring this up is because some dude out there has also been working on an FPGA adaptor, for the RPI HQ camera. Which uses all 4 dsi lanes. And that implementation (very much a prototype rather than a finished product). Well he can set any resolution or framerate he likes, up to 1000fps.
And basically to increase the FPS is always makes the image significantly DIMMER. As of course you would expect out of any camera.
Hope this helps! Let us know if you can do any further tests to confirm (or deny) with your own cameras. Good day!
And of course this speaks to the differential between the brightness thru the eye pieces compared to the camera image. Perhaps other people just artificially boost the brightness on the captured image? At a loss of quality? But of course it does not work like that. Due to the metal reflections, making it hard to see the solder joints, etc.
Please let us know if you can elaborate any better! Since my own opinions expressed here are mostly guesswork and therefore speculative. It would be nice to know better.