No AA filter? More of a marketing hype

Back in 2012 when D800 was released, Nikon did a bit tweaking on its antialiasing filter which led to the higher resolution D800E. A pair of birefringent crystal is organized in the parallel 180 degree to cancel out the effect. But were they worth it? As we had disassembled more camera, I decided to write a post on how these filter stack is organized.

ChipMod sent me a pair of filters on the Nikon D600. The IMX128 was scraped during monochromatic mod.

Filter set

Filters from D600: UV-IR, CMOS Cover Glass, Color Correction Stack

Back on D7000, I had shown the filter set consists of an antialiasing layer with UV-IR coating and an ICF stack sandwiched from a wave plate, a color correction glass and an other AA layer. Upon receiving the filter, I initially suspect the same. After closer examination, I found the color correction glass was actually just a single thin layer. No wave plate was glued to it. On a micrometer, it registered 0.484mm thick.

Without a wave plate, it’s impossible to spread a point into four, since the two light rays are in orthogonal polarized directions. I thought a workaround was to cut the AA filter at 45 degree instead of 0 or 90. (Here I refer to the orientation to the direction where two light rays separate. The AA filter is always cut perpendicular to the optical axis, or Z-axis, of the birefringent crystal) As such, the blue color could be mixed with red. However, upon inspection under a microscope, this was again rebutted. It turned out, the first UV-IR layer is only blurring on the vertical direction, leaving moiré as is in the horizontal direction.

AA under Microscope

Calibration slide between objective and AA1mm in 100 division

Stage setup with micrometer ruler in the vertical direction

The spread from this filter is around 5 micron and wider than that in D7k. This corresponds to a thicker crystal at 0.8mm. Now we know for sure D600 only blurs vertically. This gives the advantage to gain a bit higher resolution in the horizontal direction. The DPreview had an excellent resolution test confirming the case. D600 resolve horizontally well beyond 36, albeit accompanying color moiré. But it blurs out at around 34 in vertical directions.

Any other cameras also do this? It turns out that many other cameras follow this trend. To name a few: Sony A7Rii, Nikon D5100, and possibly other low end DSLRs all had a single AA glued to a color correction filter. One possibility is to suppress the already strong false color during video live view rising from row skipping. However, I would still argue the effect of this is minimal given the spread distance close to pixel pitch.

The material for AA filter and wave plate is usually crystalline quartz glass. Many website cites lithium niobate and that is incorrect. An argument floats around that quartz has too small a birefringent value and it requires a thick slice. This is true during the early days of digital imaging where pixel pitch were huge! (>10um) Once a proper calculation is done, the above 0.8mm thick material happens to give a close to 5um displacement. Should lithium niobate be used, it would be way too thin to manufacture. Another interesting property with quartz, or fused silica, is its UV transparent property. Based on the above transmission spectrum scan, the AA substrate material permits UV to pass when measured at corner. Lithium niobate would absorb strongly in UV just like those ICFs. Notice that without any coating, the glass itself reflects 10% of light. Again, for emission nebula imaging, you could keep the UV-IR filter.

Cooled CMOS Camera – P3: Image Quality

In the previous post I successfully obtained the test pattern with custom VDMA core. The next step will be to implement an operating system and software on host machine. In order to obtain real time live view and control, both software should be developed in parallel. Thus in this post, let’s take a look at the image quality with a simple baremetal application.

The sensor is capable for 10FPS 14Bit, 30FPS 12Bit, or 70FPS at 10bit ADC resolution. For astrophotography, 14bit provides the best setting for dynamic range and achieves unity gain at default setting. The sensor IR filter holder and the camera mounting plate are still in design. I will only provide a glimpse into some bias and dark images at this moment.

To facilitate dark current estimation, the cover glass protective tape was glued to a piece of cardboard. The whole sensor was then shielded from light with metal can lid. Lastly, the camera assembly was placed inside a box and exposed to -15°C winter temperature. During the process, my camera would continuously acquire 2min dark frames for 2 hours, followed by 50 bias frames.

Bias Hist

Pixel Intensity distribution for a 2×4 repeating block (Magenta, Green, Blue for odd rows)

The above distribution reflects a RAW bias frame. It appears each readout bank has different bias voltage in its construction. The readout banks assignment is a 2 rows by 4 columns repeating pattern, each color for each channel. A spike in the histogram at certain interval implies a scaling factor is applied to odd rows post-digitalization to correct for uneven gain between top and bottom ADCs.

Read Noise Distribution

Read Noise – Mode 3.12 Median 4.13 Mean 4.81

The read noise distribution is obtained by taking standard deviation among 50 bias frames for each pixel. Then I plot the above distribution to look at the mode, median and mean. The result is much better compared to a typical CCD.

Dark_current_minus_15

Finally the dark current in a series of 2-minute exposures is measured by subtracting master bias frame. Two interesting observations: 1. The density plot gets sharper (taller, narrower) as temperature decreases corresponding to even lower dark generation rate at colder temperature. 2. The bias is drifting with respect to temperature. This could be in my voltage regulator or in the sensor, or a combination of two.

The bias drift is usually compensated internally by the clamping circuit prior to ADC. But I had to turn this calibration off due to a specific issue with this particular sensor design. I will elaborate more in a later post. Thus to measure dark generation rate, I have to use FWHM of the noise distribution and compare against that in a bias frame. At temperature stabilization, FWHM was registered at 8.774, while a corrected bias is 8.415 e-. For a Gaussian distribution, FWHM is 2.3548 of sigma. Thus the variance for the accumulated dark current is 1.113 given the independent noise source. As such, the dark generation rate at this temperature is less than 0.01 eps. Excellent!

Preliminary Summary

The sensor performs well in terms of noise. For long exposure, the dark generation rate in this CMOS is more sensitive to temperature change than CCDs. The dark current is massively reduced when cooled below freezing point. The doubling temperature is below 5°C.

LEXP_001

An uncorrected dark frame after 120s exposure showing visible column bias and hot pixels

Scraping the Bayer, Gain or Loss? – A quantitative analysis of mono-mod sensitivity

When you are deep into astrophotography, you’d probably start doing monochromatic deep sky imaging. A typical choice would be cooled CCD imager. These CCD cameras come in a variety of size format and architecture. The most affordable are interline CCD offered by Sony and Kodak (now ONSemi). Then the expensive full frame CCD requiring mechanical shutter from Kodak. Now however, as most of my previous posts and other similar studies have pointed out, CMOS holds a much better edge comparing to CCD. The only problem is, not a lot of CMOS based monochromatic devices are out there for your choice.

CMOSIS CMV12000

One option is the sCMOS from EEV and Fairchild. But I would imagine those to be expensive. Then CMOSIS who offer global shutter ones with monochrome in various format. But their dark current (~125 eps) and read noise (>10e-) figures are not a clear competitor to CCD in any way. Sony makes small format B/W CMOS but nothing bigger than 1 inch format. As such, we saw many specialized conversion service that scrape away the Bayer filter layer these years. Unfortunately, by doing so, you essentially remove the microlens array which boost the quantum efficiency. So in this post, I’m going to investigate the QE loss and gain with such modification.

Data is kindly provided by ChipMod for this study.

The modification steps involve camera disassembly, filter stack removal, followed by prying open the cover glass, protecting the bonding wire and finally scratching the pixel array. For the last step, the scratching actually happens in layers. We’ll use IMX071 cross section EM image from Chipworks again for illustration.

image

The surface texture of an image sensor, as described by ChipMod, varies in resistance to scratching. The first layer to come off, are the microlens array indicated in green arrow. This layer is usually made of polymer. Further applying force would strip away the RGB Bayer filter as well, indicated by the red arrow. The yellow region represents the pixel pitch with the blue defining the photodiode boundary. Comparing the length of blue to yellow, we could estimate the fill factor is 50%. Because of the channel stop, overflow drain on the other axis, the fill factor is typically 40%. The gapless microlens above, focus the light rays onto the photodiode to bring the fill factor close to 90%.

image

The sensor was scraped into 3 vertical regions. From top to bottom, A: the microlens array is removed; B: both layer removed and C: the original one. Comparing A/B tells you how much light the color dye absorbs at that wavelength. A/C tells you how effective are microlens. Finally, B/C gives you the gain/loss after mod.

An identical test condition was set up with a 50F6.5 ED telescope in front of a white screen. 2 wavelength, Ha and Oiii are tested with 7nm FWHM filter in the back. Field is sufficiently flat so center regions are used to calculate mean intensity.

image

Test result

The microlens array performs as expected, it typically boost QE to 2x in native channels. Even in non-native color channel, the uLens still boost signal by 50% or more. Losing the uLens array is a major downside. But considering the absorption of color dye even in its peak transmission, stripping CFA actually minimize the QE loss. For example, in the red channel of H-alpha, signal was at 64% even though losing the uLens should impact the QE by more than half. The same is more apparent in Oiii wavelength. Because green channel peaks at 550nm, at 500nm the absorption is nearly half for this particular sensor. Thus the net result is no different from the original sensor.

In conclusion, mono-mod sacrifices some QE for resolution and all spectrum sensitivity. My estimation puts final peak QE at around photodiode fill factor, or around 45%. The state of art CMOS process maximized the photodiode aperture, making such mod less prone to loss of QE after microlens removal. This is in vast contrast with Kodak interline CCD structure where a 5 fold QE penalty should microlens are stripped away. The mod should perform well for narrowband imaging, especially for emission nebulas. However, a fully microlensed monochromatic sensor is still preferred for broadband imaging.

National Park Time Lapse – Tranquility

Since my last astrophotography road trip in California two and half years ago, I really haven’t spent anytime writing on travel and photography. Amidst the camera project and my PhD, I somehow have accumulated a pile of decent photography yet to be processed or released. But anyway, all those hard work serves to produce better images wouldn’t they. So I took a break in the previous weeks to finish off some of those leftover photo work.

Please enjoy my second time lapse compilation – Tranquility

Included are some of the time lapses I took in Big Bend NP, Mojave National Preserve, Death Valley, Picture Rock Lakeshore and Shenandoah NP. Then there’s also the Jiuzhai Valley in Sichuan, China!

In terms of astrophotography, I only got a few left in hard drive for release. The road trips I cover recently were on the east coast. With light pollution and bad weather along the way, there really weren’t many stars to be seen. Let alone for deep space imaging.

Cygnus

Wide Field Milky Way Center shot in Death Valley

As for 360 panorama, it becomes a routine for me now as the pipeline for 3×6 stitching is well established. In the meantime I start to incorparate the floor image in the stitching process.

Carlsbad CavernsThe WindowWhite SandBig BendPorcupine MountainTybee Island LighthouseShenandoahDeath Valley

Mouse over for location, Click for 360 View

The link to my first time laspe compilation is here:

The Making of a Cooled CMOS Camera – P1

As my last post had suggested, I was working on a camera design. Right now the “prototype”, as I would call it, is in the test phase. The project actually dates back to 3 years ago when we envisioned a large focal area CCD imager customized for deep sky astrophotography. At that time, the price for such a commercialized camera was so prohibitive. The most suitable monochromatic chip was the interline KAL-11002 with a size of 36 x 24mm^2. Unlike full frame CCD which necessitates a mechanical shutter for exposure control, interline could handles this electronically. However, the addition of a shielded VCCD region greatly impacts the quantum efficiency and full well capacity. Beyond that, Kodak CCDs don’t seem to recover QE well enough with microlenses, with peak at 50% and only 30% for 650nm on a B/W device. Later on we started to dig deep into the datasheet and soon we abandoned the project. The accumulated dark current in VCCD was simply too much at the slow readout speed required for decent level of read noise.

KAL-11k

The KAL-11002ABA in the original plan

What happened next was dramatic. After getting my hands on D7000 and the hacking, I was shocked by how good CMOS sensor performs. I soon realized the era for CCD in astronomy might come to an end. Sooner or later, it will too embrace the noiseless CMOS in the telescopes. When Kodak span off its imaging division to Truesense, it soon re leased its first CMOS sensor with sub 4e- read noise and CCD-like dark current. We decided to give it a try.

KAC

Got the sensor, now big challenges lay ahead. To speed up, I decided to use the microZed SOM board as the embedded controller, at least for the prototype. Thus only the power supplies and connecting PCB had to be designed. The Zynq-7010 will configure the sensor with its SPI MIO from the ARM PS side. The data will be received at the FPGA programming logic (PL) and somehow relay to the PS DDR3 memory. The data can then undergo complex calibration and save to SD card or transfered over GbE/USB.

microZed

The microZed SOM with 1GB DDR3 and various I/O

The board is then designed and fabricated with the 754 CPU socket mounting the sensor. The main PCB contains the voltage regulators, oscillator and temperature sensing circuits.

Main_PCB

Stack-up

The data lines go through a relay board, which also provides power to Zynq PL I/O banks. The whole stack is then tripled checked before applying power. After weeks of hardware and software debugging, the sensor was finally configured and running at designated frame rate. Now it’s time to work on verilog in order to receive the data. I’m going to cover that in my next part.

Astrophotography in pure darkness

In Michigan, I could only see one nebula – “Michigan Nebula”. Nah, that’s just a joke in the amateur astronomy society here to complain about the frequency of cloudy nights in the state. For me, the complaint is real. I do not have an observatory for regular imaging. Packing such a heavy weight EQ mount and going to some dark rural site only to find cloud building up is almost frustrating and unacceptable. Now it seems a road trip every half year could offer me better opportunity with the best dark sites in the States.

So here are some examples. During the Christmas of 2013, I went to the Big Bend National Park in Texas. There’s absolutely no light pollution from almost any direction except some desert town outside the park. Terrain should perfectly shade these local glares.

At the dusk we entered the park, but from where we were staying took about 1 hour drive. The surrounding lost its colorful appearance when the last patch of sky became completely black. The headlight of our vehicle and the passing by prevent us from dark adaptation. But when we step out of the car, the brilliant zodiac light immediately catches my attention. It was so bright, even under the streetlight in a parking lot, I could see it reaching 30 degrees high in the sky. The clouds kept me blinded for 1 day and half. It was until the third night that I could view it in its full majesty. Until midnight that day, the zodiac light was still bright on the horizon.

Zodiac Light

Zodiac Light

This time, all the clouds move away to the west and it offered a clear night for astrophotography. I picked a spot near the park entrance to setup my tracking rig, and another camera for time lapse. The Orion’s belt was my imaging priority. In a 2 hour and 40 minutes total exposure, I was able to reveal all the dark nebula and dust bands adjacent to the bright M42 and horsehead.

Orion

Meanwhile, the sunset at Rio Grande Village was considered by us to be the most scenic combination after 3 days of lonely drive in desert.

Rio_grande

360 panorama – Sunset of Rio Grande

Now 6 months have passed, another opportunity took me to the Mojave Desert in California. This time, I’ve substitute the glass inside the optical glass inside with one having antireflection coating. Thus all the glare surrounding bright stars and nebula center are gone. About 10 minutes’ drive away from the small desert town Baker, I set up my AstroTrac on the sandy road of Mojave National Preserve. It was dry hot at such a low altitude. Besides the intermittent wind blowing against you, is the occasional sound from some unknown animal sheltering in the wasteland. The glare from Baker and head light of passing cars on I-15 are on my north, the Rho Ophiuchi Nebula is a perfect target. Yet under this dry heat, it was exhaustive trying to sleep inside a car. I manage to get 100 minutes of exposure in total.

Rho Oph

The Rho Ophiuchi Cloud Complex

This time I’m using the hacked firmware preserving the raw output from the sensor. Now with custom made calibration pipeline developed, I could achieve perfect preprocessing before the actual alignment and stacking.

Meteor and Milky Way

An occasional meteor captured during the time lapse at the same night. The Rho Ophiuchi gradually sets into light dome from southern California as my TT-320X tracking it. The background light would still impact the SNR in the dark nebula.

Some 360 panoramas along the way, click to pan and zoom.

image

Devil's Postpile

image

At monolake, I took a panorama of the sky. But it seems more challenging to process. The sky was divided into 7 areas each 4 subframes. Airglow greatly increases the sky background near horizon that night.

Monolake

Teaser: Nikon DSLR Black Point Hack for Astrophotography

Heads up Astrophotographers, Canon is no longer the best camera in terms of image quality in astrophotography. Today, we, the Nikon Hackers, are first able to extract the real, authentic RAW image from the Nikon D7000. The last hurdle towards serious astro-imaging. Especially for people doing narrow band where background is very dark. It will also promise greater bias and dark calibration.

This is an exciting moment, for me as an amateur astronomer at least. Here’s a quick peak of the dark frame image.

Preview

Here’s the image straight out of the camera. The DSP engine is still treating 0 as black point thus it’s pink on the screen. The histogram also looks weird due to its X-axis is gamma corrected for JPEG preview.

statistics

Average will now be brought back to around 600ADU, the setting for on sensor black level.

As for image quality, Sony Exmor CMOS has far less readout noise and FPN compared to Canon. Dark current is also in the range of 0.15eps stablized under room temperature. Under a typical winter condition, dark current is so low and comparable to cooled CCDs.

Now 2 options are available to get sensor data without any pre-processing. One, get the firmware patch called “True dark current”. The drawback is camera will not use calibrated data. Gr and Gb pixels will not be at the same conversion gain. And currently it is only for D5100 and D7000 as we don’t have time to dig into the assembly codes for other DSLR models. Second option is to get my “Dark Current Enable Tool”. The downside is it’s only transient. Camera will return to normal once power cycled or metering system went asleep.

Thus if you have computer during imaging and use your camera for daylight photography, the second option will be the best. Otherwise if you travel like me, go for the first option and keep 2 copies in the smart phone. Copy the desired version with USB OTG and flash the camera with a charged battery.

5/17/2015 Update:

We released a new firmware patch for D5100/D7000/D800, which trades a menu entry called “Color Space” into one that can activate the original sensor data. Thus you can use your DSLR during travel for both astrophotography and daily photography without the need to flash different firmware or with a computer tether. And here’s a demo: