Does it look like that when you look in the telescope?
Many people wonder if the view through the eyepiece looks like the images presented here. The answer is simple.
Do you see the same amount of detail across the room in candlelight as you do in bright sunlight? No.|
What you're seeing here is a similar scenario. The camera that is being used is capable of time exposures of up to several MINUTES in length. Your eye on the other hand, can become dark adapted, but is still much less sensitive, especially to colors at low lighting levels (color receptors in the eye do not function at low lighting levels).
a significantly less impressive approximation of a visual view (12mm eyepiece)
Even the camera itself has its flaws that must be overcome:
The first problem with CCDs is "hot pixels". These generate overexposed points in image data - tiny abnormally bright pixels of red, green, blue, or any combination thereof. They're like individual pixel receptors that are hypersensitive to light. They can be inherant to the CCD, or the the result of the CCD temperature. Either way, these can be eliminated from "true data" by creating a "dark frame" - a picture of the same duration, at the same temperature, of total darkness (through the scope with the protective cap in place) - which is then subtracted from each data image, leaving the "real data".
A single (noisy) 20-second exposure frame from the camera
The CCD chip images random "noise" as well as the image data. Kind of like taking a picture through
a rained-on windshield. The longer the exposure, the more noise (rain) is captured. This noise has to be subtracted from the
image to leave the "real data". Fortunately, this is relatively easy to do, since the noise is truly random. All we have to do is
take a NUMBER of pictures and average the data. The noise then effectively disappears in the resulting average, since it's always random.
The greater our sampling of pictures, the lower the noise is, and the better the quality of the final image. This is not a linear
relationship - it's more of an inverse square.
"Tracking" of the telescope can be a problem with longer exposure images - showing up as image
drift or "star trails". This can be a real challenge to get rid of - an accurate polar alignment and either good tracking gears
(low "periodic error") or guiding (either visually or automatically with a computer) are essential to trail-free imaging. These "bad frames"
are de-selected, and the good frames are used and averaged together or "stacked".
"Tracking" of the telescope can be a problem with longer exposure images - showing up as image drift or "star trails". This can be a real challenge to get rid of - an accurate polar alignment and either good tracking gears (low "periodic error") or guiding (either visually or automatically with a computer) are essential to trail-free imaging. These "bad frames" are de-selected, and the good frames are used and averaged together or "stacked".
46 frames averaged together yields this result
The last step in imaging is essentially balancing the image. This can include boosting contrast or brightness
(a normal function of a camera exposure) to make up for imaging shortfalls, adjusting color to correct for off-balance
ccd data (a normal function of photographic processing), or resizing (also available for normal photos). The end result
is still REAL data, but it's displayed in a way that essentially maximizes the sensitivity of the camera AFTER the fact.
the final image after sensitivity adjustments (yet keeping white stars white for proper color mix)
Using all these properly, we end up with HIGHER SENSITIVITY to light, and especially color, than the human eye can ever