Re: M27 and filters: It's the colors, Dumbbell!


Howard Ritter
 

The way to see a faint DSO the way you’d see it from much closer in space is to use an eyepiece that gives, with a given telescope, an exit pupil that matches your own fully dilated pupil, say 6 mm. If you look at the Orion Nebula through a 12” f/5 Dob with a 31mm TermiNagler eyepiece, the exit pupil will be 31mm/5 or ~6 mm and the magnification ~50x. Since its distance is ~1350 LY, it would appear in the EP as big as it would appear from 50 times closer, as it were, or ~27 LY. And because the exit pupil of the EP matches the entrance pupil of your eye, it appears to have exactly the same surface brightness in both situations – or, indeed, the same as it appears to your dark-adapted eye in a dark-sky location. Getting closer doesn’t make it look brighter, and neither does looking at it with optical aid. They both only make it look bigger, which exactly reciprocates the increase in total light gathered. No optical system can increase the perceived areal brightness of the scene it’s focused on compared to the naked-eye view. Pity!

When people look at a closeup view of the Moon through a large scope with a large exit pupil and exclaim that it’s blindingly bright, remind them that it’s actually less bright than a beach scene on Earth, because of the lower albedo of moondust than of beach sand. It’s just that their eyes are dark-adapted and the Moon’s brightness is overwhelming under the circumstances.

So when SF movies depict starship crews gazing on an emission nebula blazing brightly in lurid colors from up close, it’s pure artistic license. I call what you see with the eyepiece + telescope pair that gives a 6-7mm exit pupil “the Starship Enterprise view”.

I think it’s fair to speak of the true color of DSOs, since their spectrum does indeed correspond to an actual perceived color, as we would see if it were many times brighter, like the way a piece of colored clothing appears to go from grey in the moonlight to blue when you go back in the house. If the multiple variables applying to a sensor and a monitor or a printer are selected to yield a depiction of the piece of clothing that we perceive to be true-color, then the same will yield a true-color depiction of a DSO.

—howard


On Aug 5, 2022, at 5:25 PM, ROBERT WYNNE <robert-wynne@...> wrote:

You confirm what I've suspected. I think that if one could travel in light years most objects would appear grey much like the moon except for those objects that emit wavelengths of light in the [hence] visable spectrum. You mentioned the Andromeda Galaxy and I have long thought the photos were very heavily processed. I had not considered its dynamic range was beyond human vision or how that is possible for those wavelenghts within human comprehension. -Best, Robert
On 08/04/2022 8:32 PM W Hilmo <y.groups@...> wrote:


On 8/4/22 4:21 PM, ROBERT WYNNE wrote:
...I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

No processed deep sky image shows how the object would realistically appear in space to the human eye.  The human eye simply doesn't capture the really faint stuff very well.  Getting closer to another galaxy, for example, would not make it brighter, and it would not reveal color to the naked eye.  For a demonstration of this, consider the Milky Way.  We are inside of it, yet it only appears as a faint, grey band across the sky.  The Andromeda Galaxy is another good example.  Even if you got closer to it, it would mostly look like it does from a super dark site with very transparent skies.

It's reasonable to process deep sky images in a way that is "true to the data".  By that, I mean that it is possible to calibrate the colors so that they match the photons that the sensor picked up.  For narrow band, it's possible to map the data to the RGB channels to reproduce their "proper" color.  I actually try to do that with my images, but the colors just don't pop the way that they do if I exercise some artistic license.

The other thing that we do in processing that is very different from what the naked eye would see is how we stretch the brightness. The dynamic range of many deep sky targets is huge.  Your eyes are much better than a computer monitor at handling this, but in most cases, your eye would not be capable of managing the full dynamic range.  If the faintest parts of a galaxy's spiral arms were somehow bright enough to be visible to the naked eye, the core would be painfully bright.

Join main@ap-ug.groups.io to automatically receive all group messages.