M27 and filters: It's the colors, Dumbbell!


Howard Ritter
 

More experimenting now that I have a functional mount and a place to leave the Sweet 16 set up. Love it!
These are some test images of the Dumbbell, with and without the 0.75 focal reducer/field flattener that Optec makes just for the Meade 16" non-ACF SCT. Unfortunately, or maybe not, it has no provision for a filter, so the first is with the Nikon D810A alone. I'm surprised by how different it turned out from the filtered images with no FF. The colors are not constrained to what can be formed from blending Ha red and OIII blue-green, and I think look more natural. And so many more stars! I suppose this is an effect of not losing 90+% of stars' light to a narrowband filter. 

Meade 16” SCT w/Optec TC/FF, 1600GTO mount, Nikon D810A full-frame DSLR. Stack of 60 x 60s, unguided, Bortle 7, processed in PS and DeNoise. For overall quality, there's no question that the unfiltered file has the others beat.
Some of the differences between images is due to my ad-hoc and non-systematic approach to processing. I hope this will improve when I migrate to Pixinsight! But not all the difference; the two filtered images have much noisier, mottled backgrounds. I think it has to do with the fact that in the two filtered images, the lack of sky/light pollution background allowed me to squeeze the data harder to bring out fainter nebulosity. The two filtered images go deeper, but at the price of amplifying noise. Looking forward to 10x the integration time with the cooled camera!

—howard

FF/TC, no filter


L-eNhance filter, no FF/TC


L-eXtreme filter, no FF/TC


Brian Kaine
 

Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Howard Ritter
 

Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:

Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


ROBERT WYNNE
 

I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Stuart
 

Robert, this is a major can of worms but the best I can offer, as someone who's been banging his head in this dept, is: 

You can shoot RGB with a monochrome camera, add Luminance if you want (it should span the RGB range exactly ideally) then do one of a variety of G2V calibrations to ensure that star colours are consistent with scientific measurements then the rest is the same as any daytime photo. Some sharpening, some noise reduction, boost the contrast, ... 

BUT, whether our eyes or brains like it or not, much of the very interesting stuff up there has its own mind about what wavelengths to radiate at so if you wanna see it you gotta have a filter designed to let it through. Now, per your comment about Photoshop (or any other software), we humans render colour photos in a handful of colour spaces but our eyes are still those that evolved under our sun so RGB or equivalent is where you usually land (not discounting printing where LAB and CMYK come in). 

The two approaches I'm familiar with, and am by no means expert at, are straight up Narrowband where you pick an order for the SII, H-Alpha and OIII to go in (SHO for Hubble fans) and you're kinda done. You can layer real RGB (G2V calibrated) over this to get star colours or not - user choice. But this mapping is entirely contrived.
OR
Blend the NB data into the RGB data close to where it really lands. For H-Alpha you'd drop it into the R (carefully) and some into the B (to represent H-Beta for which most of us don't have a filter). Then OIII straddles Green and Blue so why not drop it in both? This "enhanced" RGB can look quite realistic but you're still playing fast and loose with the mappings. SII is even more red so what to do with that? Oh, wait, I don't have an SII filter for one of my cameras. Problem solved there!


On Thu, 4 Aug 2022 at 19:21, ROBERT WYNNE <robert-wynne@...> wrote:
I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter=mac.com@groups.io> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Pete Lardizabal
 

I’m someone with a color vision deficiency. Perception verses analytically objective data collection are two different considerations. 

The Courts have touched on the subject of how real or correct an image is. Generally if an image is “unaltered” (out of the camera) or an audit trail/edit log is available the image is acceptable at trial. Interesting in that how the camera is setup can shift “reality”. Then consider images of evidence captured in wavelengths we humans can’t perceive such as UV and IR. Astronomical hobbyist imagers need not worry about standing up to the muster of the Courts; however, Astronomical Researchers face peer review. 

I can perceive “green”; however, my resolution of greens is very poor… before my color vision was properly tested (when I started working in Forensic labs in 1980) I couldn’t fathom why a box of Crayola crayons had so many green choices with different names but they ALL LOOKED THE SAME! 😆

Dynamic range also comes into play. I would argue few can perceive features on the surface of our Sun without filtration and conversely how many can resolve colors in nebula?

When processing my own images (primarily terrestrial daytime) I’ll try and image a “white” card as a reference for color balance. If I didn’t have a chance to do so I’ll try and select temps based on a close fit (shade, cloudy, full sun, etc) and then season to taste. My Wife has been tested as a Superior Color Discriminator and I’ll often ask for her input with respect to color. 😉

I find the art and science of imaging so challenging and enjoyable. 

😎

Pete

On Aug 4, 2022, at 7:21 PM, ROBERT WYNNE <robert-wynne@...> wrote:


I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


W Hilmo
 

On 8/4/22 4:21 PM, ROBERT WYNNE wrote:
...I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

No processed deep sky image shows how the object would realistically appear in space to the human eye.  The human eye simply doesn't capture the really faint stuff very well.  Getting closer to another galaxy, for example, would not make it brighter, and it would not reveal color to the naked eye.  For a demonstration of this, consider the Milky Way.  We are inside of it, yet it only appears as a faint, grey band across the sky.  The Andromeda Galaxy is another good example.  Even if you got closer to it, it would mostly look like it does from a super dark site with very transparent skies.

It's reasonable to process deep sky images in a way that is "true to the data".  By that, I mean that it is possible to calibrate the colors so that they match the photons that the sensor picked up.  For narrow band, it's possible to map the data to the RGB channels to reproduce their "proper" color.  I actually try to do that with my images, but the colors just don't pop the way that they do if I exercise some artistic license.

The other thing that we do in processing that is very different from what the naked eye would see is how we stretch the brightness. The dynamic range of many deep sky targets is huge.  Your eyes are much better than a computer monitor at handling this, but in most cases, your eye would not be capable of managing the full dynamic range.  If the faintest parts of a galaxy's spiral arms were somehow bright enough to be visible to the naked eye, the core would be painfully bright.


Brian Kaine
 

The use of color in astronomical imaging, and especially regarding DSOs, is a very complicated subject.
Even in terms of simple RGB imaging where we are trying to render objects as they “naturally” occur, there
are countless variables involved. Let’s consider the following:

The optics of our telescopes obviously affect the colors we record. Reflector vs. refractor? Different types
of glasses used in lens manufacture? Coatings? Surely we have all heard of different brands of refractors
yielding images that are warmer or cooler to the eye.

Much the same can be said for RGB filters. Which brand do you care to use? Astrodon? Astronomik?
Baader? Chroma? ZWO? They all differ to some degree in the wavelengths they pass for each channel.

What about the atmospheric conditions when we record our images? Urban or country location? Manmade
light pollution? Particulate matter in the atmosphere? Are there fires raging out west?

How about the software we use to process our images? MaxIm DL? PixInsight? Photoshop? StarTools?
Like it or not, they all influence how we do things and how they turn out.

As Pete has mentioned previously, what about our own individual eyesight? No two people perceive
color exactly the same. Even considering my own vision, I perceive colors as slightly cooler with my left
eye. Granted, it’s a subtle difference, but it’s there!

And regardless of how hard we try, we will never know if we are getting things right. Robert has it absolutely
correct; we don’t know how DSOs truly look up close in space, and getting there anytime soon to check
on it isn’t very likely. The best we can do is to make our images aesthetically pleasing, but unfortunately
"pleasing" isn't necessarily the truth.

Personally, I sometimes wonder if all of the effort I put into RGB imaging is really worth it to me. The
most inspiring astronomical images I ever saw were backlit transparencies of Messier objects at a long
gone gallery in Chicago’s Adler Planetarium. They were monochrome, each and every one.

Perhaps one day I’ll switch to working in luminance alone; just taking nice deep images. Maybe Ansel
Adams had it right after all.

Brian


ROBERT WYNNE
 

We can rely upon you or your wife for color discrimination over the Pantone color swatch cards? They used to be the standard against which inks were compounded for color printing back in the day.

Apple has done several projects as has Adobe, who vends Photoshop regarding human perception vs realty both for DPI resolution and color shift discrimination by the human eye. I was part of a study that determined human visual cognition of moving objects and their direction regarding velocity and pixel size. Thanks for the interesting post. -Best, Robert

On 08/04/2022 6:01 PM Pete Lardizabal <p14@...> wrote:



I’m someone with a color vision deficiency. Perception verses analytically objective data collection are two different considerations. 

The Courts have touched on the subject of how real or correct an image is. Generally if an image is “unaltered” (out of the camera) or an audit trail/edit log is available the image is acceptable at trial. Interesting in that how the camera is setup can shift “reality”. Then consider images of evidence captured in wavelengths we humans can’t perceive such as UV and IR. Astronomical hobbyist imagers need not worry about standing up to the muster of the Courts; however, Astronomical Researchers face peer review. 

I can perceive “green”; however, my resolution of greens is very poor… before my color vision was properly tested (when I started working in Forensic labs in 1980) I couldn’t fathom why a box of Crayola crayons had so many green choices with different names but they ALL LOOKED THE SAME! 😆

Dynamic range also comes into play. I would argue few can perceive features on the surface of our Sun without filtration and conversely how many can resolve colors in nebula?

When processing my own images (primarily terrestrial daytime) I’ll try and image a “white” card as a reference for color balance. If I didn’t have a chance to do so I’ll try and select temps based on a close fit (shade, cloudy, full sun, etc) and then season to taste. My Wife has been tested as a Superior Color Discriminator and I’ll often ask for her input with respect to color. 😉

I find the art and science of imaging so challenging and enjoyable. 

😎

Pete

On Aug 4, 2022, at 7:21 PM, ROBERT WYNNE <robert-wynne@...> wrote:

I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


ROBERT WYNNE
 

You confirm what I've suspected. I think that if one could travel in light years most objects would appear grey much like the moon except for those objects that emit wavelengths of light in the [hence] visable spectrum. You mentioned the Andromeda Galaxy and I have long thought the photos were very heavily processed. I had not considered its dynamic range was beyond human vision or how that is possible for those wavelenghts within human comprehension. -Best, Robert

On 08/04/2022 8:32 PM W Hilmo <y.groups@...> wrote:


On 8/4/22 4:21 PM, ROBERT WYNNE wrote:
...I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

No processed deep sky image shows how the object would realistically appear in space to the human eye.  The human eye simply doesn't capture the really faint stuff very well.  Getting closer to another galaxy, for example, would not make it brighter, and it would not reveal color to the naked eye.  For a demonstration of this, consider the Milky Way.  We are inside of it, yet it only appears as a faint, grey band across the sky.  The Andromeda Galaxy is another good example.  Even if you got closer to it, it would mostly look like it does from a super dark site with very transparent skies.

It's reasonable to process deep sky images in a way that is "true to the data".  By that, I mean that it is possible to calibrate the colors so that they match the photons that the sensor picked up.  For narrow band, it's possible to map the data to the RGB channels to reproduce their "proper" color.  I actually try to do that with my images, but the colors just don't pop the way that they do if I exercise some artistic license.

The other thing that we do in processing that is very different from what the naked eye would see is how we stretch the brightness. The dynamic range of many deep sky targets is huge.  Your eyes are much better than a computer monitor at handling this, but in most cases, your eye would not be capable of managing the full dynamic range.  If the faintest parts of a galaxy's spiral arms were somehow bright enough to be visible to the naked eye, the core would be painfully bright.


ROBERT WYNNE
 

Adams and the F64 Club. Each phograph must be in complete focus throughout its depth of field, each photo must use the full range of light available to the photographer and that full range must be seen in the final image. After Man Ray came along everything went to H and few learned basic fine or technical photography. Not that I don't like the occasional photo with the subject swinging a flashlight around to record the tracings but to me it's not reality. Perhaps I am a purest and new to astrophotography, this has been a lingering question since I began. My thanks to all who have responded. I've learned a lot from all of you. -Best, Robert

On 08/04/2022 9:57 PM Brian Kaine <briankaine@...> wrote:


The use of color in astronomical imaging, and especially regarding DSOs, is a very complicated subject.
Even in terms of simple RGB imaging where we are trying to render objects as they “naturally” occur, there
are countless variables involved. Let’s consider the following:

The optics of our telescopes obviously affect the colors we record. Reflector vs. refractor? Different types
of glasses used in lens manufacture? Coatings? Surely we have all heard of different brands of refractors
yielding images that are warmer or cooler to the eye.

Much the same can be said for RGB filters. Which brand do you care to use? Astrodon? Astronomik?
Baader? Chroma? ZWO? They all differ to some degree in the wavelengths they pass for each channel.

What about the atmospheric conditions when we record our images? Urban or country location? Manmade
light pollution? Particulate matter in the atmosphere? Are there fires raging out west?

How about the software we use to process our images? MaxIm DL? PixInsight? Photoshop? StarTools?
Like it or not, they all influence how we do things and how they turn out.

As Pete has mentioned previously, what about our own individual eyesight? No two people perceive
color exactly the same. Even considering my own vision, I perceive colors as slightly cooler with my left
eye. Granted, it’s a subtle difference, but it’s there!

And regardless of how hard we try, we will never know if we are getting things right. Robert has it absolutely
correct; we don’t know how DSOs truly look up close in space, and getting there anytime soon to check
on it isn’t very likely. The best we can do is to make our images aesthetically pleasing, but unfortunately
"pleasing" isn't necessarily the truth.

Personally, I sometimes wonder if all of the effort I put into RGB imaging is really worth it to me. The
most inspiring astronomical images I ever saw were backlit transparencies of Messier objects at a long
gone gallery in Chicago’s Adler Planetarium. They were monochrome, each and every one.

Perhaps one day I’ll switch to working in luminance alone; just taking nice deep images. Maybe Ansel
Adams had it right after all.

Brian


Pete Lardizabal
 

Hi Robert!

 We can rely upon you or your wife for color discrimination over the Pantone color swatch cards?”

Some people have gifted color vision… 

Kim (my Wife) has taken the Farnsworth Munsell 100 Hue Test on multiple occasions and has scored 0 transpositions on two tests and always scored at the “Superior” rating on other tests. I test “Low” with poor resolution ability with respect to anything “green” 🤣.

Reference Standards and Calibrations are a must for commercial work but I’m a hobbyist who shares some works with others. I will check my MacBook Pro monitor for gray scale and have Kim check the “color” here and there. 

On occasion I’ll help out as a backup photographer for a local business but I’ll hand over the RAW files and let them process the files with all the calibrated monitors and certified processing pros to take care of the collected images. 

Usually my at home renderings are pretty close to theirs… but I’m just a processing neophyte who is somewhat color blind having some fun with the hobby. 

😉

Pete

On Aug 5, 2022, at 5:14 PM, ROBERT WYNNE <robert-wynne@...> wrote:


We can rely upon you or your wife for color discrimination over the Pantone color swatch cards? They used to be the standard against which inks were compounded for color printing back in the day.

Apple has done several projects as has Adobe, who vends Photoshop regarding human perception vs realty both for DPI resolution and color shift discrimination by the human eye. I was part of a study that determined human visual cognition of moving objects and their direction regarding velocity and pixel size. Thanks for the interesting post. -Best, Robert
On 08/04/2022 6:01 PM Pete Lardizabal <p14@...> wrote:



I’m someone with a color vision deficiency. Perception verses analytically objective data collection are two different considerations. 

The Courts have touched on the subject of how real or correct an image is. Generally if an image is “unaltered” (out of the camera) or an audit trail/edit log is available the image is acceptable at trial. Interesting in that how the camera is setup can shift “reality”. Then consider images of evidence captured in wavelengths we humans can’t perceive such as UV and IR. Astronomical hobbyist imagers need not worry about standing up to the muster of the Courts; however, Astronomical Researchers face peer review. 

I can perceive “green”; however, my resolution of greens is very poor… before my color vision was properly tested (when I started working in Forensic labs in 1980) I couldn’t fathom why a box of Crayola crayons had so many green choices with different names but they ALL LOOKED THE SAME! 😆

Dynamic range also comes into play. I would argue few can perceive features on the surface of our Sun without filtration and conversely how many can resolve colors in nebula?

When processing my own images (primarily terrestrial daytime) I’ll try and image a “white” card as a reference for color balance. If I didn’t have a chance to do so I’ll try and select temps based on a close fit (shade, cloudy, full sun, etc) and then season to taste. My Wife has been tested as a Superior Color Discriminator and I’ll often ask for her input with respect to color. 😉

I find the art and science of imaging so challenging and enjoyable. 

😎

Pete

On Aug 4, 2022, at 7:21 PM, ROBERT WYNNE <robert-wynne@...> wrote:

I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Pete Lardizabal
 

On Aug 5, 2022, at 6:10 PM, Pete Lardizabal <p14@...> wrote:


Hi Robert!

 We can rely upon you or your wife for color discrimination over the Pantone color swatch cards?”

Some people have gifted color vision… 

Kim (my Wife) has taken the Farnsworth Munsell 100 Hue Test on multiple occasions and has scored 0 transpositions on two tests and always scored at the “Superior” rating on other tests. I test “Low” with poor resolution ability with respect to anything “green” 🤣.

Reference Standards and Calibrations are a must for commercial work but I’m a hobbyist who shares some works with others. I will check my MacBook Pro monitor for gray scale and have Kim check the “color” here and there. 

On occasion I’ll help out as a backup photographer for a local business but I’ll hand over the RAW files and let them process the files with all the calibrated monitors and certified processing pros to take care of the collected images. 

Usually my at home renderings are pretty close to theirs… but I’m just a processing neophyte who is somewhat color blind having some fun with the hobby. 

😉

Pete

On Aug 5, 2022, at 5:14 PM, ROBERT WYNNE <robert-wynne@...> wrote:


We can rely upon you or your wife for color discrimination over the Pantone color swatch cards? They used to be the standard against which inks were compounded for color printing back in the day.

Apple has done several projects as has Adobe, who vends Photoshop regarding human perception vs realty both for DPI resolution and color shift discrimination by the human eye. I was part of a study that determined human visual cognition of moving objects and their direction regarding velocity and pixel size. Thanks for the interesting post. -Best, Robert
On 08/04/2022 6:01 PM Pete Lardizabal <p14@...> wrote:



I’m someone with a color vision deficiency. Perception verses analytically objective data collection are two different considerations. 

The Courts have touched on the subject of how real or correct an image is. Generally if an image is “unaltered” (out of the camera) or an audit trail/edit log is available the image is acceptable at trial. Interesting in that how the camera is setup can shift “reality”. Then consider images of evidence captured in wavelengths we humans can’t perceive such as UV and IR. Astronomical hobbyist imagers need not worry about standing up to the muster of the Courts; however, Astronomical Researchers face peer review. 

I can perceive “green”; however, my resolution of greens is very poor… before my color vision was properly tested (when I started working in Forensic labs in 1980) I couldn’t fathom why a box of Crayola crayons had so many green choices with different names but they ALL LOOKED THE SAME! 😆

Dynamic range also comes into play. I would argue few can perceive features on the surface of our Sun without filtration and conversely how many can resolve colors in nebula?

When processing my own images (primarily terrestrial daytime) I’ll try and image a “white” card as a reference for color balance. If I didn’t have a chance to do so I’ll try and select temps based on a close fit (shade, cloudy, full sun, etc) and then season to taste. My Wife has been tested as a Superior Color Discriminator and I’ll often ask for her input with respect to color. 😉

I find the art and science of imaging so challenging and enjoyable. 

😎

Pete

On Aug 4, 2022, at 7:21 PM, ROBERT WYNNE <robert-wynne@...> wrote:

I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Howard Ritter
 

The way to see a faint DSO the way you’d see it from much closer in space is to use an eyepiece that gives, with a given telescope, an exit pupil that matches your own fully dilated pupil, say 6 mm. If you look at the Orion Nebula through a 12” f/5 Dob with a 31mm TermiNagler eyepiece, the exit pupil will be 31mm/5 or ~6 mm and the magnification ~50x. Since its distance is ~1350 LY, it would appear in the EP as big as it would appear from 50 times closer, as it were, or ~27 LY. And because the exit pupil of the EP matches the entrance pupil of your eye, it appears to have exactly the same surface brightness in both situations – or, indeed, the same as it appears to your dark-adapted eye in a dark-sky location. Getting closer doesn’t make it look brighter, and neither does looking at it with optical aid. They both only make it look bigger, which exactly reciprocates the increase in total light gathered. No optical system can increase the perceived areal brightness of the scene it’s focused on compared to the naked-eye view. Pity!

When people look at a closeup view of the Moon through a large scope with a large exit pupil and exclaim that it’s blindingly bright, remind them that it’s actually less bright than a beach scene on Earth, because of the lower albedo of moondust than of beach sand. It’s just that their eyes are dark-adapted and the Moon’s brightness is overwhelming under the circumstances.

So when SF movies depict starship crews gazing on an emission nebula blazing brightly in lurid colors from up close, it’s pure artistic license. I call what you see with the eyepiece + telescope pair that gives a 6-7mm exit pupil “the Starship Enterprise view”.

I think it’s fair to speak of the true color of DSOs, since their spectrum does indeed correspond to an actual perceived color, as we would see if it were many times brighter, like the way a piece of colored clothing appears to go from grey in the moonlight to blue when you go back in the house. If the multiple variables applying to a sensor and a monitor or a printer are selected to yield a depiction of the piece of clothing that we perceive to be true-color, then the same will yield a true-color depiction of a DSO.

—howard


On Aug 5, 2022, at 5:25 PM, ROBERT WYNNE <robert-wynne@...> wrote:

You confirm what I've suspected. I think that if one could travel in light years most objects would appear grey much like the moon except for those objects that emit wavelengths of light in the [hence] visable spectrum. You mentioned the Andromeda Galaxy and I have long thought the photos were very heavily processed. I had not considered its dynamic range was beyond human vision or how that is possible for those wavelenghts within human comprehension. -Best, Robert
On 08/04/2022 8:32 PM W Hilmo <y.groups@...> wrote:


On 8/4/22 4:21 PM, ROBERT WYNNE wrote:
...I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

No processed deep sky image shows how the object would realistically appear in space to the human eye.  The human eye simply doesn't capture the really faint stuff very well.  Getting closer to another galaxy, for example, would not make it brighter, and it would not reveal color to the naked eye.  For a demonstration of this, consider the Milky Way.  We are inside of it, yet it only appears as a faint, grey band across the sky.  The Andromeda Galaxy is another good example.  Even if you got closer to it, it would mostly look like it does from a super dark site with very transparent skies.

It's reasonable to process deep sky images in a way that is "true to the data".  By that, I mean that it is possible to calibrate the colors so that they match the photons that the sensor picked up.  For narrow band, it's possible to map the data to the RGB channels to reproduce their "proper" color.  I actually try to do that with my images, but the colors just don't pop the way that they do if I exercise some artistic license.

The other thing that we do in processing that is very different from what the naked eye would see is how we stretch the brightness. The dynamic range of many deep sky targets is huge.  Your eyes are much better than a computer monitor at handling this, but in most cases, your eye would not be capable of managing the full dynamic range.  If the faintest parts of a galaxy's spiral arms were somehow bright enough to be visible to the naked eye, the core would be painfully bright.


Jay Freeman
 

It might be illuminating (pardon me) to look at the moon in full daylight, using a telescope/eyepiece combination that produces an exit pupil the same diameter as the pupil of your eye in daylight. Keep both eyes open and do not use an eyepiece cup. See whether the moon's sunlit surface looks dark or bright then.

Some DSOs of course show noticeable color when viewed through a telescope. Among those are the Ring Nebula, the Orion Nebula, and the Triffid Nebula. For the latter, ask viewers whether the two lobes show color. (Typical reaction: "No...") Then ask if the two lobes are the same color. (Typical reaction: "Ohhhhh...!")

-- Jay Reynolds Freeman
---------------------
Jay_Reynolds_Freeman@...
http://JayReynoldsFreeman.com (personal web site)


ROBERT WYNNE
 

Thanks for sheddig some light (pardon me) on the subject. -Best, Robert

On 08/05/2022 5:15 PM Jay Freeman via groups.io <jay_reynolds_freeman@...> wrote:


It might be illuminating (pardon me) to look at the moon in full daylight, using a telescope/eyepiece combination that produces an exit pupil the same diameter as the pupil of your eye in daylight. Keep both eyes open and do not use an eyepiece cup. See whether the moon's sunlit surface looks dark or bright then.

Some DSOs of course show noticeable color when viewed through a telescope. Among those are the Ring Nebula, the Orion Nebula, and the Triffid Nebula. For the latter, ask viewers whether the two lobes show color. (Typical reaction: "No...") Then ask if the two lobes are the same color. (Typical reaction: "Ohhhhh...!")

-- Jay Reynolds Freeman
---------------------
Jay_Reynolds_Freeman@...
http://JayReynoldsFreeman.com (personal web site)





ROBERT WYNNE
 

Shedding not shedig. My apologies for typing too fast there have been a number of other typos that slipped through and apologies for those as well. - Best, Robert

On 08/05/2022 5:20 PM ROBERT WYNNE <robert-wynne@...> wrote:


Thanks for sheddig some light (pardon me) on the subject. -Best, Robert
On 08/05/2022 5:15 PM Jay Freeman via groups.io <jay_reynolds_freeman@...> wrote:


It might be illuminating (pardon me) to look at the moon in full daylight, using a telescope/eyepiece combination that produces an exit pupil the same diameter as the pupil of your eye in daylight. Keep both eyes open and do not use an eyepiece cup. See whether the moon's sunlit surface looks dark or bright then.

Some DSOs of course show noticeable color when viewed through a telescope. Among those are the Ring Nebula, the Orion Nebula, and the Triffid Nebula. For the latter, ask viewers whether the two lobes show color. (Typical reaction: "No...") Then ask if the two lobes are the same color. (Typical reaction: "Ohhhhh...!")

-- Jay Reynolds Freeman
---------------------
Jay_Reynolds_Freeman@...
http://JayReynoldsFreeman.com (personal web site)






Barbara Harris
 

I love the unfiltered image because I love seeing it with the many background stars.  BTW, I have a similar setup with Meade LX200 16” with the Optec focal reducer on an AP1200. I have photometric filters since most of my work is photometry.

Barbara