Date   

Summer Skies gifted with Solar Prominence

Jeff Ball Gmail
 

Hello APUG,

Our family vacationed in Surfside Beach, SC this past week. I had envisioned a video production celebrating the “Summer Sky”. I planned on capturing the beauty of the sky with clouds, stars, Milky Way, Sun, and Moon. I brought my Astro-Physics Stowaway and Daystar Quark Chromosphere Eyepiece along for hydrogen alpha views and images of the Sun. Boy was I in for a treat. I viewed/imaged the Sun every morning and was taken aback with the views on August 3. The prominence was fantastic and lingered well into the next day. Below are the images and links to the video production. I will soon have a video reviewing and discussing viewing/imaging with the Daystar Quark Chromosphere eyepiece. So far, it has been a blast since I picked it up in December. Clear skies and thanks for stopping by.

Link to blog with images and video

Astrobin

Sincerely,
Jeff Ball


Spacing options?

Brent Boshart
 

My camera setup is a ZWO 2600MM camera which is bolted to the ZWO EFW 7x36, bolted to a ZWO OAG-L, bolted to the ZWO tilt adapter. From the tilt adapter I have a PreciseParts adapter attached to the 92FF on my Stowaway.  I needed to move the sensor further away so put spacers between the ZWO tilt adapter and PP adapter.  Unfortunately I need just a little more length but am not comfortable spacing any further between the tilt adapter and PP adapter as there would be not many threads connecting.  Has anyone used spacers between the 92FF and PP adapter? Is that possible? I am trying to avoid a re-purchase of the PP adapter - it should have worked but there must be some variability with 2600mm specs???


Re: M27 and filters: It's the colors, Dumbbell!

ROBERT WYNNE
 

Shedding not shedig. My apologies for typing too fast there have been a number of other typos that slipped through and apologies for those as well. - Best, Robert

On 08/05/2022 5:20 PM ROBERT WYNNE <robert-wynne@...> wrote:


Thanks for sheddig some light (pardon me) on the subject. -Best, Robert
On 08/05/2022 5:15 PM Jay Freeman via groups.io <jay_reynolds_freeman@...> wrote:


It might be illuminating (pardon me) to look at the moon in full daylight, using a telescope/eyepiece combination that produces an exit pupil the same diameter as the pupil of your eye in daylight. Keep both eyes open and do not use an eyepiece cup. See whether the moon's sunlit surface looks dark or bright then.

Some DSOs of course show noticeable color when viewed through a telescope. Among those are the Ring Nebula, the Orion Nebula, and the Triffid Nebula. For the latter, ask viewers whether the two lobes show color. (Typical reaction: "No...") Then ask if the two lobes are the same color. (Typical reaction: "Ohhhhh...!")

-- Jay Reynolds Freeman
---------------------
Jay_Reynolds_Freeman@...
http://JayReynoldsFreeman.com (personal web site)






Re: M27 and filters: It's the colors, Dumbbell!

ROBERT WYNNE
 

Thanks for sheddig some light (pardon me) on the subject. -Best, Robert

On 08/05/2022 5:15 PM Jay Freeman via groups.io <jay_reynolds_freeman@...> wrote:


It might be illuminating (pardon me) to look at the moon in full daylight, using a telescope/eyepiece combination that produces an exit pupil the same diameter as the pupil of your eye in daylight. Keep both eyes open and do not use an eyepiece cup. See whether the moon's sunlit surface looks dark or bright then.

Some DSOs of course show noticeable color when viewed through a telescope. Among those are the Ring Nebula, the Orion Nebula, and the Triffid Nebula. For the latter, ask viewers whether the two lobes show color. (Typical reaction: "No...") Then ask if the two lobes are the same color. (Typical reaction: "Ohhhhh...!")

-- Jay Reynolds Freeman
---------------------
Jay_Reynolds_Freeman@...
http://JayReynoldsFreeman.com (personal web site)





Re: Vikas Chander APOD

ROBERT WYNNE
 

I think we agree on point. There may even be a million ways to render this image. Some edit digital images with 16 million different colors down to the pixel to suit their own taste. Even eliminating what are regarded as "unwanted" pixels.The beauty of one version of an image over another only proves the old saw, that beauty is in the eye of the beholder. But my question remains; is the captured image true to the actual image for the human eye within certain close parameters or is it a matter of individual subjective taste gone wild? In the medical industry there are a lot of ways to enhance photos of certain medical conditions to arrive at a precise diagnosis especially with staining and contrast media, but yet only one way to arrive at a realistic photo as one post referred a legally accurate photograph. Perhaps a comparison to X-Ray imaging provides a cleaner comparison. There is only one real variable a radiologist uses when interpreting a X-ray image - contrast and perhaps light and dark as B&W is the only variable in this form of imaging. If astrographs are allowed the full palette of variables in color we are much like sheep who have lost their way. If only shown a "finished processed image" we don't know much information about the original image. I agree with those who have posted that using filters for specific bandwidths where the color is known for a particular bandwidth is the only way to produce true to the original images particularly if the time spent with each filter is compensated. -Best, Robert

On 08/05/2022 3:59 PM Roland Christen via groups.io <chris1011@...> wrote:


There must be 1000 ways to present or take this data. Here's my version taken with a small chip CCD:
https://www.astrobin.com/full/bjp0m9/B/
https://www.astrobin.com/full/bjp0m9/B/?mod=&real=

Rolando


-----Original Message-----
From: ROBERT WYNNE <robert-wynne@...>
To: main@ap-ug.groups.io; Roland Christen via groups.io <chris1011@...>; main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, Aug 5, 2022 4:46 pm
Subject: Re: [ap-ug] Vikas Chander APOD

That is special and gorgeous. I have to wonder how many ways one could process data for this image and produce a different but equally gorgeous photo? -Best, Robert
On 08/05/2022 11:27 AM Roland Christen via groups.io <chris1011@...> wrote:


Congrats to Mr. Chander for his gorgeous image of the Trifid nebula. https://apod.nasa.gov/apod/ap220805.html

Taken with his 150 TOA refractor on a 1600AE mount.

Rolando


Re: M27 and filters: It's the colors, Dumbbell!

Jay Freeman
 

It might be illuminating (pardon me) to look at the moon in full daylight, using a telescope/eyepiece combination that produces an exit pupil the same diameter as the pupil of your eye in daylight. Keep both eyes open and do not use an eyepiece cup. See whether the moon's sunlit surface looks dark or bright then.

Some DSOs of course show noticeable color when viewed through a telescope. Among those are the Ring Nebula, the Orion Nebula, and the Triffid Nebula. For the latter, ask viewers whether the two lobes show color. (Typical reaction: "No...") Then ask if the two lobes are the same color. (Typical reaction: "Ohhhhh...!")

-- Jay Reynolds Freeman
---------------------
Jay_Reynolds_Freeman@...
http://JayReynoldsFreeman.com (personal web site)


Re: M27 and filters: It's the colors, Dumbbell!

Howard Ritter
 

The way to see a faint DSO the way you’d see it from much closer in space is to use an eyepiece that gives, with a given telescope, an exit pupil that matches your own fully dilated pupil, say 6 mm. If you look at the Orion Nebula through a 12” f/5 Dob with a 31mm TermiNagler eyepiece, the exit pupil will be 31mm/5 or ~6 mm and the magnification ~50x. Since its distance is ~1350 LY, it would appear in the EP as big as it would appear from 50 times closer, as it were, or ~27 LY. And because the exit pupil of the EP matches the entrance pupil of your eye, it appears to have exactly the same surface brightness in both situations – or, indeed, the same as it appears to your dark-adapted eye in a dark-sky location. Getting closer doesn’t make it look brighter, and neither does looking at it with optical aid. They both only make it look bigger, which exactly reciprocates the increase in total light gathered. No optical system can increase the perceived areal brightness of the scene it’s focused on compared to the naked-eye view. Pity!

When people look at a closeup view of the Moon through a large scope with a large exit pupil and exclaim that it’s blindingly bright, remind them that it’s actually less bright than a beach scene on Earth, because of the lower albedo of moondust than of beach sand. It’s just that their eyes are dark-adapted and the Moon’s brightness is overwhelming under the circumstances.

So when SF movies depict starship crews gazing on an emission nebula blazing brightly in lurid colors from up close, it’s pure artistic license. I call what you see with the eyepiece + telescope pair that gives a 6-7mm exit pupil “the Starship Enterprise view”.

I think it’s fair to speak of the true color of DSOs, since their spectrum does indeed correspond to an actual perceived color, as we would see if it were many times brighter, like the way a piece of colored clothing appears to go from grey in the moonlight to blue when you go back in the house. If the multiple variables applying to a sensor and a monitor or a printer are selected to yield a depiction of the piece of clothing that we perceive to be true-color, then the same will yield a true-color depiction of a DSO.

—howard


On Aug 5, 2022, at 5:25 PM, ROBERT WYNNE <robert-wynne@...> wrote:

You confirm what I've suspected. I think that if one could travel in light years most objects would appear grey much like the moon except for those objects that emit wavelengths of light in the [hence] visable spectrum. You mentioned the Andromeda Galaxy and I have long thought the photos were very heavily processed. I had not considered its dynamic range was beyond human vision or how that is possible for those wavelenghts within human comprehension. -Best, Robert
On 08/04/2022 8:32 PM W Hilmo <y.groups@...> wrote:


On 8/4/22 4:21 PM, ROBERT WYNNE wrote:
...I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

No processed deep sky image shows how the object would realistically appear in space to the human eye.  The human eye simply doesn't capture the really faint stuff very well.  Getting closer to another galaxy, for example, would not make it brighter, and it would not reveal color to the naked eye.  For a demonstration of this, consider the Milky Way.  We are inside of it, yet it only appears as a faint, grey band across the sky.  The Andromeda Galaxy is another good example.  Even if you got closer to it, it would mostly look like it does from a super dark site with very transparent skies.

It's reasonable to process deep sky images in a way that is "true to the data".  By that, I mean that it is possible to calibrate the colors so that they match the photons that the sensor picked up.  For narrow band, it's possible to map the data to the RGB channels to reproduce their "proper" color.  I actually try to do that with my images, but the colors just don't pop the way that they do if I exercise some artistic license.

The other thing that we do in processing that is very different from what the naked eye would see is how we stretch the brightness. The dynamic range of many deep sky targets is huge.  Your eyes are much better than a computer monitor at handling this, but in most cases, your eye would not be capable of managing the full dynamic range.  If the faintest parts of a galaxy's spiral arms were somehow bright enough to be visible to the naked eye, the core would be painfully bright.


Re: Vikas Chander APOD

Ross Elkins
 

Rolando,

Yup, thats what I remembered, my memory is not totally gone!!! I may not remember why I walked into the bedroom but I never forget
great Astro art!


Ross


Re: Vikas Chander APOD

Roland Christen
 

Yes, I just posted the link.

Rolando

-----Original Message-----
From: Ross Elkins <rossmon1@...>
To: main@ap-ug.groups.io
Sent: Fri, Aug 5, 2022 5:15 pm
Subject: Re: [ap-ug] Vikas Chander APOD

That’s gorgeous!

Rolando, didn't you do one that was also spectacular, possibly in Yellow?

Ross






Re: Vikas Chander APOD

Roland Christen
 

There must be 1000 ways to present or take this data. Here's my version taken with a small chip CCD:
https://www.astrobin.com/full/bjp0m9/B/
https://www.astrobin.com/full/bjp0m9/B/?mod=&real=

Rolando


-----Original Message-----
From: ROBERT WYNNE <robert-wynne@...>
To: main@ap-ug.groups.io; Roland Christen via groups.io <chris1011@...>; main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, Aug 5, 2022 4:46 pm
Subject: Re: [ap-ug] Vikas Chander APOD

That is special and gorgeous. I have to wonder how many ways one could process data for this image and produce a different but equally gorgeous photo? -Best, Robert
On 08/05/2022 11:27 AM Roland Christen via groups.io <chris1011@...> wrote:


Congrats to Mr. Chander for his gorgeous image of the Trifid nebula. https://apod.nasa.gov/apod/ap220805.html

Taken with his 150 TOA refractor on a 1600AE mount.

Rolando


Re: Vikas Chander APOD

Bill Long
 

Very nice images.

Interesting that his image scored an APOD, yet not even a nomination for IOTD on ABin... Oh well. 


From: main@ap-ug.groups.io <main@ap-ug.groups.io> on behalf of Roland Christen via groups.io <chris1011@...>
Sent: Friday, August 5, 2022 1:43 PM
To: main@ap-ug.groups.io <main@ap-ug.groups.io>
Subject: Re: [ap-ug] Vikas Chander APOD
 
He has the eye of a professional. More images on his website: https://www.vikaschander.com/

Rolando

-----Original Message-----
From: Arun <arun.k.hegde@...>
To: main@ap-ug.groups.io
Sent: Fri, Aug 5, 2022 3:35 pm
Subject: Re: [ap-ug] Vikas Chander APOD

I have been following Vikas Chander on Astrobin, His widefield images are absolutely spectacular, probably the best I have seen. See for example, this one:

https://www.astrobin.com/hn5dj3/

Among many other stand out images.




On Fri, Aug 5, 2022 at 02:27 PM, Roland Christen wrote:
https://apod.nasa.gov/apod/ap220805.html


Re: Stowaway, 92TCC and corner stars

Bill Long
 

The Astronomik filters have been great. Both the Deep Sky RGB line, the Luminance (L2 in my case), and the MaxFR line. I have not used any other filters of theirs though. I think there are other brands that are 1mm that have all sorts of halo and field artifact problems, mostly with their narrowband filters. 

I have a set of the 3mm thick Chroma filters as well, and I have not seen much of a quality difference. Those are 3nm narrowband filters, which have some benefits over the 6nm ones in some cases. The color filters, I see no difference at all with.


From: main@ap-ug.groups.io <main@ap-ug.groups.io> on behalf of Chris White <chris.white@...>
Sent: Friday, August 5, 2022 3:13 PM
To: main@ap-ug.groups.io <main@ap-ug.groups.io>
Subject: Re: [ap-ug] Stowaway, 92TCC and corner stars
 
The good news is that all of the problems you've seen in 1mm thick filters were eliminated with the newest astronomk line. 

Bill is using these filters with his epsilon and shared a recent ic1396 image he made. I'd challenge anyone to find anything deficient in the data that is filter related. 

He also shared a broadband image made with these newer astronomik filters and the same scope. Again, no filter artifacts. 

Sorry for the tangent. 

Ultimately it was to make the point that for a 1mm thick filter you only need to add 0.33 mm of additional spacing. 


Re: Vikas Chander APOD

Ross Elkins
 

That’s gorgeous!

Rolando, didn't you do one that was also spectacular, possibly in Yellow?

Ross


Re: M27 and filters: It's the colors, Dumbbell!

Pete Lardizabal
 

On Aug 5, 2022, at 6:10 PM, Pete Lardizabal <p14@...> wrote:


Hi Robert!

 We can rely upon you or your wife for color discrimination over the Pantone color swatch cards?”

Some people have gifted color vision… 

Kim (my Wife) has taken the Farnsworth Munsell 100 Hue Test on multiple occasions and has scored 0 transpositions on two tests and always scored at the “Superior” rating on other tests. I test “Low” with poor resolution ability with respect to anything “green” 🤣.

Reference Standards and Calibrations are a must for commercial work but I’m a hobbyist who shares some works with others. I will check my MacBook Pro monitor for gray scale and have Kim check the “color” here and there. 

On occasion I’ll help out as a backup photographer for a local business but I’ll hand over the RAW files and let them process the files with all the calibrated monitors and certified processing pros to take care of the collected images. 

Usually my at home renderings are pretty close to theirs… but I’m just a processing neophyte who is somewhat color blind having some fun with the hobby. 

😉

Pete

On Aug 5, 2022, at 5:14 PM, ROBERT WYNNE <robert-wynne@...> wrote:


We can rely upon you or your wife for color discrimination over the Pantone color swatch cards? They used to be the standard against which inks were compounded for color printing back in the day.

Apple has done several projects as has Adobe, who vends Photoshop regarding human perception vs realty both for DPI resolution and color shift discrimination by the human eye. I was part of a study that determined human visual cognition of moving objects and their direction regarding velocity and pixel size. Thanks for the interesting post. -Best, Robert
On 08/04/2022 6:01 PM Pete Lardizabal <p14@...> wrote:



I’m someone with a color vision deficiency. Perception verses analytically objective data collection are two different considerations. 

The Courts have touched on the subject of how real or correct an image is. Generally if an image is “unaltered” (out of the camera) or an audit trail/edit log is available the image is acceptable at trial. Interesting in that how the camera is setup can shift “reality”. Then consider images of evidence captured in wavelengths we humans can’t perceive such as UV and IR. Astronomical hobbyist imagers need not worry about standing up to the muster of the Courts; however, Astronomical Researchers face peer review. 

I can perceive “green”; however, my resolution of greens is very poor… before my color vision was properly tested (when I started working in Forensic labs in 1980) I couldn’t fathom why a box of Crayola crayons had so many green choices with different names but they ALL LOOKED THE SAME! 😆

Dynamic range also comes into play. I would argue few can perceive features on the surface of our Sun without filtration and conversely how many can resolve colors in nebula?

When processing my own images (primarily terrestrial daytime) I’ll try and image a “white” card as a reference for color balance. If I didn’t have a chance to do so I’ll try and select temps based on a close fit (shade, cloudy, full sun, etc) and then season to taste. My Wife has been tested as a Superior Color Discriminator and I’ll often ask for her input with respect to color. 😉

I find the art and science of imaging so challenging and enjoyable. 

😎

Pete

On Aug 4, 2022, at 7:21 PM, ROBERT WYNNE <robert-wynne@...> wrote:

I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Re: Stowaway, 92TCC and corner stars

Chris White
 

The good news is that all of the problems you've seen in 1mm thick filters were eliminated with the newest astronomk line. 

Bill is using these filters with his epsilon and shared a recent ic1396 image he made. I'd challenge anyone to find anything deficient in the data that is filter related. 

He also shared a broadband image made with these newer astronomik filters and the same scope. Again, no filter artifacts. 

Sorry for the tangent. 

Ultimately it was to make the point that for a 1mm thick filter you only need to add 0.33 mm of additional spacing. 


Re: M27 and filters: It's the colors, Dumbbell!

Pete Lardizabal
 

Hi Robert!

 We can rely upon you or your wife for color discrimination over the Pantone color swatch cards?”

Some people have gifted color vision… 

Kim (my Wife) has taken the Farnsworth Munsell 100 Hue Test on multiple occasions and has scored 0 transpositions on two tests and always scored at the “Superior” rating on other tests. I test “Low” with poor resolution ability with respect to anything “green” 🤣.

Reference Standards and Calibrations are a must for commercial work but I’m a hobbyist who shares some works with others. I will check my MacBook Pro monitor for gray scale and have Kim check the “color” here and there. 

On occasion I’ll help out as a backup photographer for a local business but I’ll hand over the RAW files and let them process the files with all the calibrated monitors and certified processing pros to take care of the collected images. 

Usually my at home renderings are pretty close to theirs… but I’m just a processing neophyte who is somewhat color blind having some fun with the hobby. 

😉

Pete

On Aug 5, 2022, at 5:14 PM, ROBERT WYNNE <robert-wynne@...> wrote:


We can rely upon you or your wife for color discrimination over the Pantone color swatch cards? They used to be the standard against which inks were compounded for color printing back in the day.

Apple has done several projects as has Adobe, who vends Photoshop regarding human perception vs realty both for DPI resolution and color shift discrimination by the human eye. I was part of a study that determined human visual cognition of moving objects and their direction regarding velocity and pixel size. Thanks for the interesting post. -Best, Robert
On 08/04/2022 6:01 PM Pete Lardizabal <p14@...> wrote:



I’m someone with a color vision deficiency. Perception verses analytically objective data collection are two different considerations. 

The Courts have touched on the subject of how real or correct an image is. Generally if an image is “unaltered” (out of the camera) or an audit trail/edit log is available the image is acceptable at trial. Interesting in that how the camera is setup can shift “reality”. Then consider images of evidence captured in wavelengths we humans can’t perceive such as UV and IR. Astronomical hobbyist imagers need not worry about standing up to the muster of the Courts; however, Astronomical Researchers face peer review. 

I can perceive “green”; however, my resolution of greens is very poor… before my color vision was properly tested (when I started working in Forensic labs in 1980) I couldn’t fathom why a box of Crayola crayons had so many green choices with different names but they ALL LOOKED THE SAME! 😆

Dynamic range also comes into play. I would argue few can perceive features on the surface of our Sun without filtration and conversely how many can resolve colors in nebula?

When processing my own images (primarily terrestrial daytime) I’ll try and image a “white” card as a reference for color balance. If I didn’t have a chance to do so I’ll try and select temps based on a close fit (shade, cloudy, full sun, etc) and then season to taste. My Wife has been tested as a Superior Color Discriminator and I’ll often ask for her input with respect to color. 😉

I find the art and science of imaging so challenging and enjoyable. 

😎

Pete

On Aug 4, 2022, at 7:21 PM, ROBERT WYNNE <robert-wynne@...> wrote:

I have long wondered about post-processed images and whether they are true to the eye of the beholder. Or if it's the eye of the beholder that drives how & what the imager wants his image to appear. With Photoshop you can make any photo appear as one would want - apart from PS limitations? There doesn't seem to be something like a NIST standard for astrophotographs. It's one reason I've held back from posting the few  photos I have as I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert
On 08/04/2022 12:59 PM Howard Ritter via groups.io <howard.ritter@...> wrote:


Hi, Brian—

Thanks for the input. I was frankly surprised at how good a result I could get from an unfiltered image made in a Bortle 7 area, especially with my limited processing skills and using only Photoshop. I’d like to see what an experienced image wrangler could do with the file using their choice of software.

I’m with you on using the Hubble palette and other arbitrary color mappings. They were developed for scientific purposes, and i don’t understand the fascination with using them for images that are only made for esthetic enjoyment. I certainly don’t criticize anyone’s taste along these lines, just saying I don’t understand it. De gustibus and all that. 

The only thing everybody might agree on is that the best possible results would come from going to a Bortle 1 for a week of imaging three two or three times a year, leaving narrowband filters behind

—howard


On Aug 4, 2022, at 1:43 PM, Brian Kaine <briankaine@...> wrote:
Howard,

I like all three images, but do prefer the first unfiltered rendition of M27. The colors are
natural, and the starry background really puts the nebula into the context of space. From
an artistic standpoint, I have never cared much for false-color images; I see them as being
arbitrary and artificial. But as a former NASA research scientist, I do appreciate their great
scientific value!

And that first image brings back memories. The first serious astronomy book my parents
bought for me was "Astronomy" by Fred Hoyle; I still have it. The dust jacket is a photo
of the Dumbbell, exactly as in your image.

Thanks for posting.

Brian


Re: Vikas Chander APOD

ROBERT WYNNE
 

That is special and gorgeous. I have to wonder how many ways one could process data for this image and produce a different but equally gorgeous photo? -Best, Robert

On 08/05/2022 11:27 AM Roland Christen via groups.io <chris1011@...> wrote:


Congrats to Mr. Chander for his gorgeous image of the Trifid nebula. https://apod.nasa.gov/apod/ap220805.html

Taken with his 150 TOA refractor on a 1600AE mount.

Rolando


Re: M27 and filters: It's the colors, Dumbbell!

ROBERT WYNNE
 

Adams and the F64 Club. Each phograph must be in complete focus throughout its depth of field, each photo must use the full range of light available to the photographer and that full range must be seen in the final image. After Man Ray came along everything went to H and few learned basic fine or technical photography. Not that I don't like the occasional photo with the subject swinging a flashlight around to record the tracings but to me it's not reality. Perhaps I am a purest and new to astrophotography, this has been a lingering question since I began. My thanks to all who have responded. I've learned a lot from all of you. -Best, Robert

On 08/04/2022 9:57 PM Brian Kaine <briankaine@...> wrote:


The use of color in astronomical imaging, and especially regarding DSOs, is a very complicated subject.
Even in terms of simple RGB imaging where we are trying to render objects as they “naturally” occur, there
are countless variables involved. Let’s consider the following:

The optics of our telescopes obviously affect the colors we record. Reflector vs. refractor? Different types
of glasses used in lens manufacture? Coatings? Surely we have all heard of different brands of refractors
yielding images that are warmer or cooler to the eye.

Much the same can be said for RGB filters. Which brand do you care to use? Astrodon? Astronomik?
Baader? Chroma? ZWO? They all differ to some degree in the wavelengths they pass for each channel.

What about the atmospheric conditions when we record our images? Urban or country location? Manmade
light pollution? Particulate matter in the atmosphere? Are there fires raging out west?

How about the software we use to process our images? MaxIm DL? PixInsight? Photoshop? StarTools?
Like it or not, they all influence how we do things and how they turn out.

As Pete has mentioned previously, what about our own individual eyesight? No two people perceive
color exactly the same. Even considering my own vision, I perceive colors as slightly cooler with my left
eye. Granted, it’s a subtle difference, but it’s there!

And regardless of how hard we try, we will never know if we are getting things right. Robert has it absolutely
correct; we don’t know how DSOs truly look up close in space, and getting there anytime soon to check
on it isn’t very likely. The best we can do is to make our images aesthetically pleasing, but unfortunately
"pleasing" isn't necessarily the truth.

Personally, I sometimes wonder if all of the effort I put into RGB imaging is really worth it to me. The
most inspiring astronomical images I ever saw were backlit transparencies of Messier objects at a long
gone gallery in Chicago’s Adler Planetarium. They were monochrome, each and every one.

Perhaps one day I’ll switch to working in luminance alone; just taking nice deep images. Maybe Ansel
Adams had it right after all.

Brian


Re: Stowaway, 92TCC and corner stars

Roland Christen
 

I have seen some strange artifacts in images taken by similar 1mm thick filters. I don't understand the point of making filters that thin. All the filters that I have used over many years are 3mm and none of them have ghost images or weird artifacts. Even the cheap ZWO filters that I'm using with their 6200 camera work well, except for the fact that they are not 3nm bandpass.

Rolando

-----Original Message-----
From: Henry Kwok <henry.ck.kwok@...>
To: main@ap-ug.groups.io
Sent: Fri, Aug 5, 2022 4:24 pm
Subject: Re: [ap-ug] Stowaway, 92TCC and corner stars

I can attest to the quality of astronomik filters, which are 1mm thick. I use them as a much cheaper alternative to pricier ones delivering >95% of the performance. It does make it a pain to adjust for back focus as I have to add 0.3 mm - and because it is not in mm increment I always have to resort to some kind of shims. 


Re: M27 and filters: It's the colors, Dumbbell!

ROBERT WYNNE
 

You confirm what I've suspected. I think that if one could travel in light years most objects would appear grey much like the moon except for those objects that emit wavelengths of light in the [hence] visable spectrum. You mentioned the Andromeda Galaxy and I have long thought the photos were very heavily processed. I had not considered its dynamic range was beyond human vision or how that is possible for those wavelenghts within human comprehension. -Best, Robert

On 08/04/2022 8:32 PM W Hilmo <y.groups@...> wrote:


On 8/4/22 4:21 PM, ROBERT WYNNE wrote:
...I don't know if the image I've captured is what the image would realistically appear in space to the human eye. The moon and planets are straightforward but DSO is entirely another matter.-Best, Robert

No processed deep sky image shows how the object would realistically appear in space to the human eye.  The human eye simply doesn't capture the really faint stuff very well.  Getting closer to another galaxy, for example, would not make it brighter, and it would not reveal color to the naked eye.  For a demonstration of this, consider the Milky Way.  We are inside of it, yet it only appears as a faint, grey band across the sky.  The Andromeda Galaxy is another good example.  Even if you got closer to it, it would mostly look like it does from a super dark site with very transparent skies.

It's reasonable to process deep sky images in a way that is "true to the data".  By that, I mean that it is possible to calibrate the colors so that they match the photons that the sensor picked up.  For narrow band, it's possible to map the data to the RGB channels to reproduce their "proper" color.  I actually try to do that with my images, but the colors just don't pop the way that they do if I exercise some artistic license.

The other thing that we do in processing that is very different from what the naked eye would see is how we stretch the brightness. The dynamic range of many deep sky targets is huge.  Your eyes are much better than a computer monitor at handling this, but in most cases, your eye would not be capable of managing the full dynamic range.  If the faintest parts of a galaxy's spiral arms were somehow bright enough to be visible to the naked eye, the core would be painfully bright.

841 - 860 of 93599