Astronomy

A Noob's question on declination

A Noob's question on declination


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

The Sun's position at noon is perpendicular to the local x-y plane on the equator during equinox. So the declination of Sun on that day is 0. Today the declination of Sun is close to 23, can this be taken as a virtual equator and hence the tropic of Cancer is in 46.5N (by adding 23.5 to it) and tropic of Capricorn is in 0.5S? Or my assumption is wrong?


I don't think this particularly useful way to think about it. The equator is the equator, Cancer is 23.4 North, and Capricorn is 23.4 South. The equator is a great circle on the sphere of the sky, but the Tropic of Cancer is not a great circle. These are fixed.

Now (in Northern Hemisphere summer) the sun is directly above a point on (or near) the tropic of cancer. This doesn't make the tropic of cancer a "virtual equator". It just makes it the tropic of cancer in summer.

It makes a difference because the path of the sun seen from the equator on midsummer is not the same as the path of the sun seen from the tropic of Capricorn on the equinox. On the equinox, the sun traces a great circle in the sky (from wherever you view it) On midsummer's day the sun doesn't trace a great circle (anywhere).


Noob question on EA Viewing

Have a little one who is very interested in seeing things in the night sky, but is still too put off by eye piece viewing.

I've tried the basic EAA smart phone adapters, but the result is too small and dark (at least with my device), and super finicky to keep centered.

What's a cost-effective way to get live digital viewing similar to an eyepiece?
Ideally through like a tablet, but a laptop would be OK too.

Would like to keep the budget under $200 for all required parts, but could be persuaded to go a little higher if there is going to be a huge performance jump.

Current Equipment:
- StarPro AZ 90mm - 600mm length.
- Focuser adapter has T42 threads

- Have an old DSLR and T42 ring

#2 alphatripleplus

Under $200 would be hard. A small sensor CMOS camera such as those made by a number of astronomy vendors, would likely be closer to $300. The camera is connected via a USB cable to a laptop and controlled by EAA software (e.g SharpCap), and the camera is then attached to the scope (replacing the eyepiece).

You could try using the DSLR in place of the CMOS camera, but to use EAA software like SharpCap you'll need to make sure that there is an ASCOM driver for your DSLR.

For EAA, as you will be live stacking, your mount should track. Trying to do EAA without a tracking mount is (marginally) possible, but not easy and tracking allows for longer image stacks - without tracking the image drifts across the camera in a few seconds, and so EAA stacking software can only produce very short stacks. Not sure if your scope has a goto mount, but I would recommend one for EAA.


Worse-than-a-noob question about EAA, dark-field correction, and light pollution

So I've been starting to look more and more at EAA, mostly due to light pollution in my backyard and a very recent, should be arriving today, purchase of a Nexstar 6SE.

I understand the need for dark-field correction for minimizing the sensor artifacts. But that doesn't deal with the background of light pollution with the objects in the area being imaged. So I was wondering for the DSOs why, instead of a dark-field image correction, one couldn't use the image from a defocused image of the same sky area using the same exposure sequence used for the focused image and use that for the correction? This assumes that the defocused light pollution signal would be the same as when the telescope is focused, this may or may not be the case depending on the time to acquire each image and the changes in the local seeing/transparency while acquiring the first image vs the second. The originally focused light of the DSO would now be essentially spread (diffused?) over the whole imager. And IF the signal for the DSO in each pixel is much less than that from the light pollution in the same pixel you would wind up with something similar to the dark field correction but the light pollution along with other sensor artifacts would be subtracted out leaving the defocused DSO background, hopefully lower noise than using dark-field alone, plus focused DSO in the final image data.

I'm setting here trying to draw focused and defocused optical paths in my head as best I can and I guess I could see issues related to having less light pollution and DSO signal hitting the imager due to the total available light now being spread over a larger area at the plane of the imager surface, so this correction would still leave some light pollution plus the now diffuse DSO signal but it would seem better than using just the dark-field alone. If the total area at the sensor plane of focused and defocused light can be measured, or calculated, I would think one could correct any reduction in signal between the two conditions and get a more exact correction for the light pollution. I suppose the hard part would be having some kind of a "standard" defocus condition one could use to do the area corrections.

Maybe I just haven't read enough yet and something like this is already done, but so far everything I've read says you subtract a dark-field from the image. If so that's my bad.

Mostly I'm just trying to find out where I went wrong in my thinking if this isn't a reasonable thing to do.

Edited by rnyboy, 13 November 2019 - 03:28 PM.

#2 bignerdguy

Why go through all that, why not just get a skyglow filter to block all the light pollution instead? Skyglow or Light pollution blocking filters block the majority of the frequencies that are prevalent in pollution from electric lights of various types. This leaves you with the frequencies that most DSO's use and will let you take some impressive images from your back yard in the city. However keep in mind that you may have to do Stacking not really long shots but the effect is the same either way.

#3 Stelios

Moving to EAA for a better fit.

#4 rnyboy

Thanks for replying. I guess the reason for not using a filter would be the filter can reduce light pollution but, and I don't know if this is the case or not, anything from the DSO in filter cutoff regions will also be filtered out. There is also a bit of light loss overall with another optical element in the way. And with a lower quality filter probably even more light lost across the visible spectrum.

#5 Rickster

That is an interesting proposition. And, at least in theory, I think it would work, The question is, would it work better than adjusting the black level slider in Sharpcap? Both methods provide a global method for subtracting the background light pollution. Your method would work by subtracting an average background level based on observation. The Sharpcap method works by setting a cut off at an adjustable background level. In my opinion, the slider is better for two reasons. First, it is easier. And second, it offers more flexibility since you can adjust it to your liking in real time.

I assume that you have yet to try Sharpcap. If this is correct, I think you will be very impressed with how well the Sharpcap method works.

#6 rnyboy

I assume that adjusting the background level in Sharpcap (and no, I haven't used it and only read a little about it) just sets a fixed, but user adjustable, upper limit being subtracted out from the total accumulated signal in each pixel for each frame taken (the "real time" reference I hope). If so then that also throws out legitimate signal from the DSO again. I have absolutely no idea how the magnitude of that thrown out DSO signal compares to the passed DSO signal. This is all a battle of both maximizing the signal and minimizing the noise as much as feasible.

It just kind of seems to me that if the total signal from light pollution ( the BIG assumption is that it stay constant during both image captures) is, say, 1/2 that of the total signal from the DSO, then being able to throw out as much as possible of a relatively constant signal from just the light pollution only would then add back up to half of the total available signal coming from the DSO in the final image. In theory up to a doubling of S/N yielding improved contrast and better counting statistics for any additional studies of the now higher S/N data.

Being very "noob" I'm kind of surprised nobody's tried this, so I figure there must be a reason why it's not done. I mean it's no added work when compared to acquiring just the dark-field image. I wish somebody would try it just for s**ts and grins to see if there's some merit to the idea or if it, putting it plainly, sucks.

Maybe Sharpcap, or other post-processing software, can (does?) automatically filter out the 5%, 10%, ?% of the total pixels in an image that have the lowest signals (the pixels containing mostly light pollution alone), subtract out the appropriate dark-field signal for each of those pixels, take the average of those differences, and then subtract that "average" from every pixel in the image leaving, hopefully, most of the "real" signal from the DSO in the final image. This would be pretty similar to what I'm suggesting, maybe even better in that it doesn't suffer from the assumption of stable light pollution over the time to acquire the two separate images. Maybe this is part of the way the post-processing stuff works and thus makes my original post and everything else in these threads kind of moot. Oh well, that's a noob for you. Enough knowledge to be dangerous at times.

Edited by rnyboy, 14 November 2019 - 08:21 AM.

#7 Rickster

I like the way you think. It reminds me of the way I think. (yikes) Exploring ideas, even if they don't pan out, is a great way to learn. And sometimes, the ideas that seem so crazy on the surface lead to a true break through.

I can envision that what (I think) you are describing would be a way to do it at a remote automated facility. But it is has some pragmatic problems for EAA. The biggest one is that you would have to defocus your rig, take your series of background noise images, and then refocus for your true image. You would have to do this for each target. And if your exposures were long, you might have to do it repeatedly as the target progressed across the sky. For example, my skies are much darker to the east than to the west. And then there is the moon. And so forth. Focusing is time consuming. Precise focus makes a huge difference in image quality. And by precise, I am talking microns. This process would increase your tome spent on each target by some multiple greater than 2 because your background exposure would have to match your target exposure in all parameters, and you would spend at least that much time again refocusing. I my mind, you could get a much better S/N by increasing the length of your total target exposure (stack) instead. S/N improves by the square root of the total exposure.

Additionally, as you may already know, the signal and the noise are not absolute values, they are overlapping statistical distributions. Nor is sky fog the only noise source. Jon Rista wrote a great primer on the subject. https://jonrista.com. phy-basics/snr/ And yes to get a pretty EAA image with an inky black background, you will have to set Sharpcap to clip faint data. This is why I (and others) will leave some background noise in the image when we want to see the full extension of an object. On the other hand, the AP wizards, like Rista, have gotten very good at bringing the signal out of the noise. But they are willing to spend orders of magnitude more time on an image than we do with EAA. The beauty of Sharpcap is that you can instantly change the dark point to suit each target and how you want it displayed. And you can use it to explore features in the object, in near real time.

I don't know what type of algorithm Sharpcap uses for the dark point (although I would like to know). I don't know if it is a simple cliff edged arithmetic subtraction of a singular global value, or if it is more sophisticated, for example, having a sloped edge or having a variable that is based on a statistical computation. There is a Sharpcap forum where questions like that can be asked. If you figure it out, I would be all ears.

Keep thinking. I am reminded of one of my favorite quotes. "We have not succeeded in answering all our problems. The answers we have found only serve to raise a whole set of new questions. In some ways we feel we are as confused as ever, but we believe we are confused on a higher level and about more important things."

#8 barbarosa

Histogram adjustments really do work very well in more than just SharpCap, but SC has two very good histogram related features..

Here is the explanation of the SC histogram and how to read it. The Pro (small license fee) version has another tool, the Smart Histogram. This tool measures the sensor response of your camera and then measures the sky brightness of any area that you select. From this it calculates camera settings based upon user input (max dynamic range or unity gain, allowable noise level and so on and then set the camera's exposure, gain, etc., depending upon what the manufacturer exposes in the SDK/drivers.

#9 rnyboy

@Rickster. You pretty much have my thoughts correct. To do it right is the way you suggest, very short time intervals between a bunch of focused and defocused images. I was thinking of one focused and one defocused image and hoping light pollution would stay constant in both. I was thinking this could work if both images are not long too long of an exposure, I was thinking of total exposure times on the order of a few minutes and not hours. I actually wasn't t thinking about the moon, but that led me to add in one more defocused image before and the one after the focused image and average the two, assuming what ever is changing in the night sky changes in a rather monotonic fashion.

I have no idea as what the contribution to total signal from light pollution is compared to the typical signal for DSOs, particularly the fainter regions of DSOs. I mean "surface brightness" seems a bit of a kludge for comparing DSO brightness to other sources of light from the unwanted noise.

Somehow it just seems to me that some additional salvageable finer detail is still being thrown out with the noise and acquiring the data differently or a slightly different treatment of the data set could do the salvaging.

A dark-field is great for removing sensor noise but that's it.

Oh well, maybe someday I'll have enough of a setup to try some of the "crazy" ideas and see how they work out, or not.

High again barbarosa. I'll have to go look at your link. I guess one thing to try would be to have a good (meaning not an amateur and not Hubble quality) "real" image set which includes a dark-field, visible spectrum only, of a DSO that has had its "visible spectrum" structure well characterized then fool around with small changes in the darkest level of the histogram from the "good" set and to gain a feeling as to how sensitive the fine detail changes in the "good" image compared to that possibly showing up in the "gold standard" reference.

#10 NaNuu

As far as I understand, this method is similar to some of the post processing tools for adjusting the background colour or gradient. Not the same, as you combine somewhat the dark subtraction with the colour gradient adjustment. I think, you could do some experiments by using dark subtraction and afterwards subtraction of the (gaussian) blurred picture - you might even do it during your observation session like that: take some darks and some lights, do the averaging to get a master dark and another averaging for a "master light", convolution of the master light, adding the master dark to that convoluted master light and then tell SharpCap to use that for dark subtraction. It would be interesting to seethe outcome.

One question arising to me: isn't that defocussed image also carrying averaged information of the DSO, which one would subtract? So it's somehow similar to playing with the histogram's back point.

#11 rnyboy

First, I just received a 6SE and haven no EAA equipment to try anything. Mostly just been reading about it and had the thoughts so I followed with the questions.

And, yes, the defocused image would have the DSO info averaged as it would be smeared across the imaged area as well. That's one of the questions I asked. How much light is from the DSO vs from the light pollution in the defocused image? Right now I'm assuming that it's a smaller contributor when spread out to the total signal in each pixel than that from the light pollution. Since there shouldn't be any structure in the light pollution not much should change except for what I would assume is a lowering of the absolute signal due to being spread over a larger circle at the plane of the imager when defocused. If the spread out defocused DSO signal is less than that from the light pollution than I would think there would be an improvement in S/N for the final image by subtracting out defocused images instead of just a simple dark-field.

"One question arising to me: isn't that defocussed image also carrying averaged information of the DSO, which one would subtract? So it's somehow similar to playing with the histogram's back point." I mostly think so, but it removes the "playing with the histogram" part, which is subjective in terms of where one stops vs the degree of the "real" finer details in the images. It kind of relates to the joke about something reported to have been seen was actually seen in that astronomer's "averted imagination".

I was also thinking about my previous response to Rickster about taking before and after defocused images for the correction and averaging that in order to try to compensate if the light pollution is gradually changing while the focused image is being taken. I'm also now wondering if it would be better to have the software subtract out the light pollution on a individual frame by frame basis. Basically take the average of the first frame in the two before and after defocused exposure sets and subtract that from the first frame in the focused exposure set and then repeat that frame by frame until reaching the last frame in the set. If just using one integrated stack of all the individual focused images than simply subtracting out the average of accumulated signal in each pixel using the two defocused sets would do the same thing.

Another issue I alluded to in this post, and mentioned in my first post. And that is correcting for what is probably a reduction in signal hitting the plane of the image sensor due to the defocusing. One would have to pick a fixed "standard" amount of defocus for all the defocused images because anything else would just lead to having different fractional amounts of the maximum light available when focused in the defocused images. I think that "reduction" in signal with defocus could be fairly easy to obtain with available optical ray tracing software to give a numerical constant for doing the correction. Divide the signal in each pixel by the fractional loss of light for each defocused set and it's now corrected for use in the subtraction from the focused set. I just realized one would now have to have a dark-field image to subtract from the defocused images leaving only the light pollution plus DSO signal for that defocus signal correction.

It's certainly more work now in that it requires the two additional before and after defocused images. It just seems that if this could pull out additional finer details then it could be worth it for those of us suffering from ****-retentiveness. It's a matter of PC horse power to some extent, but computers are really good at doing the repetitive, and boring, point by point data manipulations required once one imports the four image sets into the appropriate software. Aside from a now increased level of work I've not had any response that shows there's a fatal flaw that shoots this idea down in flames.

EDIT: Is there a way to turn off the auto-correction for spelling. It puts in more errors than my actual misspellings .

Edited by rnyboy, 15 November 2019 - 12:09 PM.

#12 NaNuu

I like your ideas and the way you think about that problem! I feel like I would try this (provided there will be a time without clouds. forecast is bad for at least another two weeks. ).

Just one more remark: Light pollution of the background usually (at least to me) has a gradient and some sort of a structure (streetlamps as well as the halo of the city. ) which unfortunately changes with the position of the objects to observe. When doing eaa using relatively short exposure times for an object, it might be sort of the same during all of the observation (of that certain object), but once you follow an object several tens of minutes, you will have to change the averaged image.

So I think: focus defocus would be a bit to difficult to get for every frame you take, hence my idea of using convolution of the frames for that averaged background signal.

Im really curious what your outcome will be!

btw: i think there was a switch to switch autospell off - no idea where Ive seen it, though.

#13 rnyboy

You'll be waiting a lonnng time for me to have an "outcome". Like I said in my first post. I just purchased a 6SE and I have nothing in the way of EAA equipment at the moment. At the moment I'm just starting to gather the information to eventually make intelligent purchases and to save up for those purchases. The postings right now are just a part of an ongoing thought experiment, so to speak.

Edited by rnyboy, 16 November 2019 - 07:22 AM.

#14 jprideaux

#15 rnyboy

I was thinking last night that on my 6SE it takes a lot of turns to go from stop to stop and maybe pick an appropriate number of turns to defocus and count them off to a defocus and then count the same turns back towards focus and then your off by just the little bit to refocus properly.

The nice thing about your lens idea is that I would think that by using the lens focal length that an optical program could calculate a new defocused "focal length" and then I think one could get the size of the defocused circle at the image plane to be used for correcting the reduction in light due to being spread over a larger area.

I seem to also remember reading an article about how the focus length changes per turn on one of the Celestron SE's focuser. I don't remember which size SE but a few searches should find it again and if it was for the 6SE one could plug the change to obtain the defocused "focal length" and then get the similar correction for the change in light at the plane of the imager.

I keep coming back to this posting just to see if somebody happens to try it and if it improves the final image at all.

Edited by rnyboy, 21 November 2019 - 02:08 PM.

#16 tomb1

Good thinking process on your defocus/focus technique. I'll give it a try next time I have clear skies. (Rain storm going through right now.) One problem I see that could be a deal breaker though has to do with how a camera "sees" color. It determines color by relative intensities of red/green/blue. These are three distinct frequency ranges of light, not a continuum. Light pollution is comprised of various frequency spikes (mercury, sodium, etc) that the camera will not be able to discern. Filters are available that do discriminate these specific frequencies very well. For this reason I believe that filters will do a better job than a broadband level oriented approach which is what your technique ends up being. We'll see!

#17 Howie1

Re you statement in Post #1 . "Mostly I'm just trying to find out where I went wrong in my thinking"

* Blackpoint adjustment will give you exactly what you are asking for, in a similar way too . except the blackpoint adjustment avoids the problems with your proposal of having to take defocused LP shots, then figure out how to add them into the images from the object, then deal with the now defocussed blurry stars and nebula details etc, and then deal with the fact the whole image is much less bright having had data subtracted by the new frame.

I'll go through each of your statements in your posts . and most like do this over several posts. OK.

In Post#1 you said "why, instead of a dark-field image correction, one couldn't use the image from a defocused image of the same sky area using the same exposure sequence used for the focused image and use that for the correction?"

* You cannot load such a frame into EAA software. There's only lights, darks, bias, flats, etc options. Someone would have to create such an option in order to load them into the normal EAA software and treat them as you describe. Each option to load darks, flats, etc treats the data in those frames using an algorithm which specifically corrects the aberation it is designed for, with the least possible destruction or subtraction of data. That last bit is important! We do not wish to halve or loose any data if we can avoid it.

* BTW Further down below when I answer another of your statements, I will tell you how dark frames work. It is not how you think it does.
* BTW If you used some other software to merge the defocussed LP frames with your light frames (the ones of the actual object) then that would be classed as post processing - see the rules on the forum about that.
* You also cannot try loading the defocussed LP frames into a stack to try to get them to take effect. Stacking looks for sharp stars in every frame it is stacking so it can overlay each frame star-on-top-of-same-star . and so it rejects frames with no stars or blurry stars. No stars means it cannot adjust or rotate frames with respect to the next as there are no stars to figure it out. And as the centroid of blurry stars is hard to determine it could lead to everything in the stacked image being blurred due incorrect placement of the stars over each other. So if blurry beyond some set point it will reject them.

Also in Post #1. you stated . " This assumes that the defocused light pollution signal would be the same as when the telescope is focused, this may or may not be the case depending on the time to acquire each image and the changes in the local seeing/transparency while acquiring the first image vs the second."

* LP varies according to altitude above the horizon, azimuth you're pointing, the time, Moon phase and position. You'd have to shoot those LP frames every time you went to a new object. The whole thing about EAA is rapid acquisition . and as per earlier there's already a better less time consuming ways as per my comment right up at the top of this post.

Again in Post #1 you said . "And IF the signal for the DSO in each pixel (in the defocused LP frame) is much less than that from the light pollution in the same pixel you would wind up with something similar to the dark field correction but the light pollution along with other sensor artifacts would be subtracted out leaving the defocused DSO background, hopefully lower noise than using dark-field alone, plus focused DSO in the final image data."

* Nope. When software processes a dark frame, because the scope was capped the image should be totally black. It wont be, so it looks for the exact pixels which are "warm" - not black in the image / on the sensor. It notes which exact pixels had that problem, and when it displays a normal light frame (thats the ones you shoot of the actual object with scope uncapped) it replaces those exact pixel positions with an average of the signal values of the surrounding pixels in the light frame. It does not subtract nor change any of the signal (data) in any of the other "good" pixels. It only corrects / adds data to the pixels which are warm.
* Dark frames do not reduce noise either - as per above it just replaces warm pixels with data from surrounding good pixels with good data. In single exposures you reduce noise by shooting at lower gain, or by shooting shorter exposures. If by lowering gain/shorter exposures you can no longer 'see' the object at good brightness with a single frame, then you have to stack the frames. Stacking reduces noise as per ricks posts . stack 4 you will halve the noise of a single frame thereby increasing the SNR by a factor of two enabling you to stretch out finer details too. Stack 16 you will quarter the noise of a single frame thereby increasing the SNR by a factor of 4 which is even better to stretch out the finer details. And so on. You can do the math.

Ok, thats only the first few points from your posts which I've answered. I'll post up the next ones in my next post. Dont forget . we are leading towards why the blackpoint DOES do what you need . only better!

#18 Howie1

In your Post #1 you said . "so this (defocused LP frame) correction would still leave some light pollution plus the now diffuse DSO signal but it would seem better than using just the dark-field alone"

* Nope. You now have blurry starlight, blurry nebula detail, blurry galaxy detail which would affect the overall blurriness of your combined defocussed LP frame and normal light frame (which had sharper details!)

* Ok says you . we'll adjust the Sharpness slider. But that will introduce pixellation / blocky stars so we use them very sparingly. All of us doing EAA wish we could get sharper images with the gear we own. I've owned and done EAA with a 6SE and I can tell you that you will get blurry stars to start with. Over time you will fiddle with software backlash and mechanical mods to the gears for backlash, and fiddle with spacing of camera to reducers, just to get sharper images. The last thing you will want to do is make them less sharp by adding in a defocused shot - if you could add it in that is - which you can't as perr above.

Also in post Post #1 you said . "so far everything I've read says you subtract a dark-field from the image"

* Nope. Many do not subtract dark frames. Even if they own uncooled cameras, many do not take dark frames. If you get heaps of very bright red green or blue dots then take darks IF and only if the bright RGB dots in the final image worry you. Or buy a cooled camera which will reduce the warm pixels to almost non existant. If the ones which show still bother you then take darks with your cooled camera. And being cooled you'll only have to take one set for each gain, as the temperature of the sensor will be held at the one low below-zero temp due to the cooler. Uncooled cameras you have to shoot darks at both different gain and different sensor temps (usually in 5 degree increments) . heaps of work hence many do NOT bother with darks.

Now in Post #4 you said . "I guess the reason for not using a filter would be the filter can reduce light pollution but, and I don't know if this is the case or not, anything from the DSO in filter cutoff regions will also be filtered out. There is also a bit of light loss overall with another optical element in the way. And with a lower quality filter probably even more light lost across the visible spectrum"

* This is at total odds with other stuff you have said. You are not ok with loosing an impossible to see bit of color due cutoff of a very narrowband of light from a nebula, or loose a really tiny bit of light due to another optical element in the way . but proposing that to add in an artificial new black cutoff point based on a defoccused LP greyish hue with blurry stars and blurry nebula is the way to go? Dont think so.

* The LP filter will cut the whole problem you are trying to figure out, and affect the overall light across the entire image heaps less than what you propose.

Ok, there's one more big post to come as it covers the actual blackpoint and how it does what you are chasing (with or without filter) and actually do less damage to the images than what you propose. Having said that . it is great that people think about what we are all doing. But, some ideas will work and some wont. And . Usually we actually try it out and post details and images to prove it works. Then when folk get interested as they are not debating if it will work or not but in fact wanting to also try it out . having seen the evidence. Hope no offense taken there . but I very strongly suggest you should get your new rig and go learn how to use it all . that is going to give you heaps more pleasure at the moment. A wonderful experience awaits and your time will be best spent asking on the forum just how you actually use all those pesky sliders and terms in the software . Anyway on to how a histogram and blackpoint works in the next post.

#19 Howie1

In Post #6 you say . "I assume that adjusting the background level in Sharpcap (and no, I haven't used it and only read a little about it) just sets a fixed, but user adjustable, upper limit being subtracted out from the total accumulated signal in each pixel for each frame taken"

* Nope Not quite as bad. LOL. Each pixel gets a certain amount of signal (as you call it) from the light striking it from the object in space. Extremely dark areas triggers a bit of signal in those pixels pointing at that area. Lets say that signal is a value of 1 for that very dark bit of sky. Other pixels are pointing to bits of sky which are a bit brighter. Lets say they have a signal value of 5. Brighter bit elsewhere, those pixels getting that light may be signal value 10 and so on. Until the core of bright stars those pixels may be 50.

* Now the bit to read carefully . When you first take the shot, the blackpoint level is set to 1 . and that tells the software to display the image on screen so that every pixel which has a signal of 1 . as totally black. All other pixels of value 2 and above are still displayed as whatever level of signal they are. If you move the blackpoint to 5, then it tells the software to display the image on screen so that every pixel which has a signal of 5 and below . as totally black. All other pixels above the value of 5 are still displayed. And so on. MOve the blackpoint to 50 and even those bright stars will disappear along with all the nebula and so on as every pixel of signal 50 and below will be set to display as black.
* So from the above you can see the idea of subtracting a defocussed LP frame where the grey level of the pixels has been set to black (so LP is effectvely black) and then added to the light frames (of the object itself) will, as you pointed out in your posts, reduce the signal in all pixels right across the image. But the blackpoint slider does not do that. If set to a low value like 5 (in this example) then you'll loose stuff 5 or less but you will not loose any level of signal above that.
* So . that is why we use the blackpoint slider and our eyeballs to correct for LP and/or over exposure/gain. You sneak the blackpoint up until the LP greyish bits between stars where there is no nebula etc becomes dark . and all the while you do that slow increase adjustment, you are watching the faint wispy bits of nebula . checking to see at what point they start to disappear. It is totally up to your taste, and what details you are trying to see as to how much you adjust. And if you do get a LP filter, the blackpoint adjustment is heaps faster and better and easier to do than without a filter . as the filter does much of the work for you. Not all the work, but certainly most of the work . and as per earlier posts lets though a lot of faint stuff which is of a different wavelength than LP. IE in effect hunts down those pixels of level 5 (say) which ARE only due to LP and makes them black, all the while leaving those pixels which are still also level 5 but caused by photons of light from nebula. Bingo . you get both benefits . less LP shows up and more nebulosity. Which is why a couple suggest to you not to worry about all this new idea . just get a LP filter and use the histogram if necessary.

So . it is the histogram and its blackpoint setting/slider which is the normal way to adjust out LP and over exposure/gain. For all the reasons above and also .
* Last big plus for blackpoint adjustment is that your defocussed LP frame being set as the blackpoint is then set. It is not easy to unset it. Or adjust it. If you find you loose too much faint stuff in the galaxy or neb for your needs or taste . you have to shoot another, or tweak it a tad brighter some how, and then reload into the software. Again, with EAA we are after ease of doing this hobby and quick acquisition of the image. Now the blackpoint adjustment is both non-destructive and instantaneous. If you tweak a tad too far and loose some faint wisp of nebula which you wanted to keep . just back it a tad off and it will reappear. But naturally the dark space between the stars will of course get a tad greyer in backing off the blackpoint. Nothing we can do about that.
* With experience you will find with your scope and your camera and your workflows with the software, that you'll know when to use the brightness, contrast, gamma, blackpoint, whitepoint, saturation, exposure time, gain, to brighten or darken different bits of your image. No one can tell you what all the "correct" settings are, as we each have different skies, different LP, shoot at different time of Moon, time in the night, different equipment . you have to just get out there and practise.

BTW I've done this hobby for 7+ years with all types of OTA (camera lens, refractors, SCT's, Newtonians). All types of mounts and mount types (push-to rather than GoTo, AltAz, AltAz with wedge, GEM). All types of cameras (no NV gear . yet! LOL). First camera was diy webcam, then a SCB2000 then several Mallincams, then ZWO, then DSLR. Have fooled with couple of Atik too including Infinity. Software I've used Nebulosity, early Mallincam stuff, Miloslick, SharpCap since it came out, Astrotoaster since it came out. On the normal non EAA photography front - iPhoto, Windows gallery editor PRO, several others I cannot recall, and many versions of Adobe PS and LR.

#20 Howie1

Ah yes . here is a trick which may interest you though . it occurred to me as I sat with a rum and coke out the back patio just thinking about all the cash and time and effort I'd spent over the past 7 years with all those cameras and software! LOL

About 6 years ago the Miloslick software (used with Mallincams) put in an HDR feature. HDR is a method from normal daytime photography. A single shot at some exposure/gain might show your home and friends standing in front of it all at good exposure . but the white clouds may be overexposed and bloated artificially white. And the shadows under the porch might appear totally dark . But your eyeball when you took the shot would have seen fluffy white clouds, people and house, and also the details of chairs and tables on the porch. Our eyes have much better dynamic range than camera sensors. HDR takes three shots all at the same gain, but at three different exposure times . one at the speed where people and house etc are perfectly exposed, and one faster frame so that the clouds are not over-exposed (but people and house are a tad underexposed now and the porch is still dark), and one slower one so that the contents on the porch in shadow are now exposed (but the people and house are a tad overexposed and the clouds are really overblown). All three are combined as one shot. By looking at whats under/over/normally exposed in those three shots I'm sure you'll see how in the end the stacked image will contain a new lot more (higher range) of values of signal. IE it is additive rather than subtracting data.

Many have since tried that in sharpcap and astrotoaster and other software . for astro / EAA photos. Simple to do and all three shots will stack! (As all three shots have stars with which the stacking will be able to overlay each frame star-upon-same-star). But, while the stacked combined final image does now contain a lot more signal (data) in each pixel in the final stacked image, tweaking it out is a bit of an issue . but it could help you down the track. And most certainly everyone comes up with it as an idea at some point in their EAA.

The reason I and others find it, then try it, and then forget about it . is that the histogram blackpoint and middle point etc is far easier to use once you learn it. And because all (three) frames shot normally will have good amounts of signal (data) the final stacked image, then the proper blackpoint, midpoint and whitepoint use allows much better results than the HDR method. But, it is exciting to try out and there is definitely an effect on the background LP. Just not as good as biting the bullet and learning the histogram, blackpoint, middlepoint and whitepoint sliders and how to read the histogram etc.

Cheers and wishing you good times ahead mate!

Edited by Howie1, 22 November 2019 - 03:03 AM.

#21 Matt Harmston

This is an interesting thread. Upon reading it, a thought experiment rolled through my head. When applying dark frames in EAA, the illumination histogram often shifts shifts toward the black point. The degree of such shift is dependent in part on the illumination of the dark itself. More aggressive darks mean more pronounced histogram shifts after being applied. the kind of affect consistent with reducing light throughput by inserting a filter. Granted, filters prevent light from hitting the sensor, whereas darks applied in a live EAA session address illumination after photons have hit the sensor. I'm simply talking about what happens to the histogram. That said, if we were to subtract a pervasive light source across an entire field. which includes diffuse light from targets of interest, skyglow, amp glow, hot pixels, etc. I would expect an illumination histogram shift consistent with that experienced via introduction of a filter.

Note, I'm specifically referencing "illumination" because varied filters will impact RGB differently.

That said, though caused by very different processes, using a filter with reasonable light throughput may ultimately have less impact on illumination as measured by the histogram than subtracting the "LP-dark".

#22 Howie1

Interesting Matt. If that is using SharpCap then I do not for sure how SC handles darks. It may be the case in SC.

But I do know all the other astro processing software I've used, darks are handled as I posted earlier. IE darks looks at the image from the capped scope (which should be totally black) and looks for pixels on the sensor/image which are not totally black. Then, for each of those pixel positions it found to be "warm"/not black in the dark frame, it replaces the equivalent pixel position on the light frames with data/signal from surrounding pixels. So bright red, green, blue pixel spots have their data replaced by the average of the data in the surrounding non-bright RGB pixels. All the other pixel positions in the light frame which were not found to be warm/not black on the sensor are not touched.

I'm pretty sure it is the stacking of the lights which is moving the histogram towards the black.

I know for sure that stacking light frames by themselves, without darks, in all the software I've used always makes the background darker as they stack. That is because noise is decreased, so the background always gets darker with stacking. As the "count" of the number of pixels in the stacked image which are now darker than your first few light frames goes up, the hump on the histogram will indeed move leftwards.

Easy test Matt . next time don't stack with a dark and see what happens to the histogram. Pretty sure the hump will move left as the count of those darker pixels increases. Visually you will see warm pixels in your stacked image, and also see the background smoothing over and getting darker and darker as you stack the lights without any darks.

Unless, SC handles darks differently to most software . which it might, as it is heavily tweaked for doing EAA rather than AP.


Question about R.A., Declination and targeting objects

As I guess for many of us, quarantine allowed time to dedicate to those hobbies we previously wished we had time for..

So I started to research my SkyWatcher EvosStar 90 (910/90) and here is my question:

I did the polar alignment and after that I wanted to observe Jupiter later at night (had to give up because the sky became cloudy). I have my altitude defined to the region I live in.

So, does polar alignment mean that I don´t have to move the altitude (and azimuth) controls anymore? Will all the objects in the skies be now searched for using only R.A. and Declination?

If so, how to target an object that is, lets say, under the North Star. would I have to move the scope to another nearby place, in order to target that object without decreasing the altitude?

Why would then polar alignment be necessary?

And finally, is there any way you could easily explain what is the purpose of the fixed indicator on the R.A. disk, in the middle image below.

I know these can be regarded as very. basic questions.

#2 DHEB

When you polar align you set your mount's polar axis parallel to the Earth's axis. This means that you can track a celestial object by only moving the telescope in the right ascension direction (about the polar axis).

#3 Dequinho

Thank you, so i just have to adjust the disks into the specific RA and Dec. for that object, from now on? Or do i have to first 'manually' find and center the object and then follow it through the sky with R.A commands?

#4 DHEB

Thank you, so i just have to adjust the disks into the specific RA and Dec. for that object, from now on? Or do i have to first 'manually' find and center the object and then follow it through the sky with R.A commands ?

This one, marked in red. The RA/DEC dials are in my experience just coarse indicators unsuitable to find anything. You better do this: unlock the clutches, find your objects by starhoping (I guess you do not have goto) using a finder, center the object in the eyepiece field, lock the clutches again, and continue tracking while you observe.

#5 Dequinho

In that case i don't understand why the polar alignment. Pure visual search can be really difficult but the more you know the sky, the easier it will get.

Thanks a lot, and would be great to hear more opinions.

#6 kathyastro

Polar alignment has nothing to do with finding objects. Polar alignment ensures that, after you have found an object, you can track it easily with just one movement.

The setting circles have nothing to do with polar alignment. They are strictly for finding objects. You do need to ensure that they are correctly calibrated. The Dec setting circle just needs to be calibrated once. It should hold its calibration forever unless disturbed.

Depending on the design of your mount, the RA setting circle probably needs frequent recalibration. Point the scope at a known object, then rotate the RA dial to match the objects's RA position. Then, you can find another object by moving the scope so that the new target's RA (and Dec) coordinates are shown on the dial. After a few minutes of tracking a target, the RA setting circle will no longer show the correct coordinate. You will need to re-calibrate the dial befor moving on to the next target.

#7 SeaBee1

As Kathy and others have pointed out, polar alignment and finding an object are two separate operations. Polar alignment lines up the mount on the North celestial pole and ensures accurate tracking of an object once found, by merely adjusting the RA axis periodically. If your scope has an RA axis motor, just turn it on and tracking is a hands free operation, otherwise just turn the RA knob. Easy peasy, and with an accurate polar alignment, and a motorized RA axis, a target will stay in the eyepiece for nearly as long as one would wish.

The RA and Dec scales are used for finding an object, but be aware. most of the scales on modern mounts are hardly accurate enough for precisely locating an object in a BIG sky. consider them as guides only. they can get you into the ballpark and some are better than others, but most of us ignore them and star hop instead.

If you want to try yours out for finding stuff, use your longest focal length eyepiece, target and center an easy to find bright star, lock the RA scale to its coordinates, then move the RA and Dec axis to a new target and (hopefully) your target should be somewhere in the field of view.

#8 Dequinho

Thanks Kathy, i understand better. However, that means that i will have to always adjust altitude and azimuth as well whenever moving to another object, or is it possible to point the telescope to any RA/Dec coordinates without moving altitude and azimuth.

Thank you for your answers!

#9 kathyastro

Thanks Kathy, i understand better. However, that means that i will have to always adjust altitude and azimuth as well whenever moving to another object, or is it possible to point the telescope to any RA/Dec coordinates without moving altitude and azimuth.

Thank you for your answers!

You must never use the alt and az adjustments for anything but polar alignment. You polar align once at the beginning of a session, then you never touch those controls again. All subsequent movement of the scope is done using the RA and Dec movements.

You need to practise in your living room. If you do, you will see that it is easy to point the scope anywhere in the sky without using the alt and az polar alignment adjustments.

Edited by kathyastro, 26 April 2020 - 09:16 AM.

#10 clearwaterdave

Hello,.For me I find if I polar align.,I can then swing my scope to face in a southerly direction.,Whether I want to view east or west of the meridian.,(due south).,will determine which side the scope is on,.I know about where my target is.,but first I point the scope at a known bright object.,and I set the declination dial to this objects declination.,There is a mark on the scope to set it by.,Next move the scope till the dial reads the dec. you want for your target,.then sweep slowly in the area of the target. I am able to always get the target in a 5* finder.,and very often in a 2.5* fov eyepiece.,

A good polar alignment makes it so I can locate most things this way.,just using the dec. dial. good luck.,

The suggestion to use the mount inside following a vid or something is a great idea.,learn how it moves and where your clutches are before your "in the dark".,

#11 Dequinho

Alright, thanks for your great elucidations. I have already seen and applied those calibration tips, and I did a successful polar alignment yesterday. So this will definitely help me to now try some more "informed" star hopping tonight.

If i read it correctly, a red dot finder would probably be a good idea as a next purchase. It would at least make it easier than my analog standard finderscope.

#12 vtornado

There is the altitude scale is set once to your latitude. You must make sure that your scope is leveled. Otherwise it is not pointing at the

Point your mount to true north, and point the scope to be parallel to the mount. If your tripod is level your scope should be pointing very close to the north star. In my yard I leveled and put down three patio stones and put marks on them where the scope is pointed to true north. Now I can take my scope out an just set it on the stones. Instant polar alignment.

For the RA circle, find something easy in the sky like Capella, Arcturus, or something else bright. Get that in your main scope and set the

RA circle to match the object's RA. When you are done looking at it, make sure the RA circle still reads correctly then use it to move to your

next target. As noted above there is a lot of slop in the circle, but it can be used to move about 3 hrs. If you have to move more than that

you will have to find and intermediate object.

#13 Dequinho

Today i removed the whole eq.mount plus the telescope and rebuilt it, having realized I made an "orientation" mistake when i first put the whole thing together.

Now I have the fine control tubes in the right place and not one of them heading towards the aperture of the scope. I have made another calibration and leveling of the mount. I will now align finderscope and main scope and later yet another North Pole alignment.

I need to ask something and hopefully you can help me again: I can rotate the whole RA axis and with that movement rotate the tube as well. And I can rotate the disk only, manually, and keep the tube in the same position.

Is there a simple way to explain when should I use one or the other method?

Once again, thank you for your patience and help

#14 dmgriff

Once you have your polar alignment for the viewing session, you can find objects with a RA sweep.

Once your declination is calibrated, set the desired objects declination, lock that axis. You do not set your RA circle with this.

Using a low power eyepiece, sweep the scope thru the RA axis until you find the object or a large object near and go from there. Lock your RA axis and use your slow mos.

A hand held planisphere can assist you.

#15 kathyastro

I am having a little trouble visualizing what you are asking. I think it is a terminology issue. The "disk" you are talking about, do you mean the setting circle? Yes, the setting circle can be turned separately from the RA axis. This is necessary to calibrate it.

#16 Dequinho

There is the altitude scale is set once to your latitude. You must make sure that your scope is leveled. Otherwise it is not pointing at the

celestrial pole.

Point your mount to true north, and point the scope to be parallel to the mount. If your tripod is level your scope should be pointing very close to the north star. In my yard I leveled and put down three patio stones and put marks on them where the scope is pointed to true north. Now I can take my scope out an just set it on the stones. Instant polar alignment.

For the RA circle, find something easy in the sky like Capella, Arcturus, or something else bright. Get that in your main scope and set the

RA circle to match the object's RA. When you are done looking at it, make sure the RA circle still reads correctly then use it to move to your

next target. As noted above there is a lot of slop in the circle, but it can be used to move about 3 hrs. If you have to move more than that

you will have to find and intermediate object.

good luck.

VT.

My question arises after this reply. It sounds like "manually" point the scope at the star and then rotating the disk to match the RA coordinate of that star.

#17 kathyastro

My question arises after this reply. It sounds like "manually" point the scope at the star and then rotating the disk to match the RA coordinate of that star.

Correct: that is how you calibrate the setting circle. Once it has been calibrated, for a few minutes, you can use it to find another target by its coordinates. When you have tired of looking at target #2, recalibrate it, and then use it to find target #3.

#18 Dequinho

Thank you all, it got much better yesterday night as I could hop a little from bright star to bright star by readjusting RA from an initial one (I started with Regulus). From there I hopped, at least, into the area I wanted to, and also had a very good alignment between finderscope and main scope.

And, earlier in the night, I got great views of the Moon .

#19 clearwaterdave

On my Orion mount the RA dial has a ""locking thumbscrew"".,This is an important part in using the RA dial correctly.,Here is how it works.,

After polar alignment I turn to face south.,I will be viewing to the east of meridian ( due south ) so the weight is on the left side of the tripod,.scope on the right.,Your counter weight bar is pointing east and when it's level with the horizon the scope will be pointing due south. you can set your RA dial by looking up what RA is on the meridian at that time and .,set the dial.,now get the RA+dec for your target.,Unlock the thumbscrew,.and both clutches.,tilt the scope to set the dec,.turn the RA till it is right and view. NOTE.,I haven't locked anything yet. if the target is visible,.center,.if not poke slowly till you find it. center. then.,lock the clutches,.and"" tighten the RA thumbscrew""".,Now as you track your target with the RA slo-mo control the RA dial stays still.,you are moving the scope but it is still pointing at the same RA,.so the dial stays under the pointer.,

Now when you go to the next target.,set the new dec. unlock the RA thumbscrew and clutch and move scope till the RA is set. view,.center,.lock all 3 things.,enjoy.,

I hope this makes sense.,lol.

You need to understand about the meridian flip..you need to flip if your weight is getting higher then the scope.,It's a bit of a pain but not too bad.,it helps to plan your observing to be in one area.,good luck


Stars

A star's direction remains nearly fixed due to its vast distance, but its right ascension and declination do change gradually due to precession of the equinoxes and proper motion, and cyclically due to annual parallax. The declinations of Solar System objects change very rapidly compared to those of stars, due to orbital motion and close proximity.

As seen from locations in the Earth's Northern Hemisphere, celestial objects with declinations greater than 90° −  φ (where φ = observer's latitude) appear to circle daily around the celestial pole without dipping below the horizon, and are therefore called circumpolar stars. This similarly occurs in the Southern Hemisphere for objects with declinations less (i.e. more negative) than −90° −  φ (where φ is always a negative number for southern latitudes). An extreme example is the pole star which has a declination near to +90°, so is circumpolar as seen from anywhere in the Northern Hemisphere except very close to the equator.

Circumpolar stars never dip below the horizon. Conversely, there are other stars that never rise above the horizon, as seen from any given point on the Earth's surface (except extremely close to the equator. Upon flat terrain, the distance has to be within approximately 2 km, although this varies based upon the observer's altitude and surrounding terrain). Generally, if a star whose declination is δ is circumpolar for some observer (where δ is either positive or negative), then a star whose declination is − δ never rises above the horizon, as seen by the same observer. (This neglects the effect of atmospheric refraction.) Likewise, if a star is circumpolar for an observer at latitude φ , then it never rises above the horizon as seen by an observer at latitude − φ .

Neglecting atmospheric refraction, for an observer in the equator, declination is always 0° at east and west points of the horizon. At the north point, it is 90° − | φ |, and at the south point, −90° + | φ |. From the poles, declination is uniform around the entire horizon, approximately 0°.

Stars visible by latitude
Observer's latitude (°) Declination
of circumpolar stars (°) of non-circumpolar stars (°) of stars not visible (°)
+ for north latitude, − for south   − for north latitude, + for south
90 (Pole) 90 to 0 N/A 0 to 90
66.5 (Arctic/Antarctic Circle) 90 to 23.5 +23.5 to −23.5 23.5 to 90
45 (midpoint) 90 to 45 +45 to −45 45 to 90
23.5 (Tropic of Cancer/Capricorn) 90 to 66.5 +66.5 to −66.5 66.5 to 90
0 (Equator) N/A +90 to −90 N/A

Non-circumpolar stars are visible only during certain days or seasons of the year.

The Sun's declination varies with the seasons. As seen from arctic or antarctic latitudes, the Sun is circumpolar near the local summer solstice, leading to the phenomenon of it being above the horizon at midnight, which is called midnight sun. Likewise, near the local winter solstice, the Sun remains below the horizon all day, which is called polar night.


Learn About Astrology

Once you have learned the basic concepts of astrology and understand the planets, the houses, the rulerships and aspects, you are ready to learn about the refinements of astrology based on astronomy. Some of these include the use of declinations and antiscion points. Unless you utilize declinations and antiscia you are missing much of the chart and a great deal of valuable information. As we explore the old traditions of astrology through groups such as Project Hindsight and review the past wisdom handed down by the astrologers of ancient time it becomes obvious that we have gone far from our astronomical heritage.

We tend to forget that astrology is based upon astronomy and observation of the &ldquoheavens&rdquo. In our quest to have instant gratification in our fast paced, psychologically oriented society we have inclined towards using interpretations based purely on the signs and longitude placements.

Just as the astronauts would not use merely the zodiacal degrees to plot a course into space, we astrologers also need to use other calculations to understand completely the charts before us.

One of the basic maxims of traditional astrology was to make note of the declinations. All of the older, well established ephemeris publishers, such as Raphaels, provide these positions. Very simply, if two planets were within a degree of each other by declination they were considered to be either parallel (when both planets are in the same direction, either North or South ) or contraparallel (one North and the other South). These were said to denote intensity of the natal aspect and act as mild conjunctions or oppositions. It was taught that these were very important and these aspects should be noted as we delineated a chart.

We seem to have drifted away from using these in recent times, but those who utilize this knowledge today have found errors in the traditional teachings regarding orbs and have refined the techniques.

Declination is the measurement of the planets placement above and below the celestial equator, and more precisely, the angular measurement north or south of the celestial equator as measured along a great circle passing through the celestial poles. Some ephemeredes do not list these positions unfortunately.

Emphasis has been placed upon the use of zodiacal longitude to view the planetary placements which is a circular representation rather than north and south alignment. However, the zodiacal viewpoint misses an important piece of astronomical information: the declinations.

Declinations are used to view the relationships between the planets from another dimension of space in relation to the earth. We must use both the zodiacal longitude positions as well as the declinations to properly locate a celestial body as well as delineate a chart.

Planets at the same degree of declination may be said to be either parallel or contraparallel and act as mild conjunctions or oppositions. The effective orbs will vary dependent upon the proximity to the celestial equator.

Many come to astrology without an astronomy background and use the tools at hand without understanding their basis. Let&rsquos try to look at declinations from the perspective of the chart wheel we are familiar with.

The origin of the chart wheel is the ecliptic&ndashthe path of the Sun through the sky is represented by our chart wheels and ancient astronomers drew these. Through the years this has become a perfect circle even though the actual ecliptic is shaped more like a football. And the chart wheel concentrates on the twelve zodiac signs, the twelve houses the divisions of space that helps us locate planetary bodies.

Our printed chart forms show lovely, evenly spaced sections even though the zodiac is not perfectly aligned and there are even things called signs of long ascension and short ascension. (See Note #1)

Since the earth is tilted approximately 23&rsquo 27.5" degrees, the Sun&rsquos path marks a great circle around the earth called the ecliptic. (Actually, the earth turns and views the Sun from this angle).

At the times of the equinoxes, the Sun is apparently traveling along the line of the equator and is at 0 degrees declination, but at the time of the solstices (the longest day or night), the Sun is 23&rsquo 27.5" from the equator, thus at 23&rsquo 27.5" declination. So, the maximum declination of the Sun is 23&rsquo 27.5" North or South of the celestial equator and those 23&rsquo 27.5 &ldquo degree maximum areas are marked on our globes as the tropics of Cancer and Capricorn.

When you place a planet in the chart wheel you are placing it in the vast area of that house by zodiacal longitude&ndashshowing you only on which of 360 degrees the planet is manifesting, each degree of a circle being similar to a ray emanating out into space.

Zodiacal longitude however does not pinpoint how far along that ray the planet lies or how close it is to the equator or the horizon. (the ascendant in a birth chart). As an example create a chart wheel and place the Sun at 22 degrees Gemini. Now place Mars at 22 degrees Gemini.

Naturally it is impossible for both bodies to occupy the same place in space. The Sun has a declination of 23&rsquo North 13&rdquo and Mars 19&rsquo North 56 &ldquo. Mars is closer to the equator, (or the ascendant ) than the Sun. Although by longitude they appear to be conjunct, by declination they are over 3 degrees apart. (See Illustration # 1)

Combustion is a term used when the Sun and another planet, primarily Mercury or Venus, is conjunct the Sun, and is said to denote the planet is weakened and the Sun overpowers it. This does not hold true unless the two are also conjunct by declination also.

Although the Sun will NEVER exceed 23&rsquo 27.5&rdquo declination, the other planets may and often do. Planets at declinations greater than this are said to be Out Of Bounds, and may be interpreted as acting out of the ordinary. To further use the standard chart wheel in the example as a viewing aid, an out of bounds planet would manifest OUTSIDE the chart wheel.

Note #1: Every 24 hours all signs ascend over the eastern horizon, however, some signs rise across the horizon faster than others. The reason is this is that all rising occurs in relation to the celestial equator and since the ecliptic is at an angle to the equator some signs take longer than others to complete their ascension.

In the northern middle latitudes, the signs Cancer to Sagittarius are referred to as signs of Long Ascension and the signs Capricorn to Gemini are called those of Short Ascension. In the southern middle latitudes it is the opposite. The signs in the middle of these groups of six rise more uniformly. These terms would not apply in the equatorial or polar regions .

ECLIPSES AND DECLINATIONS

The new moon is the monthly conjunction of the Sun and Moon. Both are in the same degree of longitude. An eclipse of the Sun occurs when , at the monthly conjunction with the Moon, the latter has no latitude, so the two are precisely in line with one another as seen from earth. The Moon blocks the light of the Sun. An eclipse of the Moon takes place when the two, at their monthly opposition (the Full Moon), are again lined up, with the Earth between them. The earth blocks the light of the Sun so that the Moon is in the earth&rsquos shadow and does not appear to have any reflected light from the Sun.

One reason to note the importance of declinations is that Eclipses ONLY occur when the Sun and Moon are in the SAME degree of declination AND longitude. They may be in the same degree of longitude but no eclipse will occur unless they have the same declination.

Occultations are a term commonly used when the Moon and a planet are in the same degree of declination. However, any celestial body can &ldquooccult&rdquo or hide another from view. These are similar to an eclipse of the planet and are noted by serious astrologers.

A significant point to ponder is that the Moon&rsquos Nodes ( see Note #2) in their approximate 19 year cycle are tied into declination. When the Moon&rsquos Nodal orbit coincides with the 0 Point of Aries, the ascending Node will be crossing the ecliptic along this point of the celestial equator and the Moon then achieves it&rsquos greatest declination in relation to its 19 year cycle, about 28&rsquo 35". However, it continues for years afterwards to orbit at a high declination, to almost 29&rsquo at times, and then 9.5 years later, when the Node crosses the autumnal equinox point ( 0 Libra ), the Moon will be at it&rsquos minimum declination (approximately 18&rsquo ) and remain within the bounds of the ecliptic for a length of time.

Note # 2: The Moon&rsquos Nodes are the two points of intersection with the ecliptic as it moves from north to south latitude during its orbit.

The question of allowable orbs in standard beginning astrology (astrology that does not acknowledge declinations) is clarified by the use of declinations. The relative strengths of an aspect by conjunction or opposition can be easily viewed by adding declinations to the equation.

Unlike the traditional teachings that recommended a standard one degree orb for parallel or contraparallel, there is a theory that points out that two bodies within one degree of declination close to the equator are to be considered within orb of influence, but the further north or south of the equator the orb must be adjusted downwards on a sliding scale to only minutes of orb.

An example of the reasoning behind this 1 would be a planet 0&rsquo Aries (declination is 0&rsquo North) and another one at 7&rsquo 30" Aries (declination 2&rsquo N 59" ) They are conjunct by the commonly used standards of allowable orbs of zodiacal longitude, yet almost 3 degrees apart in declination. These two planets are conjunct but not parallel. The other end of the spectrum is a planet at 0&rsquo Cancer ( 23&rsquo South 27.5" declination ) and another at 7&rsquo 30" Cancer ( 23&rsquo South 15"declination) . Although 7 ½ degrees is the same distance apart in longitude the Aries planets are NOT within orb of parallel by declination, but the Cancer planets are.

Another viewpoint would be to look at a planet at the allowable standard orbs for aspect by conjunction. A planet at 27 &rsquo Sagittarius, for instance and 23&rsquo S 25" declination compared to a planet at 13&rsquo 23" Sagittarius and 22&rsquo S 25" declination (within the one degree traditional orb for parallel) The parallel would not be valid because the two planets are over 13 degrees apart. This pair is neither parallel nor conjunct.

If you use more than 12" of orb at these higher declinations, the planets will be much further apart in alignment in space because the ecliptic flattens out along the northern and southern extremities. Along the equatorial regions (0 &rsquo declination), the ecliptic forms a more uniform arc. You could almost say space is more compressed along the northern and southern extremities. (See Illustration #2)

Using declinations is a useful tool for weighing the strengths of and influences of any existing aspects in a chart. Should there be several planets within orb of a conjunction, the refinement of using declinations can aid you in giving more weight to the ones closest by declination.

Also consider two planets not in aspect to one another but in mutual reception and also parallel. This is going to be an important exchange of planetary energies, as there is a direct line along which the energy can travel by way of the declination. Two planets in mutual reception, but not related by aspect or declination will have a more difficult time expressing their energies along a useful track. However, just the addition of the parallel or contraparallel notation steps up the power of any planetary influence. Planets already in aspect will have their energies strengthened by the addition of a parallel or contraparallel in the manner of longitudinal aspect. So two planets in square and also parallel, will render the square more powerful.

Look again at the illustrations included with this article and notice how declinations are measurable avenues by which planetary energies can be exchanged.

SOLSTICE POINTS and THE ANTISCIA

The word Solstice comes from the Latin : solstitium. . ( Sol, the Sun. sistere, to make stand. ) The Sun at the solstices is at its turning point in its apparent course and its declination remains essentially the same for three days. At the winter or summer solstices the Sun turns back towards the equator. A body on an antiscion point of another will make an exchange of energy by way of declination and its position by common relationship to the Solstice Points and the Sun&rsquos path. It could be termed a Solar Parallel.

Understand the Sun&rsquos path and the solstice points and you can understand the antiscion points for any planet. The entry of the Sun into the Cardinal signs is called the Solar Ingress. Mundane astrologers use the charts of these points as maps of minor beginnings to analyze current events.

Remember that the entry of the Sun into Cardinal signs reflect the turning points of the Sun on its path around the ecliptic. As the Sun starts on it&rsquos path in the spring at 0 Aries (around March 21) it is also at 0 degrees declination. The Sun&rsquos warmth as received by earth increases (in the Northern hemisphere) as the Sun travels through Taurus, and Gemini until it finally reaches 0&rsquo Cancer (around June 21, the summer solstice).

The summer solstice is the longest day and thereafter the days shorten until they are equal in the fall at the equinox. See Note # 3 At 0&rsquo Cancer the Sun has achieved it&rsquos maximum declination North. Traveling from 0 Aries to 0 Cancer, the Sun&rsquos declination has traveled from 0&rsquo North to 23&rsquo 27.5" North. From June 21 until September 21 the Sun is still at a northern declination, but traveling south, back to the 0&rsquo point of Libra and 0 declination.

After September 21 , when the Sun crosses 0 Libra and the equator, the declination will be South. And the Sun&rsquos declination will continue to increase in south declination until it reaches 0 Capricorn, December 21, and the winter solstice. 0 degrees of Cancer and Capricorn are called the Solstice Points. The Sun at 0 degrees Libra or 0 degrees Aries will be the same distance from these points. As the Sun moves between 0 Aries to 0 Cancer it will cover 0 to 23&rsquo 27.5" in declination moving North. On it&rsquos path back to the equator degree of 0 declination and 0 Libra , but traveling South, it will be at the SAME degree of declination, the SAME distance from the equator as it was on it&rsquos way North. Likewise, once past 0 Libra and traveling South in South declination towards 0 Capricorn and then &ldquoturning&rdquo back towards 0 Aries it will cover the same degrees of declination.

Any two points equidistant from 0 Cancer or Capricorn when the Sun would be at the SAME degree of declination north or south, though traveling in a different direction, are called the &ldquoantiscia&rdquo and also known as the solstice point positions. So 0 Aries has a solstice point of 0 Libra, 1 Aries = 29 Virgo, 2 Aries = 28 Virgo, etc. The easy way to check if the antiscia of a body is correct is to observe that the degrees will always add up to 30.

A method to visualize this concept is to draw a &ldquonatural&rdquo chart, i.e.: a chart having 0 Aries rising and 0 Capricorn on the MC. Then draw lines parallel to the Aries-Libra axis.

A planet in Aries will correspond to Virgo, Libra = Pisces Taurus will correspond to Leo, Scorpio=Aquarius Gemini will correspond to Cancer, Sagittarius=Capricorn

To find the antiscion of any planet find its longitude, for instance 10&rsquo 19" Pisces, then subtract the longitude from 30 degrees. What is left over is 19&rsquo 41". We see above that Pisces corresponds to Libra. So the antiscion will be 19&rsquo 41" Libra.

Let&rsquos look at a transit of a planet by the Sun. The concept behind the use of Antiscion Points is that a planet crossing one side of a pair of antiscia will be on a point that would be in parallel of declination were it the Sun. Let&rsquos choose Venus as an example at 6 Scorpio, ( and also 13 &rsquo South declination) It is on a point along the Sun&rsquos path that will be triggered when the Sun reaches that same point by transit, naturally.

The Sun transit at 6 Scorpio will be at 13&rsquo S . The Sun will oppose 6&rsquo Scorpio when it is at 6 &rsquo Taurus, ( by declination, the Sun at 13&rsquo declination North ) This will be about 6 months difference in time. The declinations will be the same but in opposite directions, thus a contraparallel. However, in February, when the Sun is at 24&rsquo Aquarius, it will also be at 13&rsquo South declination (24 degrees Aquarius is the &ldquoanti scion &rdquo or solstice point equivalent of 6 degrees Scorpio) . This would be then a parallel.

So, ANY planet that transits 24 Aquarius will be able to vibrate along the ecliptic path across to 6 Scorpio.

Note # 3: An interesting side note is that most astrologers/astronomers were from the northern latitudes and they observed the Sun&rsquos intensity in August and thus Leo is ruled by the Sun, when the Sun&rsquos heat is strongest. And it is old, tried and true astrological wisdom that any aspect that is formed is stronger in effect after it&rsquos exactitude. The Sun reaches it&rsquos maximum point in June, but the effects are seen for several months afterwards in the heat of the summer.

To go further in your studies of declinations one can convert the declinations to degrees of longitude, since one can utilize the distance from the equator and relate that to the ecliptic by drawing lines parallel to the celestial equator. Just as the antiscia use the Sun&rsquos declinations as a guide, so also do Declination Conversions to Longitude. An astrology computer program including this technique is available from Halloran&rsquos Astrol Deluxe.


A Noob's question on declination - Astronomy

The Dachshund leads a quiet life
Not far above the ground

He takes an elongated wife,
They travel all around.

They leave the lighted metropole
Nor turn to look behind
Upon the headlands of the soul,
The tundras of the mind.

They climb together through the dusk
To ask the Lost-and-Found
For information on the stars
Not far above the ground.

The Dachshunds seem to journey on:
And following them, I
Take up my monocle, the Moon,
And gaze into the sky.

Pursuing them with comic art
Beyond the cosmic goal,
I see the whole within the part,
The part within the whole

See planets wheeling overhead,
Mysterious and slow,
While morning buckles on his red,
And on the Dachshunds go.

Mother Goose's Garland, by Archibald MacLeish

Around, around the sun we go:
The moon goes round the earth.
We do not die of death:
We die of vertigo.

Escape at Bedtime, by Robert Louis Stevenson

The lights from the parlour and kitchen shone out
Through the blinds and the windows and bars
And high overhead and all moving about,
There were thousands of millions of stars.
There ne'er were such thousands of leaves on a tree
Nor of people in church or the Park,
As the crowds of the stars that looked down upon me,
And that glittered and winked in the dark.

The Dog, and the Plough, and the Hunter, and all,
And the star of the sailor, and Mars,
These shone in the sky, and the pail by the wall
Would be half full of water and stars.
They saw me at last, and they chased me with cries,
And they soon had me packed into bed
But the glory kept shining and bright in my eyes,
And the stars going round in my head.

Reason Has Moons, BY Ralph Hodgeson

Reason has moons, but moons not hers,
Lie mirror'd on her sea,
Confounding her astronomers,
But O! delighting me.

The Rabbitt's Song Outside the Tavern, by Elizabeth Coatsworth

We, who play under the pines,
We, who dance in the snow
That shines blue in the light of the moon,
Sometimes halt as we go-
Stand with our ears erect,
Our noses testing the air,
To gaze at the golden world
Behind the windows there.

Suns they have in a cave,
Stars, each on a tall white stem,
And the thought of a fox or an owl
Seems never to trouble them.
They laugh and eat and are warm,
Their food is ready at hand,
While hungry out in the cold
We little rabbits stand.

But they never dance as we dance!
They haven't the speed nor the grace.
We scorn the dog and the cat
Who lie by their fireplace.
We scorn them licking their paws
Their eyes on an upraised spoon-
We who dance hungry and wild
Under a winter's moon.


Watch the video: Problem On Magnetic Declination. Compass Surveying. HINDI (January 2023).