|
Post by howie1 on May 12, 2020 2:30:16 GMT
I have yet to try it out, I intentionally done this to let the guys give it a go and give there feedback,, I'm glad to hear there is an easy install version for Windows,, like most I want an easy interface to operate with a logical work flow,, it as said seems to follow what is out there all ready and,, like the guys say,, what about something different,, the guys seem to like the zwo asiair and pro,, And I have no idea if this may be what they refer toππππ But if they like it,, it will be good software,, and with it being based from the raspberry pi 4 4gb then,, ππ,, some genius could do better π True, for my tastes in short exposure VA/EAA, the AIR is very close to perfect ... multi platform ... great features for outreach and sharing with mates using android, apple, linux screen sharing. Even has hdmi output to a 12V HD camping LED TV if you have one. Or to a small 12V projector system. Ticks all those boxes. But, there is no way at all I'd recommend the AIR to a beginner. No manual and does so much that you have to have experience to use it. Paul and I with years of VA under our belts have had to use YouTube to figure it out. And, it only works with ZWO equipment. And, with just small tablet screen area lacks a bunch of stuff which many people wish was there. Like all "simple, easy to use" software it cannot be simple and easy to use and do everything ... which unfortunately is what people want as they get into the hobby. But, to give a perfect example of the stuff I would love to see the ALS development team do with their software here is one feature of the AIR which is totally brilliant! An example of automating a workflow which all experienced VA / EAA / AP people do ... Beginners to experienced to those doing outreach ... we ALL love and need very accurate GoTo's. If you land off target and spend just 2 minutes trying to find the darn thing, then it shortens the number of targets I can get to in my short cloud free time "observing" using very short exposures. Worse ... a beginner will often spend an hour trying to find the target. And also very bad ... if doing outreach or sharing with friends, then after 5 minutes they have all lost interest in what you are doing and wander off. Forums tell us the standard way to fix this is load platesolvers and ASCOM or INDI to control the mount to then refine the GoTo based on platesolved coordinates. But those forums are full of people (a) having problems downloading the correct plates for their equipment (b) performance issues or disk space issues with the platesolving (c) no ideas how to then use the info from platesolving to actually do a precise GoTo (d) then realising that their software wont do that and have to find more software to do the precise GoTo. So experienced people have a tough time .... and beginners should just totally avoid in my opinion. What has ZWO done with the AIR ... You open its built in object database. It shows you what the object looks like, its altitude in the sky, brightness and size. If you say to yourself "Yes that is visible, in the right spot of the sky, good size for my scope and camera combination" and so you tap to select it ... The mount automatically performs the Go To ... automatically platesolves on finishing the goto ... and automatically from that platesolve does a precise goto to place the target smack bang centre of the camera frame. That's smart! All built into the one downloadable app. There's got to be more automated stuff which the ALS team can do. Such as the surface brightness and alt and az of the target object in the database can easily be used to set a starting point for exposure time and gain. The exposure time is also easily set if AltAz mount is used as the object AZ and ALT yields the degrees of field rotation in whatever part of the sky the object is in at that time and thus the max possible (short as Alt Az mount) exposure time. Then that first shot will produce a histogram which can (or should) be very simple to interrogate automatically to decide if the image is clipped in the black left hand area, or white right hand area (highly unlikely), and then adjust exp and gain based on that in a quick iteration to yield correct histogram to allow stacking to commence. For experienced people you simply click a stop-auto-process button to allow you to use your experience to set it. But outreach and beginners it would be a super feature. And is simply an automation of the workflow which all experienced people use when setting up quick focus and frame images before then setting our normal (from experience) exposure, gain, white balance to then permit stacking. ON white balance ... that too is another example of what should be automated - as most people I know doing VA/EAA adjust the RGB curve peaks to align for the starting point. Then do the final tweak using the RGB sliders. But 100% of beginners have absolutely no idea how to color balance to start with. Tarantula nebula, orion neb, flame neb all come out red .... when they are not red at all! If they've fiddled with the settings they often have a green hue over everything as the RGGB pixel array has two green pixels which over power everything. Again they have no idea how to easily fix it. So automate it ... then give control for the final tweak if they so desire. Anyway ... off my lecture podium ... sorry everyone. Cheers Anyhow.
|
|
|
Post by davy on May 12, 2020 6:14:21 GMT
Welcome to the forum steffou And thanks for the input,, as said,, it's good to talk to developers and give our feedback on software,, Thanks, just to be fair, i'm not a developer, Fred is... but a contributor, i'm just a user with ideas, a bit geek, zero budget, and deeply involved in testing what very talented devs are making with this project. By the way, i just want to mention that i love the default avatar of mine on this forum, which remind me an hilarious manga character called SaΓ―tama (if you know).... Just one is enough.
That's why some other mates in the team made a tool called NAFAbox, based on scripts to install all classical astro tools (Indi, Kstars/EKOS, siril, stellarium, ocapture, astrometry index, ccdciel, etc...). That's making it a lot easier to prepare.
But buying Stellarmate image is also a simple way. You are right, that's why we needed a dedicated tool. As Fred said, it's important to think very seriously about that. As some other features in the list. My 2cents, but i think it's important to think cross-plateforms Very important, and wise That's what i'll do with this project with my club. Using it for public sessions to "show" the eye's limits, and share our passion with the next generation, stucked on their phone... you see?? that guy trying to take a picture with his smartphone threw the eyepiece, to pick a "souvenir" from that night . With the sharing server, people could just take a screenshot directly on the phone. Thanks in advance for your tests.
I have already tested out this approach with my lattepanda,, I have had three computers linked to my computer at the one time and had thought this was a good approach for outreach,,
|
|
|
Post by davy on May 12, 2020 6:35:49 GMT
I have yet to try it out, I intentionally done this to let the guys give it a go and give there feedback,, I'm glad to hear there is an easy install version for Windows,, like most I want an easy interface to operate with a logical work flow,, it as said seems to follow what is out there all ready and,, like the guys say,, what about something different,, the guys seem to like the zwo asiair and pro,, And I have no idea if this may be what they refer toππππ But if they like it,, it will be good software,, and with it being based from the raspberry pi 4 4gb then,, ππ,, some genius could do better π True, for my tastes in short exposure VA/EAA, the AIR is very close to perfect ... multi platform ... great features for outreach and sharing with mates using android, apple, linux screen sharing. Even has hdmi output to a 12V HD camping LED TV if you have one. Or to a small 12V projector system. Ticks all those boxes. But, there is no way at all I'd recommend the AIR to a beginner. No manual and does so much that you have to have experience to use it. Paul and I with years of VA under our belts have had to use YouTube to figure it out.Β And, it only works with ZWO equipment.Β And, with just small tablet screen area lacks a bunch of stuff which many people wish was there. Like all "simple, easy to use" software it cannot be simple and easy to use and do everything ... which unfortunately is what people want as they get into the hobby.Β But, to give a perfect example of the stuff I would love to see the ALS development team do with their software here is one feature of the AIR which is totally brilliant! An example of automating a workflow which all experienced VA / EAA / AP people do ... Beginners to experienced to those doing outreach ... we ALL love and need very accurate GoTo's. If you land off target and spend just 2 minutes trying to find the darn thing, then it shortens the number of targets I can get to in my short cloud free time "observing" using very short exposures. Worse ... a beginner will often spend an hour trying to find the target. And also very bad ... if doing outreach or sharing with friends, then after 5 minutes they have all lost interest in what you are doing and wander off. Forums tell us the standard way to fix this is load platesolvers and ASCOM or INDI to control the mount to then refine the GoTo based on platesolved coordinates. But those forums are full of people (a) having problems downloading the correct plates for their equipment (b) performance issues or disk space issues with the platesolving (c) no ideas how to then use the info from platesolving to actually do a precise GoTo (d) then realising that their software wont do that and have to find more software to do the precise GoTo. So experienced people have a tough time .... and beginners should just totally avoid in my opinion.Β What has ZWO done with the AIR ...Β You open its built in object database.Β It shows you what the object looks like, its altitude in the sky, brightness and size. If you say to yourself "Yes that is visible, in the right spot of the sky, good size for my scope and camera combination" and so you tap to select it ... The mount automatically performs the Go To ... automatically platesolves on finishing the goto ... and automatically from that platesolve does a precise goto to place the target smack bang centre of the camera frame. That's smart! All built into the one downloadable app.Β There's got to be more automated stuff which the ALS team can do. Such as the surface brightness and alt and az of the target object in the database can easily be used to set a starting point for exposure time and gain. The exposure time is also easily set if AltAz mount is used as the object AZ and ALT yields the degrees of field rotation in whatever part of the sky the object is in at that time and thus the max possible (short as Alt Az mount) exposure time. Then that first shot will produce a histogram which can (or should) be very simple to interrogate automatically to decide if the image is clipped in the black left hand area, or white right hand area (highly unlikely), and then adjust exp and gain based on that in a quick iteration to yield correct histogram to allow stacking to commence. For experienced people you simply click a stop-auto-process button to allow you to use your experience to set it. But outreach and beginners it would be a super feature. And is simply an automation of the workflow which all experienced people use when setting up quick focus and frame images before then setting our normal (from experience) exposure, gain, white balance to then permit stacking. ON white balance ... that too is another example of what should be automated - as most people I know doing VA/EAA adjust the RGB curve peaks to align for the starting point. Then do the final tweak using the RGB sliders. But 100% of beginners have absolutely no idea how to color balance to start with. Tarantula nebula, orion neb, flame neb all come out red .... when they are not red at all! If they've fiddled with the settings they often have a green hue over everything as the RGGB pixel array has two green pixels which over power everything. Again they have no idea how to easily fix it. So automate it ... then give control for the final tweak if they so desire.Β Anyway ... off my lecture podium ... sorry everyone.Β Cheers Anyhow.Β I don't have an asiair or pro,, my thoughts were,, can it be done outwith zwo,, So far I've tried the raspberry pi and it isn't good enough at the present time,, great opportunity for developers to mabe it better,, So I'm using the lattepanda and it's closer to doing what the asiair does but with my preferred Windows based software. Howie is spot on,, if new software can pin point a target with plate solving to a good degree and simply using software then,, that's stage one. Stage two,, an exposure to check it is correct,, split screen,, showing reference image to capture image. Simple mount control feature to move scope and a track button Simple slider adjustment bars to control the camera operations And when happy with all your settings capture to folder,, that will rough stack your images but give you editing options to remove a bad image from the stack
|
|
|
Post by davy on May 12, 2020 6:45:10 GMT
Okay, downloaded, clicked the exe, bypassed/ok'd the W10 warnings and waited, waite..., Aha! Nice, so that was easy as I thought, good one guys. Hmm, bit bloody dark to make out whats what on screen. 1/ pick a different or brighter display output, too dark on dark for my old(er) eyes to make out the details. Had to really concentrate to select elements on-screen. 2/ the preferences box/icon to select to access the Preferences options could be a bit larger, for reasons in (1). Otherwise at this stage all was good. Fiddled with the Preference/Scan-Folder and then copied images in one at a time to see the resulting build up of an image in ALS. Cool. I was able to start the server and browse to it after noticing the address displayed in the inspection window in ALS. Thats not very stunningly obvious and would suggest making it a output display in the 'Server' area of ALS? Not everyone is going to think of scrolling back in the inspection box to find it, if they would even think it would be there anyway.... So this was done after (a late) breakfast and I would of done more maybe but the power was cut..... Meh. Over all I thought it was pretty painless and can see it would be a GREAT tool for clubs and outreach particularly. Good work guys. Have to say too, your english (both of you) is nothing to be apologetic about. Although Fred's sounds like a have, sure you're not a mis-placed Brit Fred? Or maybe you're from Brittany.... Best regards ian Think he might be more Scottish mate π, good review and suggestions for Fred,, I like his passion,, latching on to all the suggestions in making the software better and not looking at the comments being negative.. The coding must be pretty impressive and well laid out to be able to go in and d Address some of the issues,, I've also been following the progress in Sgl, I'm back at work now and flat out, so pretty tired when I get home but hopefully can get a look at software soon and review it.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on May 12, 2020 7:11:23 GMT
What a joy to wake up and see you guys did your homework Brillant suggestions ! iannz : - You are absolutely right. Webserver information should be more obvious. Working on it before I shoot the demo video. - A switch for night/day display mode will follow shortly - Regarding launch time : it is a delay you'll have to accept for a few more weeks. We'll have a true windows installer in a "near" future. As of now, the executable you guys download is merely a 'glorified ZIP'. Your computer has to unzip all files in your temp folder and launch ALS from there... howie1 : Great ideas : Capture settings optimization on the fly according to a few first shots would be a nice feature to make ALS maybe also stand for 'A Lot Simpler'. Might be tamed down on small machines, but surely a great aid. And yes guys, I grew up just across the Channel. My mind must have been "polluted" Have the nicest day, mates
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on May 13, 2020 16:44:23 GMT
Hi again, guys
A presentation video has been put up :
A bit raw and long, but you'll get the basics in no time
|
|
iannz
Member
Posts: 71
home town/country: Tarurutangi, NZ
time zone gmt +/-: +12
|
Post by iannz on May 13, 2020 20:39:32 GMT
Hi Fred nice explanation of the many options and layout.
Have to say again that it is hard to follow on screen when the text/titling is so dark. If you weren't explaining where you were and what you're doing, along with the mouse icon, it would be hard to follow along. I think you need to make the text (red) colouring much lighter possibly or better contrast anyway.
Nice to see the 'quick key' commands etc too.
So on finishing a 'capture' does the program save the final stacked image to the ALS working folder? So you have a record of the total result?
Another thought, a la Astroberry, is an option to have the host act as a Hotspot. For groups or settings (informal star parties?) that don't have or are unable to setup (available knowledge, resources, temp locations) a dedicated server host/network. I don't belong to any groups locally and am just starting out in video/imaging astronomy, but it seems to me to be a great tool to share the experience with others with (possibly) just their own cellphones etc.
thanks ian
|
|
|
Post by davy on May 13, 2020 21:47:50 GMT
Good presentation Fred,, looking good, Being into lunar imaging,, would it stack lunar images,, π I was listening,, 25 stars for alignment,, I tried the zwo asi studio software and it would not stack,, because obviously no stars.. Could als,, be configured to do ref points on the moon surface and stack.
|
|
|
Post by davy on May 13, 2020 21:57:44 GMT
Hi Fred nice explanation of the many options and layout. Have to say again that it is hard to follow on screen when the text/titling is so dark. If you weren't explaining where you were and what you're doing, along with the mouse icon, it would be hard to follow along. I think you need to make the text (red) colouring much lighter possibly or better contrast anyway. Nice to see the 'quick key' commands etc too. So on finishing a 'capture' does the program save the final stacked image to the ALS working folder? So you have a record of the total result? Another thought, a la Astroberry, is an option to have the host act as a Hotspot. For groups or settings (informal star parties?) that don't have or are unable to setup (available knowledge, resources, temp locations) a dedicated server host/network. I don't belong to any groups locally and am just starting out in video/imaging astronomy, but it seems to me to be a great tool to share the experience with others with (possibly) just their own cellphones etc. thanks ian Hi Ian,, the astroberry should be able to do this,, I used my smartphone hot-spot to connect astroberry on the rpi4 and my laptop,, with vnc. I never tested hooking up multiple laptops ect on this system. But using my lattepanda in exactly the same manner,, only using Windows instead of Linux,, and tight vnc I connected three devices,, and it worked great,, so ideal for star parties,, with a few tweaks.
|
|
|
Post by steffou on May 13, 2020 22:14:51 GMT
Hi Fred nice explanation of the many options and layout. Have to say again that it is hard to follow on screen when the text/titling is so dark. If you weren't explaining where you were and what you're doing, along with the mouse icon, it would be hard to follow along. I think you need to make the text (red) colouring much lighter possibly or better contrast anyway. Nice to see the 'quick key' commands etc too. So on finishing a 'capture' does the program save the final stacked image to the ALS working folder? So you have a record of the total result? Another thought, a la Astroberry, is an option to have the host act as a Hotspot. For groups or settings (informal star parties?) that don't have or are unable to setup (available knowledge, resources, temp locations) a dedicated server host/network. I don't belong to any groups locally and am just starting out in video/imaging astronomy, but it seems to me to be a great tool to share the experience with others with (possibly) just their own cellphones etc. thanks ian Hi Ian,
Yes it works, this is exactly the idea behind. I have an "astroberry like" (i use a single board computer called Up! Board, which is a bit more powerfull than the Raspberry PI4) with indi/kstars/ekos, ALS, and working as an hotspot. People can just connect ther smartphone to the hotspot, and reach the ALS server directly. The webpage has an auto-refresh to see the stacking result.
|
|
|
Post by steffou on May 13, 2020 22:30:08 GMT
Good presentation Fred,, looking good, Being into lunar imaging,, would it stack lunar images,, π I was listening,, 25 stars for alignment,, I tried the zwo asi studio software and it would not stack,, because obviously no stars.. Could als,, be configured to do ref points on the moon surface and stack. Hi Davy,
for the Alignement, it's not about 25 "stars", but 25 "groups of stars".
About the alignement with planetary images, it's actually impossible, but i think Fred can answer you with better words than me.
Steff
|
|
iannz
Member
Posts: 71
home town/country: Tarurutangi, NZ
time zone gmt +/-: +12
|
Post by iannz on May 13, 2020 22:39:01 GMT
ok thanks Steffou
|
|
iannz
Member
Posts: 71
home town/country: Tarurutangi, NZ
time zone gmt +/-: +12
|
Post by iannz on May 13, 2020 22:41:24 GMT
I'd suppose the moon requires something like facial recognition?
|
|
|
Post by steffou on May 13, 2020 23:01:29 GMT
you are welcome,
There is a button to save the current displayed result, or every frame.
I think Fred has oppened an issue to save the image in a time-stamped file on session stop, so it's in the list.
|
|
|
Post by howie1 on May 14, 2020 0:58:55 GMT
Good presentation Fred,, looking good, Being into lunar imaging,, would it stack lunar images,, π I was listening,, 25 stars for alignment,, I tried the zwo asi studio software and it would not stack,, because obviously no stars.. Could als,, be configured to do ref points on the moon surface and stack. May I ask why you need to stack when shooting the Moon at crazy low gain and crazy fast shutter speed? Low gain means low noise. You stack to get rid of noise ... not to sharpen. At very low gain there is very, very little noise. And the very fast shutter speed required at those low gains to shoot the Moon freezes the movement at those speeds. Yes, you will get atmospheric wobble and such, but setup a small capture run using the unstacked module in Studio ... the centre module/app which shows just a single frame of galaxy icon/button ... setup a small run sequence in there of single shots with maybe 30 seconds between those very low gain/fast shots and you will happen across one of those frames shot with 30 second gaps "catches" a clear sky moment for which you can then save the image (so it doesnt get overwritten by the next single shot 30 seconds later). Easy Peesy dave. EG Unity gain is considered to be the lowest noise / best dynamic range setting for AP use ... like typical 3, 5, 10 minute shots. My asi120 mono camera unity gain is 35 but with that camera I shoot Moon at gain of 5 !!!! My asi224 color unity gain is 135 and I shoot Moon with the 224 at gain 20! Both those cameras when set to these almost 0 gain settings produce no noise, no mottle in the frames. So no need to stack if no noise. And when set to those almost 0 gains, the exposure times are still in the millisecond range on the Moon. I think it is around 1 ms with the 120 mono and half ms with the 224. So fast that the shots freeze any mount wobble. So fast that even an untracking camera mount produces a sharp image! But .... if you are watching the live feed (at 1000 fps (1000ms)!!!) you will see it looks a tad blurry .... because of atmospheric conditions which your eye 'sees' as a blurry overall image - your eye blends the sub second video frame fps together in your brain and 'sees' blur. But, by using the AP (middle Studio app/icon of galaxy single frame icon) and in that app/module setup a sequence run of single frames of gain 10 and exp time 1000ms with 30 sec pause between each shot .... then you will see in the preview window that each low gain very fast shot looks sharp as a tack! But every other one or so, will be blurry a bit in some random part of the image ... due atmosphere conditions. Very short 30 second wait for the next single shot to appear, and you will come across a total clarity sharp as a tack shot ... that's when you click the Save image button ... to save that as-viewed nice sharp frame. Hope that helps you davy ... post some up. And for completeness sake ... if you shoot saturn jupiter mars etc then the Studio planetary module/app (the one on the left with the saturn image/icon) shoots video for stacking with registax/etc AND DOES HAVE surface feature guiding using ST4. Open that module up and you will see the ST4 pulse settings page and the icon to ST4 guide and in the doco / help how it works. I haven't used it but it is in there. Cheers Howie
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on May 14, 2020 6:58:50 GMT
guys iannz : each time a new frame is stacked, the stack is then processed (stretch + levels + RGB balance) and this processing result is shown and saved as 'stack_image' in your work folder. So at the end of a session, "stack_image" contains the last result. But indeed, if you start a new session without putting this result in a safe place, it is overwritten by the result of this new session iterations. So we are adding a new feature, as mentioned by Steffou. See : github.com/gehelem/als/issues/119The simple fact you had to ask is a sign this presentation video missed part of its goals. We'll improve on that point Regarding aligning planetary shots : indeed ALS is aimed at deep-sky imaging as it is. So it will quite surely always fail at aligning those shots. Clear skies !
|
|
|
Post by davy on May 14, 2020 7:34:43 GMT
Good presentation Fred,, looking good, Being into lunar imaging,, would it stack lunar images,, π I was listening,, 25 stars for alignment,, I tried the zwo asi studio software and it would not stack,, because obviously no stars.. Could als,, be configured to do ref points on the moon surface and stack. May I ask why you need to stack when shooting the Moon at crazy low gain and crazy fast shutter speed? Low gain means low noise. You stack to get rid of noise ... not to sharpen. At very low gain there is very, very little noise. And the very fast shutter speed required at those low gains to shoot the Moon freezes the movement at those speeds. Yes, you will get atmospheric wobble and such, but setup a small capture run using the unstacked module in Studio ... the centre module/app which shows just a single frame of galaxy icon/button ... setup a small run sequence in there of single shots with maybe 30 seconds between those very low gain/fast shots and you will happen across one of those frames shot with 30 second gaps "catches" a clear sky moment for which you can then save the image (so it doesnt get overwritten by the next single shot 30 seconds later). Easy Peesy dave. EG Unity gain is considered to be the lowest noise / best dynamic range setting for AP use ... like typical 3, 5, 10 minute shots. My asi120 mono camera unity gain is 35 but with that camera I shoot Moon at gain of 5 !!!! My asi224 color unity gain is 135 and I shoot Moon with the 224 at gain 20! Both those cameras when set to these almost 0 gain settings produce no noise, no mottle in the frames. So no need to stack if no noise. And when set to those almost 0 gains, the exposure times are still in the millisecond range on the Moon. I think it is around 1 ms with the 120 mono and half ms with the 224. So fast that the shots freeze any mount wobble. So fast that even an untracking camera mount produces a sharp image! But .... if you are watching the live feed (at 1000 fps (1000ms)!!!) you will see it looks a tad blurry .... because of atmospheric conditions which your eye 'sees' as a blurry overall image - your eye blends the sub second video frame fps together in your brain and 'sees' blur. But, by using the AP (middle Studio app/icon of galaxy single frame icon) and in that app/module setup a sequence run of single frames of gain 10 and exp time 1000ms with 30 sec pause between each shot .... then you will see in the preview window that each low gain very fast shot looks sharp as a tack! But every other one or so, will be blurry a bit in some random part of the image ... due atmosphere conditions. Very short 30 second wait for the next single shot to appear, and you will come across a total clarity sharp as a tack shot ... that's when you click the Save image button ... to save that as-viewed nice sharp frame.Β Hope that helps you davy ... post some up. And for completeness sake ... if you shoot saturn jupiter mars etc then the Studio planetary module/app (the one on the left with the saturn image/icon) shoots video for stacking with registax/etc AND DOES HAVE surface feature guiding using ST4. Open that module up and you will see the ST4 pulse settings page and the icon to ST4 guide and in the doco / help how it works. I haven't used it but it is in there.Β Cheers Howie Hi howie,, I tried stacking the moon a few years back, and I never saw any benifit,, I had single exposure better than stacked images,, I've seen posts on Facebook groups that folk have said it was stacked images,, π I took it that,, I must have done it wrong,, so I never bothered to try again,, I like using dslr for lunar images and set my camera up in byeos,, but been doing it for so long I now what shutter speed /iso I need to be close, on video I just mostly live view and don't record.
|
|
|
Post by davy on May 14, 2020 7:40:42 GMT
|
|
|
Post by davy on May 14, 2020 7:41:36 GMT
This is what I was meaning
|
|
|
Post by howie1 on May 15, 2020 5:00:56 GMT
May I ask why you need to stack when shooting the Moon at crazy low gain and crazy fast shutter speed? Low gain means low noise. You stack to get rid of noise ... not to sharpen. At very low gain there is very, very little noise. And the very fast shutter speed required at those low gains to shoot the Moon freezes the movement at those speeds. Yes, you will get atmospheric wobble and such, but setup a small capture run using the unstacked module in Studio ... the centre module/app which shows just a single frame of galaxy icon/button ... setup a small run sequence in there of single shots with maybe 30 seconds between those very low gain/fast shots and you will happen across one of those frames shot with 30 second gaps "catches" a clear sky moment for which you can then save the image (so it doesnt get overwritten by the next single shot 30 seconds later). Easy Peesy dave. EG Unity gain is considered to be the lowest noise / best dynamic range setting for AP use ... like typical 3, 5, 10 minute shots. My asi120 mono camera unity gain is 35 but with that camera I shoot Moon at gain of 5 !!!! My asi224 color unity gain is 135 and I shoot Moon with the 224 at gain 20! Both those cameras when set to these almost 0 gain settings produce no noise, no mottle in the frames. So no need to stack if no noise. And when set to those almost 0 gains, the exposure times are still in the millisecond range on the Moon. I think it is around 1 ms with the 120 mono and half ms with the 224. So fast that the shots freeze any mount wobble. So fast that even an untracking camera mount produces a sharp image! But .... if you are watching the live feed (at 1000 fps (1000ms)!!!) you will see it looks a tad blurry .... because of atmospheric conditions which your eye 'sees' as a blurry overall image - your eye blends the sub second video frame fps together in your brain and 'sees' blur. But, by using the AP (middle Studio app/icon of galaxy single frame icon) and in that app/module setup a sequence run of single frames of gain 10 and exp time 1000ms with 30 sec pause between each shot .... then you will see in the preview window that each low gain very fast shot looks sharp as a tack! But every other one or so, will be blurry a bit in some random part of the image ... due atmosphere conditions. Very short 30 second wait for the next single shot to appear, and you will come across a total clarity sharp as a tack shot ... that's when you click the Save image button ... to save that as-viewed nice sharp frame. Hope that helps you davy ... post some up. And for completeness sake ... if you shoot saturn jupiter mars etc then the Studio planetary module/app (the one on the left with the saturn image/icon) shoots video for stacking with registax/etc AND DOES HAVE surface feature guiding using ST4. Open that module up and you will see the ST4 pulse settings page and the icon to ST4 guide and in the doco / help how it works. I haven't used it but it is in there. Cheers Howie Hi howie,, I tried stacking the moon a few years back, and I never saw any benifit,, I had single exposure better than stacked images,, I've seen posts on Facebook groups that folk have said it was stacked images,, π I took it that,, I must have done it wrong,, so I never bothered to try again,, I like using dslr for lunar images and set my camera up in byeos,, but been doing it for so long I now what shutter speed /iso I need to be close, on video I just mostly live view and don't record. Sorry dave ... one of my unhelpfully way too posts there ... that dumb long answer I put was saying exactly that ... you set such low ISO/gain and such fast shutter speed, that single snapshot yields a very sharp clear shot. Stacking does not help sharpness. Taking a single very fast low ISO/gain shot does the best at taking a sharp Moon image. But use BYEOS run sequence to do it like I said .... set it to take 20 shots (say) with 30 seconds between shots .... quite a few of those 20 shots will be really sharp. The 30 seconds break between shots just allows for some of the atmospheric wobbles or wisps of cloud / seeing conditions to change a bit between each frame 30 seconds apart. Thus more chance that one of those 20 frames shot 30 secs apart will be crystal clear.
|
|