Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Feb 3, 2017 21:05:40 GMT
Oops ... should have said a bit more, perhaps. Here goes:
The Stick has an HDMI out plug and comes with an HDMI cable. I ran that cable to the HDMI "in" port on my TV set. And, I had an old Apple wireless keyboard that I hooked up using the Stick's Bluetooth feature. And I also hooked up an old usb mouse to one of the Stick's usb ports. It happens that my TV is connected to my home network. So with a bit of nudging, I was able to have the Stick join the network and I then downloaded EOS, BYE, PHD2 and such. After that initial loading, because of the RDC interface, the hookup to the TV and the wireless keyboard and usb mouse are no longer necessary. And, if I need to download other stuff you guys invent, I can do that via the RDC connection, provided the Stick has internet access. Hope this helps
J
|
|
|
Post by davy on Feb 3, 2017 21:24:46 GMT
hi jack, yeah I had thought about how it may have been set up, and the Bluetooth keyboard and a hdmi port monitor would have been used for scope initial set up and to test the mount ect at the mount ,the monitor could be left at scope to look at camera image on byeos or astrotoaster before heading into house ,shed,observatory warm room,, yes a cheap laptop could do the same but I personally like the idea of a small screen with hdmi connected to the pc stick with my dslr plugged into it and telescope mount operating from it as well.
|
|
robrj
Member
Posts: 248
home town/country: Escondido, CA
|
Post by robrj on Feb 13, 2017 22:12:12 GMT
I have one that's similar. It's works. It goes to sleep fast (3 minutes) requiring you hit a button to wake it up but it isn't that big of a deal. I originally had my computer mounted on the OTA which would cause connection problems with the keyboard as it's wireless link is weak. If the scope came between the keyboard and the computer, it wouldn't work, requiring me to walk around the scope to get it to talk. I wanted a full size keyboard so I wound up buying a Logitech K830. It was a little better with regards to the wifi interference. I've since moved the computer to the mount itself. However, the mini-usb charging port on the Logitech has since come dislodged, meaning I can't charge it, so I may revert back to the small hand held unit until I can see if the Logitech is fixable. One thing nice about the small one is it would fit in my coat pocket (which helps with dew). One other thing to note, on my stick pc, I had to have a wired keyboard to set it up. You need a keyboard to log in, and you need to log in so you can turn on the bluetooth.
|
|
|
Post by ChrisV on Feb 14, 2017 0:30:01 GMT
I seem to get some dropouts in the connection to the PC stick occasionally, so what you say about the weak wifi makes sense Rob. I wonder if its worth adding a usb-wifi dongle to the PC stick ?
|
|
robrj
Member
Posts: 248
home town/country: Escondido, CA
|
Post by robrj on Feb 14, 2017 3:37:58 GMT
Perhaps. You could try just connecting via Wifi 2.0 and see if that makes it better at all times. I've read that some have issues with connecting WIFI 3.0 gear to the ComputeStick via the 3.0 port. They stated it can cause wifi issues because the port is near the wifi antenna. I've not had any issues with things like my ZWO camera or the hub connected. I think it's just the big metal tube interfering with the keyboard (8" newtonian). If it were that issue, it would happen all the time. It's not as bad with the computer sitting on the mount, under the tube. The only time I've had issues there was when the tube was at an angle that put it between the keyboard and the computer.
|
|
robrj
Member
Posts: 248
home town/country: Escondido, CA
|
Post by robrj on Feb 14, 2017 3:43:11 GMT
I managed to get my computer (Surface 3) to connect to the ComputeStick using RDPWrapper 1.6. Since I needed it at both ends, I tested it both ways and can connect from either computer to the other (Stick to Surface, or Surface to Stick).
I renamed the computers to something that was easy to remember (AstronomyPC for my ComuteStick). I was getting an error "Access Denied" when I ran RDPCheck, but when I renamed the computer and connected to it in spite of the failed check, it worked fine.
|
|
|
Post by ChrisV on Feb 14, 2017 5:50:02 GMT
Interesting about the wifi issue with usb3. Must look that up.
Trouble is my cameras go much better with usb3 than 2. But it's early days and I'm still playing around with things. I'm sure it gets worse when my son is in full online gaming mode on slamming the wifi !!!
|
|
robrj
Member
Posts: 248
home town/country: Escondido, CA
|
Post by robrj on Feb 14, 2017 17:59:18 GMT
If you think the wifi signal itself is the bottleneck, and not the router, and you have an outlet near your setup, you could try setting up one of those powerline WIFI adapters (not a booster). It turns your electrical system into a network which is accessed by via wifi adapter. One end plugs in and connects directly into your router and the other you plug into an outlet and connect to it via wifi. You can name it as a separate network and connect to that for astronomy or name it the same as the SSID of your home wireless to expand your existing network. I put one in my house to boost my wifi reception to all areas of the house and my front patio where I set up. They're not suitable at the end of an extension cord. This is the one I use (US style outlet plugs) www.netgear.com/home/products/networking/powerline/PLW1000.aspx
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Feb 15, 2017 22:12:15 GMT
Howie and Others:
I am still studying and enjoying Howie's YT vids. A question about processing: What appeals to me (though I have toyed with Pixinsight and my pals are all long stack guys who do wonderful work) is the Near Real Time notion. Hence, I am not keen to fuss too much in the field making darks, flats and such.
That said, it appears that DSS has the capacity, working in concert with AT, to exploit such calibration techniques. I was re-watching Howie's intro to AT, and noted that he had a library of darks that he said he captured on a rainy day. Moreover, he said DSS / AT would choose the best of the master darks and apply it. Very nice.
However, I was under the impression that darks had to be at the same TEMP as the lights. This being so, I assumed the only useful dark frames had to be shot in the field in the same session as the lights. Maybe I have that wrong.
If I created a set of darks (on my next rainy afternoon) with the same ISO, exposure, and such, without regard to TEMP, would they still work to improve the final stack?
Any wisdom appreciated!
Jack
|
|
|
Post by davy on Feb 15, 2017 22:41:28 GMT
Lol.. the reason I laugh is because, We have started the migration to astrophotography,video astronomy has changed as we all know and most including myself do , in my opinion a hybrid form ,, not quite video and neither astrophotography but a combination ,, EAA seems to cover it now as it covers all electronic visual, video and cross over astrophotography.
Master darks in my view A general master dark to add in to a near live capture just to make the image better for visual viewing .
Then if you are taking darks before video capture it would be more in line with astrophotography.. to enhance the image as a capture image.
Nothing wrong with either, but how do we now define the forum,, I don't think we strictly do only video now. Maybe we should open a new thread and discuss the implications of the new technology and methodology of video astronomy.
|
|
|
Post by howie1 on Feb 16, 2017 0:09:11 GMT
Hi Jack, When I first started out with Mallincams and even with my ZWO camera I would say 95%+ of those broadcasting live captures on NSN were all using darks. So I did too. I accepted waiting several minutes for the final image to complete ... even though each frame might only be 20 to 40 seconds. So when I started into DSLR and Toaster, I continued doing EAA that way. That video I did was waaaay back when I started out and sure enough ... had a library of darks. And I used them a few times. But I no longer do darks. So to answer your question Jack ... If doing AstroPhotography (AP) then temp is important, but then so is the signal/amount of data (looooong exposures!) and care and attention to detail like alignment, guiding, flats, bias frames and so on. Some folk carefully check against numerous sources the stars in the final image and look for whether a hot pixel correction via a dark frame has covered over one of the stars in the field of view. So ... is that what we are doing in EAA/NRT? I think the answer is obviously no, it is not. We are shooting much shorter exposures and just trying to get a decent image to look at out in the field at night. With my experience so far in EAA, I cannot tell the difference between applying darks from a library to those shoot at night at the same temp. WARNING ... if you do darks they have to be a RAW file format. Toaster and DSS take much longer to process RAW files. Secondly all your shots of objects (the lights) have to also be in RAW format. Then DSS / Toaster take even longer to process as the darks have to be applied to the lights. The above is why I now totally 100% of the time shoot Small Fine format Jpegs and do not bother with darks. Fast capture and quick processing. That last live desktop HD capture I did is Jpeg Small Fine format. See below. Watch on big widescreen HD TV and tell me you can see so many hot pixels that you simply couldn't bear to do EAA out in the field at night and have to bear to look at that! LOL! cheers Howie (the Aussie one)
|
|
|
Post by howie1 on Feb 16, 2017 1:10:20 GMT
Davy, I started out with Mallincams which kinda were the ultimate in VA (underline Video Astronomy) and in a way kick started the whole NRT thing (IMHO). 95% of those broadcasting were using Miloslick and darks and guiding. Yet it has always made me laugh that still seems to be called VA. What is the difference? ... 1. "True" VA (video) cam ... needs both power and signal cables ... takes heaps of fps but then "integrates" those frames into up to a 60 second (my VSS+ exp limit) "single" frame ... later models can apply dark-frames ... But IMPORTANTLY with both those previous steps they are done IN-camera using software/firmware programmed by some clever programmer dude ... outputs an analogue feed to a monitor which needs both power and signal cables for out in the field viewing. 2. Non "VA" cam - CCD or DSLR or CMOS imager cam ... needs one power and video cable ... takes one single longer exposure frame which you can stack/integrate... you can take darks to apply ... But IMPORTANTLY with both those previous steps they are done OUT-of-camera (laptop) using software/firmware programmed by some clever programmer dude ... you view on the screen on the laptop which has internal battery power for out in the field viewing. Personally, the only distinction I see is the processing is done in-camera vs the other processes out-of-camera. Aha, says someone ... but with a 'true' video cam I can be on M42 and move the mount with the arrow buttons and literally "see" M42 move jerkily "live" across the screen. Ok, I can set ISO 12,800 and single 2 sec shot and without using Toaster "see" M42 move on the back LCD screen of the camera. But more importantly do I consider both those jerky fuzzy images what I want to see when I replace the eyepiece of my telescope with an electronic device? Nope. It sure isn't! If I do outreach with the kids from daughters (teacher) school, or in the local astro club outings, or within my unit complex are those jerky fuzzy images what they want to see? Or to put it another way, do they care that the image can actually be moved across the monitor "live" ... Nope. Not all all. So maybe VA is a defunct term no longer applicable? Cos all technologies 'integrate' and can have darks except one can (maybe) do it all internally and the other uses an external processor to do it. Hmmm, just come to the conclusion that maybe while I love the friendship and folk and openness of the VA forum ... maybe I am actually on the wrong forum? Can you change the name to EAA forum?
|
|
|
Post by ChrisV on Feb 16, 2017 2:37:23 GMT
+1. Look at Howie's shots without darks/flats.
DARKS. Why don't you take some darks at a similar temp ISO/exposure time and see what it looks like. If you only see some hot pixels, who cares. I haven't seem much of a difference with darks on my 550D. In fact, if you take at a different temp it can make things worse. I take darks when I use my 224MC as it has huge amp glow which shows up really quickly when you start stacking/stretching. The DSLRs don't do this very much.
FLATS. If you've got a small sensor you won't have vignetting so they're not needed - but they do clear up dust bunnies! The easy thing about flats is that they're not temperature dependent so you can keep a library. BUT they change with focus and what's in you optical train and even camera orientation. I've been using them with my ASI071(in sharp cap) - but its just 10x 0.1sec of shots with the scope pointed at the sky in daylight before I start a session. I've used them in AT with my DSLR when I shoot dimmer images as the vignetting shows up when you stretch the image. But, again not needed for bright objects. And Howie's shots look way better than mine without flats.
And as Howie says, you'll have to do raws which is slower and disk space intensive. So basically I'd ignore my rave. Don't use flats/darks and see how it goes.
|
|
|
Post by ChrisV on Feb 16, 2017 3:12:03 GMT
Davy I didn't look closely. But in the post above it looks like jack has AT and byeos (or eos utilities) running on his pc stick. Isn't that what you wanted ? Chris
|
|
|
Post by davy on Feb 16, 2017 6:46:34 GMT
Yes,, been looking at posts both here and on cloudy nights about the pc stick,
|
|
robrj
Member
Posts: 248
home town/country: Escondido, CA
|
Post by robrj on Feb 16, 2017 14:43:13 GMT
I do limited darks with Starlight Live. I usually just do 10 darks before I start my session using what I think will be the longest exposure. I don't fuss too much about matching the exposure time to every shot. I use a filter wheel and have a dark blank in one of the slots. So, for instance, if I think I'm going to shoot 30 second exposures, I'll do 10 shots of 30 second darks. Then for the rest of the night, I use those even if I drop the exposure down to 10 seconds. I'm mostly doing it to remove hot pixels. When you apply a dark that is too long, the hot pixels will be darker than the surrounding sky. It would be obvious in an astrophotography shot but that doesn't really affect the view much in video astronomy. I don't do flats because SLL doesn't have that capability yet and it would require some kind of light box (another thing to carry out). My cameras are not very noisy even without darks though. I've done plenty of sessions without them.
Sharpcap 2.10 beta has the ability to take darks and flats.
|
|
|
Post by davy on Feb 16, 2017 18:16:53 GMT
Talking about darks, does anyone who uses a filter wheel, black out a section for taking darks
|
|
robrj
Member
Posts: 248
home town/country: Escondido, CA
|
Post by robrj on Feb 16, 2017 20:00:57 GMT
I mentioned it above . Orion sells a "Dark Filter" blank ($20). Scopestuff has one as well ($17). It just screws in like a filter. I bought the Scopestuff one. It's thick but still fits in my filter wheel. Martin Meredith mentioned on SGL that he just glued a piece of flocking to an unused cheapie filter and uses that for darks. I don't know if the Scopestuff one would have fit in my cheapie filter wheel as the Orion filters were a tight fit and the scopestuff filter is thicker than those.
|
|
|
Post by davy on Feb 16, 2017 21:45:46 GMT
Was just one of those things that came into my head lol..
|
|
|
Post by howie1 on Feb 16, 2017 23:19:22 GMT
So the next thing Robert is ... can you try something out for me. I mentioned it (emailed the idea) to a US filter / optics manufacturer but they replied to me saying the idea wouldn't work because a Bahtinov mask works at the objective end and not the EP/focus end of the light train. However, I don't get their answer? A filter wheel is down at the EP/focusing end of the light train and that works!?
Anyway my idea was ...
Take a clear filter which you can "donate" to the cause. Carefully glue a piece of flyscreen mesh to it. IE cut the mesh circular to slip/fall onto the clear filter. Press it down flat and put some superglue on a few strands of the mesh at the extreme edges so the mesh is held/glued to the metal edges of the inside of the filter. Bingo, you have a mini bahtinov mask. So in a filter wheel you can get focus at any time simply by rotating it into place.
I know the flyscreen mesh works as a bahtinov mask cos I've been making them for a couple of years now. My YouTube channel has a couple of video's showing how to make them, and another video which shows how to get around the problem of focusing with a camera is mounted in a hyperstared SCT (by placing a small mask on just the smaller annulus area between the hyperstar and the edge of the SCT). People with hyperstar usually buy another mask and cut it and glue velcro strips to it so they can open it when slipping it around the hyperstar, then close it down over the annulus area. But you don't need the mask right around the annulus. So long as the light has straight edges in it somewhere, it will throw a diffraction spike. It doesnt have to be centered or covering the whole aperture.
So, are you able to do that test with your filter wheel and a donor clear filter Robert?
cheers Howie (the aussie one)
|
|