Journey to Mars 2020
2020 promised to be a bumper year for planetary imaging in the UK, with the reappearance of Jupiter, Saturn and the closest pass of Mars until 2035. This is my journey from ‘not doing planetary’ to successfully imaging the planets with a few tips along the way. I don’t claim to be an expert and consider myself at the start of this journey, but my progress over a year is significant.
Strictly a deep sky kinda guy
At the start of 2020 I would have described my imaging as strictly deep sky; galaxies, nebulae and star clusters. I enjoy the challenge of pulling the detail that the eye cannot see and not knowing what you’ve got until your process your data the next day. However, I was missing seeing the planets and lunar craters up close. My previous dalliances with planetary imagining were with my first scope, a Meade ETX-90, holding the camera (pre-smart phone) up to the eyepiece. A pretty agricultural method that results in no more than indistinct reddish beachballs. To try planetary work I’d need different equipment as my observatory based rig is set up for galaxies and my travel rig is for wide field star fields.
What do you need?
To get into planetary imaging there are a few essentials you’ll need, plus a few nice to haves:
Essential:
a long focal length telescope, or the ability to extend your scope’s focal length significantly (and without compromise) using a Barlow (2x, 3x, maybe 5x). I’m talking 2000-3000mm focal length and getting on for f/20-30 range.
a sturdy tracking mount
a high frame rate camera with ROI (region of interest) image capture ability
Bahtinov mask for focussing
a computer
Processing software (mainly free, see below)
Useful extras:
filters, which depending on the glass in front of the camera sensor may include: UV/IR cut, IR pass, colour filters RGBL if you have a monochrome camera.
ADC (atmospheric dispersion corrector)
Other things I found useful:
a red dot finder
a Skywatcher guidescope alt / azi adjuster
a flip mirror and eyepiece for focussing
a standalone image capture device (ASIair)
a 10m USB 3 cable.
The Theory
I dug out about 10 year’s worth of magazines and went through them all for planetary imaging tips. I had probably never read these before as I had no need. I made a load of notes and started to form a general approach based on what the the experts were saying. I also watched some YouTube videos including an excellent talk by Damian Peach to an astronomical group where he explains how he captures his incredible images. So in summary, the theory is:
Using a high frame rate camera capture videos files of the planetary disc. Use free software like PIPP and Autostakkert! to sort and rate the best frames, following by stacking using the best [your choice] % selection. Take the resulting image into free software Registax to use the Wavelet function to sharpen and denoise the image. Do final processing in Photoshop (size, exposure, contrast, colour balance, saturation, sharpness, noise etc.), possibly overlaying RGB data with infrared or other for a combined final image.
Point and shoot it ain’t! There does not appear to be the need for calibration frames (flats, darks, bias) like with other forms of astrophotography (maybe this is useful at the advanced level).
First light
I started in March & April on the Moon and Venus as the technique would be largely the same for Mars, Jupiter & Saturn in the autumn/winter. Venus was perfectly located in the western part of the sky for evening (or even twilight) image capture. Lunar craters were good challenges for focus and framing the camera view. I did various tests of maybe 500 or 800 frames of each objects in order to make a video and select only the best 15% or so the stack together. I managed some half decent images of Venus in UV/IR and IR and some of my best lunar detail. It was quite hard work though, and I was learning what worked and what didn’t.
The Challenges
Come the late summer, Jupiter and Saturn we just visible on my southern horizon (after I had the hedge cut). Many of my first attempts at this were failures in terms of achieving an image, but taught me a lot about the process and what I needed to do to improve the image capture workflow. In no particular order:
finding the planet in the scope is really hard if you have long focal length and a small camera sensor (the planets are generally pretty tiny).
the controls of a Goto mount may be too jumpy for making minute adjustments at high magnification.
dealing with atmospheric disturbance when low in the sky (in summer in the UK, the ecliptic is low in the south)
suffering air turbulence and effects of the jet stream resulting in poor seeing
matching the focal length and the camera pixel scale to ensure the images are correctly sampled (this was a rabbit hole I didn’t even know existed…)
using the right filters to ensure the correct detail is captured. For Jupiter & Saturn tests I was using an IR685nm filter which may have helped steady the seeing by removed all the colour. Twitter friends helped point me back to a simple UV/IR filter.
getting the image in focus (the BIG one)
operating maybe 4 new types of software
reminding yourself you are not Damian Peach.
Let’s start with The Rig
In Feb 2020 I purchased a secondhand scope which would become my planetary scope: the OMC 140 Maksutov Cassegrain. As the name suggests with is a 140mm (5.5”) aperture f/14 Mak with 2000mm of focal length packed into a 450mm tube length. It is made of carbon fibre, therefore fairly light and suitable for use on my iEQ30 mount. Adding a Barlow lens could push this to 3-4000mm subject to not pushing the small aperture too far.
Portability was important, as my southern aspect at home is poor living next to a hill. I have few locations from where a southern aspect is possible for observing. The design of the BLT Observatory does not allow for low summer observing along the ecliptic and is better suited to high deep sky objects in winter.
What may not be immediately obvious is what is between the mount and the scope. I have used a Skywatcher Guidescope bracket fixed to a long vixen dovetail bar to allow for small corrections without having to use the hand controller. Because I sometimes get backlash, and I never know if left is really left or actually up, I find the manual adjustment so much faster to centre the object on the sensor. If the camera sensor is aligned to the bracket, its gives a true up/down l/r adjustment which is a huge timesaver.
Using a combination of the Goto, the red dot finder and the guide scope bracket, I can now quickly centre the target.
The Image Train
There appears to be some sort of consensus over which bits go where, but it will ultimately depend on achieving back focus, and connecting everything together with the adapters you have. The image train above has undergone many configurations over the summer and this is what I think is my optimum:
OMC 140 scope with 1.25” back. Note I could connect this with a thread & adapter to the 2” back for a stronger connection, however you need to be able to rotate the ADC to locate the bubble level on top - not possible if ‘hard fixed’.
ZWO Atmospheric Dispersion Corrector (ADC). This device has two adjustable arms that you adjust by trial and error to eliminate the effects of poor atmospheric conditions which manifest as blue and yellow hues on opposite sides of the planetary disc. The ADC must be rotated to have the bubble level on top, and adjusted maybe every half hour or so. More useful for low objects, I have found it helpful on all planetary sessions. Using over exposed test frames helps show the blue and yellow more clearly, so keep adjusting the arms until the hues disappear disappear.
Baader Barlow 1.3x (not visible). This is screwed into a 1.25” nosepiece on the front of the flip mirror. The Barlow unit is sold as 2.25x but can be disassembled and used as a lower strength. My tests found that 2.25x was way too much magnification but 1.3x was good under the right conditions.
Baader Flip Mirror 2. This was my eureka moment and deals with two of the biggest challenges; getting the planet centred and getting it in focus. Using the flip mirror means you can either have the image straight through onto the camera, or deflected via a 45deg mirror up to the eyepiece, in this case a 15mm illuminated EP with cross hairs. Using a bright star, I can focus using a Bahtinov mask and check the camera and EP are par focal using my ASIair. Then when I slew to a planet I have a good starting point for final focus tweaks (if any).
Altair Astro filter holder with removable filter tray. I think this is M48 to T2 although they make a few different versions including a T2-T2 spending on what you need. The filter tray holds 1.25” or 2” filters, and although they might not be 100% par focal owing to different glass thickness, they are close.
ZWO Asi290MC. This is an uncooled CMOS colour camera with an anti-reflective (AR) window meaning it is sensitive to IR light. If you want IR - great! If you don’t then you’ll need a filter to block the IR like a UV/IR Cut filter which gives you a colour image. If you try and capture the IR using something like an IR685 or IR850, the image will be washed out and monochrome.
Variations: it would be possible to have both the filter and ADC in front of or behind the flip mirror, however I considered that both of these were variables in achieving focus between the camera and eyepiece, hence this is why adjusting the ADC affect both equally. Also if I was using a heavy filter like a high IR filter, looking through the eyepiece would show little or nothing, hence this is placed after the flip mirror, before the camera only.
Having found the target, focused and located in the centre of the view, I swap the cable connecting the camera to the ASIair and connect it to a computer for capturing video files.
Capture Software
Image capture is done through software like Firecapture or SharpCap. As a Mac user, Firecapture is my only option. ZWO issue their own software ASIStudio but I have not found it very stable when playing around in the daytime.
I do not have laptop to run the software on, relying on a long cable to an internal computer - this limits where I can set up and is far from ideal. I cannot justify the need for a laptop purely for this given I use the ASIair for deep sky images. (Update on this below…)
The image you’ll see on screen will be grainy and jumpy, depending on the quality of the seeing. Hopefully it will show some semblance of being in focus and allow you to capture some frames.
Settings
You will need to experiment with settings for your own set up. There are many variables but are some basic rules of thumb that others have recommended:
If shooting in colour, select the correct bayer matrix for your camera, mine is RGGB
Start with the defaults for each planet which helps set the basic range for exposures
Use a gain of c. 66%
For ZWO cameras use the standard gamma 50 setting and adjust the exposure time instead
I believe there is also an ADC menu to help you fine tune your ADC.
Assign a filter as it record this for later reference as part of the video file metadata
Use a region of interest (ROI) by drawing a rectangle around the object. Expect it to move a bit so don’t go in too close.
Select the frame length (in seconds) or number of frames to capture and record in either AVI or SER format.
First time I’d recommend doing lots of tests of different frames or seconds. More frames will help reduce noise in the final stack and give you more chance of getting sharp frames to keep (hence ‘lucky imaging’). Planet rotation speed should be taken into account too, so you may only want 2-3mins per video max.
Processing
So you’ve shot your frames and now you’re inside with a beer and the dog has brought you your slippers. Processing begins. There are countless video tutorials on these pieces of software so I’ll simply describe the fundamentals.
PIPP (free, Windows, Mac via that wine bottler thingy…) - This allows you to take a RAW file or files, run it through a series of operations including debayering, cropping, adjusting levels, sorting by quality and exporting to new formats. I have found it useful for combining multiple .tiffs in one AVI and for combining 2 or 3 AVI files of different frame lengths (but using the the same settings) to make one combined video file to take into the next piece of software.
Autostakkert! (free, Windows) - is the goto software for stacking lunar detail or planetary discs. Having loaded in a files (of multiple frames) or multiple files, it analyses for quality, sorts, has you set alignment points and stacks the best ones. Based on the analysis graph, you determine which frames to ditch and what proportion of those left to stack. Higher the number means a smoother, less noisy image but sacrifices sharpness of detail. Lower number uses the sharpest frames but has a high noise. Luckily you can ask it to run 4 option at once, say 50%, 25%, 15% and 10% so you can choose. It also does a first pass at sharpening and can also use ‘drizzle’ to increase the image by 1.5x or 3x.
Registax (free, Windows) - also used for lunar stacking but more commonly for its final operation, ‘Wavelets’. Loading in a single image will jump you to this menu and allow a series of sliders to increase the sharpness at the macro (6) or micro (1) level. This is very much trial and error, though if you hit a winning formula, you can save those settings for another time. Further adjustments can be made for general brightness, contrast, aligning RGB colours, rotation etc. ‘Do All’ is the command that runs all the adjustments before you save a .tiff image.
Photoshop (Mac or Windows, subscription) - Final adjustments can be made here including rescaling, adjusting saturation, sharpness, noise and annotating your efforts so the world knows how much effort you put into it.
This will have taken you a full day since you were outside imaging, so its dark again now - so get back outside and try it again!
ASIAIR Pro Video Capture
I have recently upgraded my ASIAIR to the Pro version because it can handle video capture. There are a number of menu settings (not immediately obvious) that allow for capture in AVI or MPEG to either the onboard 64GB stick or to your phone. Having worked this out (and how to remove the horrid ‘space muzak’ from the default video files) I took a number of videos on the night of the 3rd November to produce the images below. The combined image brings out the best of the steady detail from the IR filter and the colour from the UV/IR cut filter. Every time I use it learn some more tricks and I feel my technique is improving.