The Goodman Spectroscopic Pipeline is designed to be simple to use, however simple does not always is the best case for everyone, thus The Goodman Pipeline is also flexible.
- Getting Help.
This manual is intended to be the prefered method to get help. However the quickest option is using
Will print the list of arguments along with a quick explanation and default values.
It is the same for
Prepare Data for Reduction¶
If you did a good job preparing and doing the observation this should be an easy step, either way, keep in mind the following steps.
- Remove all focus sequence.
- Remove all target acquisition or test frames.
- Using your observation’s log remove all unwanted files.
- Make sure all data has the same gain (
GAIN) and readout noise (
- Make sure all data has the same Region Of Interest or ROI (
The pipeline does not modify the original files unless there are problems with fits compliance, is never a bad idea to keep copies of your original data in a safe place.
Processing your 2D images¶
It is the first step in the reduction process, the main tasks are listed below.
- Create master bias
- Create master flats
- Apply Corrections:
- Trim image
- Detect slit and trim out non-illuminated areas
- Bias correction
- Normalized flat field correction
- Cosmic ray rejection
Some older Goodman HTS data has headers that are not FITS compliant, In such cases the headers are fixed and that is the only modification done to raw data.
The 2D images are initially reduced using
redccd. You can simply move to the
directory where your raw data is located and do:
Though you can modify the behavior in several ways.
redccd will create a directory called
RED where it will put your
reduced data. If you want to run it again it will prevent you from accidentally
removing your already reduced data unless you use
--auto-clean this will
tell the pipeline to delete the
RED directory and start over.
A summary of the most important command line arguments are presented below.
--cosmic <method>Let you select the method to do Cosmic Ray Removal.
--debugShow extended messages and plots of intermediate steps.
--flat-normalize <method>Let you select the method to do Flat Normalization.
--flat-norm-order <order>Set order for the model used to do Flat Normalization. Default 15.
--ignore-biasIgnores the existence or lack of
--ignore-flatsIgnores the existence or lack of
--raw-path <path>Set the directory where the raw data is located, can be relative.
--red-path <path>Set the directory where the reduced data will be stored. Default
--saturation <saturation>Set the saturation level. Flats exceeding the saturation level will be discarded. Default 65.000 ADU.
This is intended to work with spectroscopic and imaging data, that it is why the process is split in two.
Extracting the spectra¶
After you are done Processing your 2D images it is time to extract the spectrum into a wavelength-calibrated 1D file.
The script is called
redspec. The tasks performed are the following:
- Classifies data and creates the match of
COMPif it exists.
- Identifies targets
- Extracts targets
- Saves extracted targets to 1D spectrum
- Finds wavelength solution automatically
- Linearizes data
- Saves wavelength calibrated file
First you have to move into the
RED directory, this is a precautionary method
to avoid unintended deletion of your raw data. Then you can simply do:
And the pipeline should work its magic, though this might not be the desired behavior for every user or science case, we have implemented a set of command line arguments which are listed below.
--data-path <path>Folder were data to be processed is located. Default is current working directory.
--proc-path <path>Folder were processed data will be stored. Default is current working directory.
--search-pattern <pattern>Prefix for picking up files. Default
cfzsto. See File Prefixes.
--extraction <method>Select the Extraction Methods. The only one implemented at the moment is
--reference-files <path>Folder where to find reference-lamps
--debugShows extended and more messages. Also show plots of intermediate steps.
--max-targets <value>Maximum number of targets to detect in a single image. Default is 3.
--save-plotsSave plots as described in Plotting & Save
--plot-resultsShow plots during execution.
The mathematical model used to define the wavelength solution is recorded in the header even though the data has been linearized for record purpose.
Description of custom keywords¶
The pipeline adds several keywords to keep track of the process and in general for keeping important information available. The following table gives a description of all the keywords added by The Goodman Pipeline, though not all of them are added to all the images.
General Purpose Keywords¶
These keywords are used for record purpose, except for
GSP_FNAM which is
used to keep track of the file name.
|GSP_ONAM||Original file name, first read.|
|GSP_PNAM||Parent file name.|
|GSP_FNAM||Current file name.|
|GSP_PATH||Path from where the file was read.|
|GSP_TECH||Observing technique. Imaging or Spectroscopy.|
|GSP_DATE||Date of processing.|
|GSP_SLIT||Slit trim section. From slit-illuminated area.|
|GSP_BIAS||Master bias file used.|
|GSP_FLAT||Master flat file used.|
|GSP_NORM||Master flat normalization method.|
|GSP_COSM||Cosmic ray rejection method.|
|GSP_WRMS||Wavelength solution RMS Error.|
|GSP_WPOI||Number of points used to calculate RMS Error.|
|GSP_WREJ||Number of points rejected from RMS Error Calculation.|
|GSP_DCRR||Reference paper for DCR software (cosmic ray rejection).|
Non-linear wavelength solution¶
Since writing non-linear wavelength solutions to the headers using the FITS
standard (reference) is extremely complex and not necessarily well documented,
we came up with the solution of simply describing the mathematical model
modeling. This allows for maintaining the data
untouched while keeping a reliable description of the wavelength solution.
The current implementation will work for writting any polinomial model. Reading is implemented only for
Chebyshev1D which is the
model by default.
|GSP_FUNC||Name of mathematical model from astropy’s
|GSP_ORDR||Order of the model used.|
|GSP_NPIX||Number of pixels.|
|GSP_C000||Value of parameter
|GSP_C001||Value of parameter
|GSP_C002||Value of parameter
Every image used in a combination of images is recorded in the header of the resulting one. The order does not have importance but most likely the header of the first one will be used.
The combination is made using the
combine() method with the following parameters
At this moment these parameters are not user-configurable.
|GSP_IC01||First image used to create combined.|
|GSP_IC02||Second image used to create combined.|
The reference lamp library maintains the lamps non-linearized and also they get a record of the pixel value and its equivalent in angstrom. In the following table a three-line lamp is shown.
|GSP_P001||Pixel value for the first line detected.|
|GSP_P002||Pixel value for the second line detected.|
|GSP_P003||Pixel value for the third line detected.|
|GSP_A001||Angstrom value for the first line detected.|
|GSP_A002||Angstrom value for the second line detected.|
|GSP_A003||Angstrom value for the third line detected.|
Cosmic Ray Removal¶
--cosmic <method> has four options but there are only two real
Different methods work different for different binning. So if
<method>is set to
defaultthe pipeline will decide as follows:
3x3has not being tested.
It was already said that this method work better for binning
1x1. More information can be found on Installing DCR. The disadvantages of this method is that is a program written in C and it is required to write the file to the disk, process it and read it back again. Still is faster than
The parameters for running
dcrare written in a file called
dcr.para lookup table and a file generator have been implemented but you can parse custom parameters by placing a
dcr.parfile in a different directory and point it using
- This is the preferred method for files with binning
3x3. This is the Astroscrappy’s implementation and is run with the default parameters. Future versions might include some parameter adjustment.
- Skips the cosmic ray removal process.
Asymetric binnings have not been tested but the pipeline only takes in consideration the dispersion axis to decide. This does not mean that the spatial binning does not impact the performance of any of the methods, we just don’t know it yet.
There are three possible
<method> (s) to do the normalization of master flats.
For the method using a model the default model’s order is
15. It can be set
- Calculates the mean of the image using numpy’s
mean()and divide the image by it.
- Collapses the master flat across the spatial direction, fits a
Chebyshev1Dmodel of order
15and divide the full image by this fitted model.
- Fits a
Chebyshev1Dmodel to every line/column (dispersion axis) and divides it by the fitted model. This method takes too much to process and it has been left in the code for experimentation purposes only.
--extraction <method> has two options but only
- Fractional pixel extraction differs from a simple and rough extraction
in how it deals with the edges of the region.
- Unfortunately this method has not been implemented yet.
There are several ways one can do this but we selected adding prefixes to the file name because is easier to add and also easy to filter using a terminal, for instance.
or in python
import glob file_list = glob.glob('cfzsto*fits')
So what does all those letter mean? Here is a table to explain it.
|o||Overscan Correction Applied|
|t||Trim Correction Applied|
|s||Slit trim correction applied|
|z||Bias correction applied|
|f||Flat correction applied|
|c||Cosmic rays removed|
|e||Spectrum extracted to 1D|
|w||1D Spectrum wavelength calibrated|
So, for an original file named
Means the file have been overscan corrected while
Means the spectrum has been extracted to a 1D file but the file has not been
flat fielded (
Ideally after running
redccd the file should be named:
And after running