AO telemetry¶
How to take telemetry¶
The RTC can save a wide variety of AO telemetry types. There are several different ways to command the RTC to get this data.
Common single measurements¶
If you want just a single measurement type, there are several functions in
the ultimate/ directory. These all take an average over a short period
and return one data type. In particular use:
phase = ult_get_phase()
centroids = ult_get_cents()
intensity = ult_get_intensity()
references = ult_get_refs()
rot_and_focus = ult_get_rotfoc()
Arbitrary data for up to six seconds¶
The master function here is rtc_retrieve_dumpdata. This function takes a large number of arguments for the data types. It lets you specify several options for the data as a functionin time, including and average of all measurements, only the last measurement, data at the full frame rate, and data decimated from the full frame rate. See the documentation at the link above for all the types.
res = rtc_retrieve_dumpdata(centroids=centroids, /grablast) ;; most recent measurement
res = rtc_retrieve_dumpdata(centroids=centroids, /fullbuffer, interval=3) ;; 3 seconds, all measurements
Standard telemetry dump for GPIES¶
The function ult_telem is the top-level code that gets all the data that you want. Call this simply as
ult_telem()
This does several things
- Takes data for a fixed set of data types, as well as storing configuration parameters
- Takes this data for 22 seconds (by issuing several 5.5 second dumps that are synchronized)
- Archives all the measurements in multiple fits files. See details in next section about where this is saved to.
- Extracts current GPIES settings for storage in the header.
- Prints out a
_When_201X.XX.XX_XX.XX.XXtag that identifies the saved telemetry
How to access and analyze telemetry¶
Every telem set is identified in file name with a “When” tag that marks
the time is was taken. THis format is _When_YYYY.MM.DD_HH.MM.SS, where
any fields with a single digit are not padded with a leading 0, e.g.
_When_2015.130_23.9.21.
To have access to the data, you several options
- log into on the VPN and use cpogpi03 - this only has the most recent stuff.
- You can sync with Dropbox to have a complete copy of the archive (warning - this is nearly 1.5 TB and counting)
- sync a subset manually from one of the master sources, such as palan.
Get an account on
palan.astro.berkeley.eduand rsync all the files over to your own local copy.
To generically load in the telemetry files, we only recommend using the function load_telem as
load_telem, '_When_201X.XX.XX_XX.XX.XX'
Warning
The gpilib code is set up to assume that the AO telemetry is in a specific location. This does not necessarily match with what is on Dropbox. Pay attention below to the directories and locations!
Data at Gemini¶
AO telemetry on the Gemini VMs used to be stored in
gpilib/private/psds/
This was actually a link to a separate file system. All standard AO telemetry
files (i.e. what you get with ult_telem) are now stored in
gpilib/aotelem/Raw/YYYYMMDD/
Data on any computer using gpilib to read and reduce data¶
AO telemetry used to be stored in
gpilib/private/psds/
All standard AO telemetry
files (i.e. what you get with ult_telem) are now stored in
gpilib/aotelem/Raw/YYYYMMDD/
All Reduced data products will be stored in
gpilib/aotelem/Reduced/YYYYMMDD/
Warning
If you are using the manual sync option (not Dropbox), Lisa has a perl script that will reorganize all your data. Email her to get a copy.
Data on Dropbox and synced computers (e.g. palan)¶
AO telemetry used to be stored in
Dropbox/GPI_Telemetry/gpilib_private/psds/
The new folders are
Dropbox/GPI_Telemetry/Raw/YYYYMMD
Dropbox/GPI_Telemetry/Reduced/YYYYMMD
Warning
If you are using dropbox to sync your data it’s your responsibilty to
make sure that the synced folders map correctly, that is that
Dropbox/GPI_Telemetry/Raw/ is matched to your local
gpilib/aotelem/Raw/.
Proper software calls to get paths¶
All access to the file paths for loading and saving AO telemetry is now
handled by a new function ultimate/ult_get_telem_path.pro.
This function takes a _when_2015. tag and returns the
path to either the raw or reduced data. If you are saving data
(either when taking fresh telemetry at Gemini or for a reduced data product)
this function will automatically create the YYYYMMDD directory for you if needed.
To get the path to load in a Raw file use the following. This returns everything except the suffix, e.g.
phase.fitssince that varies.path = ult_get_telem_path('_When_2015.11.6_4.43.46')
To get the path to load in a Reduced file use the following.
path = ult_get_telem_path('_When_2015.11.6_4.43.46', /red)
To get the path to save a Reduced file, creating the directory if necessary, do
path = ult_get_telem_path('_When_2015.11.6_4.43.46', /red, /query)
Integration with web thingie¶
Outline - DB¶
The When tag is supplemented in AORAW
_infofiles with theWHENSTR, which is of the formatYYYYMMDDHHMMSS. This does have zero-padding, e.g.20150130230921This keyword is either in the header already or autogenerated by the DB ingester.Webthingie’s DB has five new tables.
Table name Description AORAW info on each telem “set” overall (i.e. _info)AORAW_FILES stores all the files that comprise the sets (i.e. _phase)AORED info on every unique reduced product (e.g. aored_When_...lapeb.fits)AORAW2RED many to many mapping of AORAW to AORED AO2IFS many to many mapping of AORAW to RawDataProducts These can be queried directly by table name, e.g.
SELECT whenstr from AORAW.Let’s say that you want to get all the names of telem files from a specific When tag. There are two ways to do this.
First, you can just query on the datafile name with the tag, e.g.
SELECT datafile from AORAW_FILES where datafile like "%_When_2015.1.30_23.9.21%"
Second, you can use the linking UID/RAWID combo with a join.
SELECT datafile from AORAW_FILES left join AORAW on AORAW.UID = AORAW_FILES.RAWID where whenstr like "20150130230921"
To get a list of all AORAW sets, sorted by date, do
SELECT AORAW_FILES.datafile from AORAW left join AORAW_FILES on AORAW.UID = AORAW_FILES.RAWID where datafile like "%_phase" order by WHENSTR
If you want to get a list of all of the telem sets have not yet been analyzed by the error budget code, filter on there not being an entry in the AORAW2RED table. (This query style will break as new reduced types are added into the table. LAP needs to improve the logic). You need to filter on the object name. (THe object name will be null or zenith when we are not observing a star. Otherwise it is the target name.) You can optionally add filter for a certain date range by using
whenstr.SELECT AORAW_FILES.datafile, objname from AORAW left join AORAW_FILES on AORAW.UID = AORAW_FILES.RAWID where AORAW.UID not in (SELECT RAWID from AORAW2RED) and objname not like 'Zenith' and datafile like "%_phase" and cast(WHENSTR as unsigned) > 20150129120000 order by WHENSTR
You can explore relationships between reduced and raw AO data, and between AO data and IFS data easily with joins.
THe ingestion script automatically copies any keywords from AORAW_FILES into the linked AORAW entry, so you don’t need a join for that. However, none of those keywords are stored for the AORAW_FILES. So to get the SAMPLES keyword for a specific telemety file in AORAW_FILES, do the following:
SELECT samples from AORAW left join AORAW_FILES on AORAW.UID = AORAW_FILES.RAWID where datafile like "%_When_2015.1.30_23.9.21_phase"
The AORAW and AORED are linked when the AORED is ingested by examining specific keywords in the header of the AORED file. To get information from the AORAW file that’s relevant to the AORED file, use the linker table. For ezample, to retrieve a listing of all
lap_ebAO reduced with a list of star names (OBJNAME), do the following:SELECT AORED.datafile, objname from AORAW inner join AORAW2RED on AORAW.UID = AORAW2RED.RAWID inner join AORED on AORAW2RED.REDID = AORED.UID where AORED.datafile like "%lap_eb"
The AORAW and IFS images can be linked (see section below on how this is calculated). This is queried using the AO2IFS table. For example, to get a list of all
lap_ebreduced AO products that have a not BAD IFS image associated with them, do the following:SELECT AORAW.whenstr, RawDataProducts.datafile, OBSNOTES.markedbad from RawDataProducts left join OBSNOTES on RawDataProducts.UID = OBSNOTES.UID inner join AO2IFS on RawDataProducts.UID = AO2IFS.IFSID inner join AORAW on AO2IFS.AOID = AORAW.UID where OBSNOTES.markedbad is NULL
The AORED and IFS images can be linked (see section below on how this is calculated). This is queried using the AO2IFS table. For example, to get a list of all
lap_ebreduced AO products that have a IFS image associated with them (which could be good or bad), do the following:SELECT AORED.datafile, RawDataProducts.datafile from RawDataProducts inner join AO2IFS on RawDataProducts.UID = AO2IFS.IFSID inner join AORAW on AO2IFS.AOID = AORAW.UID inner join AORAW2RED on AORAW.UID = AORAW2RED.RAWID inner join AORED on AORAW2RED.REDID = AORED.UID where AORED.datafile like "%lap_eb"
To generate a list of previous performance on a specific target, using parameters from the error budget code, do
SELECT ReducedDataProducts.CONTR025, whenstr, AORED.WFE_BAND from ReducedDataProducts inner join Raw2Reduced on ReducedDataProducts.UID = Raw2Reduced.REDID inner join RawDataProducts on Raw2Reduced.RAWID = RawDataProducts.UID left join OBSNOTES on RawDataProducts.UID = OBSNOTES.UID inner join AO2IFS on RawDataProducts.UID = AO2IFS.IFSID inner join AORAW on AO2IFS.AOID = AORAW.UID inner join AORAW2RED on AORAW.UID = AORAW2RED.RAWID inner join AORED on AORAW2RED.REDID = AORED.UID left join AORAW_FILES on AORAW.UID = AORAW_FILES.RAWID where AORED.datafile like "%lap_eb" and AORAW_FILES.datafile like "%phase" and ReducedDataProducts.datafile like "%spdc_distorcorr.fits" and ReducedDataProducts.datapath like "%Spec%" and OBSNOTES.markedbad is NULL and AORAW.objname like "%c Eri%" order by whenstr
To generate the data for a plot of IFS contrast at 250 mas with AO bandwidth error, ignoring measurements that have been marked bad, do the following. Since there may be multiple reduced version, choose the most recent one from the data cruncher by the directory name
SELECT ReducedDataProducts.CONTR025, ReducedDataProducts.datafile, AORAW_FILES.datafile, AORED.WFE_BAND from ReducedDataProducts inner join Raw2Reduced on ReducedDataProducts.UID = Raw2Reduced.REDID inner join RawDataProducts on Raw2Reduced.RAWID = RawDataProducts.UID left join OBSNOTES on RawDataProducts.UID = OBSNOTES.UID inner join AO2IFS on RawDataProducts.UID = AO2IFS.IFSID inner join AORAW on AO2IFS.AOID = AORAW.UID inner join AORAW2RED on AORAW.UID = AORAW2RED.RAWID inner join AORED on AORAW2RED.REDID = AORED.UID left join AORAW_FILES on AORAW.UID = AORAW_FILES.RAWID where AORED.datafile like "%lap_eb" and AORAW_FILES.datafile like "%phase" and ReducedDataProducts.datafile like "%spdc_distorcorr.fits" and ReducedDataProducts.datapath like "%Spec%" and OBSNOTES.markedbad is NULL
Outline - AORAW¶
- AORAW products (the 18 files generated by ult_telem) will now have a new file
(which will be automatically created in the future, but for now will be retrofitted)
that contains the full IFS header and information on all data files in the set.
- This file is called
ugp_When_YYYY.MM.DD_HH.MM.SS_info.fits - The header contains the keyword
SAMPLESwhich is the number of time samples in the full-extent set (e.g. phase, ttdata) - The header contains keywords
DTYPE1,DTYPE2which have thephase1,ttdatasuffixes for all 18 files.
- This file is called
Outline - AORED¶
An AO reduced data product from a particular piece of code will be a single FITS file. This file has the following requirements:
- Its name is unique
- All single-value products are stored as key/value pairs in the header.
- All array-type products are stored as extensions to the main fits file.
To facilitate finding and loading these, the header needs the following
keywords:
NUMEXTSwhich is a positive integer of the number of extensionsEXT1,EXT2, etc. for each extension, the value is a string which describes the product and can serve as a suggestion for what to load the extension in as.
To facilitate linking to original AORAW data files, the header also includes a keyword for each file used as
FILE_1,FILE_2, etc for each AORAW file. The value is a string which is the filename (no file paths) without the.fitssuffix. E.g. the header looks like:FILE_1 = 'ugp_When_2015.1.30_23.9.21_ttdata' /AORaw file used in this analysis
We also recommend that the header contain information on how the products were generated. Approved keywords for that are:
CODE = 'webthingie_eb.pro' /name of IDL code used to reduce the data A_UTDATE= '2015-11-19' /Analysis of data - date conducted (UT) A_UTTIME= '21:42:23.000' /Analysis of data - time conducted (UT) AUTHOR = 'Lisa Poyneer' /Person who wrote this code
There is no need to copy any keywords from the AORAW into the AORED file. The DB will take care of all of that.
Right now the error budget analysis has set the keywords in the AORED schema. If you do a new type of analysis that produces new single-element parameters, you can add in new keywords to the schema. This will require that you email Dmitry a list of the parameters, in the following format:
CTPERSUB DOUBLE COMMENT 'Counts (DN) per subaperture on WFS CCD per frame' NOLQG INT COMMENT 'if 1, not using LQG ' CGAINADJ DOUBLE COMMENT 'Multiplier to get right centroid gain (provision' S_UTDATE DATE COMMENT 'Start of telemetry - date (UT) '
Outline - matching AORAW to IFSRAW¶
- Previously the matching of AO telem setes to IFS was cobbled together.
- Now we have a single python script which sends MySQL queries and matches files.
- If a match is found, those files are linked in the AORAW2IFS table.
Python notes¶
Get the gpi codebase from the seti SVN repo as gpi/database:
svn checkout https://repos.seti.org/gpi/database --username username1
For your python installation with macports do selfupdate, then grab
sudo port install py27-mysql and any other packages that you don’t have.
A simple query looks like
import numpy as np
import datetime
from dateutil import parser
import gpifilesdb
# in a terminal you have opened a tuinnel already with the command
# ssh username@atuin.coecis.cornell.edu -L 3306:localhost:3306
db = gpifilesdb.GPIFilesDB(server='127.0.0.1', username='username')
aoraw_tag = "_When_2015.1.30_23.9.21"
aoraw_datafile = "ugp"+aoraw_tag+"_info"
aorawquery1 = "SELECT DATEOBS from AORAW where datafile like '%s'"%(aoraw_datafile)
savedatera = db._do_query(aorawquery1)
savedate = savedatera[0][0]
aorawquery2 = "SELECT UTSTART from AORAW where datafile like '%s'"%(aoraw_datafile)
savetimera = db._do_query(aorawquery2)
savetime = savetimera[0][0]
uttime = parser.parse(savedate + ' ' + savetime)
aorawquery4 = "SELECT SAMPLES from AORAW where datafile like '%s'"%(aoraw_datafile)
samplesra = db._do_query(aorawquery4)
samples = samplesra[0][0]
Steps for testing and using new system¶
Step 0: Prototyping¶
- Done Lisa and Dmitry experiment with a few files and figure out all of the above.
- See below Then move on to migrating and testing:
Step 1: Code prep¶
DONE [Lisa] Adapt local
eb.proprocess the data and save the new specified products with new headers into the correct location.DONE [Lisa] Write stand-alone version of code that takes an AO telemetry file and queries the DB to find a match for an IFS image. FInal goal is to have a code snippet in python that will be plugged into existing routines on palan.
DONE [Lisa] write script to retrofit all the telem dumps with the
_info.fitsfile.DONE [Lisa] edit ult_telem to save this new
_info.fitsfile with proper new keywords in the header!Done [Lisa] write perl script to move al of the files to new Raw and Red directories!
DONE [Lisa] edit ult_get_headervals have a new keyword in the default header of:
WHENSTR char 14 Repackaged when string in format YYYYMMDDHHMMSS
Done [Lisa] Change
ult_telemand associated codes to save raw data products to the new location.Done [Lisa] Change
load_telemand associated codes load these raw files from correct location.
Step 2: Lisa’s computer¶
- AORAW
- DOne Create new file location and move file (perl script)
- AORED
- Done [Lisa] Run those sets through new
webthingie_ebto make reduced products.
- Done [Lisa] Run those sets through new
Step 3: Small sample¶
- All categories
- DONE - syncing stopped morning of Dec 7, 2015 Notify Jeff to stop dropbox syncing
- Done Create new file locations and to move a few files over by hand.
- Done sync over reduced data that matches Raw from drop.
- Done Ingest this sample set into DB.
- Done [Lisa] Trying querying the raw products through the web interface.
- DOne [Lisa] Trying querying the reduced products through the web interface.
- Done [Lisa] Query DB and verify that linking and getting header values is working correctly. E.g. get the AORED bandiwdth error and the IFS Red contrast at 250, 400 and 800 and check if it’s right.
Step 4: Migrate all old data¶
- All categories
- Done Run script on palan to create new file locations and to move remaining files over
- Done sync over all reduced data from drop.
- Done Ingest all this into DB
- Done [Lisa] Trying querying the raw products through the web interface.
- Done [Lisa] Trying querying the reduced products through the web interface.
- Done [Lisa] Query DB and verify that linking and getting header values is working correctly. E.g. get the AORED bandiwdth error and the IFS Red contrast at 250, 400 and 800 and check if it’s right.
- Paper verification
- Todo [Lisa] Update
wrap_eb.proto work reading in the reduced products and obtaining IFS header information (contrast, Imag) entirely from DB queries. - Todo [Lisa] Verify the old code (local, kludged) and new code (using new data storage locations and WebThingie queries) produce the same results!
- Todo [Lisa] Update
Step 5: Permanent switch to new system¶
When everything is working, switch over to saving everything to this location by default.
- Scheduled Tuesday 12/15 ** Test all of these changes in the real system. Commit on drop and then
check out on cpogpi01
ult_telemandult_get_headervals. - Jan 2016 on gpicruncher [?] Figure out how to automatically run the AO reduced products and the linking (preferably this is not Lisa minding this, but a cron job on palan or Vanessa keeping an eye on it)