A requirement of the data receipt pipeline is the ability to process DDF products transmitted via tape in addition to the normal network delivery mechanism. This requirement serves short-term emergency situations where network connectivity is not available.
Inputs
DDF products that have been extracted from tape and placed under a single directory. The extracted files come in pairs with a standardized naming convention:
hst_????_lz_????-??-?????-??-???_v01.dat1 hst_????_lz_????-??-?????-??-???_v01.sfdu
Outputs
Renamed DDF product pairs of files of the form:
ql_????????????????????.raw ql_????????????????????.dan
appropriate for input to the pi_swap task.
opus_login.csh must be sourced first to set up the appropriate environment variables for OPUS.
Two examples of using the scripts are provided below. In both cases, the DDF products were extracted from tape to the current working directory, then the script was run from that directory. Example 1 demonstrates a successful run for a single pair of DDF products. Example 2 demonstrates the same operation, only OSF creation fails for one reason or another this time (intended to demonstrate the type of error reporting one can expect from the script).
(no errors)
% tape2ql.csh blue.path . OPUS_DDF_DIR: Found 1 input datasets in .. ...creating ql_a2c419940911t171656z.raw ...creating ql_a2c419940911t171656z.dan ...creating OSF Processing complete.(errors)
% tape2ql.csh red.path . OPUS_DDF_DIR: Found 1 input datasets in .. ...creating ql_a2c419940911t171656z.raw ...creating ql_a2c419940911t171656z.dan ...creating OSF Unable to set SW to w Processing complete. There were 1 errors: ...failed copies to .raw file: ...failed copies to .dan file: ...failed OSF creations: ql_a2c419940911t171656z