gwpy provides a nice way to access data, including propietary data, remotely. This note adds some additional steps needed to get things set up. First, you'll need some libraries
$ conda install -c conda-forge python-ldas-tools-framecpp $ conda install -c conda-forge python-nds2-client $ conda install -c anaconda krb5
Next, you need to initialise
kerberos for authentication. To do this,
first write a file
[libdefaults] default_realm = LIGO.ORG dns_lookup_realm = true dns_lookup_kdc = true ticket_lifetime = 24h forwardable = yes kdc_timesync=0 [domain_realm] ligo.org = LIGO.ORG .ligo.org = LIGO.ORG ligo.caltech.edu = LIGO.ORG .ligo.caltech.edu = LIGO.ORG ligo-wa.caltech.edu = LIGO.ORG .ligo-wa.caltech.edu = LIGO.ORG ligo-la.caltech.edu = LIGO.ORG .ligo-la.caltech.edu = LIGO.ORG phys.uwm.edu = LIGO.ORG .phys.uwm.edu = LIGO.ORG
(Note, I took this verbatim from the
/etc/krb5.conf file on a cluster
set up by LIGO Data Grid experts. Next, set this as the default configuration
$ export KRB5CCNAME="/PATH/TO/.krb5.conf"
Now, you just need to do initialise,
$ kinit albert.einstein@LIGO.ORG
After this, you should be able to access remote data using
>>> from gwpy.timeseries import TimeSeries >>> TimeSeries.get('H1:DCH-CLEAN_STRAIN_C02', start=118700800.0, end=118700801.0, dtype='float64')
Note, I haven't given any thought to how to make things last between sessions
here, it is just a first crack which seemed to work. Also, weirdly the
krb5.conf file seems to be getting corrupted. Unsure why.
Be warned, that propietary data is propietary, so you shouldn't use this on clusters/laptops which are unsecured.