40m QIL Cryo_Lab CTN SUS_Lab TCS_Lab OMC_Lab CRIME_Lab FEA ENG_Labs OptContFac Mariner WBEEShop
 40m Log, Page 71 of 339 Not logged in
ID Date Author Type Category Subject
6394   Fri Mar 9 15:48:56 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:

I decided to make a backup of the database and then delete it and make a new database:

cd ~/ryan/database_dumpMar92012
mysqldump -u root -p C1_conlog > C1_conlog.dump.Mar92012 
Note: it appears this failed the first time, thankfully this wasn't a production service yet... In the future, do not trust this backup method for important data!

Next, log into mysql as root, dump the database, remake it and grant privileges again.:
(This is saved in megatron:~/ryan/restore_database.txt
megatron:~/ryan>mysql -u root -p
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 174
Server version: 5.1.41-3ubuntu12.10 (Ubuntu)

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> list databases;
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'list databases' at line 1
mysql> list users;      ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'list users' at line 1
mysql> use C1_conlog
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> list users;
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'list users' at line 1
mysql> select User from mysql.user;                                             +------------------+
| User             |
+------------------+
| php              |
| C1_conlog_epics  |
| c1_conlog_epics  |
| root             |
| C1_conlog_epics  |
| c1_conlog_epics  |
| debian-sys-maint |
| root             |
| root             |
+------------------+
9 rows in set (0.00 sec)

mysql> show databases;                                                          +--------------------+
| Database           |
+--------------------+
| information_schema |
| C1_conlog          |
| mysql              |
+--------------------+
3 rows in set (0.00 sec)

mysql> drop database C1_conlog ;
Query OK, 2 rows affected (0.56 sec)

mysql> create database C1_conlog;
Query OK, 1 row affected (0.00 sec)

mysql> use C1_conlog ;
Database changed
mysql> SET SQL_MODE="NO_AUTO_VALUE_ON_ZERO";
Query OK, 0 rows affected (0.00 sec)

mysql>
mysql> CREATE TABLE channels (
->   channel_id mediumint(8) unsigned NOT NULL AUTO_INCREMENT,
->   channel_name varchar(60) NOT NULL,
->   PRIMARY KEY (channel_id),
->   UNIQUE KEY channel_name (channel_name)
-> ) ENGINE=MyISAM  DEFAULT CHARSET=latin1;
Query OK, 0 rows affected (0.04 sec)

mysql>
mysql> CREATE TABLE data (
->   acquire_time decimal(26,6) NOT NULL,
->   channel_id mediumint(8) unsigned NOT NULL,
->   value varchar(40) DEFAULT NULL,
->   status tinyint(3) unsigned DEFAULT NULL,
->   connected tinyint(1) unsigned NOT NULL,
->   PRIMARY KEY (channel_id,acquire_time)
-> ) ENGINE=MyISAM DEFAULT CHARSET=latin1;
Query OK, 0 rows affected (0.03 sec)

mysql> grant select, insert, update, execute on * to 'c1_conlog_epics'@'127.0.0.1';  Query OK, 0 rows affected (0.00 sec)
mysql> grant select, insert, update, execute on * to 'C1_conlog_epics'@'127.0.0.1';   Query OK, 0 rows affected (0.00 sec)
mysql> grant select, insert, update, execute on * to 'c1_conlog_epics'@'localhost';  Query OK, 0 rows affected (0.00 sec)
mysql> grant select, insert, update, execute on * to 'C1_conlog_epics'@'localhost';
Query OK, 0 rows affected (0.00 sec)

mysql> grant select on C1_conlog to 'php'@'%';
ERROR 1146 (42S02): Table 'C1_conlog.C1_conlog' doesn't exist
mysql> grant select on * to 'php'@'%';
Query OK, 0 rows affected (0.00 sec)

mysql> select * from mysql.users
-> ;
ERROR 1146 (42S02): Table 'mysql.users' doesn't exist
mysql> select User from mysql.user;
| C1_conlog_epics  |
| c1_conlog_epics  |
| root             |
| C1_conlog_epics  |
| c1_conlog_epics  |
| debian-sys-maint |
| root             |
| root             |
+------------------+
9 rows in set (0.00 sec)

mysql> Bye



Next, I decided that I want to index on the acquire_time instead of the combination of channel_id and acquire_time (I think it makes a lot of sense for several query types, and especially debugging the conlog!):
mysql> create index acquire_time_index on data(acquire_time);
Query OK, 0 rows affected (0.04 sec)
Records: 0  Duplicates: 0  Warnings: 0


## Next Fix:

The above worked well, but when I restarted the conlog, I had to re-execute the "remove_channels" from the medm, because initially all channels were being loaded (use_channel_names had all the channels still).
Additionally, there were a lot of channels with "*RMS*" in the name that were being recorded, and were changing relatively quickly, so I have added those to the remove_channel_names file.

I am going to: Backup the files in /ligo/caltech/data/conlog/c1
Edit use_channel_names to only have the good channels.
Dump the database again
Stop conlog.
Wipe the database again.
Remake the database again (with permissions and the new index).
Restart the conlog and hope!

## The fix above seems to be in place and working. The database has the initial entries for the channels it monitors and is not growing without operators changing EPICs values.

6396   Fri Mar 9 16:28:10 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:
I created a page on the wiki for the new EPICS log (conlog):
https://wiki-40m.ligo.caltech.edu/aLIGO%20EPICs%20log%20%28conlog%29

I also edited this with restart instructions:
https://wiki-40m.ligo.caltech.edu/Computer_Restart_Procedures#megatron
6399   Sat Mar 10 15:29:47 2012 ZachHowToComputer Scripts / ProgramsModeMatchr

For your mode matching pleasure, I have added a tool called "ModeMatchr" to the SVN under /trunk/zach/tools/modematchr/

It uses the usual fminsearch approach, but tolerates a fully astigmatic input (i.e., w0ix ≠ w0iy, z0ix ≠ z0iy) and allows for transforming to an elliptical waist  (i.e., w0fx ≠ w0fy, but z0fx = z0fy). It would be straightforward to allow for z0fx ≠ z0fy, but I have never seen a case when we actually wanted this. On the other hand, the elliptical output ability is nice for coupling to wide-angle ring cavities.

It also does the looping through available lenses for you , and retains the best solution for each lens combination in an output cell, which can then be combed with another function (getOtherSol). fminsearch is incredibly fast: with a 10-lens bank, it finds all 100 best solutions on my crappy MacBook in <10s.

I have also included the functionality to constrain the length of the total MMT to within some percentage of the optimal distance, which helps to sift through the muck .

6415   Wed Mar 14 13:27:15 2012 ZachHowToComputer Scripts / ProgramsModeMatchr

I have added to ModeMatchr the capability to fix the total MMT distance. This is nice if you are coupling to a cavity some fixed distance away. The blurb from the help:

% Note: for any total length constraint dtot_tol > 0, ModeMatchr will use
% fminsearch to find the best solutions near your nominal dtot, and then
% omit solutions whose dtot lie outside your tolerance. For dtot_tol = 0,
% ModeMatchr actively constrains dtot to your value, and then finds the
% best solution. Therefore, set dtot_tol = 0 if you have a fixed distance
% into which to put a MMT.


 Quote: For your mode matching pleasure, I have added a tool called "ModeMatchr" to the SVN under /trunk/zach/tools/modematchr/ It uses the usual fminsearch approach, but tolerates a fully astigmatic input (i.e., w0ix ≠ w0iy, z0ix ≠ z0iy) and allows for transforming to an elliptical waist  (i.e., w0fx ≠ w0fy, but z0fx = z0fy). It would be straightforward to allow for z0fx ≠ z0fy, but I have never seen a case when we actually wanted this. On the other hand, the elliptical output ability is nice for coupling to wide-angle ring cavities. It also does the looping through available lenses for you , and retains the best solution for each lens combination in an output cell, which can then be combed with another function (getOtherSol). fminsearch is incredibly fast: with a 10-lens bank, it finds all 100 best solutions on my crappy MacBook in <10s. I have also included the functionality to constrain the length of the total MMT to within some percentage of the optimal distance, which helps to sift through the muck .

6534   Fri Apr 13 16:09:43 2012 SureshUpdateComputer Scripts / ProgramsACAD 2002 installed on C21530

I have installed ACAD 2002 on one of the Windows machines in the Control Room.    It is on the machine which has Solid Works (called C21530).

I hope we will be able to open our optical layout diagrams with this and update them even though it is an old version.

6550   Thu Apr 19 16:21:04 2012 ZachUpdateComputer Scripts / ProgramsArbcav updated, made badass

I have modified Arbcav to be way cooler than it used to be.

Main modifications:

• Can now truly model an arbitrary cavity geometry
• The previous version could only handle a few different topologies. In each case, it would unfold the cavity into the equivalent linear cavity and use the g-parameter method to calculate gouy phases, etc.
• The new model uses the closed cavity propagation matrix to find the supported mode, and then explicitly calculates the accumulated gouy phase by propagating the beam through the full cavity. This is done analytically with zR, so there is negligible slow-down.
• Now plots a diagram of the cavity geometry, both to help you and for you to verify that it is calculating the right thing (<-- this is the cool part)
• Plots the beam path and mirror locations
• Specifies whether mirrors are curved or flat
• Prints mirror parameters next to them
• Finds all intracavity waist locations and plots them
• Gives waist information (size in X, Y)

Since the information is already there, I will have the output structure include things like the input beam q parameter, which could then be fed directly to mode matching tools like ModeMatchr.

The function takes as input the same arguments as before. Example for a square cavity:

out = arbcav([200e-6 50e-6 200e-6 50e-6],[0.75 0.75 0.75 0.75],[1e10 9 1e10 9],[45 45 45 45],29.189e6,10e-6,1064e-9,1000);

i.e.,

out = arbcav(transmissivity_list, length_list, RoC_list, angle_list, modulation_freq, loss_list_or_loss_per_mirror, wavelength, num_pts_for_plot);

If you don't give it a modulation frequency, it will just plot carrier HOMs. If you don't give it RoCs and angles, it will just plot the transmission spectrum.

I'm still fine-tuning some functionality, but I should have it up on the SVN relatively soon. Comments or suggestions are welcome!

Some screenshots:

Cavity geometry plots (linear, triangular, square, bowtie):

Transmission and HOM spectra (these correspond to the square cavity at lower left, above):

6570   Wed Apr 25 21:24:10 2012 DenUpdateComputer Scripts / Programsc1oaf

C1OAF model, codes and medm screens are updated. All proper files are commited to svn and updated at the new model path.

6686   Fri May 25 19:13:10 2012 Duncan MacleodSummaryComputer Scripts / Programs40m summary webpages

# 40m summary webpages

The aLIGO-style summary webpages are now running on 40m data! They are running on megatron so can be viewed from within the martian network at:

http://192.168.113.209/~controls/summary

At the moment I have configured the 5 seismic BLRMS bands, and a random set of PSL channels taken from a strip tool.

## Technical notes

• The code is in python depending heavily on the LSCSoft PyLAL and GLUE modules.
• /home/controls/public_html/summary/bin/summary_page.py
• The HTML is supported by a CSS script and a JS script which are held locally in the run directory, and JQuery linked from the google repo.
• /home/controls/public_html/summary/summary_page.css
• /home/controls/public_html/summary/pylaldq.js
• The configuration is controlled via a single INI format file
• /home/controls/public_html/summary/share/c1_summary_page.ini

### Getting frames

Since there are no segments or triggers for C1, the only data sources are GWF frames. These are mounted from the framebuilder under /frames on megatron. There is a python script that takes in a pair of GPS times and a frame type that will locate the frames for you. This is how you use it to find T type frames (second trends) for May 25 2012:

python /home/controls/public_html/summary/bin/framecache.py --ifo C1 --gps-start-time 1021939215 --gps-end-time 1022025615 --type T -o framecache.lcf

If you don't have GPS times, you can use the tconvert tool to generate them

$tconvert May 25 1021939215 The available frame types, as far as I'm aware are R (raw), T (seconds trends), and M (minute trends). ## Running the code The code is designed to be fairly easy to use, with most of the options set in the ini file. The code has three modes - day, month, or GPS start-stop pair. The month mode is a little sketchy so don't expect too much from it. To run in day mode: python /home/controls/public_html/summary/bin/summary_page.py --ifo C1 --config-file /home/controls/public_html/summary/share/c1_summary_page.ini --output-dir . --verbose --data-cache framecache.lcf -SRQDUTAZBVCXH --day 20120525 Please forgive the large apparently arbitrary collection of letters, since the 40m doesn't use segments or triggers, these options disable processing of these elements, and there are quite a few of them. They correspond to --skip-something options in long form. To see all the options, run python /home/controls/public_html/summary/bin/summary_page.py --help There is also a convenient shell script that will run over today's data in day mode, doing everything for you. This will run framecache.py to find the frames, then run summary_page.py to generate the results in the correct output directory. To use this, run bash /home/controls/public_html/summary/bin/c1_summary_page.sh ## Configuration Different data tabs are disabled via command link --skip-this-tab style options, but the content of tabs is controlled via the ini file. I'll try to give an overview of how to use these. The only configuration required for the Seismic BLRMS 0.1-0.3 Hz tab is the following section: [data-Seismic 0.1-0.3 Hz] channels = C1:PEM-RMS_STS1X_0p1_0p3,C1:PEM-RMS_STS1Y_0p1_0p3,C1:PEM-RMS_STS1Z_0p1_0p3 labels = STS1X,STS1Y,STS1Z frame-type = R plot-dataplot1 = plot-dataplot3 = amplitude-log = True amplitude-lim = 1,500 amplitude-label = BLRMS motion ($\mu$m/s) The entries can be explained as follows: 1. '[data-Seismic 0.1-0.3 Hz] - This is the section heading. The 'data-' mark identifies this as data, and is a relic of how the code is written, the 'Seismic 0.1-0.3 Hz' part is the name of the tab to be displayed in the output. 2. 'channels = ...' - This is a comma-separated list of channels as they are named in the frames. These must be exact so the code knows how to find them. 3. 'labels = STS1X,STS1Y,STS1Z' - This is a comma-separated list of labels mapping channel names to something more readable for the plots, this is optional. 4. 'frame-type = R' - This tells the code what frame type the channels are, so it can determine from which frames to read them, this is not optional, I think. 5. 'plot-dataplotX' - This tells the code I want to run dataplotX for this tab. Each 'dataplot' is defined in it's own section, and if none of these options are given, the code tries to use all of them. In this configuration 'plot-dataplot1' tells the code I want to display the time-series of data for this tab. 6. 'amplitude-XXX = YYY' - This gives the plotter specific information about this tab that overrides the defaults defined in the dataplotX section. The options in this example tell the plotter that when plotting amplitude on any plot, that axis should be log-scale, with a limit of 1-500 and with a specific label. The possible plotting configurations for this style of option are: 'lim', 'log', 'label', I think. Other compatible options not used in this example are: • scale = X,Y,Z - a comma-separated list of scale factors to apply to the data. This can either be a single entry for all channels, or one per channel, nothing in between. • offset = X,Y,Z - another comma-separate list of DC offsets to apply to the data (before scaling, by default). DAQ noise may mean a channel that should read zero during quick times is offset by some fixed amount, so you can correct that here. Again either one for all channels, or one per channel. • transform = lambda x: f(x) - a python format lambda function. This is basically any mathematical function that can be applied to each data sample. By default the code constructs the function 'lambda d: scale * (d-offset)', i.e. it calibrates the data by removing the offset an applying the scale. • band = fmin, fmax - a low,high pair of frequencies within which to bandpass the data. Sketchy at best... • ripple_db = X - the ripple in the stopband of the bandpass filter • width = X - the width in the passband of the bandpass filter • rms_average = X - number of seconds in a single RMS average (combine with band to make BLRMS) • spectrum-segment-length = X - the length of FFT to use when calculating the spectrum, as a number of samples • spectrum-overlap = X - the overlap (samples) between neighbouring FFTs when calculating the spectrum • spectrum-time-step = X - the length (seconds) of a single median-mean average for the spectrogram At the moment a package version issue means the spectrogram doesn't work, but the spectrum should. At the time of writing, to use the spectrum simple add 'plot-dataplot2'. You can view the configuration file within the webpage via the 'About' link off any page. Please e-mail any suggestions/complaints/praise to duncan.macleod@ligo.org. 6687 Fri May 25 20:45:25 2012 Duncan MacleodSummaryComputer Scripts / Programs40m summary webpages There is now a job in the crontab that will run the shell wrapper every hour, so the pages _should_ take care of themselves. If you make adjustments to the configuration file they will get picked up on the hour, or you can just run the script by hand at any time. $ crontab -l # m h  dom mon dow   command 0 */1 * * * bash /home/controls/public_html/summary/bin/c1_summary_page.sh > /dev/null 2>&1

6757   Tue Jun 5 21:09:40 2012 yutaUpdateComputer Scripts / Programshacked ezca tools

Currently, ezca tools are flakey and fails too much.
So, I hacked ezca tools just like Yoichi did in 2009 (see elog #1368).

For now,

/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcastep
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaswitch
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcawrite

are wrapper scripts that repeats ezca stuff until it succeeds (or fails more than 5 times).

Of course, this is just a temporary solution to do tonight's work.
To stop this hack, run /users/yuta/scripts/ezhack/stophacking.cmd. To hack, run /users/yuta/scripts/ezhack/starthacking.cmd.

Original binary files are located in /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcabackup/ directory.
Wrapper scripts live in /users/yuta/scripts/ezhack directory.

I wish I could alias ezca tools to my wrapper scripts so that I don't have to touch the original files. However, alias settings doesn't work in our scripts.
Do you have any idea?

6768   Wed Jun 6 18:04:22 2012 JamieUpdateComputer Scripts / Programshacked ezca tools

 Quote: Currently, ezca tools are flakey and fails too much. So, I hacked ezca tools just like Yoichi did in 2009 (see elog #1368). For now, /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaread /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcastep /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaswitch /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcawrite are wrapper scripts that repeats ezca stuff until it succeeds (or fails more than 5 times). Of course, this is just a temporary solution to do tonight's work. To stop this hack, run /users/yuta/scripts/ezhack/stophacking.cmd. To hack, run /users/yuta/scripts/ezhack/starthacking.cmd. Original binary files are located in /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcabackup/ directory. Wrapper scripts live in /users/yuta/scripts/ezhack directory. I wish I could alias ezca tools to my wrapper scripts so that I don't have to touch the original files. However, alias settings doesn't work in our scripts. Do you have any idea?

I didn't like this solution, so I hacked up something else.  I made a new single wrapper script to handle all of the utils.  It then executes the correct command based on the zeroth argument (see below).

I think moved all the binaries to give them .bin suffixes, and the made links to the new wrapper script.  Now everything should work as expected, with this new retry feature.

controls@rosalba:/ligo/apps/linux-x86_64/gds-2.15.1/bin 0$for pgm in ezcaread ezcawrite ezcaservo ezcastep ezcaswitch; do mv$pgm{,.bin}; ln ezcawrapper $pgm; done controls@rosalba:/ligo/apps/linux-x86_64/gds-2.15.1/bin 0$ cat ezcawrapper
#!/bin/bash

retries=5

pgm="$0" run="${pgm}.bin"

if ! [ -e "$run" ] ; then cat <&2 This is the ezca wrapper script. It should be hardlinked in place of the ezca commands (ezcaread, ezcawrite, etc.), and executing the original binaries (that have been moved to *.bin) with$retries
failure retries.
EOF
exit -1
fi

if [ -z "$@" ] || [[ "$1" == '-h' ]] ; then
"$run" exit fi for try in$(seq 1 "$retries") ; do if "$run" "$@"; then exit else echo "retrying ($try/$retries)..." >&2 fi done echo "$(basename $pgm) failed after$retries retries." >&2
exit 1


6769   Wed Jun 6 18:22:52 2012 JamieUpdateComputer Scripts / Programshacked ezca tools

 Quote: I didn't like this solution, so I hacked up something else.  I made a new single wrapper script to handle all of the utils.  It then executes the correct command based on the zeroth argument (see below). I think moved all the binaries to give them .bin suffixes, and the made links to the new wrapper script.  Now everything should work as expected, with this new retry feature.

Yuta and I added a feature such that it will not retry if the environment variables EZCA_NORETRY is set, e.g.

EZCA_NORETRY=true ezcaread FOOBAR 6803 Tue Jun 12 13:49:32 2012 JamieConfigurationComputer Scripts / Programstconvert A nicer, better maintained version of tconvert is now supplied by the lalapps package. It's called lalapps_tconvert. I installed lalapps on all the workstations and aliased tconvert to point to lalapps_tconvert. 6863 Sun Jun 24 23:42:31 2012 yutaUpdateComputer Scripts / ProgramsPMC locker I made a python script for relocking PMC. It currently lives in /opt/rtcds/caltech/c1/scripts/PSL/PMC/PMClocker.py. I think the hardest part for this kind of locker is the scan speed. I could make the scan relatively fast by using pyNDS. The basic algorithm is as follows. 1. Turns off the servo by C1:PSL-PMC_SW1. 2. Scans C1:PSL-PMC_RAMP using ezcastep.bin. Default settings for ezcastep is ezcastep.bin C1:PSL-PMC_RAMP -s 0.1 0.01 10000 So, it steps by 0.01 for 10000 times with interval of 0.1 sec. 3. Get C1:PSL-PMC_PMCTRANSPD and C1:PSL-PMC_RAMP online 1 sec data using pyNDS. 4. If it finds a tall peak in C1:PSL-PMC_PMCTRANSPD, kills ezcastep.bin process, sets C1:PSL-PMC_RAMP to the value where the tall peak was found, and then turns on the servo. 5. If tall peak wasn't found, go back to 3 and get data again. 6. If C1:PSL-PMC_RAMP reaches near -7 V or 0 V, it kills previous ezcastep.bin process and turns the sign of the scan. I tested this script several times. It sometimes passes over TEM00 (because of the dead time in online pyNDS?), but it locks PMC with in ~10 sec. Currently, you have to run this to relock PMC because I don't know how to make this an autolocker. I think use of pyNDS can be applied for finding IR resonance using ALS, too. I haven't checked it yet becuase c1ioo is down, but ALS version lives in /users/yuta/scripts/findIRresonance.py. ALS may be easier in that we can use fast channels and nice filter modules. Other scripts: I updated /opt/rtcds/caltech/c1/scripts/general/toggler.py. It now has "lazymode". When lazymode, it toggles automatically with interval of 1 sec until you Ctrl-c. Also, I moved damprestore.py from my users directory to /opt/rtcds/caltech/c1/scripts/SUS/damprestore.py. It restores suspension damping of a specified mirror when watchdog shuts down the damping. 6871 Mon Jun 25 17:48:27 2012 yutaUpdateComputer Scripts / Programsscript for finding IR resonance using ALS I made a python script for finding IR resonance using ALS. It currently lives in /opt/rtcds/caltech/c1/scripts/ALS/findIRresonance.py. The basic algorism is as follows. 1. Scan the arm by putting an offset to the phase output of the phase tracker(Step C1:ALS-BEAT(X|Y)_FINE_OFFSET_OFFSET by 10 deg with 3 sec ramp time). 2. Fetch TR(X|Y) and OFFSET online data using pyNDS during the step. 3. If it finds a tall peak, sets OFFSET to the value where the tall peak was found. 4. If tall peak wasn't found, go back to 1 and step OFFSET again. The time series data of how he did is plotted below. I ran the script for Y arm, but it is compatible for both X and Y arm. 6872 Mon Jun 25 21:54:52 2012 DenUpdateComputer Scripts / ProgramsPMC locker  Quote: I made a python script for relocking PMC. It currently lives in /opt/rtcds/caltech/c1/scripts/PSL/PMC/PMClocker.py. I thought we rewrite auto lockers once per year, but this time it took us only a month. I wrote it for PMC on May 24. Is it not working? Could someone make it more clear why some scripts are written on bash, others on sh or python? I think we should elaborate a strict order. Masha and I can work on it if anyone else considers this issue as a problem. 6873 Tue Jun 26 00:52:18 2012 yutaUpdateComputer Scripts / ProgramsPMC locker  Quote: I thought we rewrite auto lockers once per year, but this time it took us only a month. I wrote it for PMC on May 24. Is it not working? I know. I just wanted to use pyNDS for this kind of scanning & locking situation. c1ioo was down for the weekend and I couldn't test my script for ALS, so I used it for PMC. But I think PMClocker.py can relock PMC faster because it can sweep C1:PSL-PMC_RAMP continuously and can get continuous data of C1:PSL-PMC_PMCTRANSPD. 6880 Wed Jun 27 11:35:06 2012 SashaSummaryComputer Scripts / ProgramsSURF - Week 1 - Summary I started playing with matlab for the first time, accurately simulated a coupled harmonic oscillator (starting from the basic differential equations, if anyone's curious), wrote a program to get a bode plot out of any simulation (regardless of the number of inputs/outputs), and read a lot. I'm currently going through the first stage of simulating an ideal Fabry-Perot cavity (I technically started yesterday, but yesterday's work turned out to be wrong, so fresh start!), and other than yesterday's setback, its going okay. I attached a screenshot of my simulation of the pitch/pendulum motion of one of the mirrors LIGO uses. The bode plots for this one are turning out a little weird, but I'm fairly certain its just a computational error and can be ignored (as the simulation matlab rendered without the coupling was really accurate - down to a floating point error). I have also attached these bode plots. The first bode is based on the force input, while the second is based on the torque input. It makes sense that there are two resonant frequencies, since there ought to be one per input. Attachment 1: Screen_Shot_2012-06-27_at_11.27.10_AM.png Attachment 2: Screen_Shot_2012-06-27_at_11.26.57_AM.png Attachment 3: Screen_Shot_2012-06-27_at_11.27.29_AM.png 6883 Wed Jun 27 15:10:34 2012 JamieUpdateComputer Scripts / Programs40m summary webpages move I have moved the summary pages stuff that Duncan set up to a new directory that it accessible to the nodus web server and is therefore available from the outside world: /users/public_html/40-summary which is available at: https://nodus.ligo.caltech.edu:30889/40m-summary/ I updated the scripts, configurations, and crontab appropriately: /users/public_html/40m-summary/bin/c1_summary_page.sh /users/public_html/40m-summary/share/c1_summary_page.ini 6885 Wed Jun 27 23:54:21 2012 yutaUpdateComputer Scripts / Programsimage capturing script Mike J. came tonight and he fixed Sensoray (elog #6645). He recompiled it and fixed it. I made a python wrapper script for Sensoray scripts. It currently lives in /users/yuta/scripts/videocapture.py. If you run something like ./videocapture.py AS it saves image capture of AS to /users/yuta/scripts/SensorayCapture/ directory with the GPS time. Below is the example output of AS when MI is aligned. We still see some clipping in the right. This clipping is there when one arm is mis-aligned and clipping moves together with the main beam spot. So, this might be from the incident beam, probably at the Faraday. Currently, videocapture.py runs only on pianosa, since Sensoray 2253S is connected to pianosa. Also, it can only capture MON4. My script changes MON4 automatically. 6956 Wed Jul 11 09:48:24 2012 LizSummaryComputer Scripts / ProgramsUpdate/daily summary testing I have been working on configuration of the Daily Summary webpages and have been attempting to create a "PSL health" page. This page will display the PMC power, the temperature on the PSL table and the PSL table microphone levels. Thus far, I have managed to make the extra PSL tab and configure the graph of the interior temperature, using channel C1:PSL-FSS_RMTEMP. I have been attempting to make a spectrogram for one of the PMC channels, but there is an issue with the spectrogram setup, as Duncan Macleod noted in ELOG 6686: "At the moment a package version issue means the spectrogram doesn't work, but the spectrum should. At the time of writing, to use the spectrum simple add 'plot-dataplot2'." Because of this issue, I have also been trying to make the spectrogram plots work. Thus far, I have fixed the issue with one of the spectrogram plots, but there are several problems with the other four that I need to address. I have also been looking at the microphone channels and trying to make the plot for them work. I checked which microphone was on the PSL table and plotted it in matplotlib to make sure it was working. However, when I tried to incorporate it into the daily summary pages, the script stops at that point! It might simply be taking an excessively long time, but I have to figure out why this is the case. (I am using channel C1:PEM-MIC_6_IN1_DQ, if this is blatantly wrong, please let me know!!) The main point of this ELOG is that I have working test-daily summary pages online! They can be found here: Also, if anyone has more requests for what they would like to see on the finalized summary pages site, please respond to this post or email me at: endavison@umail.ucsb.edu 6986 Wed Jul 18 10:08:01 2012 LizUpdateComputer Scripts / ProgramsWeek 5 update/progress Over the past week, I have been focusing on the issues I brought up in my last ELOG, 6956. I spent quite a while attempting to modify the script and create my own spectrogram function within the existing code. I also checked out the channels on the PSL table for the PSL health page and produced a spectrogram plot of the PMC reflected, transmitted, and input powers, the PZT Voltage and the laser output power. When I was entering these channels into the configuration script, I came across an issue with the way the python script parses this. If there were spaces between the channel names (for example: C1:PSL-PMC_INPUT_DC, C1:PSL-PMC_RFPDDC... etc) the program would not recognize the channels. I made some alterations to the parsing script such that all white spaces at the beginning and end of the channels were stripped and the program could find them. The next thing that I worked on was attempting to see if the microphone channels were actually stopping the program or just taking an extraordinarily long time. I tried running the program with shorter time samples and that seemed to work quite well! However, I had to leave it running overnight in order to finish. I am sure that this difference comes from the fact that the microphone channels are fast channels. I would like to somehow make it run more quickly, and am thinking about how best to do this. I finally got my spectrogram function to work after quite a bit of trouble. There were issues with mismatched data and limit sets that I discovered came from times when only a few frames (one or two) were in one block. I added some code to ignore small data blocks like those and the program works very well now! It seems like the best way to get the right limits is to let the program automatically set the limits (they are nicely log-scaled and everything) but there are some issues that produce questionable results. I spent a while adding a colormap option to the script so that the spectrogram colors can be adjusted! This mostly took so long because, on Monday night, some strange things were happening with the PMC that made the program fail (zeros were being output, which caused an uproar in the logarithmic data limits). I was incredibly worried about this and thought that I had somehow messed up the script (this happened in the middle of when I was tinkering with the cmap option) so I undid all of my work! It was only when I realized it was still going on and Masha and Jenne were talking about the PMC issues that I figured out that it was an external issue. I then went in and set manual limits so that a blank spectrogram and redid everything. The spectrogram is not operational and the colormap can be customized. I need to fix the problem with the autoscaled axes (perhaps adding a lower bound?) so that the program does not crash when there is an issue. Yesterday, I spoke with Rana about what my next step should be. He advised me to look at ELOGs from Steve (6678) and Koji (6675) about what they wanted to see on the site. These gave me a good map of what is needed on the site and where I will go next. I need to find out what is going on with the weather channels and figure out how to calibrate the microphones. I will also be making sure there are correct units on all of the plots and figure out how to take only a short section of data for the microphone channels. I have already modified the tab template so that it is similar to Koji's ELOG idea and will be making further changes to the layout of the summary pages themselves. I will also be working on having the right plots up consistently on the site. 6994 Fri Jul 20 11:59:27 2012 ranaUpdateComputer Scripts / ProgramsCONLOG not running WE tried to use the new conlog today and discovered that: 1) No one at the 40m uses conlog because they don't know that it ever ran and don't know how to use regexp. 2) It has not been running since the last time Megatron was rebooted (probably a power outage). 3) We could not get it to run using the instructions that Syracuse left in our wiki. Emails are flying. 7001 Mon Jul 23 07:39:55 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog: Note: The Conlog install instructions that I started from were located here: https://awiki.ligo-wa.caltech.edu/aLIGO/Conlog 7012 Mon Jul 23 20:19:01 2012 LizUpdateComputer Scripts / ProgramsInput Needed (From everyone!) The summary pages are now online (Daily Summary), and will eventually be found on the 40m Wiki page under "LOGS-Daily Summary". (Currently, the linked website is the former summary page site) Currently, all of the IFO and Acoustic channels have placeholders (they are not showing the real data yet) and the Weather channels are not working, although the Weather Station in the interferometer room is working (I am looking into this - any theories as to why this is would be appreciated!!). I am looking for advice on what else to include in these pages. It would be fantastic if everyone could take a moment to look over what I have so far (especially the completed page from July 23, 2012) and give me their opinions on: 1. What else you would like to see included 2. Any specific applications to your area of work that I have overlooked 3. What the most helpful parts of the pages are 4. Any ways that I could make the existing pages more helpful 5. Any other questions, comments, clarifications or suggestions Finally, are the hourly subplots actually helpful? It seems to me like they would be superfluous if the whole page were updating every 1-2 hours (as it theoretically eventually will). These subplots can be seen on the July 24, 2012 page. My email address is endavison@umail.ucsb.edu. Thank you! 7023 Wed Jul 25 11:22:39 2012 LizUpdateComputer Scripts / ProgramsWeek 6 update This week, I made several modifications to the Summary page scripts, made preliminary Microphone BLRMS channels and, with Rana's help, got the Weather Station working again. I changed the spectrogram and spectrum options in the Summary Pages so that, given the sampling frequency (which is gathered by the program), the NFFT and overlap are calculated internally. This is an improvement over user-entered values because it saves the time of having to know the sampling frequency for each desired plot. In addition, I set up another .sh file that can generate summary pages for any given day. Although this will probably not be useful in the final site, it is quite helpful now because I can go back and populate the pages. The current summary pages file is called "c1_summary_page.sh" and the one that is set up to get a specific day is called "liz_c1_summary_page.sh". I also made a few adjustments to the .css file for the webpage so that plots completely show up (they were getting cut off on the edges before) and are easier to see. I also figured out that the minute and second trend options weren't working because the channel names have to be modified to CHANNEL.mean, CHANNEL.min and CHANNEL.max. So that is all in working order now, although I'm not sure if I should just use the mean trends or look at all of them (the plots could get crowded if I choose to do this). Another modification I made to the python summary page script was adding an option to have an image on one of the pages. This was useful because I can now put requested MEDM screens up on the site. The image option can be accessed if, in the configuration file, you use "image-" instead of "data-" for the first word of the section header. I also added a link to the final summary page website on the 40 meter wiki page (my summary page are currently located in the summary-test pages, but they will be moved over once they are more finalized). I fleshed out the graphs on the summary pages as well, and have useful plots for the OSEM and OPLEV channels. Instead of using the STS BLRMS channels, I have decided to use the GUR BLRMS channels that Masha made. I ELOGged about my progress and asked for any advice or recommendations a few days ago (7012) and it would still be great if everyone could take a look at what I currently have up on the website and tell me what they think! July 22 and 23 are the most finalized pages thus far, so are probably the best to look at. https://nodus.ligo.caltech.edu:30889/40m-summary-test/archive_daily/20120723/ This week, I also tried to fix the problems with the Weather Station, which had not been operational since 2010. All of the channels on the weather station monitor seemed to be producing accurate data except the rain gauge, so I went on the roof of the Machine Shop to see if anything was blatantly wrong with it. Other than a lot of dust and spiders, it was in working condition. I plan on going up again to clean it because, in the manual, it is recommended that the rain collector be cleaned every one to two years... I also cleared the "daily rain" option on the monitor and set all rain-related things to zero. Rana and I then traced the cabled from c1pem1 to the weather station monitor, and found that thy were disconnected. In fact, the connector was broken apart and the pins were bent. After we reconnected them, the weather station was once again operational! In order to prevent accidental disconnection in the future, it may be wise to secure this connection with cable ties. It went out of order again briefly on Tuesday, but I reconnected it and now it is in much sturdier shape! The most recent thing that I have been doing in relation to my project has been making BLRMS channels for the MIC channels. With Jenne's assistance, I made the channels, compiled and ran the model on c1sus, made filters, and included the channels on the PEM MEDM screen . I have a few modifications to make and want to . One issue that I have come across is that the sampling rate for the PEM system is 2 kHz, and the audio frequencies range all the way up to 20 kHz. Because of this, I am only taking BLRMS data in the 1-1000 Hz range. This may be problematic because some of these channels may only show noise (For example, 1-3 and 3-10 Hz may be completely useless). The pictures below are of the main connections in the Weather Station. This first is the one that Rana and I connected (it is now better connected and looks like a small beige box), located near the beam-splitter chamber, and the second is the c1pem1 rack. For more information on the subject, there is a convenient wiki page: https://wiki-40m.ligo.caltech.edu/Weather_Station Attachment 1: P7230026.JPG Attachment 2: P7230031.JPG 7032 Wed Jul 25 17:35:44 2012 LizUpdateComputer Scripts / ProgramsSummary Pages are in the right place! The summary pages can now be accessed from the "Daily Summary" link under LOGS on the 40 meter Wiki page. 7063 Wed Aug 1 10:07:16 2012 LizUpdateComputer Scripts / ProgramsWeek 7 Update Over the past week, I have continued refining the summary pages. They are now online in their final home, and can be easily accessed from the 40 meter Wiki page! (It can be accessed by the Daily Summary link under "LOGS"). I have one final section to add plots to (the IFO section is currently still only "dummy" plots) but the rest are showing correct data! I have many edits to make in order for them to be more intelligible, but they are available for browsing if anyone feels so inclined. I also spent quite a while formatting the pages so that the days are in PDT time instead of UTC time. This process was quite time consuming and required modifications in several files, but I tracked my changes with git so they are easy to pinpoint. I also did a bit of css editing and rewriting of a few html generation functions so that the website is more appealing. (One example of this is that the graphs on each individual summary page are now full sized instead of a third of the size. This week, I also worked with the BLRMS mic channels I made. I edited the band pass and low pass filters that I had created last week and made coherence plots of the channels. I encountered two major issues while doing this. Firstly, the coherence of the channels decreases dramatically above 40 Hz. I will look at this more today, but am wondering why it is the case. If nothing could be done about this, it would render three of my channels ineffective. The other issue is that the Nyquist frequency is at 1000 Hz, which is the upper limit of my highest frequency channel (300-1000 Hz). I am not sure if this really affects the channel, but it looks very different from all of the other channels. I am also wondering whether the channels below 20 Hz are useful at all, or whether they are just showing noise. The microphone calibration has been something I have been trying to figure out for quite some time, but I recently found a value on the website that makes the EM172 microphones and has a value for their sensitivity. I determined the transfer factor from this sensitivity as 39.8107 mV/Pa, although I am not sure if all of the mics will be consistent with this. 7104 Tue Aug 7 15:01:38 2012 JenneUpdateComputer Scripts / Programsmedmrun now allows args to pass to scripts Previously, medmrun didn't accept arguments to pass along to the script it was going to run. Jamie has graciously taken a moment from fixing the computer disaster to help me update the medmrun script. Now the ASS scripts are call-able from the screen. 7108 Tue Aug 7 18:38:50 2012 LizUpdateComputer Scripts / ProgramsDaily Summary Pages are in their final form! Please check the summary pages out at the link below and let me know if there are any modifications I should make! All existing pages are up to date and contain all of the pages I have. Questions, comments, and suggestions will be appreciated! Contact me at endavison@umail.ucsb.edu https://nodus.ligo.caltech.edu:30889/40m-summary/ 7115 Wed Aug 8 10:38:43 2012 LizUpdateComputer Scripts / ProgramsWeek 8/Summary Pages update Over the past week, I have been working on my progress report and finalizing the summary pages. I have a few more things to address in the pages (such as starting at 6 AM, including spectrograms where necessary and generating plots for the days more than ~a week ago) but they are mostly finalized. I added all of the existing acoustic and seismic channels so the PEM page is up to date. The microphone plots include information about the transfer factor that I found on their information sheet (http://www.primomic.com/). If there are any plots that are missing or need editing, please let me know! I also modified the c1_summary_page.sh script to run either the daily plots or current updating plots by taking in an argument in the command line. It can be run ./c1_summary_page.sh 2012/07/27 or ./c1_summary_page.sh now to generate the current day's pages. (Essentially, I combined the two scripts I had been running separately.) I have been commenting my code so it is more easily understandable and have been working on writing a file that explains how to run the code and the main alterations I made. The most exciting thing that has taken place this week is that the script went from taking ~6 hours to run to taking less than 5 minutes. This was done by using minute trends for all of the channels and limiting the spectrum plot data. The summary pages for each day now contain only the most essential plots that give a good overview of the state of the interferometer and its environment instead of every plot that is created for that day. I am waiting for Duncan to send me some spectrogram updates he has made that downsample the timeseries data before plotting the spectrogram. This will make it run much more quickly and introduce a more viable spectrogram option. Today's Summary Pages can be accessed by the link on the wiki page or at: https://nodus.ligo.caltech.edu:30889/40m-summary/archive_daily/20120808/ 7120 Wed Aug 8 13:37:46 2012 KojiUpdateComputer Scripts / ProgramsWeek 8/Summary Pages update Hey, the pages got significantly nicer than before. I will continue to give you comments if I find anything. So far: There are many 10^-100 in logarithmic plots. Once they are removed, we should be able to see the seismic excitation during these recent earth quakes? Incidentally, where the script is located? "./" isn't the absolute path description.  Quote: Over the past week, I have been working on my progress report and finalizing the summary pages. I have a few more things to address in the pages (such as starting at 6 AM, including spectrograms where necessary and generating plots for the days more than ~a week ago) but they are mostly finalized. I added all of the existing acoustic and seismic channels so the PEM page is up to date. The microphone plots include information about the transfer factor that I found on their information sheet (http://www.primomic.com/). If there are any plots that are missing or need editing, please let me know! I also modified the c1_summary_page.sh script to run either the daily plots or current updating plots by taking in an argument in the command line. It can be run ./c1_summary_page.sh 2012/07/27 or ./c1_summary_page.sh now to generate the current day's pages. (Essentially, I combined the two scripts I had been running separately.) I have been commenting my code so it is more easily understandable and have been working on writing a file that explains how to run the code and the main alterations I made. The most exciting thing that has taken place this week is that the script went from taking ~6 hours to run to taking less than 5 minutes. This was done by using minute trends for all of the channels and limiting the spectrum plot data. The summary pages for each day now contain only the most essential plots that give a good overview of the state of the interferometer and its environment instead of every plot that is created for that day. I am waiting for Duncan to send me some spectrogram updates he has made that downsample the timeseries data before plotting the spectrogram. This will make it run much more quickly and introduce a more viable spectrogram option. Today's Summary Pages can be accessed by the link on the wiki page or at: https://nodus.ligo.caltech.edu:30889/40m-summary/archive_daily/20120808/ 7181 Tue Aug 14 16:33:51 2012 SashaUpdateComputer Scripts / ProgramsSimPlant indicator added I added an indicator to the watch dog screen so that a little "SP" icon appears whenever the SimPlant is on. Since we only have one simplant (ETMX), only ETMX has the simPlant indicator. However, since assymetry is ugly, I moved all of the OL icons over so that they're in a line and so that there is room for future SP icons. I also fixed the link to the Watchdogs on the main SUS screens (it was dead, but now it is ALIVE). 7192 Wed Aug 15 13:23:34 2012 LizSummaryComputer Scripts / ProgramsLast Weekly Update Over the past week I have been continuing to finalize the daily summary pages, attempting to keep the total run time under half an hour so that they can be run frequently. I have had many hang ups with the spectrograms and am currently using second trends (with this method, the entire script takes 15 minutes to run). I also have a backup method that takes 3 minutes of data for every 12 minutes, but could not implement any interpolation correctly. This might be a future focus, or the summary pages could be configured to run in parallel and full data for the spectrograms can be used. I configured Steve's tab to include one page of images and one page of plots and fixed the scripts so that it corrects for daylight savings time (at the beginning of the running, the program prints 'DST' or 'Not DST'). Right now, I am focusing on making coherence plots in a spectrogram style (similar to the matlab 'coh_carpet' function) and a spectrogram depicting Gaussianity (similar to the plots made by the RayleighMonitor). I have also been working on my final paper and presentation. 7203 Thu Aug 16 13:04:36 2012 LizSummaryComputer Scripts / ProgramsDaily Summary Details I just wrote a short description of how to run the daily summary pages and the configuration process for making changes to the site. It can be found in /users/public_html/40m-summary and is named README.txt. If I need to clarify anything, please let me know! The configuration process should be relatively straightforward, so it will be easy to add plots or change them when there are changes at the 40 meter. 7214 Fri Aug 17 05:29:04 2012 YoichiConfigurationComputer Scripts / ProgramsC1configure scripts I noticed that the IFO restore scripts have some problems. They use burt request files to store and restore the settings. However, the request files contain old channel names. Especially channels with _TRIG_THRES_ON/OFF are now _TRIG_THRESH_ON/OFF, note the extra "H". These scripts reside in /opt/rtcds/caltech/c1/burt/c1ifoconfigure/. I fixed the PRMI_SBres and MI scripts. Someone should fix all other files. 7238 Tue Aug 21 00:02:05 2012 ranaSummaryComputer Scripts / ProgramsGDS/DTT bug: 10 digit GPS times not accepted I've noticed that we're experiencing this bug which was previously seen at LHO. We cannot enter 10 digit GPS times into the time fields for DTT due to a limit in TLGEntry.cc, which Jim Batch fixed in September of last year. Seems like we're running an old version of the GDS tools. I checked the Lidax tool (which you can get from the GDS Mainmenu). It does, in fact, allow 10 digit entries. 7611 Wed Oct 24 18:42:39 2012 ManasaUpdateComputer Scripts / ProgramsPhase map summary of LaserOptik mirrors  Quote: Raji took the optics over. They were all measured at 0 deg incidence angle, although we will use them at the angles required for the recycling folding mirrors. Here's the summary from GariLynn: In general all six pieces have a radius of curvature of around -700 meters. They all fall off rapidly past 40 mm diameter. Within the 40 mm diameter the rms is ~10 nm for most. I can get finer analysis if you have something specific that you want to know. All data are saved in Wyko format at the following location: http://www.ligo.caltech.edu/~coreopt/40MCOC/Oct24-2012/ Gari After a long search, I've found a way to finally read and analyze(?) the Wyko opd format data using Image SXM, an image analysis software working only on mac osx. I am attaching the images (in tiff) and profile plot of all the 6 mirrors. Attachment 1: sn1Laseroptik_profile Attachment 2: sn2Laseroptik_profile Attachment 3: sn3Laseroptik_profile Attachment 4: sn4Laseroptik_profile Attachment 5: sn5Laseroptik_profile Attachment 6: sn6Laseroptik_profile Attachment 7: sn1.png Attachment 8: sn2.png Attachment 9: sn3.png Attachment 10: sn4.png Attachment 11: sn5.png Attachment 12: sn6.png 7615 Wed Oct 24 22:48:46 2012 janoschUpdateComputer Scripts / ProgramsPhase map summary of LaserOptik mirrors  Quote: After a long search, I've found a way to finally read and analyze(?) the Wyko opd format data using Image SXM, an image analysis software working only on mac osx. I am attaching the images (in tiff) and profile plot of all the 6 mirrors. Great, however, unless you can save the images in FITS format, we still need another reader for the opd images. 7616 Thu Oct 25 02:01:15 2012 KojiUpdateComputer Scripts / ProgramsPhase map summary of LaserOptik mirrors Previous phasemap data and analysis for the new 40m COC are summarized on the following page https://nodus.ligo.caltech.edu:30889/40m_phasemap/ (Use traditional LVC authentication (not albert.einstein)) The actual instance of the files can also be found on nodus below the following directory: /cvs/cds/caltech/users/public_html/40m_phasemap The programs for the analysis are found in /cvs/cds/caltech/users/public_html/40m_phasemap/40m_PRM/mat The main program is RunThis.m Basically this program takes ascii files converted from opd by Vision32. (i.e. You need to go to Downs) Then the matlab program takes care of the plots and curvature analyses. 7758 Wed Nov 28 21:42:21 2012 ranaFrogsComputer Scripts / Programsdataviewer font error An error this evening on rossa: dataviewer not working due to some font errors: controls@rossa:~ 0 dataviewer
Connecting.... done
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning:
Name: FilterText
Class: XmTextField
Character '\52' not supported in font.  Discarded.

Warning:
Name: FilterText
Class: XmTextField
Character '\56' not supported in font.  Discarded.

Warning:
Name: FilterText
Class: XmTextField
Character '\170' not supported in font.  Discarded.

Warning:

etc.............

7768   Fri Nov 30 14:21:18 2012 ranaHowToComputer Scripts / ProgramsThe mystery of PDFs and you. As deep as the mystery of Rasputin.

This is how to post PDF:

From DTT, print the plot as a postscript file.

Then use ps2pdf to make a archival PDF version (the flag is the key!). Example:

ps2pdf -dPDFX /home/controls/Desktop/darm.ps

Attachment 1: darm.pdf
7818   Wed Dec 12 20:22:03 2012 JamieUpdateComputer Scripts / Programsilluminators fixed and added to VIDEO screen

I fixed the illuminator setup.  ETMY was not hooked up, and the screen wasn't configured quite right.  The ITMX illuminator still needs to be hooked up to the vertex switch.

I made an updated illuminator script that works more like the videoswitch scripts, with a saner interface, and is located here:

/opt/rtcds/caltech/c1/scripts/general/illuminator


I also fixed up the illuminator MEDM interface a bit and added it to the VIDEO screen:

While I was at it, I cleaned up the sitemap a bit:

I hope everyone won't be too confused.

7993   Mon Feb 4 15:26:10 2013 JamieUpdateComputer Scripts / ProgramsNew "getdata" program to pull NDS channel data, including test points

I've added a new program called getdata (to scripts/general/getdata) that will conveniently pull arbitrary data from an NDS server, either DQ or online (ie. testpoints).

Start times and durations may be specified.  If past data is requested, you must of course be requesting DQ channels.  If no start time is specified, data will be pulled "online", in which case you can specify testpoints.

If an output directory is specified, the retrieved data will be stored in that directory in files named after the channels.  If an output directory is not specified, no output will be

Help usage:

controls@pianosa:~ 0$/opt/rtcds/caltech/c1/scripts/general/getdata --help usage: getdata [-h] [-s START] [-d DURATION] [-o OUTDIR] channel [channel ...] Pull online or DQ data from an NDS server. Use NDSSERVER environment variable to specify host:port. positional arguments: channel Acquisition channel. Multiple channels may be specified acquired at once. optional arguments: -h, --help show this help message and exit -s START, --start START GPS start time. If omitted, online data will be fetched. When specified must also specify duration. -d DURATION, --duration DURATION Length of data to acquire. -o OUTDIR, --outdir OUTDIR Output directory. Data from each channel stored as '.txt'. Any existing data files will be automatically overwritten. controls@pianosa:~ 0$


8097   Mon Feb 18 00:03:46 2013 ZachUpdateComputer Scripts / ProgramsARBCAV v3.0

I have uploaded ARBCAV v3.0 to the SVN. The major change in this release, as I mentioned, is the input/output handling. The input and output are now contained in a single 'model' structure. To define the cavity, you fill in the substructure 'model.in' (e.g., model.in.T = [0.01 10e-6 0.01]; etc.) and call the function as:

model = arbcav(model);

Note: the old syntax is maintained as legacy for back-compatibility, and the function automatically creates a ".in" substructure in the output, so that the user can still use the single-line calling, which can be convenient. Then, any individual parameter can be changed by changing the appropriate field, and the function can be rerun using the new, simpler syntax from then on.

The function then somewhat intelligently decides what to compute based on what information you give it. Using a simple option string as a second argument, you can choose what you want plotted (or not) when you call. Alternatively, you can program the desired functionality into a sub-substructure 'model.in.funct'.

The outputs are created as substructures of the output object. Here is an example:

>> th = 0.5*acos(266/271) *180 /pi;

OMC.in.theta = [-th -th th th];

OMC.in.L = [0.266 0.284 0.275 0.271];

OMC.in.RoC = [1e10 2 1e10 2];

OMC.in.lambda = 1064e-9;

OMC.in.T = 1e-6 * [8368 25 8297 33];

OMC.in.f_mod = 24.5e6;

>> OMC

OMC =

in: [1x1 struct]

>> OMC = arbcav(OMC,'noplot')

Warning: No loss given--assuming lossless mirrors

> In arbcav at 274

OMC =

in: [1x1 struct]

FSR: 2.7353e+08

Lrt: 1.0960

finesse: 374.1568

buildup: 119.6956

df: [1000x1 double]

coefs: [1000x4 double]

HOM: [1x1 struct]

>> OMC.HOM

ans =

f: [1x1 struct]

pwr: [1x1 struct]

>> OMC.HOM.pwr

ans =

carr: [15x15 double]

SBp: [15x15 double]

SBm: [15x15 double]

Some other notes:

• The annoying Mdo.m has been internalized; it is no longer needed.
• For the next release, I am working on including:
• Finite mirror thickness/intracavity refractive elements - If, for god knows what reason, you decide to put a mirror substrate within a cavity
• Mode overlap - Calculating the overlap of an input beam to the cavity
• Mode matching - Calculating a mode matching telescope into the cavity for some defined input beam
• Anything else?

I have added lots of information to the help header, so check there for more details. As always, your feedback is greatly appreciated.

8209   Fri Mar 1 18:23:28 2013 JamieUpdateComputer Scripts / Programsupdated version of "getdata"

  /opt/rtcds/caltech/c1/scripts/general/getdata


It now writes the data to disk incrementally while it's downloading from the server, so it doesn't fill up memory.

I also added a couple new options:

* --append allows for appending to existing data files

8254   Thu Mar 7 18:48:43 2013 yutaUpdateComputer Scripts / Programsreleasing my secret scripts

I released/updated my secret scripts to real scripts directory.
I checked they run correctly (but maybe not working correctly).

burtlookup.py
in ./scripts/general/burtlookup.py

It returns a value of a specified channel in the past using burt snapshots.
Help is available.

GRtoggler.py
in ./scripts/ALS/GRtoggler.py

Toggles green shutter until it locks TEM00.
Help is available. Threshold setting is critical.

MCbeeper.py
in ./scripts/MC/MCbeeper.py

Beeps when MC is unlocked.

yutalib.py
in ./scripts/pylibs/yutalib.py

I think it's well commented.

pyezcalib.py
in ./scripts/pylibs/pyezcalib.py

Python library for ezca stuff.
It has functions for recording and resetting default channel values in case of interrupt.

./scripts/PRCmodescan
Python scripts for PRC modescan. Not well commented. Not organized.
See elog #8012

./scripts/Alignment
Python and shell scripts for alignment work. Not well commented.
See elog #8164

./scripts/SUS/OplevCalibration
Python scripts for oplev calibration. Not well commented.
See elog #8221

./scripts/dither/gfactormeasurement
Python scripts for g-factor measurement. Not well commented.
See elog #8230

./scripts/SUS/ActuatorCalib
Python scripts for calibrating actuators. Not well commented.
See elog #8242

8524   Thu May 2 19:59:34 2013 JamieUpdateComputer Scripts / Programslookback: new program to look at recent past testpoint data

To aid in lock-loss studies, I made a new program called 'lookback', similar to 'getdata', to look at past data.

When called with channel name arguments, it runs continuously, storing all channel data in a ring buffer.  When the user hits Ctrl-C, all the data in the ring buffer is displayed.  There is an option to store the data in the ring buffer to disk as well.

controls@rosalba:/opt/rtcds/caltech/c1/scripts/general 0$./lookback -h usage: lookback [-h] [-l LENGTH] [-o OUTDIR] channel [channel ...] Lookback on testpoint data. The specified amount of data is stored in a ring buffer. When Ctrl-C is hit, all data in the ring buffer is plotted. Both 'DQ' and 'online' test point data is available. Use NDSSERVER environment variable to specify host:port. positional arguments: channel Acquisition channel. Multiple channels may be specified and acquired at once. optional arguments: -h, --help show this help message and exit -l LENGTH, --lookback LENGTH Lookback time in seconds. This amount of data will be stored in a ring buffer, and plotted on Ctrl-C. Default is 10 seconds -o OUTDIR, --outdir OUTDIR Output directory to write data (will be created if it doesn't exist). Data from each channel stored as '<channel>.txt'. Any existing data files will be overwritten. controls@rosalba:/opt/rtcds/caltech/c1/scripts/general 0$

8588   Thu May 16 02:34:38 2013 ranaUpdateComputer Scripts / ProgramsAutoRUN GUI resurrected

We talked about the thing that watches the scripts for autolocking during the meeting today.

I've resurrected the Perl-Tk GUI that we used through i/eLIGO for watching the IFO and running the appropriate scripts. This is not meant to be a replacement for aLIGO stuff, but just something to get us going for now. I expect that we will make some new fanciness which will eclipse this, but I brought it back so that we don't start off with some 'Advanced' system which is worse than the old one.

You can run it from scripts/c1/ by typing ./AutoRUN.pl. It pops up the GUI and starts in a Disabled mode where it watches and does nothing.

I have done some editing of the GUI's code so that it uses caget / caput instead of ezca binaries. New stuff is in the SVN.

Next up is to start testing it and fixing it up so that it uses the thresholds set in the LSC screens rather than some hardcoded values.

Eventually we should also convert all of its daughter scripts from tcsh to bash to keep Jamie's blood pressure in the low hundreds...

Attachment 1: Autorun.png
8608   Tue May 21 18:18:28 2013 JamieUpdateComputer Scripts / ProgramsnetGPIB stuff update/modernized/cleanedup/improved

I did a bunch of cleanup work on the netGPIB stuff:

• Removed extensions from all executable scripts (executables should not have language extensions)
• fixed execution permissions on executables and modules
• committed HP8590.py and HP3563A.py instrument modules, which were there but not included in the svn
• committed NWAG4395A (was AG4395A_Run.py) to svn, and removed old "custom" copies (bad people!)
• cleaned up, modernized, and fixed the netgpibdata program
• removed plotting from netgpibdata, since it was only available for one instrument, there's already a separate program to handle it, and it's just plotting the saved data anyway
• added a netgpibcmd program for sending basic commands to instruments.