40m QIL Cryo_Lab CTN SUS_Lab CAML OMC_Lab CRIME_Lab FEA ENG_Labs OptContFac Mariner WBEEShop
  40m Log, Page 76 of 350  Not logged in ELOG logo
ID Date Author Type Categoryup Subject
  6148   Fri Dec 23 15:55:12 2011 ranaSummaryComputer Scripts / ProgramsCONLOG: not working since Oct 1

Often people say "I don't use conlog because its real slow". Its a little like not driving because your car has no gas.

I looked into what's going on with conlog. No one has fixed its channel list in ~1 year so it didn't make much sense. Also since Oct 1 of this year, it expired the leap seconds epoch and has been waiting for someone to look at the log file and update the list of leap seconds.

Some issues:

  • Don't use the phrases like OUTPUT, OUTMON, OUT16, or INMON as the usual part of a channel name. These are filter modules words which use to exclude channels from conlog. Please fix ALL of the LOCKIN screens to get rid of the OUTPUT filter banks.
  • If you use an EPICS channel in a servo so that its getting changed 16 times a second, make sure to add it to the conlog exclude list.

There are a bunch of bad channels which are screwing up various tools (DV, DTT, etc.):

Examples: C1:LSC-Subsystem_NPRO_SW1, C1:-DOF2PD_MTRX_0_0_SW1, C1:BAD-BAR_CRAZY_2_RSET, C1:C1L-DOF2PD_MTRX_3_14_SW2, C1:DUB-SEIS_GUR2_Z_LIMIT, etc.

  • There are a bunch of old, unused directories in c1/medm/. EVERYONE take a look in there and delete the OLD dead ones so that we don't keep recording those channels.

 To fix up some of these issues, I have deleted several MEDM directories which I thought were old (there are several extras left from Aidan's Green time). I also have put a bunch of exclude variables into the conlog 'scan_adls' script to prevent it from adding some of the new worthless channels. Finally, I have started this command

../bin/strip_out_channels '.*STAT.*','.*_ALIVE.*','C1:PEM.*','.*_Name.*','C1:UCT.*','C1:MCP.*','C1:SP.*','C1:DU.*','C1:RF.*','C1:NIO.*','C1:TST.*','C1:SUP.*','C1:X.*','C1:FEC.*','.*_LFSERVO.*','.*FSS_SLOWDC.*','C1:LSC-LA_MTRX_21','C1:LSC-PD.*OFFSET','C1:LSC-ETM.*OFFSET' conlog*.log

which should strip lots of the excess conlog data out of the conlog directory. The only downside is that its setting all of the timestamps of the .log files to today instead of the historical times but I don't think we'll care about this too much. Hopefully it will speed things up to have less than 450 GB of conlog files...

update: 12 hours later, its still running and has removed ~100 GB so far. It will probably take the rest of the weekend to finish.

  6162   Tue Jan 3 18:40:08 2012 AidanSummaryComputer Scripts / Programsmedm directory clean-up

 I moved the following old and obsolete MEDM directories to an archive /cvs/cds/rtcds/caltech/c1/medm

  • c1vga
  • c1vgl
  • c1gpt
  • c1gpv
  • c1nio
  • c1spy
  • c1spx

None of these models were present in /cvs/cds/rtcds/caltech/c1/target

  6169   Wed Jan 4 11:47:08 2012 JamieSummaryComputer Scripts / Programsmedm directory clean-up

Quote:

 I moved the following old and obsolete MEDM directories to an archive /cvs/cds/rtcds/caltech/c1/medm

  • c1vga
  • c1vgl
  • c1gpt
  • c1gpv
  • c1nio
  • c1spy
  • c1spx

None of these models were present in /cvs/cds/rtcds/caltech/c1/target

Remember that we don't use /cvs/cds/rtcds anymore.  Everything should be in /opt/rtcds now.

  6185   Wed Jan 11 14:06:28 2012 Leo SingerHowToComputer Scripts / ProgramsHowTo for getting data from NDS off site

 This may or may not be general knowledge already, but Jamie and I added a HowTo explaining how to retrieve channel data from the frame builder via NDS, but off site on one's own computer.  See the Wiki page:

https://wiki-40m.ligo.caltech.edu/How_To/NDS

  6373   Wed Mar 7 13:59:07 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:

In order to install the necessary extensions to epics to make the aLIGO conlog work, I have edited one file in the base epics install that affects makefiles:

/cvs/cds/caltech/apps/linux64/epics/base/configure/CONFIG_COMMON

Jamie said he prefers diffs, so I regenerated the original file and did a diff against the current file:

megatron:configure>diff CONFIG_COMMON.orig.reconstructedMar72012 CONFIG_COMMON.bck.Mar72012
206,207c206,210
< USR_CPPFLAGS =
< USR_DBDFLAGS =
---
> USR_CPPFLAGS = -I $(EPICS_BASE)/include
> USR_CPPFLAGS += -I $(EPICS_BASE)/include/os/Linux/
> USR_CPPFLAGS += -I $(EPICS_BASE)/../modules/archive/lib/linux-x86_64/
> USR_DBDFLAGS = -I $(EPICS_BASE)/dbd
> USR_DBDFLAGS += -I $(EPICS_BASE)/../modules/archive/dbd

This is saved in CONFIG_COMMON.diff.Mar72012_1

  6377   Wed Mar 7 18:00:39 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:

Quote:

In order to install the necessary extensions to epics to make the aLIGO conlog work, I have edited one file in the base epics install that affects makefiles:

/cvs/cds/caltech/apps/linux64/epics/base/configure/CONFIG_COMMON

Jamie said he prefers diffs, so I regenerated the original file and did a diff against the current file:

megatron:configure>diff CONFIG_COMMON.orig.reconstructedMar72012 CONFIG_COMMON.bck.Mar72012
206,207c206,210
< USR_CPPFLAGS =
< USR_DBDFLAGS =
---
> USR_CPPFLAGS = -I $(EPICS_BASE)/include
> USR_CPPFLAGS += -I $(EPICS_BASE)/include/os/Linux/
> USR_CPPFLAGS += -I $(EPICS_BASE)/../modules/archive/lib/linux-x86_64/
> USR_DBDFLAGS = -I $(EPICS_BASE)/dbd
> USR_DBDFLAGS += -I $(EPICS_BASE)/../modules/archive/dbd

This is saved in CONFIG_COMMON.diff.Mar72012_1

 After following up with Patrick Thomas for a while trying to make the extensions to epics work within the currently installed epics, he decided that we should just start over with a fresh install of epics 3.14.10.

I am installing this in /ligo/apps/linux-x86_64/epics/base-3.14.10/

Prior to all of this, I had done a lot of installation and configuration of the packages needed to make LAMP work:

sudo apt-get install lamp-server^

sudo apt-get install phpmyadmin

I then continued to follow the instructions on Patrick's wiki:

https://awiki.ligo-wa.caltech.edu/aLIGO/Conlog#EDCU_library

I installed the c_string library into /ligo/apps/linux-x86_64/ according to his instructions.  (prior to my installs, there was no /ligo/ on this machine at all, so I made the needed parent directories with the correct permissions).

That should be everything up to installing epics (working on that now).

  6387   Thu Mar 8 17:18:19 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:
I have the aLIGO Conlog 'working' at http://192.168.113.209/conlog/conlog.php

The process is running inside a screen on megatron.

To start it running, you need to set your environment, and then run the startup.c1conlog script :

cd /cvs/cds/rtcds/caltech/c1/target/conlog/conlogepics

source conlog_environment.txt

./startup.c1conlog

This will leave you at an epics prompt, which means the code is running. (that's why I left it running in a screen for now).

To change the list of channels when the conlog is running, you need to edit the file (s):

/ligo/caltech/data/conlog/c1/add_channel_names
/ligo/caltech/data/conlog/c1/remove_channel_names

Then start up medm as follows:

cd ~/ryan/

medm -x -macro "IFO=C1" medm/CONLOG.adl

Then click the Add channel list button or Remove channel list button.

To change the channels before running the conlog from a blank database, you would edit:
/ligo/caltech/data/conlog/c1/use_channel_names (I believe this should be read whenever the conlog is restarted, but I'm only sure it does the first time you start conlog).




Documenting the rest of the installation:


Successful? Installation of Fresh EPICS and Extensions



Fresh Copy of EPICS 3.14.10


* We restarted (on Patrick's suggestion) with a fresh copy of EPICS 3.14.10 in:
/ligo/apps/linux-x86_64/epics
* I had to set a clean environment:
* Then I downloaded the tarball, unpacked it, and simply ran make within it, and it worked!
* Next, I followed Patrick's wiki instructions with only modifications to the configure/RELEASE files for the archive and ioc/conlog extensions.
* Then I realized that I had to rebuild conlog ioc after adding a directory:
/ligo/apps/linux-x86_64/epics/iocs/conlog/iocBoot/iocc1_conlog
* I had to copy the files from the h1 directory and modify them so that all reference to h1 now point to c1 in the new directory
* I then rebuilt the conlog ioc (I had to make sure setenv SCRIPTS was run again because I had logged out and back in, and I reset the whole environment properly again)

Rest of Install


* I was able to fairly trivially follow through the rest of the installation steps on Patrick's wiki, up to the "Design" section.
* Now, there is no obvious way to move forward (nothing is actually running I believe).

New Install Instructions:


"
You want to create the EPICS IOC boot directory by doing the following:

In the top level of the IOC (.../epics/iocs/conlog) with the appropriate
paths:

/epics/base/bin/linux-x86_64/makeBaseApp.pl -i -t ioc c1_conlog
What application should the IOC(s) boot?
The default uses the IOC's name, even if not listed above.
Application name? conlog

Then you will have to update the st.cmd it creates. You can compare the
st.cmd and st.cmd.backup files in the other directories to see what needs
to be changed.

I don't know if just copying the directory will work, it might.

You will also probably want to change the following line in
/epics/modules/archive/archiveApp/src/drvArchive.c:

queue = epicsMessageQueueCreate(100000000, sizeof(struct message_type));

and reduce 100000000 to something smaller depending on the amount of ram
available to the computer. I think sizeof(struct message_type) is
something like 112 bytes. Then recompile.

You basically put a file with the list of channels to use in the directory
path for "use channel list filepath" in the following command in st.cmd:
drvArchive_read_channel_list_filepaths <add channel list filepath>,<remove
channel list filepath>,<use channel list filepath>
I can send you the script that I currently use to generate that channel
list if you want, but it may need to be changed for your setup.

Once you start the ioc, open the medm which can be checked out from
subversion here: cds_user_apps/trunk/cds/common/medm/CONLOG.adl
with macro substitution for IFO: medm -x -macro "IFO=C1"
and click on the button for "Use channel list".
The number of monitored channels should increase to the number of channels
in the file.

-Patrick

...

The perl script and example file are attached, just redirect the output of
the perl script to a file. It scans autoBurt.req files in a particular
directory and its subdirectories for channel names that meet certain
criteria. All the file contains is a list of channel names, one on each
line. To start the IOC, go to the target directory and run
./startup.c1conlog.

"

{{rpfisher:scan_autoburt.pl.txt|scan_autoburt.pl.txt}}



{{rpfisher:use_channel_names.txt|use_channel_names.txt}}



My Notes Regarding These Instructions:


* Throughout the installation instructions, it probably should have been made clear that the ifo is lowercase: eg c1 (but in the end the installation mixed C1 and c1 in different places)
* Also throughout, one should be careful to replace lho with your site (eg caltech) wherever it appears
* After running the first perl script to generate the iocBoot/iocc1_conlog directory, the goal is to rebuild the whole conlog ioc by running make from epics/iocs/conlog, but before doing that:
* I needed to change the suggested line in the archive module to match the correct RAM size of the machine I am installing on (I actually gave it just less than half the free RAM), then:
* Remake the archive module
* Change into the ioc/conlog directory, remove the iocBoot I had made before for c1, rerun the perl script above, then run make from the ioc/conlog directory.
* Once that is done, you need to edit the file st.cmd to add lines for the reading and writing of channels, such as:
dbLoadRecords("db/conlog.db","IFO=C1")
drvArchive_mysql "C1_conlog","/data/mysql/C1_conlog_epics_user"
drvArchive_read_channel_list_filepaths "/ligo/caltech/data/conlog/c1/add_channel_names","/ligo/caltech/data/conlog/c1/remove_channel_names","/ligo/caltech/data/conlog/c1/use_channel_names"
drvArchive_write_channel_list_filepaths "/ligo/caltech/data/conlog/c1/channel_names","/ligo/caltech/data/conlog/c1/monitored_channel_names","/ligo/caltech/data/conlog/c1/unmonitored_channel_names"
* I also had to rerun this set of commands once that was all done:
controls: cd /opt/rtcds/<site>/<ifo>/target/conlog/conlogepics
controls: cp -r /ligo/apps/linux-x86_64/epics/iocs/conlog/db ./
controls: cp -r /ligo/apps/linux-x86_64/epics/iocs/conlog/dbd ./
controls: cp /ligo/apps/linux-x86_64/epics/iocs/conlog/iocBoot/ioc<ifo>_conlog/envPaths ./
controls: cp /ligo/apps/linux-x86_64/epics/iocs/conlog/iocBoot/ioc<ifo>_conlog/st.cmd ./
controls: echo ./bin/linux-x86_64/conlog st.cmd > startup.<ifo>conlog
controls: chmod a+x startup.<ifo>conlog
* This set of instructions seemed to be lacking, so I added this:

cp -r /ligo/apps/linux-x86_64/epics/iocs/conlog/bin/ ./

* Now the executable runs but doesn't work:
* Fixes needed:
* Need to use root permissions and make sure the files in /data/mysql have the right names for the users the code expects and also have the right passwords. (have to match capitalization appropriately for the <ifo> tag everywhere
* Might need to go into mysql and add a new user with the proper capitalization also
* Need to edit the ld_library_path to point to the new epics libraries (see the suggested environment below)
* Now, the code seems to work, but dumps me at an "epics> " prompt, I'm asking Patrick what to do next.
* I was impatient and loaded up the medm screen, and found out that the one channel I had picked was not readable (unmonitored)
* I ran a modified version of Patrick's perl script to search autoBert files for channels, and replaced my use_channel_names file with the output of the script
* Now, the medm screen shows lots of monitored channels, and the conlog is filling up! (can see it from phpmyadmin)
* Next step: I wanted to get the php pages working, so I edited the files inside /var/www/conlog:
megatron:~/ryan>diff -u /var/www/conlog/conlog.php /var/www/conlog/conlog.php.bck.Mar82012_1
--- /var/www/conlog/conlog.php  2012-03-08 15:31:53.152547771 -0800
+++ /var/www/conlog/conlog.php.bck.Mar82012_1   2012-03-08 15:28:23.062704171 -0800
@@ -19,7 +19,7 @@
        <form action="query.php" method="post">
                <h3><label for="database">Database:</label></h3>
                <select id="database" name="database">
-                       <option value="C1_conlog">C1</option>
+                       <option value="h2_conlog">h2</option>
                </select>
 
                <h3><label for="included_channels">Included channels:</label></h3>

megatron:~/ryan>diff -u /var/www/conlog/query.php /var/www/conlog/query.php.bck.Mar82012_1
--- /var/www/conlog/query.php   2012-03-08 15:33:45.122550303 -0800
+++ /var/www/conlog/query.php.bck.Mar82012_1    2012-03-08 15:32:31.772554679 -0800
@@ -168,8 +168,8 @@
        }
 
        $database_name = $_POST["database"];
-       if ($database_name == 'C1_conlog') {
-               $server = '192.168.113.209';
+       if ($database_name == 'h2_conlog') {
+               $server = 'cdsconlog';
        }
        else {
                die('Unknown database.');

* Finally, the mysql server was denying access from outside queries, so I fixed that:
megatron:~/ryan>diff -u /etc/mysql/my.cnf /etc/mysql/my.cnf.bck.Mar82012_1
--- /etc/mysql/my.cnf   2012-03-08 15:35:57.122548370 -0800
+++ /etc/mysql/my.cnf.bck.Mar82012_1    2012-03-08 15:35:10.652559315 -0800
@@ -49,7 +49,7 @@
 #
 # Instead of skip-networking the default is now to listen only on
 # localhost which is more compatible and is not less secure.
-#bind-address          = 127.0.0.1
+bind-address           = 127.0.0.1
 #
 # * Fine Tuning
 #
megatron:~/ryan>
* Now, I think everything is working * almost:
* It seems that when you first start the conlog up, it finds all the variables and inserts values of "Null" for everything, but after that it detects changes properly!


Conlog Environment


Need to source this to use the new environment:

megatron:~>cat ~/ryan/conlog_enviroment.txt
  6391   Fri Mar 9 11:02:56 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:

Too Many Fast Changing EPICS in New Conlog


I have been monitoring the new conlog, and it already has far too many rows.

I'm going through the list of channels to exclude in the update_channels script for the conlog that is currently running and removing them from
the monitored channels in the new conlog using the remove_channel_names file and the medm screen (we may want to just wipe out the tables
and start over after this is set properly, but for now I'm keeping them):
#-- Exclude a few uninteresting or obsolete categories
if ( $chan =~ m/^[BIJ]$/ ||
$chan =~ m/IOO-MC_PWR_IN/ ||
$chan eq "C1:PSL-FSS_SLOWDC" ||
$chan =~ m/PSL-STAT_.*_BITS/ ||
$chan =~ m/:IOO-PZTM[12]_(PIT|YAW)_BIAS$/ ||
$chan =~ m/DAQ.*_cycle/ ||
$chan =~ m/DAQ.*rtSeconds/ ||
$chan =~ m/C1:-.*/ ||
$chan =~ m/C1:SUP/ ||
$chan =~ m/C1:SP/ ||
$chan =~ m/C1:X/ ||
$chan =~ m/C1:TST/ ||
$chan =~ m/C1:RF/ ||
$chan =~ m/C1:UCT/ ||
$chan =~ m/C1:DU/ ||
$chan =~ m/C1:MCP/ ||
$chan =~ m/C1:MCS/ ||
$chan =~ m/C1:FEC/ ||
$chan =~ m/C1:PEM/ ||
$chan =~ m/C1:LSP/ ||
$chan =~ m/C1:NIO/ ||
$chan =~ m/C1:WFS/ ||
$chan =~ m/C1:ASC-WFS/ ||
$chan =~ m/C1:ASC-SP/ ||
$chan =~ m/C1:VG/ ||
$chan =~ m/C1:IOO-DOF/ ||
$chan =~ m/C1:IOO-EO/ ||
$chan =~ m/Name/ ||
$chan =~ m/DEFAULTNAME/ ||
$chan =~ m/:IOO-PZT.*OFFSET/ ||
$chan =~ m/PD_VAR$/ ||
$chan =~ m/_INMON$/ ||
$chan =~ m/_EXCMON$/ ||
$chan =~ m/_OUT16$/ ||
$chan =~ m/_OUTMON$/ ||
$chan =~ m/_OUTPUT$/ ||
$chan =~ m/_RSET$/ ||
$chan =~ m/_ALIVE$/ ||
$chan =~ m/VMon$/ ||
$chan =~ m/PDMon$/ ||
$chan =~ m/(BiasVMon|FE_PPOLL|MASTER_OVERFLOW|FSS_TIDALSET|CPU_LOAD|CDM
_STAT|State_Bits|INDCOFFSET)/ )

With these removals, only 15493 channels are being monitored now.
  6394   Fri Mar 9 15:48:56 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:

I decided to make a backup of the database and then delete it and make a new database:
 
cd ~/ryan/database_dumpMar92012
mysqldump -u root -p C1_conlog > C1_conlog.dump.Mar92012 
Note: it appears this failed the first time, thankfully this wasn't a production service yet... In the future, do not trust this backup method for important data!

Next, log into mysql as root, dump the database, remake it and grant privileges again.:
(This is saved in megatron:~/ryan/restore_database.txt
megatron:~/ryan>mysql -u root -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 174
Server version: 5.1.41-3ubuntu12.10 (Ubuntu)

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> list databases;
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'list databases' at line 1
mysql> list users;      ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'list users' at line 1
mysql> use C1_conlog
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> list users;
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'list users' at line 1
mysql> select User from mysql.user;                                             +------------------+
| User             |
+------------------+
| php              |
| C1_conlog_epics  |
| c1_conlog_epics  |
| root             |
| C1_conlog_epics  |
| c1_conlog_epics  |
| debian-sys-maint |
| root             |
| root             |
+------------------+
9 rows in set (0.00 sec)

mysql> show databases;                                                          +--------------------+
| Database           |
+--------------------+
| information_schema |
| C1_conlog          |
| mysql              |
+--------------------+
3 rows in set (0.00 sec)

mysql> drop database C1_conlog ;
Query OK, 2 rows affected (0.56 sec)

mysql> create database C1_conlog;
Query OK, 1 row affected (0.00 sec)

mysql> use C1_conlog ;
Database changed
mysql> SET SQL_MODE="NO_AUTO_VALUE_ON_ZERO";
Query OK, 0 rows affected (0.00 sec)

mysql>
mysql> CREATE TABLE `channels` (
    ->   `channel_id` mediumint(8) unsigned NOT NULL AUTO_INCREMENT,
    ->   `channel_name` varchar(60) NOT NULL,
    ->   PRIMARY KEY (`channel_id`),
    ->   UNIQUE KEY `channel_name` (`channel_name`)
    -> ) ENGINE=MyISAM  DEFAULT CHARSET=latin1;
Query OK, 0 rows affected (0.04 sec)

mysql>
mysql> CREATE TABLE `data` (
    ->   `acquire_time` decimal(26,6) NOT NULL,
    ->   `channel_id` mediumint(8) unsigned NOT NULL,
    ->   `value` varchar(40) DEFAULT NULL,
    ->   `status` tinyint(3) unsigned DEFAULT NULL,
    ->   `connected` tinyint(1) unsigned NOT NULL,
    ->   PRIMARY KEY (`channel_id`,`acquire_time`)
    -> ) ENGINE=MyISAM DEFAULT CHARSET=latin1;
Query OK, 0 rows affected (0.03 sec)

mysql> grant select, insert, update, execute on * to 'c1_conlog_epics'@'127.0.0.1';  Query OK, 0 rows affected (0.00 sec) 
mysql> grant select, insert, update, execute on * to 'C1_conlog_epics'@'127.0.0.1';   Query OK, 0 rows affected (0.00 sec) 
 mysql> grant select, insert, update, execute on * to 'c1_conlog_epics'@'localhost';  Query OK, 0 rows affected (0.00 sec) 
mysql> grant select, insert, update, execute on * to 'C1_conlog_epics'@'localhost';
Query OK, 0 rows affected (0.00 sec)

mysql> grant select on C1_conlog to 'php'@'%';
ERROR 1146 (42S02): Table 'C1_conlog.C1_conlog' doesn't exist
mysql> grant select on * to 'php'@'%';
Query OK, 0 rows affected (0.00 sec)

mysql> select * from mysql.users
    -> ;
ERROR 1146 (42S02): Table 'mysql.users' doesn't exist
mysql> select User from mysql.user;        
| C1_conlog_epics  |
| c1_conlog_epics  |
| root             |
| C1_conlog_epics  |
| c1_conlog_epics  |
| debian-sys-maint |
| root             |
| root             |
+------------------+
9 rows in set (0.00 sec)

mysql> Bye 



Next, I decided that I want to index on the acquire_time instead of the combination of channel_id and acquire_time (I think it makes a lot of sense for several query types, and especially debugging the conlog!):
mysql> create index acquire_time_index on data(acquire_time);
Query OK, 0 rows affected (0.04 sec)
Records: 0  Duplicates: 0  Warnings: 0

Next Fix:


The above worked well, but when I restarted the conlog, I had to re-execute the "remove_channels" from the medm, because initially all channels were being loaded (use_channel_names had all the channels still).
Additionally, there were a lot of channels with "*RMS*" in the name that were being recorded, and were changing relatively quickly, so I have added those to the remove_channel_names file.

I am going to: Backup the files in /ligo/caltech/data/conlog/c1
Edit use_channel_names to only have the good channels.
Dump the database again
Stop conlog.
Wipe the database again.
Remake the database again (with permissions and the new index).
Restart the conlog and hope!

The fix above seems to be in place and working. The database has the initial entries for the channels it monitors and is not growing without operators changing EPICs values.

  6396   Fri Mar 9 16:28:10 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:
I created a page on the wiki for the new EPICS log (conlog):
https://wiki-40m.ligo.caltech.edu/aLIGO%20EPICs%20log%20%28conlog%29

I also edited this with restart instructions:
https://wiki-40m.ligo.caltech.edu/Computer_Restart_Procedures#megatron
  6399   Sat Mar 10 15:29:47 2012 ZachHowToComputer Scripts / ProgramsModeMatchr

For your mode matching pleasure, I have added a tool called "ModeMatchr" to the SVN under /trunk/zach/tools/modematchr/

It uses the usual fminsearch approach, but tolerates a fully astigmatic input (i.e., w0ix ≠ w0iy, z0ix ≠ z0iy) and allows for transforming to an elliptical waist  (i.e., w0fx ≠ w0fy, but z0fx = z0fy). It would be straightforward to allow for z0fx ≠ z0fy, but I have never seen a case when we actually wanted this. On the other hand, the elliptical output ability is nice for coupling to wide-angle ring cavities.

It also does the looping through available lenses for you , and retains the best solution for each lens combination in an output cell, which can then be combed with another function (getOtherSol). fminsearch is incredibly fast: with a 10-lens bank, it finds all 100 best solutions on my crappy MacBook in <10s.

I have also included the functionality to constrain the length of the total MMT to within some percentage of the optimal distance, which helps to sift through the muck .

MMrLogo.png

  6415   Wed Mar 14 13:27:15 2012 ZachHowToComputer Scripts / ProgramsModeMatchr

I have added to ModeMatchr the capability to fix the total MMT distance. This is nice if you are coupling to a cavity some fixed distance away. The blurb from the help:

% Note: for any total length constraint dtot_tol > 0, ModeMatchr will use
% fminsearch to find the best solutions near your nominal dtot, and then
% omit solutions whose dtot lie outside your tolerance. For dtot_tol = 0,
% ModeMatchr actively constrains dtot to your value, and then finds the
% best solution. Therefore, set dtot_tol = 0 if you have a fixed distance
% into which to put a MMT.

Quote:

For your mode matching pleasure, I have added a tool called "ModeMatchr" to the SVN under /trunk/zach/tools/modematchr/

It uses the usual fminsearch approach, but tolerates a fully astigmatic input (i.e., w0ix ≠ w0iy, z0ix ≠ z0iy) and allows for transforming to an elliptical waist  (i.e., w0fx ≠ w0fy, but z0fx = z0fy). It would be straightforward to allow for z0fx ≠ z0fy, but I have never seen a case when we actually wanted this. On the other hand, the elliptical output ability is nice for coupling to wide-angle ring cavities.

It also does the looping through available lenses for you , and retains the best solution for each lens combination in an output cell, which can then be combed with another function (getOtherSol). fminsearch is incredibly fast: with a 10-lens bank, it finds all 100 best solutions on my crappy MacBook in <10s.

I have also included the functionality to constrain the length of the total MMT to within some percentage of the optimal distance, which helps to sift through the muck .

MMrLogo.png

 

  6534   Fri Apr 13 16:09:43 2012 SureshUpdateComputer Scripts / ProgramsACAD 2002 installed on C21530

I have installed ACAD 2002 on one of the Windows machines in the Control Room.    It is on the machine which has Solid Works (called C21530). 

The installation files are in MyDocuments under Acad2002.  This a shared LIGO license which Christian Cepada had with him.

I hope we will be able to open our optical layout diagrams with this and update them even though it is an old version.

 

 

  6550   Thu Apr 19 16:21:04 2012 ZachUpdateComputer Scripts / ProgramsArbcav updated, made badass

I have modified Arbcav to be way cooler than it used to be.

Main modifications:

  • Can now truly model an arbitrary cavity geometry
    • The previous version could only handle a few different topologies. In each case, it would unfold the cavity into the equivalent linear cavity and use the g-parameter method to calculate gouy phases, etc.
    • The new model uses the closed cavity propagation matrix to find the supported mode, and then explicitly calculates the accumulated gouy phase by propagating the beam through the full cavity. This is done analytically with zR, so there is negligible slow-down.
  • Now plots a diagram of the cavity geometry, both to help you and for you to verify that it is calculating the right thing (<-- this is the cool part)
    • Plots the beam path and mirror locations
    • Specifies whether mirrors are curved or flat
    • Prints mirror parameters next to them
    • Finds all intracavity waist locations and plots them
    • Gives waist information (size in X, Y)

Since the information is already there, I will have the output structure include things like the input beam q parameter, which could then be fed directly to mode matching tools like ModeMatchr.

The function takes as input the same arguments as before. Example for a square cavity:

out = arbcav([200e-6 50e-6 200e-6 50e-6],[0.75 0.75 0.75 0.75],[1e10 9 1e10 9],[45 45 45 45],29.189e6,10e-6,1064e-9,1000);

i.e.,

out = arbcav(transmissivity_list, length_list, RoC_list, angle_list, modulation_freq, loss_list_or_loss_per_mirror, wavelength, num_pts_for_plot);

If you don't give it a modulation frequency, it will just plot carrier HOMs. If you don't give it RoCs and angles, it will just plot the transmission spectrum.

 

I'm still fine-tuning some functionality, but I should have it up on the SVN relatively soon. Comments or suggestions are welcome!

 

Some screenshots:

Cavity geometry plots (linear, triangular, square, bowtie):

linear.pngtriangular.pnggyro.pngbowtie.png

 

Transmission and HOM spectra (these correspond to the square cavity at lower left, above):

gyro_spect.pnggyro_HOM.png

  6570   Wed Apr 25 21:24:10 2012 DenUpdateComputer Scripts / Programsc1oaf

C1OAF model, codes and medm screens are updated. All proper files are commited to svn and updated at the new model path.

  6686   Fri May 25 19:13:10 2012 Duncan MacleodSummaryComputer Scripts / Programs40m summary webpages

40m summary webpages

 The aLIGO-style summary webpages are now running on 40m data! They are running on megatron so can be viewed from within the martian network at:

http://192.168.113.209/~controls/summary

At the moment I have configured the 5 seismic BLRMS bands, and a random set of PSL channels taken from a strip tool.

Technical notes

  • The code is in python depending heavily on the LSCSoft PyLAL and GLUE modules.
    • /home/controls/public_html/summary/bin/summary_page.py
  • The HTML is supported by a CSS script and a JS script which are held locally in the run directory, and JQuery linked from the google repo.
    • /home/controls/public_html/summary/summary_page.css
    • /home/controls/public_html/summary/pylaldq.js
  • The configuration is controlled via a single INI format file
    • /home/controls/public_html/summary/share/c1_summary_page.ini

Getting frames

Since there are no segments or triggers for C1, the only data sources are GWF frames. These are mounted from the framebuilder under /frames on megatron. There is a python script that takes in a pair of GPS times and a frame type that will locate the frames for you. This is how you use it to find T type frames (second trends) for May 25 2012:

python /home/controls/public_html/summary/bin/framecache.py --ifo C1 --gps-start-time 1021939215 --gps-end-time 1022025615 --type T -o framecache.lcf

If you don't have GPS times, you can use the tconvert tool to generate them

$ tconvert May 25
1021939215

The available frame types, as far as I'm aware are R (raw), T (seconds trends), and M (minute trends).

Running the code

The code is designed to be fairly easy to use, with most of the options set in the ini file. The code has three modes - day, month, or GPS start-stop pair. The month mode is a little sketchy so don't expect too much from it. To run in day mode:

python /home/controls/public_html/summary/bin/summary_page.py --ifo C1 --config-file /home/controls/public_html/summary/share/c1_summary_page.ini --output-dir . --verbose --data-cache framecache.lcf -SRQDUTAZBVCXH --day 20120525

Please forgive the large apparently arbitrary collection of letters, since the 40m doesn't use segments or triggers, these options disable processing of these elements, and there are quite a few of them. They correspond to --skip-something options in long form. To see all the options, run

python /home/controls/public_html/summary/bin/summary_page.py --help

There is also a convenient shell script that will run over today's data in day mode, doing everything for you. This will run framecache.py to find the frames, then run summary_page.py to generate the results in the correct output directory. To use this, run

bash /home/controls/public_html/summary/bin/c1_summary_page.sh

Configuration

Different data tabs are disabled via command link --skip-this-tab style options, but the content of tabs is controlled via the ini file. I'll try to give an overview of how to use these. The only configuration required for the Seismic BLRMS 0.1-0.3 Hz tab is the following section:

 

[data-Seismic 0.1-0.3 Hz]
channels = C1:PEM-RMS_STS1X_0p1_0p3,C1:PEM-RMS_STS1Y_0p1_0p3,C1:PEM-RMS_STS1Z_0p1_0p3
labels = STS1X,STS1Y,STS1Z
frame-type = R
plot-dataplot1 =
plot-dataplot3 =
amplitude-log = True
amplitude-lim = 1,500
amplitude-label = BLRMS motion ($\mu$m/s)

The entries can be explained as follows:

  1. '[data-Seismic 0.1-0.3 Hz] - This is the section heading. The 'data-' mark identifies this as data, and is a relic of how the code is written, the 'Seismic 0.1-0.3 Hz' part is the name of the tab to be displayed in the output.
  2. 'channels = ...' - This is a comma-separated list of channels as they are named in the frames. These must be exact so the code knows how to find them.
  3. 'labels = STS1X,STS1Y,STS1Z' - This is a comma-separated list of labels mapping channel names to something more readable for the plots, this is optional.
  4. 'frame-type = R' - This tells the code what frame type the channels are, so it can determine from which frames to read them, this is not optional, I think.
  5. 'plot-dataplotX' - This tells the code I want to run dataplotX for this tab. Each 'dataplot' is defined in it's own section, and if none of these options are given, the code tries to use all of them. In this configuration 'plot-dataplot1' tells the code I want to display the time-series of data for this tab.
  6. 'amplitude-XXX = YYY' - This gives the plotter specific information about this tab that overrides the defaults defined in the dataplotX section. The options in this example tell the plotter that when plotting amplitude on any plot, that axis should be log-scale, with a limit of 1-500 and with a specific label. The possible plotting configurations for this style of option are: 'lim', 'log', 'label', I think.

Other compatible options not used in this example are:

 

  • scale = X,Y,Z - a comma-separated list of scale factors to apply to the data. This can either be a single entry for all channels, or one per channel, nothing in between.
  • offset = X,Y,Z - another comma-separate list of DC offsets to apply to the data (before scaling, by default). DAQ noise may mean a channel that should read zero during quick times is offset by some fixed amount, so you can correct that here. Again either one for all channels, or one per channel.
  • transform = lambda x: f(x) - a python format lambda function. This is basically any mathematical function that can be applied to each data sample. By default the code constructs the function 'lambda d: scale * (d-offset)', i.e. it calibrates the data by removing the offset an applying the scale.
  • band = fmin, fmax - a low,high pair of frequencies within which to bandpass the data. Sketchy at best...
  • ripple_db = X - the ripple in the stopband of the bandpass filter
  • width = X - the width in the passband of the bandpass filter
  • rms_average = X - number of seconds in a single RMS average (combine with band to make BLRMS)
  • spectrum-segment-length = X - the length of FFT to use when calculating the spectrum, as a number of samples
  • spectrum-overlap = X - the overlap (samples) between neighbouring FFTs when calculating the spectrum
  • spectrum-time-step = X - the length (seconds) of a single median-mean average for the spectrogram

At the moment a package version issue means the spectrogram doesn't work, but the spectrum should. At the time of writing, to use the spectrum simple add 'plot-dataplot2'.

You can view the configuration file within the webpage via the 'About' link off any page.

Please e-mail any suggestions/complaints/praise to duncan.macleod@ligo.org.

  6687   Fri May 25 20:45:25 2012 Duncan MacleodSummaryComputer Scripts / Programs40m summary webpages

There is now a job in the crontab that will run the shell wrapper every hour, so the pages _should_ take care of themselves. If you make adjustments to the configuration file they will get picked up on the hour, or you can just run the script by hand at any time.

$ crontab -l
# m h  dom mon dow   command
0 */1 * * * bash /home/controls/public_html/summary/bin/c1_summary_page.sh > /dev/null 2>&1

  6757   Tue Jun 5 21:09:40 2012 yutaUpdateComputer Scripts / Programshacked ezca tools

Currently, ezca tools are flakey and fails too much.
So, I hacked ezca tools just like Yoichi did in 2009 (see elog #1368).

For now,

/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaread
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcastep
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaswitch
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcawrite

are wrapper scripts that repeats ezca stuff until it succeeds (or fails more than 5 times).

Of course, this is just a temporary solution to do tonight's work.
To stop this hack, run /users/yuta/scripts/ezhack/stophacking.cmd. To hack, run /users/yuta/scripts/ezhack/starthacking.cmd.

Original binary files are located in /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcabackup/ directory.
Wrapper scripts live in /users/yuta/scripts/ezhack directory.

I wish I could alias ezca tools to my wrapper scripts so that I don't have to touch the original files. However, alias settings doesn't work in our scripts.
Do you have any idea?

  6768   Wed Jun 6 18:04:22 2012 JamieUpdateComputer Scripts / Programshacked ezca tools

Quote:

Currently, ezca tools are flakey and fails too much.
So, I hacked ezca tools just like Yoichi did in 2009 (see elog #1368).

For now,

/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaread
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcastep
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcaswitch
/ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcawrite

are wrapper scripts that repeats ezca stuff until it succeeds (or fails more than 5 times).

Of course, this is just a temporary solution to do tonight's work.
To stop this hack, run /users/yuta/scripts/ezhack/stophacking.cmd. To hack, run /users/yuta/scripts/ezhack/starthacking.cmd.

Original binary files are located in /ligo/apps/linux-x86_64/gds-2.15.1/bin/ezcabackup/ directory.
Wrapper scripts live in /users/yuta/scripts/ezhack directory.

I wish I could alias ezca tools to my wrapper scripts so that I don't have to touch the original files. However, alias settings doesn't work in our scripts.
Do you have any idea?

I didn't like this solution, so I hacked up something else.  I made a new single wrapper script to handle all of the utils.  It then executes the correct command based on the zeroth argument (see below).

I think moved all the binaries to give them .bin suffixes, and the made links to the new wrapper script.  Now everything should work as expected, with this new retry feature.

controls@rosalba:/ligo/apps/linux-x86_64/gds-2.15.1/bin 0$ for pgm in ezcaread ezcawrite ezcaservo ezcastep ezcaswitch; do mv $pgm{,.bin}; ln ezcawrapper $pgm; done
controls@rosalba:/ligo/apps/linux-x86_64/gds-2.15.1/bin 0$ cat ezcawrapper
#!/bin/bash

retries=5

pgm="$0"
run="${pgm}.bin"

if ! [ -e "$run" ] ; then
    cat <&2
This is the ezca wrapper script.  It should be hardlinked in place of
the ezca commands (ezcaread, ezcawrite, etc.), and executing the
original binaries (that have been moved to *.bin) with $retries
failure retries.
EOF
    exit -1
fi

if [ -z "$@" ] || [[ "$1" == '-h' ]] ; then
    "$run"
    exit
fi

for try in $(seq 1 "$retries") ; do
    if "$run" "$@"; then
	exit
    else
	echo "retrying ($try/$retries)..." >&2
    fi
done
echo "$(basename $pgm) failed after $retries retries." >&2
exit 1

  6769   Wed Jun 6 18:22:52 2012 JamieUpdateComputer Scripts / Programshacked ezca tools

Quote:

I didn't like this solution, so I hacked up something else.  I made a new single wrapper script to handle all of the utils.  It then executes the correct command based on the zeroth argument (see below).

I think moved all the binaries to give them .bin suffixes, and the made links to the new wrapper script.  Now everything should work as expected, with this new retry feature.

Yuta and I added a feature such that it will not retry if the environment variables EZCA_NORETRY is set, e.g.

$ EZCA_NORETRY=true ezcaread FOOBAR

  6803   Tue Jun 12 13:49:32 2012 JamieConfigurationComputer Scripts / Programstconvert

A nicer, better maintained version of tconvert is now supplied by the lalapps package.  It's called lalapps_tconvert.  I installed lalapps on all the workstations and aliased tconvert to point to lalapps_tconvert.

  6863   Sun Jun 24 23:42:31 2012 yutaUpdateComputer Scripts / ProgramsPMC locker

I made a python script for relocking PMC.
It currently lives in /opt/rtcds/caltech/c1/scripts/PSL/PMC/PMClocker.py.

I think the hardest part for this kind of locker is the scan speed. I could make the scan relatively fast by using pyNDS.
The basic algorithm is as follows.

1. Turns off the servo by C1:PSL-PMC_SW1.

2. Scans C1:PSL-PMC_RAMP using ezcastep.bin. Default settings for ezcastep is

ezcastep.bin C1:PSL-PMC_RAMP -s 0.1 0.01 10000

So, it steps by 0.01 for 10000 times with interval of 0.1 sec.

3. Get C1:PSL-PMC_PMCTRANSPD and C1:PSL-PMC_RAMP online 1 sec data using pyNDS.

4. If it finds a tall peak in C1:PSL-PMC_PMCTRANSPD, kills ezcastep.bin process, sets C1:PSL-PMC_RAMP to the value where the tall peak was found, and then turns on the servo.

5. If tall peak wasn't found, go back to 3 and get data again.

6. If C1:PSL-PMC_RAMP reaches near -7 V or 0 V, it kills previous ezcastep.bin process and turns the sign of the scan.

I tested this script several times. It sometimes passes over TEM00 (because of the dead time in online pyNDS?), but it locks PMC with in ~10 sec.
Currently, you have to run this to relock PMC because I don't know how to make this an autolocker.

I think use of pyNDS can be applied for finding IR resonance using ALS, too.
I haven't checked it yet becuase c1ioo is down, but ALS version lives in /users/yuta/scripts/findIRresonance.py. ALS may be easier in that we can use fast channels and nice filter modules.

Other scripts:
 I updated /opt/rtcds/caltech/c1/scripts/general/toggler.py. It now has "lazymode". When lazymode, it toggles automatically with interval of 1 sec until you Ctrl-c.

 Also, I moved damprestore.py from my users directory to /opt/rtcds/caltech/c1/scripts/SUS/damprestore.py. It restores suspension damping of a specified mirror when watchdog shuts down the damping.

  6871   Mon Jun 25 17:48:27 2012 yutaUpdateComputer Scripts / Programsscript for finding IR resonance using ALS

I made a python script for finding IR resonance using ALS. It currently lives in /opt/rtcds/caltech/c1/scripts/ALS/findIRresonance.py.

The basic algorism is as follows.

1. Scan the arm by putting an offset to the phase output of the phase tracker(Step C1:ALS-BEAT(X|Y)_FINE_OFFSET_OFFSET by 10 deg with 3 sec ramp time).

2. Fetch TR(X|Y) and OFFSET online data using pyNDS during the step.

3. If it finds a tall peak, sets OFFSET to the value where the tall peak was found.

4. If tall peak wasn't found, go back to 1 and step OFFSET again.

The time series data of how he did is plotted below.
I ran the script for Y arm, but it is compatible for both X and Y arm.

findIRresonance20120625.png

  6872   Mon Jun 25 21:54:52 2012 DenUpdateComputer Scripts / ProgramsPMC locker

Quote:

I made a python script for relocking PMC.
It currently lives in /opt/rtcds/caltech/c1/scripts/PSL/PMC/PMClocker.py.

I thought we rewrite auto lockers once per year, but this time it took us only a month. I wrote it for PMC on May 24. Is it not working?

Could someone make it more clear why some scripts are written on bash, others on sh or python? I think we should elaborate a strict order. Masha and I can work on it if anyone else considers this issue as a problem.

  6873   Tue Jun 26 00:52:18 2012 yutaUpdateComputer Scripts / ProgramsPMC locker

Quote:

I thought we rewrite auto lockers once per year, but this time it took us only a month. I wrote it for PMC on May 24. Is it not working?

I know.
I just wanted to use pyNDS for this kind of scanning & locking situation.
c1ioo was down for the weekend and I couldn't test my script for ALS, so I used it for PMC.

But I think PMClocker.py can relock PMC faster because it can sweep C1:PSL-PMC_RAMP continuously and can get continuous data of C1:PSL-PMC_PMCTRANSPD.

  6880   Wed Jun 27 11:35:06 2012 SashaSummaryComputer Scripts / ProgramsSURF - Week 1 - Summary

I started playing with matlab for the first time, accurately simulated a coupled harmonic oscillator (starting from the basic differential equations, if anyone's curious), wrote a program to get a bode plot out of any simulation (regardless of the number of inputs/outputs), and read a lot.

I'm currently going through the first stage of simulating an ideal Fabry-Perot cavity (I technically started yesterday, but yesterday's work turned out to be wrong, so fresh start!), and other than yesterday's setback, its going okay.

I attached a screenshot of my simulation of the pitch/pendulum motion of one of the mirrors LIGO uses. The bode plots for this one are turning out a little weird, but I'm fairly certain its just a computational error and can be ignored (as the simulation matlab rendered without the coupling was really accurate - down to a floating point error). I have also attached these bode plots. The first bode is based on the force input, while the second is based on the torque input. It makes sense that there are two resonant frequencies, since there ought to be one per input.

 

Attachment 1: Screen_Shot_2012-06-27_at_11.27.10_AM.png
Screen_Shot_2012-06-27_at_11.27.10_AM.png
Attachment 2: Screen_Shot_2012-06-27_at_11.26.57_AM.png
Screen_Shot_2012-06-27_at_11.26.57_AM.png
Attachment 3: Screen_Shot_2012-06-27_at_11.27.29_AM.png
Screen_Shot_2012-06-27_at_11.27.29_AM.png
  6883   Wed Jun 27 15:10:34 2012 JamieUpdateComputer Scripts / Programs40m summary webpages move

I have moved the summary pages stuff that Duncan set up to a new directory that it accessible to the nodus web server and is therefore available from the outside world:

/users/public_html/40-summary

which is available at:

https://nodus.ligo.caltech.edu:30889/40m-summary/

I updated the scripts, configurations, and crontab appropriately:

/users/public_html/40m-summary/bin/c1_summary_page.sh
/users/public_html/40m-summary/share/c1_summary_page.ini

 

  6885   Wed Jun 27 23:54:21 2012 yutaUpdateComputer Scripts / Programsimage capturing script

Mike J. came tonight and he fixed Sensoray (elog #6645). He recompiled it and fixed it.

I made a python wrapper script for Sensoray scripts. It currently lives in /users/yuta/scripts/videocapture.py.
If you run something like
  ./videocapture.py AS
it saves image capture of AS to /users/yuta/scripts/SensorayCapture/ directory with the GPS time.
Below is the example output of AS when MI is aligned. We still see some clipping in the right. This clipping is there when one arm is mis-aligned and clipping moves together with the main beam spot. So, this might be from the incident beam, probably at the Faraday.

Currently, videocapture.py runs only on pianosa, since Sensoray 2253S is connected to pianosa. Also, it can only capture MON4. My script changes MON4 automatically.

AS_1024901004.bmp

  6956   Wed Jul 11 09:48:24 2012 LizSummaryComputer Scripts / ProgramsUpdate/daily summary testing

I have been working on configuration of the Daily Summary webpages and have been attempting to create a "PSL health" page.  This page will display the PMC power, the temperature on the PSL table and the PSL table microphone levels.  Thus far, I have managed to make the extra PSL tab and configure the graph of the interior temperature, using channel C1:PSL-FSS_RMTEMP.

I have been attempting to make a spectrogram for one of the PMC channels, but there is an issue with the spectrogram setup, as Duncan Macleod noted in ELOG 6686:

"At the moment a package version issue means the spectrogram doesn't work, but the spectrum should. At the time of writing, to use the spectrum simple add 'plot-dataplot2'."

 

 

Because of this issue, I have also been trying to make the spectrogram plots work.  Thus far, I have fixed the issue with one of the spectrogram plots, but there are several problems with the other four that I need to address.  I have also been looking at the microphone channels and trying to make the plot for them work.  I checked which microphone was on the PSL table and plotted it in matplotlib to make sure it was working.  However, when I tried to incorporate it into the daily summary pages, the script stops at that point!  It might simply be taking an excessively long time, but I have to figure out why this is the case.

                                                 (I am using channel C1:PEM-MIC_6_IN1_DQ, if this is blatantly wrong, please let me know!!)

 

The main point of this ELOG is that I have working test-daily summary pages online!  They can be found here:

https://nodus.ligo.caltech.edu:30889/40m-summary-test/archive_daily/20120710/

Also, if anyone has more requests for what they would like to see on the finalized summary pages site, please respond to this post or email me at: endavison@umail.ucsb.edu

  6986   Wed Jul 18 10:08:01 2012 LizUpdateComputer Scripts / ProgramsWeek 5 update/progress

Over the past week, I have been focusing on the issues I brought up in my last ELOG,  6956.  I spent quite a while attempting to modify the script and create my own spectrogram function within the existing code.  I also checked out the channels on the PSL table for the PSL health page and produced a spectrogram plot of the PMC reflected, transmitted, and input powers, the PZT Voltage and the laser output power.  When I was entering these channels into the configuration script, I came across an issue with the way the python script parses this.  If there were spaces between the channel names (for example: C1:PSL-PMC_INPUT_DC, C1:PSL-PMC_RFPDDC... etc) the program would not recognize the channels.  I made some alterations to the parsing script such that all white spaces at the beginning and end of the channels were stripped and the program could find them.

 The next thing that I worked on was attempting to see if the microphone channels were actually stopping the program or just taking an extraordinarily long time.  I tried running the program with shorter time samples and that seemed to work quite well!  However, I had to leave it running overnight in order to finish.  I am sure that this difference comes from the fact that the microphone channels are fast channels.  I would like to somehow make it run more quickly, and am thinking about how best to do this.

I finally got my spectrogram function to work after quite a bit of trouble.  There were issues with mismatched data and limit sets that I discovered came from times when only a few frames (one or two) were in one block.  I added some code to  ignore small data blocks  like those and the program works very well now!  It seems like the best way to get the right limits is to let the program automatically set the limits (they are nicely log-scaled and everything) but there are some issues that produce questionable results.  I spent a while adding a colormap option to the script so that the spectrogram colors can be adjusted!  This mostly took so long because, on Monday night, some strange things were happening with the PMC that made the program fail (zeros were being output, which caused an uproar in the logarithmic data limits).  I was incredibly worried about this and thought that I had somehow messed up the script (this happened in the middle of when I was tinkering with the cmap option) so I undid all of my work!  It was only when I realized it was still going on and Masha and Jenne were talking about the PMC issues that I figured out that it was an external issue.  I then went in and set manual limits so that a blank spectrogram and redid everything.

The spectrogram is not operational and the colormap can be customized.  I need to fix the problem with the autoscaled axes (perhaps adding a lower bound?) so that the program does not crash when there is an issue.

Yesterday, I spoke with Rana about what my next step should be.  He advised me to look at ELOGs from Steve (6678) and Koji (6675) about what they wanted to see on the site.  These gave me a good map of what is needed on the site and where I will go next.

I need to find out what is going on with the weather channels and figure out how to calibrate the microphones.  I will also be making sure there are correct units on all of the plots and figure out how to take only a short section of data for the microphone channels.  I have already modified the tab template so that it is similar to Koji's ELOG idea and will be making further changes to the layout of the summary pages themselves.  I will also be working on having the right plots up consistently on the site.

 

  6994   Fri Jul 20 11:59:27 2012 ranaUpdateComputer Scripts / ProgramsCONLOG not running

WE tried to use the new conlog today and discovered that:

1) No one at the 40m uses conlog because they don't know that it ever ran and don't know how to use regexp.

2) It has not been running since the last time Megatron was rebooted (probably a power outage).

3) We could not get it to run using the instructions that Syracuse left in our wiki.

Emails are flying.

  7001   Mon Jul 23 07:39:55 2012 Ryan FisherSummaryComputer Scripts / ProgramsAlterations to base epics install for installing aLIGO conlog:
Note: The Conlog install instructions that I started from were located here:
https://awiki.ligo-wa.caltech.edu/aLIGO/Conlog
  7012   Mon Jul 23 20:19:01 2012 LizUpdateComputer Scripts / ProgramsInput Needed (From everyone!)

The summary pages are now online (Daily Summary), and will eventually be found on the 40m Wiki page under "LOGS-Daily Summary". (Currently, the linked website is the former summary page site)

Currently, all of the IFO and Acoustic channels have placeholders (they are not showing the real data yet) and the Weather channels are not working, although the Weather Station in the interferometer room is working (I am looking into this - any theories as to why this is would be appreciated!!).

I am looking for advice on what else to include in these pages.  It would be fantastic if everyone could take a moment to look over what I have so far (especially the completed page from July 23, 2012) and give me their opinions on:

 

1.  What else you would like to see included

2.  Any specific applications to your area of work that I have overlooked

3.  What the most helpful parts of the pages are

4.  Any ways that I could make the existing pages more helpful

5.  Any other questions, comments, clarifications or suggestions

Finally, are the hourly subplots actually helpful?  It seems to me like they would be superfluous if the whole page were updating every 1-2 hours (as it theoretically eventually will).  These subplots can be seen on the July 24, 2012 page.

My email address is endavison@umail.ucsb.edu.

 Thank you!  

 

  7023   Wed Jul 25 11:22:39 2012 LizUpdateComputer Scripts / ProgramsWeek 6 update

This week, I made several modifications to the Summary page scripts, made preliminary Microphone BLRMS channels and, with Rana's help, got the Weather Station working again. 

I changed the spectrogram and spectrum options in the Summary Pages so that, given the sampling frequency (which is gathered by the program), the NFFT and overlap are calculated internally.  This is an improvement over user-entered values because it saves the time of having to know the sampling frequency for each desired plot.  In addition, I set up another .sh file that can generate summary pages for any given day.  Although this will probably not be useful in the final site, it is quite helpful now because I can go back and populate the pages.  The current summary pages file is called "c1_summary_page.sh" and the one that is set up to get a specific day is called "liz_c1_summary_page.sh".  I also made a few adjustments to the .css file for the webpage so that plots completely show up (they were getting cut off on the edges before) and are easier to see.  I also figured out that the minute and second trend options weren't working because the channel names have to be modified to CHANNEL.mean, CHANNEL.min and CHANNEL.max.  So that is all in working order now, although I'm not sure if I should just use the mean trends or look at all of them (the plots could get crowded if I choose to do this).  Another modification I made to the python summary page script was adding an option to have an image on one of the pages.  This was useful because I can now put requested MEDM screens up on the site.  The image option can be accessed if, in the configuration file, you use "image-" instead of "data-" for the first word of the section header.

I also added a link to the final summary page website on the 40 meter wiki page (my summary page are currently located in the summary-test pages, but they will be moved over once they are more finalized).  I fleshed out the graphs on the summary pages as well, and have useful plots for the OSEM and OPLEV channels.  Instead of using the STS BLRMS channels, I have decided to use the GUR BLRMS channels that Masha made.  I ELOGged about my progress and asked for any advice or recommendations a few days ago (7012) and it would still be great if everyone could take a look at what I currently have up on the website and tell me what they think!  July 22 and 23 are the most finalized pages thus far, so are probably the best to look at.

https://nodus.ligo.caltech.edu:30889/40m-summary-test/archive_daily/20120723/

 

This week, I also tried to fix the problems with the Weather Station, which had not been operational since 2010.  All of the channels on the weather station monitor seemed to be producing accurate data except the rain gauge, so I went on the roof of the Machine Shop to see if anything was blatantly wrong with it.  Other than a lot of dust and spiders, it was in working condition.  I plan on going up again to clean it because, in the manual, it is recommended that the rain collector be cleaned every one to two years...  I also cleared the "daily rain" option on the monitor and set all rain-related things to zero.  Rana and I then traced the cabled from c1pem1 to the weather station monitor, and found that thy were disconnected.  In fact, the connector was broken apart and the pins were bent.  After we reconnected them, the weather station was once again operational!  In order to prevent accidental disconnection in the future, it may be wise to secure this connection with cable ties.  It went out of order again briefly on Tuesday, but I reconnected it and now it is in much sturdier shape!

 

The most recent thing that I have been doing in relation to my project has been making BLRMS channels for the MIC channels.  With Jenne's assistance, I made the channels, compiled and ran the model on c1sus, made filters, and included the channels on the PEM MEDM screen .  I have a few modifications to make and want to .  One issue that I have come across is that the sampling rate for the PEM system is 2 kHz, and the audio frequencies range all the way up to 20 kHz.  Because of this, I am only taking BLRMS data in the 1-1000 Hz range.  This may be problematic because some of these channels may only show noise (For example, 1-3 and 3-10 Hz may be completely useless).

 

The pictures below are of the main connections in the Weather Station.  This first is the one that Rana and I connected (it is now better connected and looks like a small beige box), located near the beam-splitter chamber, and the second is the c1pem1 rack.  For more information on the subject, there is a convenient wiki page: https://wiki-40m.ligo.caltech.edu/Weather_Station

Attachment 1: P7230026.JPG
P7230026.JPG
Attachment 2: P7230031.JPG
P7230031.JPG
  7032   Wed Jul 25 17:35:44 2012 LizUpdateComputer Scripts / ProgramsSummary Pages are in the right place!

The summary pages can now be accessed from the "Daily Summary" link under LOGS on the 40 meter Wiki page.

  7063   Wed Aug 1 10:07:16 2012 LizUpdateComputer Scripts / ProgramsWeek 7 Update

Over the past week, I have continued refining the summary pages.  They are now online in their final home, and can be easily accessed from the 40 meter Wiki page!  (It can be accessed by the Daily Summary link under "LOGS").  I have one final section to add plots to (the IFO section is currently still only "dummy" plots) but the rest are showing correct data!  I have many edits to make in order for them to be more intelligible, but they are available for browsing if anyone feels so inclined.

I also spent quite a while formatting the pages so that the days are in PDT time instead of UTC time.  This process was quite time consuming and required modifications in several files, but I tracked my changes with git so they are easy to pinpoint.  I also did a bit of css editing and rewriting of a few html generation functions so that the website is more appealing.  (One example of this is that the graphs on each individual summary page are now full sized instead of a third of the size.

This week, I also worked with the BLRMS mic channels I made.  I edited the band pass and low pass filters that I had created last week and made coherence plots of the channels.  I encountered two major issues while doing this.  Firstly, the coherence of the channels decreases dramatically above 40 Hz.  I will look at this more today, but am wondering why it is the case.  If nothing could be done about this, it would render three of my channels ineffective.  The other issue is that the Nyquist frequency is at 1000 Hz, which is the upper limit of my highest frequency channel (300-1000 Hz).  I am not sure if this really affects the channel, but it looks very different from all of the other channels.  I am also wondering whether the channels below 20 Hz are useful at all, or whether they are just showing noise.

The microphone calibration has been something I have been trying to figure out for quite some time, but I recently found a value on the website that makes the EM172 microphones and has a value for their sensitivity.  I determined the transfer factor from this sensitivity as 39.8107 mV/Pa, although I am not sure if all of the mics will be consistent with this.

  7104   Tue Aug 7 15:01:38 2012 JenneUpdateComputer Scripts / Programsmedmrun now allows args to pass to scripts

Previously, medmrun didn't accept arguments to pass along to the script it was going to run.  Jamie has graciously taken a moment from fixing the computer disaster to help me update the medmrun script.

Now the ASS scripts are call-able from the screen.

  7108   Tue Aug 7 18:38:50 2012 LizUpdateComputer Scripts / ProgramsDaily Summary Pages are in their final form!

Please check the summary pages out at the link below and let me know if there are any modifications I should make!  All existing pages are up to date and contain all of the pages I have.

Questions, comments, and suggestions will be appreciated! Contact me at endavison@umail.ucsb.edu

https://nodus.ligo.caltech.edu:30889/40m-summary/

  7115   Wed Aug 8 10:38:43 2012 LizUpdateComputer Scripts / ProgramsWeek 8/Summary Pages update

Over the past week, I have been working on my progress report and finalizing the summary pages.  I have a few more things to address in the pages (such as starting at 6 AM, including spectrograms where necessary and generating plots for the days more than ~a week ago) but they are mostly finalized.  I added all of the existing acoustic and seismic channels so the PEM page is up to date.  The microphone plots include information about the transfer factor that I found on their information sheet (http://www.primomic.com/).  If there are any plots that are missing or need editing, please let me know!

I also modified the c1_summary_page.sh script to run either the daily plots or current updating plots by taking in an argument in the command line.  It can be run ./c1_summary_page.sh 2012/07/27
 or ./c1_summary_page.sh now to generate the current day's pages.  (Essentially, I combined the two scripts I had been running separately.)  I have been commenting my code so it is more easily understandable and have been working on writing a file that explains how to run the code and the main alterations I made.  The most exciting thing that has taken place this week is that  the script went from taking ~6 hours to run to taking less than 5 minutes.  This was done by using minute trends for all of the channels and limiting the spectrum plot data.

The summary pages for each day now contain only the most essential plots that give a good overview of the state of the interferometer and its environment instead of every plot that is created for that day.

I am waiting for Duncan to send me some spectrogram updates he has made that downsample the timeseries data before plotting the spectrogram.  This will make it run much more quickly and introduce a more viable spectrogram option.

 

Today's Summary Pages can be accessed by the link on the wiki page or at:

https://nodus.ligo.caltech.edu:30889/40m-summary/archive_daily/20120808/

  7120   Wed Aug 8 13:37:46 2012 KojiUpdateComputer Scripts / ProgramsWeek 8/Summary Pages update

Hey, the pages got significantly nicer than before. I will continue to give you comments if I find anything.

So far: There are many 10^-100 in logarithmic plots. Once they are removed, we should be able to see the seismic excitation during these recent earth quakes?

Incidentally, where the script is located? "./" isn't the absolute path description.

Quote:

Over the past week, I have been working on my progress report and finalizing the summary pages.  I have a few more things to address in the pages (such as starting at 6 AM, including spectrograms where necessary and generating plots for the days more than ~a week ago) but they are mostly finalized.  I added all of the existing acoustic and seismic channels so the PEM page is up to date.  The microphone plots include information about the transfer factor that I found on their information sheet (http://www.primomic.com/).  If there are any plots that are missing or need editing, please let me know!

I also modified the c1_summary_page.sh script to run either the daily plots or current updating plots by taking in an argument in the command line.  It can be run ./c1_summary_page.sh 2012/07/27
 or ./c1_summary_page.sh now to generate the current day's pages.  (Essentially, I combined the two scripts I had been running separately.)  I have been commenting my code so it is more easily understandable and have been working on writing a file that explains how to run the code and the main alterations I made.  The most exciting thing that has taken place this week is that  the script went from taking ~6 hours to run to taking less than 5 minutes.  This was done by using minute trends for all of the channels and limiting the spectrum plot data.

The summary pages for each day now contain only the most essential plots that give a good overview of the state of the interferometer and its environment instead of every plot that is created for that day.

I am waiting for Duncan to send me some spectrogram updates he has made that downsample the timeseries data before plotting the spectrogram.  This will make it run much more quickly and introduce a more viable spectrogram option.

 

Today's Summary Pages can be accessed by the link on the wiki page or at:

https://nodus.ligo.caltech.edu:30889/40m-summary/archive_daily/20120808/

 

  7181   Tue Aug 14 16:33:51 2012 SashaUpdateComputer Scripts / ProgramsSimPlant indicator added

I added an indicator to the watch dog screen so that a little "SP" icon appears whenever the SimPlant is on. Since we only have one simplant (ETMX), only ETMX has the simPlant indicator. However, since assymetry is ugly, I moved all of the OL icons over so that they're in a line and so that there is room for future SP icons.

I also fixed the link to the Watchdogs on the main SUS screens (it was dead, but now it is ALIVE).

  7192   Wed Aug 15 13:23:34 2012 LizSummaryComputer Scripts / ProgramsLast Weekly Update

Over the past week I have been continuing to finalize the daily summary pages, attempting to keep the total run time under half an hour so that they can be run frequently.  I have had many hang ups with the spectrograms and am currently using second trends (with this method, the entire script takes 15 minutes to run).  I also have a backup method that takes 3 minutes of data for every 12 minutes, but could not implement any interpolation correctly.  This might be a future focus, or the summary pages could be configured to run in parallel and full data for the spectrograms can be used.  I configured Steve's tab to include one page of images and one page of plots and fixed the scripts so that it corrects for daylight savings time (at the beginning of the running, the program prints 'DST' or 'Not DST').

Right now, I am focusing on making coherence plots in a spectrogram style (similar to the matlab 'coh_carpet' function) and a spectrogram depicting Gaussianity (similar to the plots made by the RayleighMonitor).  I have also been working on my  final paper and presentation.

  7203   Thu Aug 16 13:04:36 2012 LizSummaryComputer Scripts / ProgramsDaily Summary Details

I just wrote a short description of how to run the daily summary pages and the configuration process for making changes to the site.  It can be found in /users/public_html/40m-summary and is named README.txt.  If I need to clarify anything, please let me know!  The configuration process should be relatively straightforward, so it will be easy to add plots or change them when there are changes at the 40 meter.

  7214   Fri Aug 17 05:29:04 2012 YoichiConfigurationComputer Scripts / ProgramsC1configure scripts

 I noticed that the IFO restore scripts have some problems. They use burt request files to store and restore the settings. However, the request files contain old channel names.

Especially channels with _TRIG_THRES_ON/OFF are now _TRIG_THRESH_ON/OFF, note the extra "H".

These scripts reside in /opt/rtcds/caltech/c1/burt/c1ifoconfigure/.

I fixed the PRMI_SBres and MI scripts. Someone should fix all other files.

  7238   Tue Aug 21 00:02:05 2012 ranaSummaryComputer Scripts / ProgramsGDS/DTT bug: 10 digit GPS times not accepted

 I've noticed that we're experiencing this bug which was previously seen at LHO. We cannot enter 10 digit GPS times into the time fields for DTT due to a limit in TLGEntry.cc, which Jim Batch fixed in September of last year. Seems like we're running an old version of the GDS tools.

I checked the Lidax tool (which you can get from the GDS Mainmenu). It does, in fact, allow 10 digit entries.

  7611   Wed Oct 24 18:42:39 2012 ManasaUpdateComputer Scripts / ProgramsPhase map summary of LaserOptik mirrors

Quote:

 

 Raji took the optics over. They were all measured at 0 deg incidence angle, although we will use them at the angles required for the recycling folding mirrors.  Here's the summary from GariLynn:

In general all six pieces have a radius of curvature of around -700 meters.

They all fall off rapidly past 40 mm diameter.  Within the 40 mm diameter the rms is ~10 nm for most.  I can get finer analysis if you have something specific that you want to know. 
 
All data are saved in Wyko format at the following location:
Gari

 After a long search, I've found a way to finally read and analyze(?)  the Wyko opd format data using Image SXM, an image analysis software working only on mac osx.

I am attaching the images (in tiff) and profile plot of all the 6 mirrors.

Attachment 1: sn1Laseroptik_profile
Attachment 2: sn2Laseroptik_profile
Attachment 3: sn3Laseroptik_profile
Attachment 4: sn4Laseroptik_profile
Attachment 5: sn5Laseroptik_profile
Attachment 6: sn6Laseroptik_profile
Attachment 7: sn1.png
sn1.png
Attachment 8: sn2.png
sn2.png
Attachment 9: sn3.png
sn3.png
Attachment 10: sn4.png
sn4.png
Attachment 11: sn5.png
sn5.png
Attachment 12: sn6.png
sn6.png
  7615   Wed Oct 24 22:48:46 2012 janoschUpdateComputer Scripts / ProgramsPhase map summary of LaserOptik mirrors

Quote:

After a long search, I've found a way to finally read and analyze(?)  the Wyko opd format data using Image SXM, an image analysis software working only on mac osx.

I am attaching the images (in tiff) and profile plot of all the 6 mirrors.

 Great, however, unless you can save the images in FITS format, we still need another reader for the opd images.

  7616   Thu Oct 25 02:01:15 2012 KojiUpdateComputer Scripts / ProgramsPhase map summary of LaserOptik mirrors

Previous phasemap data and analysis for the new 40m COC are summarized on the following page

https://nodus.ligo.caltech.edu:30889/40m_phasemap/

(Use traditional LVC authentication (not albert.einstein))

The actual instance of the files can also be found on nodus below the following directory:

/cvs/cds/caltech/users/public_html/40m_phasemap

The programs for the analysis are found in

/cvs/cds/caltech/users/public_html/40m_phasemap/40m_PRM/mat

The main program is RunThis.m

Basically this program takes ascii files converted from opd by Vision32.
(i.e. You need to go to Downs)
Then the matlab program takes care of the plots and curvature analyses.

  7758   Wed Nov 28 21:42:21 2012 ranaFrogsComputer Scripts / Programsdataviewer font error

An error this evening on rossa: dataviewer not working due to some font errors:

controls@rossa:~ 0$ dataviewer
Connecting.... done
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning: Not all children have same parent in XtManageChildren
Warning:
    Name: FilterText
    Class: XmTextField
    Character '\52' not supported in font.  Discarded.

Warning:
    Name: FilterText
    Class: XmTextField
    Character '\56' not supported in font.  Discarded.

Warning:
    Name: FilterText
    Class: XmTextField
    Character '\170' not supported in font.  Discarded.

Warning:

etc.............

  7768   Fri Nov 30 14:21:18 2012 ranaHowToComputer Scripts / ProgramsThe mystery of PDFs and you. As deep as the mystery of Rasputin.

This is how to post PDF:

From DTT, print the plot as a postscript file.

Then use ps2pdf to make a archival PDF version (the flag is the key!). Example:

ps2pdf -dPDFX /home/controls/Desktop/darm.ps

Attachment 1: darm.pdf
darm.pdf
ELOG V3.1.3-