This afternoon, I was toying around with reducing either the CARM or DARM offsets (so, put in a CARM offset, leave DARM zero, lock the PRMI, then reduce CARM offset to zero. Or, put in a DARM offset, leaving CARM offset zero, lock the PRMI, then reduce the DARM offset to zero).
When looking at the data, I see that the MICH error signal gets fuzzier when the arms get close to resonance. (Note here that because I forgot to zero the carm offset before finding the resonances, -3 is my zero point for this plot and the next.)

Here is a zoom of the last piece of this time series, but with both TRX and TRY plotted (along with POPDC, CARM_ERR and DARM_ERR), where you can see that I had a momentary power buildup of > 100 transmission counts, which is about 20% of our final expected power.

Here is a different time series, showing a reduction of the DARM offset, and you can see that as the offset approaches zero, the MICH error signal gets noticeably more fuzzy. Somewhere near the 240 second mark, I lose PRMI lock.

|