I finally got the ezcaPut command working. The camera code can now talk directly to the EPICS channels. However, after repeated calls of the ezcaPut function, the function begins claim to time out, even though it continues to write values to the channel successfully (EPICS is successfully getting the new value for the channel, but failing to reply back to the program in time, I think). It has seg-faulted once as well, so the stability cannot yet be trusted for running long term. For now, however, it works well enough to test a servo in the short term. The current approach simply uses a terminal running ezcaservo with the pitch and yaw offset channels of the ETMX, as well as the channels that the camera code output to. This hasn't actually been tested since we haven't had enough time with the x-arm locked.
Tested various fixed zoom lens on the camera, since the one we were previously using was too heavy for its mount and likely more expensive than necessary. The 16mm lens gets a good picture of the beam and the optic together, though the beam is a little too small in the picture to reliably fit a gaussian to. The 24mm lens zooms too much to see the whole optic, but the beam profile itself is much clearer. The 24mm lens is currently on the camera.
Scanned the PZT voltage of the PMC across its full offset range to gain a plot of voltage vs intensity. I used DTT's triggered time series response system to measure the outputs of the slow PZT voltage and transmission intensity channels, and used the script triangle wave to drive the PZT ramp channel slowly over its full range (I couldn't get DTT to output to the channel). Clear resonances did appear (PMCScanWide.tif), but the number of data points per peak was far too small reliably fit a lorentzian to (PMCScanSinglePeak.tif). When I decreased the scanning range and increased the time in order to collect a large number of points on a few peaks, the resulting data was too messy to fit to a lorentzian (PMCSlowSinglePeak.tif). |