I'm still not satisfied/done with the solution to this, but this has gone too long without an update and anyway probably someone else will have a direction to take it that prevents me spinning my wheels on solved or basic questions.
The story will have to wait to be on the elog, but I've put it in the jupyter notebook. Basically:
It's clear to me that there is a way to optimize the OMC, but the normalization of my DARM referred noise is clearly wrong, because I'm finding that the input-referred noise is at least 4e-11 m/rt(Hz). This seems too large to believe.
Indeed, I was finding the noise in the wrong way, in a pretty basic mistake. I’m glad I found it I guess. I’ll post some plots and update the git tomorrow.