Lately I've been trying to sort out the problem of the discrepancy that I noticed between the values read on the spectrum analyzer's display and what we get with the GPIB interface.
It turns out that the discrepancy originates from the two data vector that the display and the GPIB interface acquire. Whereas the display shows data in "RAW" format, the GPIB interface, for the way the netgpibdata script is written, acquires the so called "error-corrected data". That is the GPIB downloaded data is postprocessed and corrected for some internal calibration factors of the instrument.
I noticed that someone, that wasn't me, has edited the wiki page about the netgpibdata under my name saying:
* A4395 Spectrum Units
Independetly by which unites are displayed by the A4395 spectrum analyzer on the screen, the data is saved in Watts/rtHz"