to previous bench test (MCM without detector).

I connected another MCM to the test setup in 218, and collected 1500 data packets. This one had a kapton cable and a bad detector (inner bottom, 8.5cm cable, broken Silicon, no bias voltage).

The data were collected in 250-event chunks, and appended to the same (ascii) file. The events were collected at a rate of about 0.7Hz, over a period of about 2.5 hours.

ps


ps, page 2

The 8 panels above show ADC vs AMU channel, for each of the 8 groups of 32 channels. Now what we expected was a pronounced 4-cell periodicity, like in the real data, since we were guessing that this was caused by feedback picked up from the MCM by the kapton cable. This is not what I see here.

Notes: The real data measurements linked above were taken on packet 2003, which an inner middle. However, the noise levels for IM are the same as for IB, see this noise vs cable length plot. Also, in the real MVD, htere are 60 MCMs radiating in synchrony, whereas we have only one here. The inner enclosure may add to the noise too.

What is striking is the drift of the ADC on the scale of a few hours. Note that the bare MCM settled in ~10-15 minutes, and was steady over 5 hours. Toshi looked up the leakage current for this detector (before it broke), and there was a 24-hour periodicity of ~15% in the current. Can the ADC drift be due to the slow drift in leakage current?

Below are 2 examples of leakage current vs time, with ambient temperature also recorded:


Last update 7 Nov 2001 - HvH
back