Can you fix charge-transfer inefficiency without a theory-driven model?

The Gaia mission needs to centroid stars with accuracies at the 10-3-pixel level. At the same time, the detector will be affected by charge-transfer inefficiency degradation as the instrument is battered by cosmic radiation; this causes significant magnitude-dependent centroid shifts. The team has been showing that with reasonable models of charge-transfer inefficiency, they can reach their scientific goals. One question I am interested in—a boring but very important question—is whether it is possible to figure out and fix the CTI issues without a good model up-front. (I am anticipating that the model won't be accurate, although the team is analyzing lab CCDs subject to sensible, realistic damage.) The shape and magnitude of the effects on the point-spread function and positional offsets will be a function of stellar magnitude (brightness) and position on the chip. They might also have something to do with what stars have crossed the chip in advance of the current star. The idea is to build a non-trivial fake data stream and then analyze it without knowing what was put in: Can you recover and model all the effects at sufficient precision after learning the time-evolving non-trivial model on the science data themselves? The answer—which I expect to be yes—has implications for Gaia and every precision experiment to follow.

In order to work on such subjects I built a one-dimensional (yes the sky is a circle, not a 2-sphere) Gaia simulator. It currently doesn't do what is needed, so fork it and start coding! Or build your own. Or get serious and make a full mission simulator. But my point is not Will Gaia work? it is Can we make Gaia analysis less dependent on mechanistic CCD models? In the process we might make it more precise overall. Enhanced goal: Analyze all of Gaia's mission choices with the model.

1 comment:

  1. A related problem, for which there are reams of data, is to look at CTE in the Hubble ACS camera. In the Panchromatic Hubble Andromeda Treasury (PHAT) survey the CTE is a pretty big issue, and since it varies with background rate and we cover a huge range of unresolved background light levels (from the core of Andromeda to its outskirts), getting it wrong leads to false gradients.

    STScI has a "cosmetic" CTE repair algorithm, but I don't think it is justified or even uses a halfway-realistic model of the effect.

    I understand the idea here is to correct for this effect *without* a model, but since in the HST world there isn't even a model-based solution (that I know of), that would be a good start.