## Section: Scientific Foundations

### Identification

See module 6.1 .

The behavior of the monitored continuous system is assumed to be described by
a parametric model ,
where the distribution of the observations (Z_{0}, ..., Z_{N} )
is characterized by
the parameter vector .
An *estimating function* , for example of the form :

is such that for all .
In many situations, is the gradient of a
function to be minimized : squared prediction error,
log-likelihood (up to a sign), ....
For performing model identification on the basis of observations
(Z_{0}, ..., Z_{N}) ,
an estimate of the unknown parameter is then
[43] :

Assuming that ^{*} is the true parameter value,
and that
if and only if = ^{*} with ^{*} fixed
(identifiability condition),
then converges towards ^{*} .
Thanks to the central limit theorem, the vector
is asymptotically Gaussian with zero mean, with covariance matrix
which can be either computed or estimated.
If, additionally, the matrix
is invertible, then using a Taylor expansion and
the constraint ,
the asymptotic normality of the estimate is obtained :

In many applications, such an approach must be improved in the following directions :