2 Copyright (C) 2008-2024 EDF R&D
4 This file is part of SALOME ADAO module.
6 This library is free software; you can redistribute it and/or
7 modify it under the terms of the GNU Lesser General Public
8 License as published by the Free Software Foundation; either
9 version 2.1 of the License, or (at your option) any later version.
11 This library is distributed in the hope that it will be useful,
12 but WITHOUT ANY WARRANTY; without even the implied warranty of
13 MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
14 Lesser General Public License for more details.
16 You should have received a copy of the GNU Lesser General Public
17 License along with this library; if not, write to the Free Software
18 Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
20 See http://www.salome-platform.org/ or email : webmaster.salome@opencascade.com
22 Author: Jean-Philippe Argaud, jean-philippe.argaud@edf.fr, EDF R&D
24 .. index:: single: LinearLeastSquares
25 .. _section_ref_algorithm_LinearLeastSquares:
27 Calculation algorithm "*LinearLeastSquares*"
28 --------------------------------------------
30 .. ------------------------------------ ..
31 .. include:: snippets/Header2Algo01.rst
33 This algorithm realizes a "Least Squares" linear type estimation of the state of
34 a system. It is similar to a :ref:`section_ref_algorithm_Blue`, without its
37 This algorithm is always the fastest of all the optimization algorithms of
38 ADAO. It is theoretically reserved for observation operator cases which are
39 explicitly linear, even if it sometimes works in "slightly" non-linear cases.
40 One can verify the linearity of the observation operator with the help of a
41 :ref:`section_ref_algorithm_LinearityTest`.
43 This algorithm is naturally written for a single estimate, without any dynamic
44 or iterative notion (there is no need in this case for an incremental evolution
45 operator, nor for an evolution error covariance). In ADAO, it can also be used
46 on a succession of observations, placing the estimate in a recursive framework
47 partly similar to a Kalman Filter. A standard estimate is made at each
48 observation step on the state predicted by the incremental evolution model.
50 In all cases, it is recommended to prefer at least a
51 :ref:`section_ref_algorithm_Blue`, or a
52 :ref:`section_ref_algorithm_ExtendedBlue` or a
53 :ref:`section_ref_algorithm_3DVAR`.
55 .. ------------------------------------ ..
56 .. include:: snippets/Header2Algo12.rst
58 .. include:: snippets/FeaturePropLocalOptimization.rst
60 .. include:: snippets/FeaturePropDerivativeNeeded.rst
62 .. include:: snippets/FeaturePropParallelDerivativesOnly.rst
64 .. include:: snippets/FeaturePropConvergenceOnStatic.rst
66 .. ------------------------------------ ..
67 .. include:: snippets/Header2Algo02.rst
69 .. include:: snippets/Observation.rst
71 .. include:: snippets/ObservationError.rst
73 .. include:: snippets/ObservationOperator.rst
75 .. ------------------------------------ ..
76 .. include:: snippets/Header2Algo03AdOp.rst
78 .. include:: snippets/EstimationOf_Parameters.rst
80 StoreSupplementaryCalculations
81 .. index:: single: StoreSupplementaryCalculations
83 *List of names*. This list indicates the names of the supplementary
84 variables, that can be available during or at the end of the algorithm, if
85 they are initially required by the user. Their availability involves,
86 potentially, costly calculations or memory consumptions. The default is then
87 a void list, none of these variables being calculated and stored by default
88 (excepted the unconditional variables). The possible names are in the
89 following list (the detailed description of each named variable is given in
90 the following part of this specific algorithmic documentation, in the
91 sub-section "*Information and variables available at the end of the
95 "CostFunctionJAtCurrentOptimum",
97 "CostFunctionJbAtCurrentOptimum",
99 "CostFunctionJoAtCurrentOptimum",
104 "InnovationAtCurrentAnalysis",
106 "SimulatedObservationAtCurrentOptimum",
107 "SimulatedObservationAtCurrentState",
108 "SimulatedObservationAtOptimum",
112 ``{"StoreSupplementaryCalculations":["CurrentState", "Residu"]}``
114 *Tips for this algorithm:*
116 As the *"Background"* and *"BackgroundError"* commands are required for ALL
117 the calculation algorithms in the interface, you have to provide a value,
118 even if these commands are not required for this algorithm, and will not be
119 used. The simplest way is to give "1" as a STRING for both.
121 .. ------------------------------------ ..
122 .. include:: snippets/Header2Algo04.rst
124 .. include:: snippets/Analysis.rst
126 .. include:: snippets/CostFunctionJ.rst
128 .. include:: snippets/CostFunctionJb.rst
130 .. include:: snippets/CostFunctionJo.rst
132 .. ------------------------------------ ..
133 .. include:: snippets/Header2Algo05.rst
135 .. include:: snippets/Analysis.rst
137 .. include:: snippets/CostFunctionJ.rst
139 .. include:: snippets/CostFunctionJAtCurrentOptimum.rst
141 .. include:: snippets/CostFunctionJb.rst
143 .. include:: snippets/CostFunctionJbAtCurrentOptimum.rst
145 .. include:: snippets/CostFunctionJo.rst
147 .. include:: snippets/CostFunctionJoAtCurrentOptimum.rst
149 .. include:: snippets/CurrentOptimum.rst
151 .. include:: snippets/CurrentState.rst
153 .. include:: snippets/CurrentStepNumber.rst
155 .. include:: snippets/ForecastState.rst
157 .. include:: snippets/InnovationAtCurrentAnalysis.rst
159 .. include:: snippets/OMA.rst
161 .. include:: snippets/SimulatedObservationAtCurrentOptimum.rst
163 .. include:: snippets/SimulatedObservationAtCurrentState.rst
165 .. include:: snippets/SimulatedObservationAtOptimum.rst
167 .. ------------------------------------ ..
168 .. _section_ref_algorithm_LinearLeastSquares_examples:
170 .. include:: snippets/Header2Algo06.rst
172 - :ref:`section_ref_algorithm_Blue`
173 - :ref:`section_ref_algorithm_ExtendedBlue`
174 - :ref:`section_ref_algorithm_3DVAR`
175 - :ref:`section_ref_algorithm_LinearityTest`