Salome HOME
Adding CurrentIterationNumber to user information and documentation
[modules/adao.git] / doc / fr / ref_algorithm_3DVAR.rst
1 ..
2    Copyright (C) 2008-2020 EDF R&D
3
4    This file is part of SALOME ADAO module.
5
6    This library is free software; you can redistribute it and/or
7    modify it under the terms of the GNU Lesser General Public
8    License as published by the Free Software Foundation; either
9    version 2.1 of the License, or (at your option) any later version.
10
11    This library is distributed in the hope that it will be useful,
12    but WITHOUT ANY WARRANTY; without even the implied warranty of
13    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
14    Lesser General Public License for more details.
15
16    You should have received a copy of the GNU Lesser General Public
17    License along with this library; if not, write to the Free Software
18    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307 USA
19
20    See http://www.salome-platform.org/ or email : webmaster.salome@opencascade.com
21
22    Author: Jean-Philippe Argaud, jean-philippe.argaud@edf.fr, EDF R&D
23
24 .. index:: single: 3DVAR
25 .. _section_ref_algorithm_3DVAR:
26
27 Algorithme de calcul "*3DVAR*"
28 ------------------------------
29
30 .. ------------------------------------ ..
31 .. include:: snippets/Header2Algo01.rst
32
33 Cet algorithme réalise une estimation d'état par minimisation variationnelle de
34 la fonctionnelle :math:`J` d'écart classique en assimilation de données
35 statique:
36
37 .. math:: J(\mathbf{x})=(\mathbf{x}-\mathbf{x}^b)^T.\mathbf{B}^{-1}.(\mathbf{x}-\mathbf{x}^b)+(\mathbf{y}^o-H(\mathbf{x}))^T.\mathbf{R}^{-1}.(\mathbf{y}^o-H(\mathbf{x}))
38
39 qui est usuellement désignée comme la fonctionnelle "*3D-VAR*" (voir par exemple
40 [Talagrand97]_).
41
42 .. ------------------------------------ ..
43 .. include:: snippets/Header2Algo02.rst
44
45 .. include:: snippets/Background.rst
46
47 .. include:: snippets/BackgroundError.rst
48
49 .. include:: snippets/Observation.rst
50
51 .. include:: snippets/ObservationError.rst
52
53 .. include:: snippets/ObservationOperator.rst
54
55 .. ------------------------------------ ..
56 .. include:: snippets/Header2Algo03AdOp.rst
57
58 .. include:: snippets/BoundsWithNone.rst
59
60 .. include:: snippets/CostDecrementTolerance.rst
61
62 .. include:: snippets/GradientNormTolerance.rst
63
64 .. include:: snippets/MaximumNumberOfSteps.rst
65
66 Minimizer
67   .. index:: single: Minimizer
68
69   Cette clé permet de changer le minimiseur pour l'optimiseur. Le choix par
70   défaut est "LBFGSB", et les choix possibles sont "LBFGSB" (minimisation non
71   linéaire sous contraintes, voir [Byrd95]_, [Morales11]_ et [Zhu97]_), "TNC"
72   (minimisation non linéaire sous contraintes), "CG" (minimisation non
73   linéaire sans contraintes), "BFGS" (minimisation non linéaire sans
74   contraintes), "NCG" (minimisation de type gradient conjugué de Newton). Il
75   est fortement conseillé de conserver la valeur par défaut.
76
77   Exemple :
78   ``{"Minimizer":"LBFGSB"}``
79
80 .. include:: snippets/NumberOfSamplesForQuantiles.rst
81
82 .. include:: snippets/ProjectedGradientTolerance.rst
83
84 .. include:: snippets/Quantiles.rst
85
86 .. include:: snippets/SetSeed.rst
87
88 .. include:: snippets/SimulationForQuantiles.rst
89
90 StoreSupplementaryCalculations
91   .. index:: single: StoreSupplementaryCalculations
92
93   Cette liste indique les noms des variables supplémentaires qui peuvent être
94   disponibles à la fin de l'algorithme, si elles sont initialement demandées par
95   l'utilisateur. Cela implique potentiellement des calculs ou du stockage
96   coûteux. La valeur par défaut est une liste vide, aucune de ces variables
97   n'étant calculée et stockée par défaut sauf les variables inconditionnelles.
98   Les noms possibles sont dans la liste suivante : [
99   "Analysis",
100   "APosterioriCorrelations",
101   "APosterioriCovariance",
102   "APosterioriStandardDeviations",
103   "APosterioriVariances",
104   "BMA",
105   "CostFunctionJ",
106   "CostFunctionJAtCurrentOptimum",
107   "CostFunctionJb",
108   "CostFunctionJbAtCurrentOptimum",
109   "CostFunctionJo",
110   "CostFunctionJoAtCurrentOptimum",
111   "CurrentIterationNumber",
112   "CurrentOptimum",
113   "CurrentState",
114   "IndexOfOptimum",
115   "Innovation",
116   "InnovationAtCurrentState",
117   "JacobianMatrixAtBackground",
118   "JacobianMatrixAtOptimum",
119   "KalmanGainAtOptimum",
120   "MahalanobisConsistency",
121   "OMA",
122   "OMB",
123   "SigmaObs2",
124   "SimulatedObservationAtBackground",
125   "SimulatedObservationAtCurrentOptimum",
126   "SimulatedObservationAtCurrentState",
127   "SimulatedObservationAtOptimum",
128   "SimulationQuantiles",
129   ].
130
131   Exemple :
132   ``{"StoreSupplementaryCalculations":["BMA", "CurrentState"]}``
133
134 .. ------------------------------------ ..
135 .. include:: snippets/Header2Algo04.rst
136
137 .. include:: snippets/Analysis.rst
138
139 .. include:: snippets/CostFunctionJ.rst
140
141 .. include:: snippets/CostFunctionJb.rst
142
143 .. include:: snippets/CostFunctionJo.rst
144
145 .. ------------------------------------ ..
146 .. include:: snippets/Header2Algo05.rst
147
148 .. include:: snippets/Analysis.rst
149
150 .. include:: snippets/APosterioriCorrelations.rst
151
152 .. include:: snippets/APosterioriCovariance.rst
153
154 .. include:: snippets/APosterioriStandardDeviations.rst
155
156 .. include:: snippets/APosterioriVariances.rst
157
158 .. include:: snippets/BMA.rst
159
160 .. include:: snippets/CostFunctionJ.rst
161
162 .. include:: snippets/CostFunctionJAtCurrentOptimum.rst
163
164 .. include:: snippets/CostFunctionJb.rst
165
166 .. include:: snippets/CostFunctionJbAtCurrentOptimum.rst
167
168 .. include:: snippets/CostFunctionJo.rst
169
170 .. include:: snippets/CostFunctionJoAtCurrentOptimum.rst
171
172 .. include:: snippets/CurrentIterationNumber.rst
173
174 .. include:: snippets/CurrentOptimum.rst
175
176 .. include:: snippets/CurrentState.rst
177
178 .. include:: snippets/IndexOfOptimum.rst
179
180 .. include:: snippets/Innovation.rst
181
182 .. include:: snippets/InnovationAtCurrentState.rst
183
184 .. include:: snippets/JacobianMatrixAtBackground.rst
185
186 .. include:: snippets/JacobianMatrixAtOptimum.rst
187
188 .. include:: snippets/KalmanGainAtOptimum.rst
189
190 .. include:: snippets/MahalanobisConsistency.rst
191
192 .. include:: snippets/OMA.rst
193
194 .. include:: snippets/OMB.rst
195
196 .. include:: snippets/SigmaObs2.rst
197
198 .. include:: snippets/SimulatedObservationAtBackground.rst
199
200 .. include:: snippets/SimulatedObservationAtCurrentOptimum.rst
201
202 .. include:: snippets/SimulatedObservationAtCurrentState.rst
203
204 .. include:: snippets/SimulatedObservationAtOptimum.rst
205
206 .. include:: snippets/SimulationQuantiles.rst
207
208 .. ------------------------------------ ..
209 .. include:: snippets/Header2Algo09.rst
210
211 .. literalinclude:: scripts/simple_3DVAR.py
212
213 .. include:: snippets/Header2Algo10.rst
214
215 .. literalinclude:: scripts/simple_3DVAR.res
216
217 .. ------------------------------------ ..
218 .. include:: snippets/Header2Algo06.rst
219
220 - :ref:`section_ref_algorithm_Blue`
221 - :ref:`section_ref_algorithm_ExtendedBlue`
222 - :ref:`section_ref_algorithm_LinearityTest`
223
224 .. ------------------------------------ ..
225 .. include:: snippets/Header2Algo07.rst
226
227 - [Byrd95]_
228 - [Morales11]_
229 - [Talagrand97]_
230 - [Zhu97]_