DOCLANG = en
SPHINXOPTS =
SPHINXBUILD = sphinx-build
-PAPER =
+PAPER = letter
BUILDDIR = $(top_builddir)/doc/build/$(DOCLANG)
SRCDIR = $(top_srcdir)/doc/$(DOCLANG)
-EXTRA_DIST = conf.py advanced.rst examples.rst index.rst intro.rst theory.rst using.rst \
+EXTRA_DIST = conf.py advanced.rst index.rst intro.rst theory.rst tutorials_in_salome.rst gui_in_salome.rst \
resources/ADAO.png \
resources/ADAO_small.png \
resources/ADAO_small_rouge.png \
# Internal variables.
-PAPEROPT_a4 = -D latex_paper_size=a4
-PAPEROPT_letter = -D latex_paper_size=letter
+PAPEROPT_a4 = -D latex_elements.papersize='a4paper'
+PAPEROPT_letter = -D latex_elements.papersize='letterpaper'
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) $(SRCDIR)
.PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest
+++ /dev/null
-..
- Copyright (C) 2008-2019 EDF R&D
-
- This file is part of SALOME ADAO module.
-
- This library is free software; you can redistribute it and/or
- modify it under the terms of the GNU Lesser General Public
- License as published by the Free Software Foundation; either
- version 2.1 of the License, or (at your option) any later version.
-
- This library is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- Lesser General Public License for more details.
-
- You should have received a copy of the GNU Lesser General Public
- License along with this library; if not, write to the Free Software
- Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
-
- See http://www.salome-platform.org/ or email : webmaster.salome@opencascade.com
-
- Author: Jean-Philippe Argaud, jean-philippe.argaud@edf.fr, EDF R&D
-
-.. _section_examples:
-
-================================================================================
-**[DocU]** Tutorials on using the ADAO module
-================================================================================
-
-.. |eficas_new| image:: images/eficas_new.png
- :align: middle
- :scale: 50%
-.. |eficas_save| image:: images/eficas_save.png
- :align: middle
- :scale: 50%
-.. |eficas_saveas| image:: images/eficas_saveas.png
- :align: middle
- :scale: 50%
-.. |eficas_yacs| image:: images/eficas_yacs.png
- :align: middle
- :scale: 50%
-
-This section presents some examples on using the ADAO module in SALOME. The
-first one shows how to build a simple data assimilation case defining explicitly
-all the required input data through the GUI. The second one shows, on the same
-case, how to define input data using external sources through scripts. We
-describe here always Python scripts because they can be directly inserted in
-YACS script nodes, but external files can use other languages.
-
-The mathematical notations used afterward are explained in the section
-:ref:`section_theory`.
-
-Building an estimation case with explicit data definition
----------------------------------------------------------
-
-This simple example is a demonstration one, and describes how to set a BLUE
-estimation framework in order to get the *fully weighted least square estimated
-state* of a system from an observation of the state and from an *a priori*
-knowledge (or background) of this state. In other words, we look for the
-weighted middle between the observation and the background vectors. All the
-numerical values of this example are arbitrary.
-
-Experimental setup
-++++++++++++++++++
-
-We choose to operate in a 3-dimensional space. 3D is chosen in order to restrict
-the size of numerical object to explicitly enter by the user, but the problem is
-not dependent of the dimension and can be set in dimension 10, 100, 1000... The
-observation :math:`\mathbf{y}^o` is of value 1 in each direction, so::
-
- Yo = [1 1 1]
-
-The background state :math:`\mathbf{x}^b`, which represent some *a priori*
-knowledge or a mathematical regularization, is of value of 0 in each direction,
-which is::
-
- Xb = [0 0 0]
-
-Data assimilation requires information on errors covariances :math:`\mathbf{R}`
-and :math:`\mathbf{B}`, respectively for observation and background variables.
-We choose here to have uncorrelated errors (that is, diagonal matrices) and to
-have the same variance of 1 for all variables (that is, identity matrices). We
-set::
-
- B = R = [1 0 0 ; 0 1 0 ; 0 0 1]
-
-Last, we need an observation operator :math:`\mathbf{H}` to convert the
-background value in the space of observation values. Here, because the space
-dimensions are the same, we can choose the identity as the observation
-operator::
-
- H = [1 0 0 ; 0 1 0 ; 0 0 1]
-
-With such choices, the "Best Linear Unbiased Estimator" (BLUE) will be the
-average vector between :math:`\mathbf{y}^o` and :math:`\mathbf{x}^b`, named the
-*analysis*, denoted by :math:`\mathbf{x}^a`, and its value is::
-
- Xa = [0.5 0.5 0.5]
-
-As an extension of this example, one can change the variances represented by
-:math:`\mathbf{B}` or :math:`\mathbf{R}` independently, and the analysis
-:math:`\mathbf{x}^a` will move to :math:`\mathbf{y}^o` or to
-:math:`\mathbf{x}^b`, in inverse proportion of the variances in
-:math:`\mathbf{B}` and :math:`\mathbf{R}`. As an other extension, it is also
-equivalent to search for the analysis thought a BLUE algorithm or a 3DVAR one.
-
-Using the graphical interface (GUI) to build the ADAO case
-++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
-
-First, you have to activate the ADAO module by choosing the appropriate module
-button or menu of SALOME, and you will see:
-
- .. _adao_activate2:
- .. image:: images/adao_activate.png
- :align: center
- :width: 100%
- .. centered::
- **Activating the module ADAO in SALOME**
-
-Choose the "*New*" button in this window. You will directly get the embedded
-case editor interface for variables definition, along with the SALOME "*Object
-browser*". You can then click on the "*New*" button |eficas_new| to create a new
-ADAO case, and you will see:
-
- .. _adao_viewer:
- .. image:: images/adao_viewer.png
- :align: center
- :width: 100%
- .. centered::
- **The embedded editor for cases definition in module ADAO**
-
-Then, fill in the variables to build the ADAO case by using the experimental set
-up described above. All the technical information given above will be directly
-inserted in the ADAO case definition, by using the *String* type for all the
-variables. When the case definition is ready, save it to a "*JDC (\*.comm)*"
-native file somewhere in your path. Remember that other files will be also
-created near this first one, so it is better to make a specific directory for
-your case, and to save the file inside. The name of the file will appear in the
-"*Object browser*" window, under the "*ADAO*" menu. The final case definition
-looks like this:
-
- .. _adao_jdcexample01:
- .. image:: images/adao_jdcexample01.png
- :align: center
- :width: 100%
- .. centered::
- **Definition of the experimental setup chosen for the ADAO case**
-
-To go further, we need now to generate the YACS scheme from the ADAO case
-definition. In order to do that, right click on the name of the file case in the
-"*Object browser*" window, and choose the "*Export to YACS*" sub-menu (or the
-"*Export to YACS*" button |eficas_yacs|) as below:
-
- .. _adao_exporttoyacs00:
- .. image:: images/adao_exporttoyacs.png
- :align: center
- :scale: 75%
- .. centered::
- **"Export to YACS" sub-menu to generate the YACS scheme from the ADAO case**
-
-This command will generate the YACS scheme, activate YACS module in SALOME, and
-open the new scheme in the GUI of the YACS module [#]_. After eventually
-reordering the nodes by using the "*arrange local nodes*" sub-menu of the YACS
-graphical view of the scheme, you get the following representation of the
-generated ADAO scheme:
-
- .. _yacs_generatedscheme:
- .. image:: images/yacs_generatedscheme.png
- :align: center
- :width: 100%
- .. centered::
- **YACS generated scheme from the ADAO case**
-
-After that point, all the modifications, executions and post-processing of the
-data assimilation scheme will be done in the YACS module. In order to check the
-result in a simple way, we use the "*UserPostAnalysis*" node (or we create here
-a new YACS node by using the "*in-line script node*" sub-menu of the YACS
-graphical view).
-
-This script node will retrieve the data assimilation analysis from the
-"*algoResults*" output port of the computation bloc (which gives access to a
-SALOME Python Object), and will print it on the standard output.
-
-To obtain this, the in-line script node need to have an input port of type
-"*pyobj*", named "*Study*" for example, that have to be linked graphically to
-the "*algoResults*" output port of the computation bloc. Then, the code to fill
-in the script node is::
-
- Xa = Study.getResults().get("Analysis")[-1]
-
- print()
- print("Analysis =",Xa)
- print()
-
-The (initial or augmented) YACS scheme can be saved (overwriting the generated
-scheme if the "*Save*" command or button are used, or with a new name through
-the "*Save as*" command). Ideally, the implementation of such post-processing
-procedure can be done in YACS to test, and then entirely saved in one Python
-script that can be integrated in the ADAO case by using the keyword
-"*UserPostAnalysis*".
-
-Then, classically in YACS, the scheme have to be compiled for run, and then
-executed. After completion, the printing on standard output is available in the
-"*YACS Container Log*", obtained through the right click menu of the "*proc*"
-window in the YACS scheme as shown below:
-
- .. _yacs_containerlog:
- .. image:: images/yacs_containerlog.png
- :align: center
- :width: 100%
- .. centered::
- **YACS menu for Container Log, and dialog window showing the log**
-
-We verify that the result is correct by checking that the log dialog window
-contains the following line::
-
- Analysis = [0.5, 0.5, 0.5]
-
-as shown in the image above.
-
-As a simple extension of this example, one can notice that the same problem
-solved with a 3DVAR algorithm gives the same result. This algorithm can be
-chosen at the ADAO case building step, before entering in YACS step. The
-ADAO 3DVAR case will look completely similar to the BLUE algorithmic case, as
-shown by the following figure:
-
- .. _adao_jdcexample02:
- .. image:: images/adao_jdcexample02.png
- :align: center
- :width: 100%
- .. centered::
- **Defining an ADAO 3DVAR case looks completely similar to a BLUE case**
-
-There is only one command changing, with "*3DVAR*" value in the "*Algorithm*"
-field instead of "*Blue*".
-
-Building an estimation case with external data definition by scripts
---------------------------------------------------------------------
-
-It is useful to get parts or all of the data from external definition, using
-Python script files to provide access to the data. As an example, we build here
-an ADAO case representing the same experimental setup as in the above example
-`Building an estimation case with explicit data definition`_, but using data
-from a single one external Python script file.
-
-First, we write the following script file, using conventional names for the
-required variables. Here, all the input variables are defined in the same
-script, but the user can choose to split the file in several ones, or to mix
-explicit data definition in the ADAO GUI and implicit data definition by
-external files. The present script file looks like::
-
- import numpy
- #
- # Definition of the Background as a vector
- # ----------------------------------------
- Background = [0, 0, 0]
- #
- # Definition of the Observation as a vector
- # -----------------------------------------
- Observation = "1 1 1"
- #
- # Definition of the Background Error covariance as a matrix
- # ---------------------------------------------------------
- BackgroundError = numpy.array([[1., 0., 0.], [0., 1., 0.], [0., 0., 1.]])
- #
- # Definition of the Observation Error covariance as a matrix
- # ----------------------------------------------------------
- ObservationError = numpy.matrix("1 0 0 ; 0 1 0 ; 0 0 1")
- #
- # Definition of the Observation Operator as a matrix
- # --------------------------------------------------
- ObservationOperator = numpy.identity(3)
-
-The names of the Python variables above are mandatory, in order to define the
-right case variables, but the Python script can be bigger and define classes,
-functions, file or database access, etc. with other names. Moreover, the above
-script shows different ways to define arrays and matrices, using list, string
-(as in Numpy or Octave), Numpy array type or Numpy matrix type, and Numpy
-special functions. All of these syntax are valid.
-
-After saving this script in a file (named here "*script.py*" for the example)
-somewhere in your path, we use the graphical interface (GUI) to build the ADAO
-case. The procedure to fill in the case is similar to the previous example
-except that, instead of selecting the "*String*" option for the "*FROM*" keyword
-of each variable, we select the "*Script*" one. This leads to a
-"*SCRIPT_DATA/SCRIPT_FILE*" entry in the graphical tree, allowing to choose a
-file as:
-
- .. _adao_scriptentry01:
- .. image:: images/adao_scriptentry01.png
- :align: center
- :width: 100%
- .. centered::
- **Defining an input value using an external script file**
-
-Other steps and results are exactly the same as in the `Building an estimation
-case with explicit data definition`_ previous example.
-
-In fact, this script methodology is the easiest way to retrieve data from
-in-line or previous calculations, from static files, from database or from
-stream, all of them inside or outside of SALOME. It allows also to modify easily
-some input data, for example for debug purpose or for repetitive execution
-process, and it is the most versatile method in order to parametrize the input
-data. **But be careful, script methodology is not a "safe" procedure, in the
-sense that erroneous data, or errors in calculations, can be directly injected
-into the YACS scheme execution. The user have to carefully verify the content of
-his scripts.**
-
-Adding parameters to control the data assimilation algorithm
-------------------------------------------------------------
-
-One can add some optional parameters to control the data assimilation algorithm
-calculation. This is done by using optional parameters in the
-"*AlgorithmParameters*" command of the ADAO case definition, which is a keyword
-of the "*ASSIMILATION_STUDY*" general command. This keyword requires an explicit
-definition of the values from default ones, or from a Python dictionary,
-containing some key/value pairs. The list of possible optional parameters are
-given in the section :ref:`section_reference` and its subsections. The
-recommandation is to use the explicit definition of values from the default list
-of optionnal parameters, as here with the "*MaximumNumberOfSteps*":
-
- .. _adao_scriptentry02:
- .. image:: images/adao_scriptentry02.png
- :align: center
- :width: 100%
- .. centered::
- **Adding parameters to control the algorithm and the outputs**
-
-This dictionary can be defined, for example, in an external Python script
-file, using the mandatory variable name "*AlgorithmParameters*" for the
-dictionary. All the keys inside the dictionary are optional, they all have
-default values, and can exist without being used. For example::
-
- AlgorithmParameters = {
- "Minimizer" : "LBFGSB", # Recommended
- "MaximumNumberOfSteps" : 10,
- }
-
-If no bounds at all are required on the control variables, then one can choose
-the "*BFGS*" or "*CG*" minimization algorithm for all the variational data
-assimilation or optimization algorithms. For constrained optimization, the
-minimizer "*LBFGSB*" is often more robust, but the "*TNC*" is sometimes more
-effective. In a general way, the "*LBFGSB*" algorithm choice is recommended.
-Then the script can be added to the ADAO case, in a file entry describing the
-"*Parameters*" keyword.
-
-Other steps and results are exactly the same as in the `Building an estimation
-case with explicit data definition`_ previous example. The dictionary can also
-be directly given in the input field of string type associated for the keyword.
-
-Building a complex case with external data definition by scripts
-----------------------------------------------------------------
-
-This more complex and complete example has to been considered as a framework for
-user inputs treatment, that need to be tailored for each real application.
-Nevertheless, the file skeletons are sufficiently general to have been used for
-various applications in neutronic, fluid mechanics... Here, we will not focus on
-the results, but more on the user control of inputs and outputs in an ADAO case.
-As previously, all the numerical values of this example are arbitrary.
-
-The objective is to setup the input and output definitions of a physical
-estimation case by external python scripts, using a general non-linear operator,
-adding control on parameters and so on... The complete framework scripts can be
-found in the ADAO skeletons examples directory under the name
-"*External_data_definition_by_scripts*".
-
-Experimental setup
-++++++++++++++++++
-
-We continue to operate in a 3-dimensional space, in order to restrict
-the size of numerical object shown in the scripts, but the problem is
-not dependent of the dimension.
-
-We choose a twin experiment context, using a known true state
-:math:`\mathbf{x}^t` but of arbitrary value::
-
- Xt = [1 2 3]
-
-The background state :math:`\mathbf{x}^b`, which represent some *a priori*
-knowledge of the true state, is build as a normal random perturbation of 20% of
-the true state :math:`\mathbf{x}^t` for each component, which is::
-
- Xb = Xt + normal(0, 20%*Xt)
-
-To describe the background error covariances matrix :math:`\mathbf{B}`, we make
-as previously the hypothesis of uncorrelated errors (that is, a diagonal matrix,
-of size 3x3 because :math:`\mathbf{x}^b` is of lenght 3) and to have the same
-variance of 0.1 for all variables. We get::
-
- B = 0.1 * diagonal( length(Xb) )
-
-We suppose that there exist an observation operator :math:`\mathbf{H}`, which
-can be non linear. In real calibration procedure or inverse problems, the
-physical simulation codes are embedded in the observation operator. We need also
-to know its gradient with respect to each calibrated variable, which is a rarely
-known information with industrial codes. But we will see later how to obtain an
-approximated gradient in this case.
-
-Being in twin experiments, the observation :math:`\mathbf{y}^o` and its error
-covariances matrix :math:`\mathbf{R}` are generated by using the true state
-:math:`\mathbf{x}^t` and the observation operator :math:`\mathbf{H}`::
-
- Yo = H( Xt )
-
-and, with an arbitrary standard deviation of 1% on each error component::
-
- R = 0.0001 * diagonal( length(Yo) )
-
-All the information required for estimation by data assimilation are then
-defined.
-
-Skeletons of the scripts describing the setup
-+++++++++++++++++++++++++++++++++++++++++++++
-
-We give here the essential parts of each script used afterward to build the
-ADAO case. Remember that using these scripts in real Python files requires to
-correctly define the path to imported modules or codes (even if the module is in
-the same directory that the importing Python file. We indicate the path
-adjustment using the mention ``"# INSERT PHYSICAL SCRIPT PATH"``), the encoding
-if necessary, etc. The indicated file names for the following scripts are
-arbitrary. Examples of complete file scripts are available in the ADAO examples
-standard directory.
-
-We first define the true state :math:`\mathbf{x}^t` and some convenient matrix
-building function, in a Python script file named
-``Physical_data_and_covariance_matrices.py``::
-
- import numpy
- #
- def True_state():
- """
- Arbitrary values and names, as a tuple of two series of same length
- """
- return (numpy.array([1, 2, 3]), ['Para1', 'Para2', 'Para3'])
- #
- def Simple_Matrix( size, diagonal=None ):
- """
- Diagonal matrix, with either 1 or a given vector on the diagonal
- """
- if diagonal is not None:
- S = numpy.diag( diagonal )
- else:
- S = numpy.matrix(numpy.identity(int(size)))
- return S
-
-We can then define the background state :math:`\mathbf{x}^b` as a random
-perturbation of the true state, adding a *required ADAO variable* at the end of
-the script the definition, in order to export the defined value. It is done in a
-Python script file named ``Script_Background_xb.py``::
-
- from Physical_data_and_covariance_matrices import True_state
- import numpy
- #
- xt, names = True_state()
- #
- Standard_deviation = 0.2*xt # 20% for each variable
- #
- xb = xt + abs(numpy.random.normal(0.,Standard_deviation,size=(len(xt),)))
- #
- # Creating the required ADAO variable
- # -----------------------------------
- Background = list(xb)
-
-In the same way, we define the background error covariance matrix
-:math:`\mathbf{B}` as a diagonal matrix, of the same diagonal length as the
-background of the true state, using the convenient function already defined. It
-is done in a Python script file named ``Script_BackgroundError_B.py``::
-
- from Physical_data_and_covariance_matrices import True_state, Simple_Matrix
- #
- xt, names = True_state()
- #
- B = 0.1 * Simple_Matrix( size = len(xt) )
- #
- # Creating the required ADAO variable
- # -----------------------------------
- BackgroundError = B
-
-To continue, we need the observation operator :math:`\mathbf{H}` as a function
-of the state. It is here defined in an external file named
-``"Physical_simulation_functions.py"``, which should contain one function
-conveniently named here ``"DirectOperator"``. This function is user one,
-representing as programming function the :math:`\mathbf{H}` operator. We suppose
-this function is then given by the user. A simple skeleton is given here for
-convenience::
-
- def DirectOperator( XX ):
- """ Direct non-linear simulation operator """
- #
- # --------------------------------------> EXAMPLE TO BE REMOVED
- if type(XX) is type(numpy.matrix([])): # EXAMPLE TO BE REMOVED
- HX = XX.A1.tolist() # EXAMPLE TO BE REMOVED
- elif type(XX) is type(numpy.array([])): # EXAMPLE TO BE REMOVED
- HX = numpy.matrix(XX).A1.tolist() # EXAMPLE TO BE REMOVED
- else: # EXAMPLE TO BE REMOVED
- HX = XX # EXAMPLE TO BE REMOVED
- # --------------------------------------> EXAMPLE TO BE REMOVED
- #
- return numpy.array( HX )
-
-We does not need the linear companion operators ``"TangentOperator"`` and
-``"AdjointOperator"`` because they will be approximated using ADAO capabilities.
-
-We insist on the fact that these non-linear operator ``"DirectOperator"``,
-tangent operator ``"TangentOperator"`` and adjoint operator
-``"AdjointOperator"`` come from the physical knowledge, include the reference
-physical simulation code, and have to be carefully setup by the data
-assimilation or optimization user. The simulation errors or missuses of the
-operators can not be detected or corrected by the data assimilation and
-optimization ADAO framework alone.
-
-In this twin experiments framework, the observation :math:`\mathbf{y}^o` and its
-error covariances matrix :math:`\mathbf{R}` can be generated. It is done in two
-Python script files, the first one being named ``Script_Observation_yo.py``::
-
- from Physical_data_and_covariance_matrices import True_state
- from Physical_simulation_functions import DirectOperator
- #
- xt, noms = True_state()
- #
- yo = DirectOperator( xt )
- #
- # Creating the required ADAO variable
- # -----------------------------------
- Observation = list(yo)
-
-and the second one named ``Script_ObservationError_R.py``::
-
- from Physical_data_and_covariance_matrices import True_state, Simple_Matrix
- from Physical_simulation_functions import DirectOperator
- #
- xt, names = True_state()
- #
- yo = DirectOperator( xt )
- #
- R = 0.0001 * Simple_Matrix( size = len(yo) )
- #
- # Creating the required ADAO variable
- # -----------------------------------
- ObservationError = R
-
-As in previous examples, it can be useful to define some parameters for the data
-assimilation algorithm. For example, if we use the standard "*3DVAR*" algorithm,
-the following parameters can be defined in a Python script file named
-``Script_AlgorithmParameters.py``::
-
- # Creating the required ADAO variable
- # -----------------------------------
- AlgorithmParameters = {
- "Minimizer" : "LBFGSB", # Recommended
- "MaximumNumberOfSteps" : 15, # Number of global iterative steps
- "Bounds" : [
- [ None, None ], # Bound on the first parameter
- [ 0., 4. ], # Bound on the second parameter
- [ 0., None ], # Bound on the third parameter
- ],
- }
-
-Finally, it is common to post-process the results, retrieving them after the
-data assimilation phase in order to analyze, print or show them. It requires to
-use a intermediary Python script file in order to extract these results at the
-end of the a data assimilation or optimization process. The following example
-Python script file, named ``Script_UserPostAnalysis.py``, illustrates the fact::
-
- from Physical_data_and_covariance_matrices import True_state
- import numpy
- #
- xt, names = True_state()
- xa = ADD.get("Analysis")[-1]
- x_series = ADD.get("CurrentState")[:]
- J = ADD.get("CostFunctionJ")[:]
- #
- # Verifying the results by printing
- # ---------------------------------
- print()
- print("xt = %s"%xt)
- print("xa = %s"%numpy.array(xa))
- print()
- for i in range( len(x_series) ):
- print("Step %2i : J = %.5e and X = %s"%(i, J[i], x_series[i]))
- print()
-
-At the end, we get a description of the whole case setup through a set of files
-listed here:
-
-#. ``Physical_data_and_covariance_matrices.py``
-#. ``Physical_simulation_functions.py``
-#. ``Script_AlgorithmParameters.py``
-#. ``Script_BackgroundError_B.py``
-#. ``Script_Background_xb.py``
-#. ``Script_ObservationError_R.py``
-#. ``Script_Observation_yo.py``
-#. ``Script_UserPostAnalysis.py``
-
-We insist here that all these scripts are written by the user and can not be
-automatically tested by ADAO. So the user is required to verify the scripts (and
-in particular their input/output) in order to limit the difficulty of debug. We
-recall: **script methodology is not a "safe" procedure, in the sense that
-erroneous data, or errors in calculations, can be directly injected into the
-YACS scheme execution.**
-
-Building the case with external data definition by scripts
-++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
-
-All these scripts can then be used to define the ADAO case with external data
-definition by Python script files. It is entirely similar to the method
-described in the `Building an estimation case with external data definition by
-scripts`_ previous section. For each variable to be defined, we select the
-"*Script*" option of the "*FROM*" keyword, which leads to a
-"*SCRIPT_DATA/SCRIPT_FILE*" entry in the tree. For the "*ObservationOperator*"
-keyword, we choose the "*ScriptWithOneFunction*" form and keep the default
-differential increment.
-
-The other steps to build the ADAO case are exactly the same as in the `Building
-an estimation case with explicit data definition`_ previous section.
-
-Using the simple linear operator :math:`\mathbf{H}` from the Python script file
-``Physical_simulation_functions.py`` in the ADAO examples standard directory,
-the results will look like::
-
- xt = [1 2 3]
- xa = [ 1.000014 2.000458 3.000390]
-
- Step 0 : J = 1.81750e+03 and X = [1.014011, 2.459175, 3.390462]
- Step 1 : J = 1.81750e+03 and X = [1.014011, 2.459175, 3.390462]
- Step 2 : J = 1.79734e+01 and X = [1.010771, 2.040342, 2.961378]
- Step 3 : J = 1.79734e+01 and X = [1.010771, 2.040342, 2.961378]
- Step 4 : J = 1.81909e+00 and X = [1.000826, 2.000352, 3.000487]
- Step 5 : J = 1.81909e+00 and X = [1.000826, 2.000352, 3.000487]
- Step 6 : J = 1.81641e+00 and X = [1.000247, 2.000651, 3.000156]
- Step 7 : J = 1.81641e+00 and X = [1.000247, 2.000651, 3.000156]
- Step 8 : J = 1.81569e+00 and X = [1.000015, 2.000432, 3.000364]
- Step 9 : J = 1.81569e+00 and X = [1.000015, 2.000432, 3.000364]
- Step 10 : J = 1.81568e+00 and X = [1.000013, 2.000458, 3.000390]
- ...
-
-The state at the first step is the randomly generated background state
-:math:`\mathbf{x}^b`. During calculation, these printings on standard output are
-available in the "*YACS Container Log*" window, obtained through the right click
-menu of the "*proc*" window in the YACS executed scheme.
-
-.. [#] For more information on YACS, see the *YACS module* and its integrated help available from the main menu *Help* of the SALOME platform.
--- /dev/null
+..
+ Copyright (C) 2008-2019 EDF R&D
+
+ This file is part of SALOME ADAO module.
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, write to the Free Software
+ Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+
+ See http://www.salome-platform.org/ or email : webmaster.salome@opencascade.com
+
+ Author: Jean-Philippe Argaud, jean-philippe.argaud@edf.fr, EDF R&D
+
+.. _section_gui_in_salome:
+
+================================================================================
+**[DocR]** Graphical User Interface for ADAO (GUI/EFICAS)
+================================================================================
+
+.. |eficas_new| image:: images/eficas_new.png
+ :align: middle
+ :scale: 50%
+.. |eficas_save| image:: images/eficas_save.png
+ :align: middle
+ :scale: 50%
+.. |eficas_saveas| image:: images/eficas_saveas.png
+ :align: middle
+ :scale: 50%
+.. |eficas_yacs| image:: images/eficas_yacs.png
+ :align: middle
+ :scale: 50%
+.. |yacs_compile| image:: images/yacs_compile.png
+ :align: middle
+ :scale: 50%
+
+This section presents the usage of the ADAO module in SALOME platform. Here we
+describe the general progression to establish an ADAO case, the details being
+given in the following chapters. It is completed by the detailed description of
+all the commands and keywords in the section :ref:`section_reference`, by
+advanced usage procedures in the section :ref:`section_advanced`, and by
+examples in the section :ref:`section_tutorials_in_salome`.
+
+Logical procedure to build an ADAO case
+---------------------------------------
+
+The construction of an ADAO case follows a simple approach to define the set of
+input data, and then generates a complete executable block diagram used in YACS
+[#]_. Many variations exist for the definition of input data, but the logical
+sequence remains unchanged.
+
+First of all, the user is considered to know its personal input data needed to
+set up the data assimilation study, following :ref:`section_methodology`. These
+data can already be available in SALOME or not.
+
+Basically, the procedure of using ADAO involves the following steps:
+
+- :ref:`section_u_step1`
+- :ref:`section_u_step2`
+- :ref:`section_u_step3`
+- :ref:`section_u_step4`
+- :ref:`section_u_step5`
+
+Each step will be detailed in the next section.
+
+Detailed procedure to build an ADAO case
+----------------------------------------
+
+.. _section_u_step1:
+
+STEP 1: Activate the ADAO module and use the editor GUI
++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+As always for a module, it has to be activated by choosing the appropriate
+module button (or the menu) in the toolbar of SALOME. If there is no SALOME
+study loaded, a popup appears, allowing to choose between creating a new study,
+or opening an already existing one:
+
+ .. _adao_activate1:
+ .. image:: images/adao_activate.png
+ :align: center
+ .. centered::
+ **Activating the module ADAO in SALOME**
+
+Choosing the "*New*" button, an embedded case editor [#]_ will be opened, along
+with the standard "*Object browser*". You can then click on the "*New*" button
+|eficas_new| (or choose the "*New*" entry in the "*ADAO*" main menu) to create a
+new ADAO case, and you will see:
+
+ .. _adao_viewer:
+ .. image:: images/adao_viewer.png
+ :align: center
+ :width: 100%
+ .. centered::
+ **The embedded editor for cases definition in module ADAO**
+
+.. _section_u_step2:
+
+STEP 2: Build and modify the ADAO case, and save it
++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+To build a case using the embedded editor, you have to go through a series of
+sub-steps, by selecting, at each sub-step, a keyword and then filling in its
+value. It is noted that it is in this step that is needed, among other things,
+to define the call to the simulation code used in observation or evolution
+operators describing the problem [#]_.
+
+The structured editor indicates hierarchical types, values or keywords allowed.
+Incomplete or incorrect keywords are identified by a visual error red flag.
+Possible values are indicated for keywords defined with a limited list of
+values, and adapted entries are given for the other keywords. Some help messages
+are contextually provided in the editor reserved places.
+
+A new case is set up with the minimal list of commands. All the mandatory
+commands or keywords are already present, none of them can be suppressed.
+Optional keywords can be added by choosing them in a list of suggestions of
+allowed ones for the main command, for example the "*ASSIMILATION_STUDY*"
+command. As an example, one can add parameters in the "*AlgorithmParameters*"
+keyword, as described in the last part of the section :ref:`section_tutorials_in_salome`.
+
+At the end, when all fields or keywords have been correctly defined, each line
+of the commands tree must have a green flag. This indicates that the whole case
+is valid and completed (and can be saved).
+
+ .. _adao_jdcexample00:
+ .. image:: images/adao_jdcexample01.png
+ :align: center
+ :scale: 75%
+ .. centered::
+ **Example of a valid ADAO case**
+
+Finally, you have to save your ADAO case by pushing the "*Save*" button
+|eficas_save|, or the "*Save as*" button |eficas_saveas|, or by choosing the
+"*Save/Save as*" entry in the "*ADAO*" menu. You will be prompted for a location
+in your file tree and a name, that will be completed by a "*.comm*" extension
+used for the embedded case editor. This will generate a pair of files describing
+the ADAO case, with the same base name, the first one being completed by a
+"*.comm*" extension and the second one by a "*.py*" extension [#]_.
+
+.. _section_u_step3:
+
+STEP 3: Export the ADAO case as a YACS scheme
++++++++++++++++++++++++++++++++++++++++++++++
+
+When the ADAO case is completed, you have to export it as a YACS scheme in order
+to execute the data assimilation calculation. This can be easily done by using
+the "*Export to YACS*" button |eficas_yacs|, or equivalently choose the "*Export
+to YACS*" entry in the "*ADAO*" main menu, or in the contextual case menu in the
+SALOME object browser.
+
+ .. _adao_exporttoyacs01:
+ .. image:: images/adao_exporttoyacs.png
+ :align: center
+ :scale: 75%
+ .. centered::
+ **"Export to YACS" sub-menu to generate the YACS scheme from the ADAO case**
+
+This will lead to automatically generate a YACS scheme, and open the YACS module
+on this scheme. The YACS file, associated with the scheme, will be stored in the
+same directory and with the same base name as the ADAO saved case, only changing
+its extension to "*.xml*". Be careful, *if the XML file name already exist, the
+file will be overwritten without prompting for replacing the XML file*.
+
+.. _section_u_step4:
+
+STEP 4: Supplement and modify the YACS scheme, and save it
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+.. index:: single: Analysis
+
+When the YACS scheme is generated and opened in SALOME through the YACS module
+GUI, you can modify or supplement the scheme like any standard YACS scheme.
+Nodes or blocs can be added, copied or modified to elaborate complex analysis,
+or to insert data assimilation or optimization capabilities into more complex
+YACS calculation schemes. It is recommended to save the modified scheme with a
+new name, in order to preserve the XML file in the case you re-export the ADAO
+case to YACS.
+
+The main supplement needed in the YACS scheme is a post-processing step. The
+evaluation of the results has to be done in the physical context of the
+simulation used by the data assimilation procedure. The post-processing can be
+provided through the "*UserPostAnalysis*" ADAO keyword as a script or a string,
+by templates, or can be build as YACS nodes. These two ways of building the
+post-processing can use all the SALOME possibilities. See the part describing
+:ref:`section_ref_output_variables`, or the help for each algorithm, for the
+full description of these elements.
+
+In practice, the YACS scheme has an "*algoResults*" output port of the
+computation bloc, which gives access to a structured object named hereafter
+"*ADD*" for example, containing all the calculation results. These results can
+be obtained by retrieving the named variables stored along the calculation. The
+main information is the "*Analysis*" one, that can be obtained by the python
+command (for example in an in-line script node or a script provided through the
+"*UserPostAnalysis*" keyword)::
+
+ Analysis = ADD.get("Analysis")[:]
+
+"*Analysis*" is a complex object, similar to a list of values calculated at each
+step of data assimilation calculation. In order to get and print the optimal
+data assimilation state evaluation, in a script provided through the
+"*UserPostAnalysis*" keyword, one can use::
+
+ Xa = ADD.get("Analysis")[-1]
+ print("Optimal state:", Xa)
+ print()
+
+This ``Xa`` variable is a vector of values, that represents the solution of the
+data assimilation or optimization evaluation problem, noted as
+:math:`\mathbf{x}^a` in the section :ref:`section_theory`.
+
+Such method can be used to print results, or to convert these ones to
+structures that can be used in the native or external SALOME post-processing. A
+simple example is given in the section :ref:`section_tutorials_in_salome`.
+
+.. _section_u_step5:
+
+STEP 5: Execute the YACS case and obtain the results
+++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+The YACS scheme is now complete and can be executed. Parametrization and
+execution of this YACS case is fully compliant with the standard way to deal
+with a YACS scheme, as described in the *YACS module User's Guide*.
+
+To recall the simplest way to proceed, the YACS scheme has to be compiled using
+the button |yacs_compile|, or the equivalent YACS menu entry, to prepare the
+scheme to run. Then the compiled scheme can be started, executed step by step or
+using breakpoints, etc.
+
+The standard output will be pushed into the "*YACS Container Log*", obtained
+through the right click menu of the "*proc*" window in the YACS GUI. The errors
+are shown either in the "*YACS Container Log*", or at the command line in the
+terminal window (if SALOME has been launched by its explicit command, and not by
+a menu or a desktop icon). As an example, the output of the above simple case is
+of the following form::
+
+ Entering in the assimilation study
+ Name is set to........: Test
+ Algorithm is set to...: Blue
+ Launching the analysis
+
+ Optimal state: [0.5, 0.5, 0.5]
+
+shown in the "*YACS Container Log*".
+
+The execution can also be done using a Shell script, as described in the section
+:ref:`section_advanced`.
+
+.. [#] For more information on YACS, see the *YACS module* and its integrated help available from the main menu *Help* of the SALOME platform.
+
+.. [#] For more information on the embedded case editor, see the *EFICAS module* and its integrated help available from the main menu *Help* of the SALOME platform.
+
+.. [#] The use of physical simulation code in the data assimilation elementary operators is illustrated or described in the following main parts.
+
+.. [#] This intermediary python file can also be used as described in the section :ref:`section_advanced`.
--- /dev/null
+..
+ Copyright (C) 2008-2019 EDF R&D
+
+ This file is part of SALOME ADAO module.
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, write to the Free Software
+ Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+
+ See http://www.salome-platform.org/ or email : webmaster.salome@opencascade.com
+
+ Author: Jean-Philippe Argaud, jean-philippe.argaud@edf.fr, EDF R&D
+
+.. index:: single: TabuSearch
+.. _section_ref_algorithm_TabuSearch:
+
+Calculation algorithm "*TabuSearch*"
+------------------------------------
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo01.rst
+
+This algorithm realizes an estimation of the state of a system by minimization
+of a cost function :math:`J` without gradient. It is a method that does not use
+the derivatives of the cost function. It falls in the same category than the
+:ref:`section_ref_algorithm_DerivativeFreeOptimization`, the
+:ref:`section_ref_algorithm_ParticleSwarmOptimization` or the
+:ref:`section_ref_algorithm_DifferentialEvolution`.
+
+This is an optimization method allowing for global minimum search of a general
+error function :math:`J` of type :math:`L^1`, :math:`L^2` or :math:`L^{\infty}`,
+with or without weights. The default error function is the augmented weighted
+least squares function, classically used in data assimilation.
+
+It works by iterative random exploration of the surroundings of the current
+point, to choose the state that minimizes the error function. To avoid
+returning to a point already explored, the algorithm's memory mechanism allows
+to exclude (hence the name *tabu*) the return to the last explored states.
+Positions already explored are kept in a list of finite length.
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo02.rst
+
+.. include:: snippets/Background.rst
+
+.. include:: snippets/BackgroundError.rst
+
+.. include:: snippets/Observation.rst
+
+.. include:: snippets/ObservationError.rst
+
+.. include:: snippets/ObservationOperator.rst
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo03AdOp.rst
+
+.. include:: snippets/LengthOfTabuList.rst
+
+.. include:: snippets/MaximumNumberOfSteps_50.rst
+
+.. include:: snippets/NoiseAddingProbability.rst
+
+.. include:: snippets/NoiseDistribution.rst
+
+.. include:: snippets/NoiseHalfRange.rst
+
+.. include:: snippets/NumberOfElementaryPerturbations.rst
+
+.. include:: snippets/QualityCriterion.rst
+
+.. include:: snippets/SetSeed.rst
+
+.. include:: snippets/StandardDeviation.rst
+
+StoreSupplementaryCalculations
+ .. index:: single: StoreSupplementaryCalculations
+
+ This list indicates the names of the supplementary variables that can be
+ available at the end of the algorithm. It involves potentially costly
+ calculations or memory consumptions. The default is a void list, none of
+ these variables being calculated and stored by default. The possible names
+ are in the following list: [
+ "BMA",
+ "OMA",
+ "OMB",
+ "CurrentState",
+ "CostFunctionJ",
+ "CostFunctionJb",
+ "CostFunctionJo",
+ "Innovation",
+ "SimulatedObservationAtBackground",
+ "SimulatedObservationAtCurrentState",
+ "SimulatedObservationAtOptimum",
+ ].
+
+ Example :
+ ``{"StoreSupplementaryCalculations":["BMA", "Innovation"]}``
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo04.rst
+
+.. include:: snippets/Analysis.rst
+
+.. include:: snippets/CostFunctionJ.rst
+
+.. include:: snippets/CostFunctionJb.rst
+
+.. include:: snippets/CostFunctionJo.rst
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo05.rst
+
+.. include:: snippets/BMA.rst
+
+.. include:: snippets/OMA.rst
+
+.. include:: snippets/OMB.rst
+
+.. include:: snippets/CurrentState.rst
+
+.. include:: snippets/Innovation.rst
+
+.. include:: snippets/SimulatedObservationAtBackground.rst
+
+.. include:: snippets/SimulatedObservationAtCurrentState.rst
+
+.. include:: snippets/SimulatedObservationAtOptimum.rst
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo06.rst
+
+- :ref:`section_ref_algorithm_DerivativeFreeOptimization`
+- :ref:`section_ref_algorithm_DifferentialEvolution`
+- :ref:`section_ref_algorithm_ParticleSwarmOptimization`
+
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo07.rst
+
+- [Glover89]_
+- [Glover90]_
+- [WikipediaTS]_
Checking algorithm "*TangentTest*"
----------------------------------
-Description
-+++++++++++
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo01.rst
This algorithm allows to check the quality of the tangent operator, by
calculating a residue with known theoretical properties.
One take :math:`\mathbf{dx}_0=Normal(0,\mathbf{x})` and
:math:`\mathbf{dx}=\alpha*\mathbf{dx}_0`. :math:`F` is the calculation code.
-Optional and required commands
-++++++++++++++++++++++++++++++
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo02.rst
-The general required commands, available in the editing user interface, are the
-following:
+.. include:: snippets/CheckingPoint.rst
- .. include:: snippets/CheckingPoint.rst
+.. include:: snippets/ObservationOperator.rst
- .. include:: snippets/ObservationOperator.rst
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo03Chck.rst
-The general optional commands, available in the editing user interface, are
-indicated in :ref:`section_ref_assimilation_keywords`. Moreover, the parameters
-of the command "*AlgorithmParameters*" allow to choose the specific options,
-described hereafter, of the algorithm. See
-:ref:`section_ref_options_Algorithm_Parameters` for the good use of this
-command.
+.. include:: snippets/AmplitudeOfInitialDirection.rst
-The options of the algorithm are the following:
+.. include:: snippets/EpsilonMinimumExponent.rst
- .. include:: snippets/AmplitudeOfInitialDirection.rst
+.. include:: snippets/InitialDirection.rst
- .. include:: snippets/EpsilonMinimumExponent.rst
+.. include:: snippets/SetSeed.rst
- .. include:: snippets/InitialDirection.rst
+StoreSupplementaryCalculations
+ .. index:: single: StoreSupplementaryCalculations
- .. include:: snippets/SetSeed.rst
+ This list indicates the names of the supplementary variables that can be
+ available at the end of the algorithm. It involves potentially costly
+ calculations or memory consumptions. The default is a void list, none of
+ these variables being calculated and stored by default. The possible names
+ are in the following list: [
+ "CurrentState",
+ "Residu",
+ "SimulatedObservationAtCurrentState",
+ ].
- StoreSupplementaryCalculations
- .. index:: single: StoreSupplementaryCalculations
+ Example :
+ ``{"StoreSupplementaryCalculations":["CurrentState"]}``
- This list indicates the names of the supplementary variables that can be
- available at the end of the algorithm. It involves potentially costly
- calculations or memory consumptions. The default is a void list, none of
- these variables being calculated and stored by default. The possible names
- are in the following list: ["CurrentState", "Residu",
- "SimulatedObservationAtCurrentState"].
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo04.rst
- Example :
- ``{"StoreSupplementaryCalculations":["CurrentState"]}``
+.. include:: snippets/Residu.rst
-Information and variables available at the end of the algorithm
-+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo05.rst
-At the output, after executing the algorithm, there are variables and
-information originating from the calculation. The description of
-:ref:`section_ref_output_variables` show the way to obtain them by the method
-named ``get`` of the variable "*ADD*" of the post-processing. The input
-variables, available to the user at the output in order to facilitate the
-writing of post-processing procedures, are described in the
-:ref:`subsection_r_o_v_Inventaire`.
+.. include:: snippets/CurrentState.rst
-The unconditional outputs of the algorithm are the following:
+.. include:: snippets/SimulatedObservationAtCurrentState.rst
- .. include:: snippets/Residu.rst
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo06.rst
-The conditional outputs of the algorithm are the following:
-
- .. include:: snippets/CurrentState.rst
-
- .. include:: snippets/SimulatedObservationAtCurrentState.rst
-
-See also
-++++++++
-
-References to other sections:
- - :ref:`section_ref_algorithm_FunctionTest`
- - :ref:`section_ref_algorithm_LinearityTest`
- - :ref:`section_ref_algorithm_AdjointTest`
- - :ref:`section_ref_algorithm_GradientTest`
+- :ref:`section_ref_algorithm_FunctionTest`
+- :ref:`section_ref_algorithm_LinearityTest`
+- :ref:`section_ref_algorithm_AdjointTest`
+- :ref:`section_ref_algorithm_GradientTest`
+- :ref:`section_ref_algorithm_LocalSensitivityTest`
Calculation algorithm "*UnscentedKalmanFilter*"
-----------------------------------------------
-Description
-+++++++++++
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo01.rst
This algorithm realizes an estimation of the state of a dynamic system by a
"unscented" Kalman Filter, avoiding to have to perform the tangent and adjoint
to evaluate on small systems. One can verify the linearity of the operators
with the help of the :ref:`section_ref_algorithm_LinearityTest`.
-Optional and required commands
-++++++++++++++++++++++++++++++
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo02.rst
-The general required commands, available in the editing user interface, are the
-following:
+.. include:: snippets/Background.rst
- .. include:: snippets/Background.rst
+.. include:: snippets/BackgroundError.rst
- .. include:: snippets/BackgroundError.rst
+.. include:: snippets/EvolutionError.rst
- .. include:: snippets/EvolutionError.rst
+.. include:: snippets/EvolutionModel.rst
- .. include:: snippets/EvolutionModel.rst
+.. include:: snippets/Observation.rst
- .. include:: snippets/Observation.rst
+.. include:: snippets/ObservationError.rst
- .. include:: snippets/ObservationError.rst
+.. include:: snippets/ObservationOperator.rst
- .. include:: snippets/ObservationOperator.rst
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo03AdOp.rst
-The general optional commands, available in the editing user interface, are
-indicated in :ref:`section_ref_assimilation_keywords`. Moreover, the parameters
-of the command "*AlgorithmParameters*" allows to choose the specific options,
-described hereafter, of the algorithm. See
-:ref:`section_ref_options_Algorithm_Parameters` for the good use of this
-command.
+.. include:: snippets/BoundsWithExtremes.rst
-The options of the algorithm are the following:
+.. include:: snippets/ConstrainedBy.rst
- .. include:: snippets/BoundsWithExtremes.rst
+.. include:: snippets/EstimationOf.rst
- .. include:: snippets/ConstrainedBy.rst
+.. include:: snippets/AlphaBeta.rst
- .. include:: snippets/EstimationOf.rst
+StoreSupplementaryCalculations
+ .. index:: single: StoreSupplementaryCalculations
- Alpha, Beta, Kappa, Reconditioner
- .. index:: single: Alpha
- .. index:: single: Beta
- .. index:: single: Kappa
- .. index:: single: Reconditioner
+ This list indicates the names of the supplementary variables that can be
+ available at the end of the algorithm. It involves potentially costly
+ calculations or memory consumptions. The default is a void list, none of
+ these variables being calculated and stored by default. The possible names
+ are in the following list: [
+ "APosterioriCorrelations",
+ "APosterioriCovariance",
+ "APosterioriStandardDeviations",
+ "APosterioriVariances",
+ "BMA",
+ "CostFunctionJ",
+ "CostFunctionJb",
+ "CostFunctionJo",
+ "CurrentState",
+ "Innovation",
+ ].
- These keys are internal scaling parameters. "Alpha" requires a value between
- 1.e-4 and 1. "Beta" has an optimal value of 2 for Gaussian *a priori*
- distribution. "Kappa" requires an integer value, and the right default is
- obtained by setting it to 0. "Reconditioner" requires a value between 1.e-3
- and 10, it defaults to 1.
+ Example :
+ ``{"StoreSupplementaryCalculations":["BMA", "Innovation"]}``
- Example :
- ``{"Alpha":1,"Beta":2,"Kappa":0,"Reconditioner":1}``
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo04.rst
- StoreSupplementaryCalculations
- .. index:: single: StoreSupplementaryCalculations
+.. include:: snippets/Analysis.rst
- This list indicates the names of the supplementary variables that can be
- available at the end of the algorithm. It involves potentially costly
- calculations or memory consumptions. The default is a void list, none of
- these variables being calculated and stored by default. The possible names
- are in the following list: ["APosterioriCorrelations",
- "APosterioriCovariance", "APosterioriStandardDeviations",
- "APosterioriVariances", "BMA", "CostFunctionJ", "CostFunctionJb",
- "CostFunctionJo", "CurrentState", "Innovation"].
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo05.rst
- Example :
- ``{"StoreSupplementaryCalculations":["BMA", "Innovation"]}``
+.. include:: snippets/APosterioriCorrelations.rst
-Information and variables available at the end of the algorithm
-+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+.. include:: snippets/APosterioriCovariance.rst
-At the output, after executing the algorithm, there are variables and
-information originating from the calculation. The description of
-:ref:`section_ref_output_variables` show the way to obtain them by the method
-named ``get`` of the variable "*ADD*" of the post-processing. The input
-variables, available to the user at the output in order to facilitate the
-writing of post-processing procedures, are described in the
-:ref:`subsection_r_o_v_Inventaire`.
+.. include:: snippets/APosterioriStandardDeviations.rst
-The unconditional outputs of the algorithm are the following:
+.. include:: snippets/APosterioriVariances.rst
- .. include:: snippets/Analysis.rst
+.. include:: snippets/BMA.rst
-The conditional outputs of the algorithm are the following:
+.. include:: snippets/CostFunctionJ.rst
- .. include:: snippets/APosterioriCorrelations.rst
+.. include:: snippets/CostFunctionJb.rst
- .. include:: snippets/APosterioriCovariance.rst
+.. include:: snippets/CostFunctionJo.rst
- .. include:: snippets/APosterioriStandardDeviations.rst
+.. include:: snippets/CurrentState.rst
- .. include:: snippets/APosterioriVariances.rst
+.. include:: snippets/Innovation.rst
- .. include:: snippets/BMA.rst
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo06.rst
- .. include:: snippets/CostFunctionJ.rst
+- :ref:`section_ref_algorithm_KalmanFilter`
+- :ref:`section_ref_algorithm_ExtendedKalmanFilter`
+- :ref:`section_ref_algorithm_EnsembleKalmanFilter`
- .. include:: snippets/CostFunctionJb.rst
+.. ------------------------------------ ..
+.. include:: snippets/Header2Algo07.rst
- .. include:: snippets/CostFunctionJo.rst
-
- .. include:: snippets/CurrentState.rst
-
- .. include:: snippets/Innovation.rst
-
-See also
-++++++++
-
-References to other sections:
- - :ref:`section_ref_algorithm_KalmanFilter`
- - :ref:`section_ref_algorithm_ExtendedKalmanFilter`
-
-Bibliographical references:
- - [WikipediaUKF]_
+- [WikipediaUKF]_
--- /dev/null
+.. index:: single: Alpha
+.. index:: single: Beta
+.. index:: single: Kappa
+.. index:: single: Reconditioner
+
+Alpha, Beta, Kappa, Reconditioner
+ These keys are internal scaling parameters. "Alpha" requires a value between
+ 1.e-4 and 1. "Beta" has an optimal value of 2 for Gaussian *a priori*
+ distribution. "Kappa" requires an integer value, and the right default is
+ obtained by setting it to 0. "Reconditioner" requires a value between 1.e-3
+ and 10, it defaults to 1.
+
+ Example :
+ ``{"Alpha":1,"Beta":2,"Kappa":0,"Reconditioner":1}``
CrossOverProbability_CR
This key is used to define the probability of recombination or crossover
during the differential evolution. This variable is usually noted as ``CR``
- in the literature. The default value is 0.7, and it is recommended to change
- it if necessary.
+ in the literature, and it is required to be between 0 and 1. The default
+ value is 0.7, and it is recommended to change it if necessary.
Example:
``{"CrossOverProbability_CR":0.7}``
--- /dev/null
+.. index:: single: LengthOfTabuList
+
+LengthOfTabuList
+ This key indicates the length of the tabu list, that is the maximum number of
+ previously executed perturbations and kept for the record. The default is 50,
+ and it is recommended to adapt it to the needs on real problems.
+
+ Example :
+ ``{"LengthOfTabuList":50}``
--- /dev/null
+.. index:: single: NoiseAddingProbability
+
+NoiseAddingProbability
+ This key indicates the probability of perturbing a state component. It is a
+ mandatory value between 0 and 1. The default is 1, and it is not recommended
+ to change it.
+
+ Example :
+ ``{"NoiseAddingProbability":1.}``
--- /dev/null
+.. index:: single: NoiseDistribution
+
+NoiseDistribution
+ This key indicates the type of the distribution used to generate the state
+ perturbations. This distribution can be of "Uniform" or "Gaussian" type. The
+ default is a distribution of "Uniform" type, and it is recommended to adapt
+ it to the needs on real problems.
+
+ Example :
+ ``{"NoiseDistribution":"Uniform"}``
--- /dev/null
+.. index:: single: NoiseHalfRange
+
+NoiseHalfRange
+ This key indicates,, only in the case of a "Uniform" distribution type asked
+ through the keyword "*NoiseDistribution*", the half-amplitude of the uniform
+ state centred perturbations for each component of the state. The default is
+ an empty list, this key must therefore be filled in in the case of a
+ "Uniform" distribution. A simple way to do this is to give a list of the
+ length of the desired state with identical half-amplitudes, as in the example
+ below with half-amplitudes of 3%. It is recommended to take half-amplitudes
+ of a few percent maximum.
+
+ Example :
+ ``{"NoiseHalfRange":<longueur de l'état>*[0.03]}``
--- /dev/null
+.. index:: single: NumberOfElementaryPerturbations
+
+NumberOfElementaryPerturbations
+ This key indicates the number of elementary perturbations that will be
+ performed to select a complete state perturbation. The default is 1, and it
+ is recommended to adapt it carefully to the needs of real problems, without
+ choosing too many elementary perturbations.
+
+ Example :
+ ``{"NumberOfElementaryPerturbations":1}``
+.. index:: single: Observer
.. index:: single: Observers
+.. index:: single: Observer Template
Observers
*List of functions linked to variables*. This command allows to set internal
--- /dev/null
+.. index:: single: SampleAsExplicitHyperCube
+
+SampleAsExplicitHyperCube
+ This key describes the calculations points as an hyper-cube, from a given
+ list of explicit sampling of each variable as a list. That is then a list of
+ lists, each of them being potentially of different size.
+
+ Example : ``{"SampleAsExplicitHyperCube":[[0.,0.25,0.5,0.75,1.], [-2,2,1]]}`` for a state space of dimension 2
--- /dev/null
+.. index:: single: SampleAsIndependantRandomVariables
+
+SampleAsIndependantRandomVariables
+ This key describes the calculations points as an hyper-cube, for which the
+ points on each axis come from a independent random sampling of the axis
+ variable, under the specification of the distribution, its parameters and
+ the number of points in the sample, as a list ``['distribution',
+ [parameters], number]`` for each axis. The possible distributions are
+ 'normal' of parameters (mean,std), 'lognormal' of parameters (mean,sigma),
+ 'uniform' of parameters (low,high), or 'weibull' of parameter (shape). That
+ is then a list of the same size than the one of the state.
+
+ Example :
+ ``{"SampleAsIndependantRandomVariables":[ ['normal',[0.,1.],3], ['uniform',[-2,2],4]]`` for a state space of dimension 2
--- /dev/null
+.. index:: single: SampleAsMinMaxStepHyperCube
+
+SampleAsMinMaxStepHyperCube
+ This key describes the calculations points as an hyper-cube, from a given
+ list of implicit sampling of each variable by a triplet *[min,max,step]*.
+ That is then a list of the same size than the one of the state. The bounds
+ are included.
+
+ Example :
+ ``{"SampleAsMinMaxStepHyperCube":[[0.,1.,0.25],[-1,3,1]]}`` for a state space of dimension 2
--- /dev/null
+.. index:: single: SampleAsnUplet
+
+SampleAsnUplet
+ This key describes the calculations points as a list of n-uplets, each
+ n-uplet being a state.
+
+ Example :
+ ``{"SampleAsnUplet":[[0,1,2,3],[4,3,2,1],[-2,3,-4,5]]}`` for 3 points in a state space of dimension 4
--- /dev/null
+.. index:: single: StandardDeviation
+
+StandardDeviation
+ This key indicates, only in the case of a "Gaussian" distribution type asked
+ through the keyword "*NoiseDistribution*", the standard deviation of the
+ state Gaussian perturbations for each state component. The default is an
+ empty list, this key must therefore be filled in in the case of a "Gaussian"
+ distribution. A simple way to do this is to give a list of the length of the
+ desired state with identical standard deviations, as in the example below
+ with standard deviations of 5%. It is recommended to take standard deviations
+ of a few percent maximum.
+
+ Example :
+ ``{"StandardDeviation":<longueur de l'état>*[0.05]}``
.. _section_tui:
================================================================================
-**[DocR]** Textual Application Programming Interface for the user (API/TUI)
+**[DocR]** Textual User Interface for ADAO (TUI/API)
================================================================================
This section presents advanced usage of the ADAO module using its text
**setObserver** (*Variable, Template, String, Script, Info*)
This command allows to set an *observer* on the current or final
calculation variable. Reference should be made to the description of the
- ':ref:`ref_observers_requirements` for their list and content, and to the
- :ref:`section_reference` to know what are the observable quantities. One
- defines as "*String*" the *observer* body, using a string including if
+ ':ref:`section_ref_observers_requirements` for their list and content, and
+ to the :ref:`section_reference` to know what are the observable quantities.
+ One defines as "*String*" the *observer* body, using a string including if
necessary line breaks. It is recommended to use the patterns available by
the argument "*Template*". In the case of a definition as "*Script*", the
file must contain only the body of the function, as described in the
- :ref:`ref_observers_requirements`. The "*Info*" variable contains an
- information string or can be void.
+ :ref:`section_ref_observers_requirements`. The "*Info*" variable contains
+ an information string or can be void.
Perform the calculation
+++++++++++++++++++++++
+++ /dev/null
-..
- Copyright (C) 2008-2019 EDF R&D
-
- This file is part of SALOME ADAO module.
-
- This library is free software; you can redistribute it and/or
- modify it under the terms of the GNU Lesser General Public
- License as published by the Free Software Foundation; either
- version 2.1 of the License, or (at your option) any later version.
-
- This library is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- Lesser General Public License for more details.
-
- You should have received a copy of the GNU Lesser General Public
- License along with this library; if not, write to the Free Software
- Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
-
- See http://www.salome-platform.org/ or email : webmaster.salome@opencascade.com
-
- Author: Jean-Philippe Argaud, jean-philippe.argaud@edf.fr, EDF R&D
-
-.. _section_using:
-
-================================================================================
-**[DocU]** Using the ADAO module
-================================================================================
-
-.. |eficas_new| image:: images/eficas_new.png
- :align: middle
- :scale: 50%
-.. |eficas_save| image:: images/eficas_save.png
- :align: middle
- :scale: 50%
-.. |eficas_saveas| image:: images/eficas_saveas.png
- :align: middle
- :scale: 50%
-.. |eficas_yacs| image:: images/eficas_yacs.png
- :align: middle
- :scale: 50%
-.. |yacs_compile| image:: images/yacs_compile.png
- :align: middle
- :scale: 50%
-
-This section presents the usage of the ADAO module in SALOME platform. Here we
-describe the general progression to establish an ADAO case, the details being
-given in the following chapters. It is completed by the detailed description of
-all the commands and keywords in the section :ref:`section_reference`, by
-advanced usage procedures in the section :ref:`section_advanced`, and by
-examples in the section :ref:`section_examples`.
-
-Logical procedure to build an ADAO case
----------------------------------------
-
-The construction of an ADAO case follows a simple approach to define the set of
-input data, and then generates a complete executable block diagram used in YACS
-[#]_. Many variations exist for the definition of input data, but the logical
-sequence remains unchanged.
-
-First of all, the user is considered to know its personal input data needed to
-set up the data assimilation study, following :ref:`section_methodology`. These
-data can already be available in SALOME or not.
-
-Basically, the procedure of using ADAO involves the following steps:
-
- - :ref:`section_u_step1`
- - :ref:`section_u_step2`
- - :ref:`section_u_step3`
- - :ref:`section_u_step4`
- - :ref:`section_u_step5`
-
-Each step will be detailed in the next section.
-
-Detailed procedure to build an ADAO case
-----------------------------------------
-
-.. _section_u_step1:
-
-STEP 1: Activate the ADAO module and use the editor GUI
-+++++++++++++++++++++++++++++++++++++++++++++++++++++++
-
-As always for a module, it has to be activated by choosing the appropriate
-module button (or the menu) in the toolbar of SALOME. If there is no SALOME
-study loaded, a popup appears, allowing to choose between creating a new study,
-or opening an already existing one:
-
- .. _adao_activate1:
- .. image:: images/adao_activate.png
- :align: center
- .. centered::
- **Activating the module ADAO in SALOME**
-
-Choosing the "*New*" button, an embedded case editor [#]_ will be opened, along
-with the standard "*Object browser*". You can then click on the "*New*" button
-|eficas_new| (or choose the "*New*" entry in the "*ADAO*" main menu) to create a
-new ADAO case, and you will see:
-
- .. _adao_viewer:
- .. image:: images/adao_viewer.png
- :align: center
- :width: 100%
- .. centered::
- **The embedded editor for cases definition in module ADAO**
-
-.. _section_u_step2:
-
-STEP 2: Build and modify the ADAO case, and save it
-+++++++++++++++++++++++++++++++++++++++++++++++++++
-
-To build a case using the embedded editor, you have to go through a series of
-sub-steps, by selecting, at each sub-step, a keyword and then filling in its
-value. It is noted that it is in this step that is needed, among other things,
-to define the call to the simulation code used in observation or evolution
-operators describing the problem [#]_.
-
-The structured editor indicates hierarchical types, values or keywords allowed.
-Incomplete or incorrect keywords are identified by a visual error red flag.
-Possible values are indicated for keywords defined with a limited list of
-values, and adapted entries are given for the other keywords. Some help messages
-are contextually provided in the editor reserved places.
-
-A new case is set up with the minimal list of commands. All the mandatory
-commands or keywords are already present, none of them can be suppressed.
-Optional keywords can be added by choosing them in a list of suggestions of
-allowed ones for the main command, for example the "*ASSIMILATION_STUDY*"
-command. As an example, one can add parameters in the "*AlgorithmParameters*"
-keyword, as described in the last part of the section :ref:`section_examples`.
-
-At the end, when all fields or keywords have been correctly defined, each line
-of the commands tree must have a green flag. This indicates that the whole case
-is valid and completed (and can be saved).
-
- .. _adao_jdcexample00:
- .. image:: images/adao_jdcexample01.png
- :align: center
- :scale: 75%
- .. centered::
- **Example of a valid ADAO case**
-
-Finally, you have to save your ADAO case by pushing the "*Save*" button
-|eficas_save|, or the "*Save as*" button |eficas_saveas|, or by choosing the
-"*Save/Save as*" entry in the "*ADAO*" menu. You will be prompted for a location
-in your file tree and a name, that will be completed by a "*.comm*" extension
-used for the embedded case editor. This will generate a pair of files describing
-the ADAO case, with the same base name, the first one being completed by a
-"*.comm*" extension and the second one by a "*.py*" extension [#]_.
-
-.. _section_u_step3:
-
-STEP 3: Export the ADAO case as a YACS scheme
-+++++++++++++++++++++++++++++++++++++++++++++
-
-When the ADAO case is completed, you have to export it as a YACS scheme in order
-to execute the data assimilation calculation. This can be easily done by using
-the "*Export to YACS*" button |eficas_yacs|, or equivalently choose the "*Export
-to YACS*" entry in the "*ADAO*" main menu, or in the contextual case menu in the
-SALOME object browser.
-
- .. _adao_exporttoyacs01:
- .. image:: images/adao_exporttoyacs.png
- :align: center
- :scale: 75%
- .. centered::
- **"Export to YACS" sub-menu to generate the YACS scheme from the ADAO case**
-
-This will lead to automatically generate a YACS scheme, and open the YACS module
-on this scheme. The YACS file, associated with the scheme, will be stored in the
-same directory and with the same base name as the ADAO saved case, only changing
-its extension to "*.xml*". Be careful, *if the XML file name already exist, the
-file will be overwritten without prompting for replacing the XML file*.
-
-.. _section_u_step4:
-
-STEP 4: Supplement and modify the YACS scheme, and save it
-++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
-
-.. index:: single: Analysis
-
-When the YACS scheme is generated and opened in SALOME through the YACS module
-GUI, you can modify or supplement the scheme like any standard YACS scheme.
-Nodes or blocs can be added, copied or modified to elaborate complex analysis,
-or to insert data assimilation or optimization capabilities into more complex
-YACS calculation schemes. It is recommended to save the modified scheme with a
-new name, in order to preserve the XML file in the case you re-export the ADAO
-case to YACS.
-
-The main supplement needed in the YACS scheme is a post-processing step. The
-evaluation of the results has to be done in the physical context of the
-simulation used by the data assimilation procedure. The post-processing can be
-provided through the "*UserPostAnalysis*" ADAO keyword as a script or a string,
-by templates, or can be build as YACS nodes. These two ways of building the
-post-processing can use all the SALOME possibilities. See the part describing
-:ref:`section_ref_output_variables`, or the help for each algorithm, for the
-full description of these elements.
-
-In practice, the YACS scheme has an "*algoResults*" output port of the
-computation bloc, which gives access to a structured object named hereafter
-"*ADD*" for example, containing all the calculation results. These results can
-be obtained by retrieving the named variables stored along the calculation. The
-main information is the "*Analysis*" one, that can be obtained by the python
-command (for example in an in-line script node or a script provided through the
-"*UserPostAnalysis*" keyword)::
-
- Analysis = ADD.get("Analysis")[:]
-
-"*Analysis*" is a complex object, similar to a list of values calculated at each
-step of data assimilation calculation. In order to get and print the optimal
-data assimilation state evaluation, in a script provided through the
-"*UserPostAnalysis*" keyword, one can use::
-
- Xa = ADD.get("Analysis")[-1]
- print("Optimal state:", Xa)
- print()
-
-This ``Xa`` variable is a vector of values, that represents the solution of the
-data assimilation or optimization evaluation problem, noted as
-:math:`\mathbf{x}^a` in the section :ref:`section_theory`.
-
-Such method can be used to print results, or to convert these ones to
-structures that can be used in the native or external SALOME post-processing. A
-simple example is given in the section :ref:`section_examples`.
-
-.. _section_u_step5:
-
-STEP 5: Execute the YACS case and obtain the results
-++++++++++++++++++++++++++++++++++++++++++++++++++++
-
-The YACS scheme is now complete and can be executed. Parametrization and
-execution of this YACS case is fully compliant with the standard way to deal
-with a YACS scheme, as described in the *YACS module User's Guide*.
-
-To recall the simplest way to proceed, the YACS scheme has to be compiled using
-the button |yacs_compile|, or the equivalent YACS menu entry, to prepare the
-scheme to run. Then the compiled scheme can be started, executed step by step or
-using breakpoints, etc.
-
-The standard output will be pushed into the "*YACS Container Log*", obtained
-through the right click menu of the "*proc*" window in the YACS GUI. The errors
-are shown either in the "*YACS Container Log*", or at the command line in the
-terminal window (if SALOME has been launched by its explicit command, and not by
-a menu or a desktop icon). As an example, the output of the above simple case is
-of the following form::
-
- Entering in the assimilation study
- Name is set to........: Test
- Algorithm is set to...: Blue
- Launching the analysis
-
- Optimal state: [0.5, 0.5, 0.5]
-
-shown in the "*YACS Container Log*".
-
-The execution can also be done using a Shell script, as described in the section
-:ref:`section_advanced`.
-
-.. [#] For more information on YACS, see the *YACS module* and its integrated help available from the main menu *Help* of the SALOME platform.
-
-.. [#] For more information on the embedded case editor, see the *EFICAS module* and its integrated help available from the main menu *Help* of the SALOME platform.
-
-.. [#] The use of physical simulation code in the data assimilation elementary operators is illustrated or described in the following main parts.
-
-.. [#] This intermediary python file can also be used as described in the section :ref:`section_advanced`.