3 \page ghs3dprl_hypo_page MG-Tetra Parallel Parameters hypothesis
5 \n MG-Tetra Parameters hypothesis works only with <b>MG-Tetra</b>
6 meshing algorithm which uses <b>MG-Tetra-hpc</b> code (formerly tepal)
7 which is the parallel implementation of MG-Tetra (formerly TetMesh-GHS3D) algorithm.
8 This algorithm is a DISTENE commercial software, its use requires a license.
10 See http://www.distene.com and http://www.meshgems.com/volume-meshing-meshgems-tetra.html.
11 \n MG-Tetra-hpc (Tepal V3 in fact) gives the possibility to generate a partitioned
12 mesh with more than 200 million tetrahedrons on computers using MPI.
13 The launch of this version is described below.
14 \n This is a serious alternative to MG-Tetra, which requires a much less common
15 configuration with 64Go RAM to only try to make a partition of a mesh with
16 200 million tetrahedrons, no result guaranteed (in 2010).
18 \note The plug-in doesn't load in the memory the supposedly large resulting meshes.
19 The meshes are saved in MED files and can be imported in the user-defined location via menu File-Import-MED Files.
20 \n Pay attention, that Salome GUI needs 2Go RAM to load a MED
21 file with 5 million tetrahedrons.
23 \image html ghs3dprl_parameters_basic.png
27 <b>Name</b> - allows to define the name of the hypothesis (MG-Tetra Parallel Parameters by default).
30 <b>MED Name</b> - allows to define the path and the prefix of the
31 resulting MED files ("DOMAIN" by default).
32 If the path is not defined, the environment variable $SALOME_TMP_DIR
33 is used. If $SALOME_TMP_DIR is not defined as well, the environment
34 variable $TMP is used.
37 <b>Nb Partitions</b> - allows to define the number of generated MED files.
38 The initial skin (triangles) will be meshed (tetrahedrons) and partitioned
39 in Nb_Part by the elementary algorithm implemented in Tepal.<br>
42 <b>Keep Files</b> - if this box is checked, input files of MG-Tetra-hpc
43 (GHS3DPRL.points and GHS3DPRL.faces) are not deleted after use (if the
44 background mode was not used).
47 <b>Tetra_hpc in Background</b> - if this box is checked, MG-Tetra-hpc execution
48 and MED file generation are launched in background mode and the user
49 can even exit Salome. Pay attention that in this case MG-Tetra-hpc algorithm works
50 independently of "killSalome.py", and sometimes on another host.
53 <b>Merge subdomains</b> - if this box is checked, merge the sub-domains
54 into one mesh and write the output .mesh(b).
57 <b>Tag subdomains</b> - if this box is checked, use the parallel sub-domain
58 index as tag into the merged output mesh or not (used in combination with the
59 <b>Merge subdomains</b> option).
62 <b>Output interfaces</b> - if this box is checked, write the parallel
63 sub-domains interface triangles into the merged output mesh (used in
64 combination with the <b>Merge subdomains</b> option).
67 <b>Discard subdomains</b> - if this box is checked, discard the parallel sub-domains
68 (mesh, global numbering and interfaces).
71 <h1>Modifying MG-Tetra-hpc Advanced Parameters</h1><br>
72 MG-Tetra Parallel plug-in launches a standalone binary
73 executable <b>tetrahpc2med</b>.<br>
74 tetrahpc2med launches MG_Tetra-hpc, waits for the end of computation, and
75 converts the resulting output files into MED files.<br>
76 Some advanced optional parameters are accessible as arguments.<br>
78 If <b>Keep Files</b> option is checked, it is possible to re-launch
79 \a tetrahpc2med or MG-Tetra-hpc in the Terminal as a command with
80 custom parameters.<br>
83 <b>Advanced tetrahpc2med Parameters</b> - type <b>tetrahpc2med --help</b> in the Terminal. <p>
86 myname@myhost > /export/home/myname/salome_7/GHS3DPRLPLUGIN/bin/salome/tetrahpc2med --help
87 tetrahpc2med V3.0 (MED3+tetra-hpc) Available options:
88 --help : produces this help message
89 --casename : path and name of input tetrahpc2med files which are
90 - output files of GHS3DPRL_Plugin .mesh
91 - output file of GHS3DPRL_Plugin casename_skin.med (optional)
92 with initial skin and its initial groups
93 --number : number of partitions
94 --medname : path and name of output MED files
95 --limitswap : max size of working cpu memory (Mo) (before swapping on .temp files)
96 --verbose : trace of execution (0->6)
97 --test : more tests about joints, before generation of output files
98 --menu : a GUI menu for option number
99 --launchtetra : also launch tetra-hpc on files casename.mesh and option number
100 --merge_subdomains : merge the subdomains into one mesh and write the output .mesh(b) file
101 --tag_subdomains : use the parallel subdomain index as tag into the merged output mesh
102 to identify the parallel subdomains (used in combination with the merge_subdomains option)
103 --output_interfaces : write the parallel subdomains interface triangles into the merged output mesh
104 (used in combination with the merge_subdomains option)
105 --discard_subdomains : discard the parallel subdomains informations output (mesh, global numbering and interfaces)
106 --background : force background mode from launch tetra-hpc and generation of final MED files (big meshes)
107 --deletegroups : regular expression (see QRegExp) which matches unwanted groups in final MED files
108 (try --deletegroups="(\bJOINT)"
109 (try --deletegroups="(\bAll_Nodes|\bAll_Faces)"
110 (try --deletegroups="((\bAll_|\bNew_)(N|F|T))"
112 tetrahpc2med --casename=/tmp/GHS3DPRL --number=2 --medname=DOMAIN --limitswap=1000 --verbose=0 --test=yes --menu=no --launchtetra=no
118 <b>Advanced tetra_hpc parameters (2014)</b> <p>
122 Usage: tetra_hpc.exe [options]
126 Short option (if it exists)
135 --in <input mesh file name>
139 --out <output mesh file name>
143 --merge_subdomains <merge>
144 Describes whether to merge the subdomains into one mesh and write the
145 output .mesh(b) file or not.
147 yes : the subdomains will be merged into one mesh and written to
149 no : the subdomains will not be merged.
152 --tag_subdomains <tag>
153 Describes whether to use the parallel subdomain index as tag into the
154 merged output mesh or not (used in combination with the
155 merge_subdomains option).
157 yes : the tags of the tetrahedra in the merged output will
158 identify the parallel subdomains,
159 no : the tag will keep its standard meaning of volume domain.
162 --output_interfaces <output_interfaces>
163 Describes whether to write the parallel subdomains interface
164 triangles into the merged output mesh or not (used in combination
165 with the merge_subdomains option).
166 if <output_interfaces> is
167 yes : the parallel subdomains interface triangles will be written
168 into the merged output mesh,
169 no : they will not be added to the merged output mesh.
173 Set the verbosity level, increasing from 0 to 10.
174 <verbose> values are increasing from 0 to 10 :
179 --discard_subdomains <discard>
180 Describes whether to discard the parallel subdomains (mesh, global
181 numbering and interfaces) or not.
183 yes : the subdomain informations (mesh, global numbering and
184 interfaces) will be discarded,
185 no : they will be written to disk as output.
192 <h1>Saving user's preferred MG-Tetra Parallel Advanced Parameters</h1><br>
193 MG-Tetra Parallel plug-in launches standalone binary executable tetrahpc2med.<br>
194 You may rename file tetrahpc2med as tetrahpc2med.exe for example, and replace
195 tetrahpc2med by a shell script at your convenience to overriding parameters.
196 <br>... or else $PATH modification... .<br>
198 <b>Advanced tetrahpc2med Parameters</b> - overriding parameter deletegroups<p>
199 You may rename tetrahpc2med as tetrahpc2med.exe for example.
203 #script tetrahpc2med overriding parameter deletegroups
204 #we have renamed binary executable tetrahpc2med as tetrahpc2med.exe
205 #echo tetrahpc2med initial parameters are $1 $2 $3 $4 ... or $*
208 tetrahpc2med.exe $* --deletegroups="(\bAll_Nodes|\bAll_Faces)"
214 <h1>tetra_hpc and MPI use.</h1><br>
215 This 2014 beta-version needs MPI, (openmpi was used). To use it you have to proceed as below.
218 <b>Obsolete example tepal_v2_mpirun.</b><p>
223 #script tepal overriding launching Tepal_V2.0 with MPI (tepal run 64 bits only).
224 #we have renamed binary executable tepal as tepal64_v2.exe.
225 #typical command to launch tepal v1 :
226 #tepal -f /tmp/myname/GHS3DPRL -n 16 > /tmp/myname/tepal.log
227 #this file is an example to transform this call for tepal v2.0,
228 # (beta version using .mesh input file)
229 #you have to adapt for your convenience.
231 #first problem is convert v1 input files GHS3DPRL.faces and GHS3DPRL.points
232 # to v2 input file GHS3DPRL.mesh.
233 #second problem is to launch on heterogeneous system linux cluster of
234 # 2 hosts (64 bits) of 8 nodes (by example)
235 # with different 2 executables codes linked on 2 different
236 # openmpi shared library codes.
237 #third problem is convert tepal v2 output files GHS3DPRL*.mesh
238 # to v1 input files GHS3DPRL*.faces an GHS3DPRL*.points.
240 #you have to work on the same physical disk and same path input and output files : $SAME_DIR
241 #you have to work on different physical disk but same path and name for executable files
242 # (and shared libraries) : $DIFF_DIR
244 echo "parameter 0="$0
245 echo "parameter 1="$1
246 echo "parameter 2="$2
247 echo "parameter 3="$3
248 echo "parameter 4="$4
250 export SAME_DIR=/same_physical_disk_and_same path/tmp
251 export DIFF_DIR=/different_physical_disk_but_same path/myname
253 #copy input local files from local current directory (something like /tmp/myname)
254 #in this case we need /tmp/myname and $SAME_DIR different
259 export IN_FILES=`basename $2`
260 export IN_DIR=`dirname $2`
261 #created .mesh from .faces et .points
262 /through_salome_path/facespoints2mesh.py $IN_FILES
264 #there are 2 executable openmpi and library through 2 physical DIFF_DIR
265 export PATH=$DIFF_DIR/openmpi-1.3.1_install/bin:${PATH}
266 export LD_LIBRARY_PATH=$DIFF_DIR/openmpi-1.3.1_install/lib:${LD_LIBRARY_PATH}
268 #there are 2 executables tepal_v2 through 2 physical DIFF_DIR
269 export LD_LIBRARY_PATH=$DIFF_DIR/tepal-2.0.0/bin/Linux_64:${LD_LIBRARY_PATH}
270 export PATH=$DIFF_DIR/tepal-2.0.0/bin/Linux_64:$PATH
272 #small test between friends
274 #mpirun -n $4 hostname >> hostnames.log
276 #there necessary set env licence file for tepal v2
277 export DISTENE_LICENSE_FILE="Use global envvar: DLIM8VAR"
278 export DLIM8VAR="dlim8 1:1:29030@is142356/0016175ef08c::a1ba...9e19"
279 export SIMULOGD_LICENSE_FILE=29029@is142356
280 export LICENSE_FILE=/product/distene/dlim8.var.sh
282 #mpirun with necessary set environment
283 export TMP_ENV="-x PATH -x LD_LIBRARY_PATH -x DISTENE_LICENSE_FILE -x DLIM8VAR \
284 -x SIMULOGD_LICENSE_FILE -x LICENSE_FILE"
285 #mpirun $TMPENV -n $4 which tepal64_v2.exe >> hostnames.log
287 #real mpirun uncomment after verify small test
288 mpirun $TMPENV -n $4 tepal64_v2.exe --in $IN_FILES.mesh --out $IN_FILES.mesh --verbose 100
290 #convert output files tepalv1 format
291 /through_salome_path/mesh2facespoints.py $IN_FILES
293 #copy output files from $SAME_DIR to local current directory (something like /tmp/myname)
294 cp -f hostnames.log $IN_DIR
295 cp -f $IN_FILES* $IN_DIR
298 #cat $SAME_DIR/hostnames.log
299 #cat /tmp/myname/tepal.log
305 <h1>TUI use.</h1><br>
308 <b>example ex30_tepal.py.</b><p>
316 from salome.geom import geomBuilder
317 geompy = geomBuilder.New(salome.myStudy)
320 from salome.smesh import smeshBuilder
321 smesh = smeshBuilder.New(salome.myStudy)
334 base = geompy.MakeVertex(0, 0, 0)
335 direction = geompy.MakeVectorDXDYDZ(0, 0, 1)
337 cylinder = geompy.MakeCylinder(base, direction, radius, height)
339 geompy.addToStudy(cylinder, "Cylinder")
341 # Define a mesh on a geometry
342 # ---------------------------
344 m = smesh.Mesh(cylinder)
346 # 2D mesh with BLSURF
347 # -------------------
349 algo2d = m.Triangle(smeshBuilder.BLSURF)
351 algo2d.SetPhysicalMesh(1)
354 algo2d.SetGeometricMesh(0)
356 # 3D mesh with tetra-hpc (formerly tepal v3 (2014))
357 # ----------------------------------------------------
359 algo3d = m.Tetrahedron(smeshBuilder.MG_Tetra_Parallel)
361 algo3d.SetMEDName(results)
363 algo3d.SetBackground(False)
364 algo3d.SetKeepFiles(False)
365 algo3d.SetToMergeSubdomains(False)
366 algo3d.SetToTagSubdomains(False)
367 algo3d.SetToOutputInterfaces(False)
368 algo3d.SetToDiscardSubdomains(False)
378 if os.access(results+".xml", os.F_OK):
379 print "Ok: tetra_hpc"
381 print "KO: tetra_hpc"