Atom topic feed | site map | contact | login | Protection des données personnelles | Powered by FluxBB | réalisation artaban
You are not logged in.
Hello
I try to create a parallel release of CA but I get an error with the last command (./waf install -p, see https://sites.google.com/site/codeaster … 4-english). The error appears at the step 'build the elements catalog elem.1 using installed aster (from cata_ele.ojb)' and apparently I need ~30GB memory to overcome this step.
# ------------------------------------------------------------------------------------------
# Impression du contenu du fichier de commandes à exécuter :
# ------------------------------------------------------------------------------------------
DEBUT(CATALOGUE=_F(FICHIER='CATAELEM', UNITE=4),
ERREUR=_F(ERREUR_F='ABORT'), PAR_LOT='NON')
MAJ_CATA(ELEMENT=_F())
FIN()
# ------------------------------------------------------------------------------------------
!----------------------------------------------------------------------------------------!
! <F> <JEVEUX1_71> !
! !
! La mémoire totale de 4000.00 Mo allouée à l'étude est insuffisante, il est nécessaire !
! de disposer d'au moins 29450.75 Mo uniquement pour démarrer l'exécution. !
! !
! !
! Cette erreur est fatale. Le code s'arrête. !
!----------------------------------------------------------------------------------------!
Does it mean that I need a machine with more than 30GB to generate a parallel version of CA?
nicorannou
Offline
There is probably another problem. Which version do you want to compile?
Regards
Mac
Offline
Hello
As you can see in the topic, the CA version to compile is 13.2.0.
nicorannou
Offline
Hello,
I think the origin of the problem is an incompatibility with MPI includes (mpi.h).
But I didn't yet search more for the moment...
MC
Code_Aster release : last unstable on Ubuntu 16.04 64 bits - GNU Compilers
Please do not forget to tag your first post as *SOLVED* when it is!
Offline
Hello,
@nicorannou
Which version of openmpi are you using ?
$mpirun --version
Thanks
stephane
Offline
Hello @stephaneberger,
the mpi version installed on my machine is 1.10.2.
Nicolas
Offline
Hello,
Last code_aster compiles without error with openmpi-1.8.8.
The problem occurred since openmpi-1.10.0. I also check that it has not been "fixed" in last 1.10.5.
During the last stage of the build process, code_aster is called directly. If it is called code_aster through mpirun, it works as expected.
A temporary fix is to add:
self.env['CATALO_CMD'] = 'mpirun'
in the 'configure' function in the file used via --use-config.
Another fix is simply to use openmpi 1.8 wrappers just by changing your PATH environment variables.
MC
Code_Aster release : last unstable on Ubuntu 16.04 64 bits - GNU Compilers
Please do not forget to tag your first post as *SOLVED* when it is!
Offline
Hello Mr Courtois,
thanks a lot! Indeed, parallel CA is well compiled by adding :
self.env['CATALO_CMD'] = 'mpirun' in the configuration file.
So, I ran a job but an error occurs :
Sur les 4041 mailles du maillage MAIL, on a demandé l'affectation de 4041, on a pu en affecter 4041
Modélisation Type maille Élément fini Nombre
COQUE_3D QUAD9 MEC3QU9H 477
COQUE_3D TRIA7 MEC3TR7H 18
3D QUAD9 MECA_FACE9 522
3D HEXA27 MECA_HEXA27 3024
-- NOMBRE DE MAILLES : 4041
-- NOMBRE DE NOEUDS : 27913
!--------------------------------------------------------------------------------!
! <EXCEPTION> <UTILITAI5_1> !
! !
! Le fichier de nom fort.99.part.2 associé à l'unité logique ??? n'existe pas. !
!--------------------------------------------------------------------------------!
Destruction du concept 'MODELE'.
Le processeur #0 a émis un message d'erreur.
On demande au processeur #1 de s'arrêter ou de lever une exception.
thanks by advance for your help.
Nicolas
Offline
Hello,
Last code_aster compiles without error with openmpi-1.8.8.
The problem occurred since openmpi-1.10.0. I also check that it has not been "fixed" in last 1.10.5.
During the last stage of the build process, code_aster is called directly. If it is called code_aster through mpirun, it works as expected.
A temporary fix is to add:
self.env['CATALO_CMD'] = 'mpirun'
in the 'configure' function in the file used via --use-config.
Another fix is simply to use openmpi 1.8 wrappers just by changing your PATH environment variables.
MC
Indeed, the fix will be to prepend the command with 'mpirun -np 1' (because without argument mpirun may start one process per existing core...).
Code_Aster release : last unstable on Ubuntu 16.04 64 bits - GNU Compilers
Please do not forget to tag your first post as *SOLVED* when it is!
Offline
@nicorannou: send a testcase to reproduce them problem.
Code_Aster release : last unstable on Ubuntu 16.04 64 bits - GNU Compilers
Please do not forget to tag your first post as *SOLVED* when it is!
Offline
Hello
here is testcase which shows the problem.
Nicolas
Offline