Welcome to the forums. Please post in English or French.

You are not logged in.

#1 Re: Code_Aster usage » Odd error in PROJ_CHAMP, nonlinear tranient thermal-stress analysis » 2021-09-16 18:31:11

Hi,
Why not try with solid hex elements  with a few layers representing the thickness of shell?

#2 Re: Code_Aster usage » Odd error in PROJ_CHAMP, nonlinear tranient thermal-stress analysis » 2021-09-15 19:49:10

Hello,
First introduce yourself in introduction section.
I think There will be error with PROJ_CHAMP as you changed the mesh and added one more node at center which did not exist before. It is far from all other nodes.
Regards
Anirudh

#3 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-08-01 21:15:51

Hi
I installed on version 15.2 sequential, installation of which is easy. Just download code aster stable and python3 setup.py.
I am yet not able to install 15.4 sequential version.

Regards
Anirudh

#4 Code_Aster usage » Erreur numérique (floating point exception). » 2021-07-26 07:43:05

Anirudh
Replies: 1

Hi to all,
I somehow recently installed PArallel 15.4 version of code_Aster. But whenever I run a testcase, I get a floating point exception during AFFE_MODELE. I tried to change partitioner from METIS to SCOTCH but the error persists in every testcase. Below is the error.


# Commande #0005 de ./CurrentCase.comm.changed.py, ligne 51
MODE = AFFE_MODELE(AFFE=_F(MODELISATION='3D',
                           PHENOMENE='MECANIQUE',
                           TOUT='OUI'),
                   DISTRIBUTION=_F(METHODE='SOUS_DOMAINE',
                                   PARTITIONNEUR='SCOTCH'),
                   INFO=1,
                   MAILLAGE=MAILL,
                   VERI_JACOBIEN='OUI',
                   VERI_NORM_IFS='OUI')

Sur les 149 mailles du maillage 00000001, on a demandé l'affectation de 149, on a pu en affecter
149.
Modélisation     Formulation      Type maille  Élément fini     Nombre
_                _                SEG2         MECA_ARETE2      27
_                _                QUAD4        MECA_FACE4       98
3D               _                HEXA8        MECA_HEXA8       24

╔════════════════════════════════════════════════════════════════════════════════════════════════╗
║ <F> <DVP_2>                                                                                    ║
║                                                                                                ║
║ Erreur numérique (floating point exception).                                                   ║
║                                                                                                ║
║                                                                                                ║
║ Cette erreur est fatale. Le code s'arrête.                                                     ║
║ Il y a probablement une erreur dans la programmation.                                          ║
║ Veuillez contacter votre assistance technique.                                                 ║
╚════════════════════════════════════════════════════════════════════════════════════════════════╝

Traceback returned by GNU libc (last 25 stack frames):
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibc.so(print_trace_+0x23) [0x7f6f2d79eb03]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(utmess_core_+0x978) [0x7f6f2c244858]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(utmess_+0x7ec) [0x7f6f2c243b6c]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(utmfpe_+0x39) [0x7f6f2c3653d9]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibc.so(hanfpe+0xc) [0x7f6f2d79edec]
/lib/x86_64-linux-gnu/libc.so.6(+0x46210) [0x7f6f2e0f7210]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(calcul_+0x1120) [0x7f6f2b462240]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(modelcheck_+0x3b1) [0x7f6f2bab8c01]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(op0018_+0x748) [0x7f6f2bc91718]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(execop_+0x18a) [0x7f6f2c03156a]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibfor.so(expass_+0x14) [0x7f6f2c031634]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_Z9call_operP7_objecti+0x28) [0x7f6f2a97c8d8]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_ZN5boost6python7objects23caller_py_function_implINS0_6detail6callerIPFvP7_objectiENS0_21default_call_policiesENS_3mpl7vector3IvS6_iEEEEEclES6_S6_+0x58) [0x7f6f2a97e9c8]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(_ZNK5boost6python7objects8function4callEP7_objectS4_+0x2cb) [0x7f6f274b1eab]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(+0x2115c) [0x7f6f274b215c]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(_ZNK5boost6python6detail17exception_handlerclERKNS_9function0IvEE+0x6b) [0x7f6f274b72eb]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_ZN5boost6detail8function21function_obj_invoker2INS_3_bi6bind_tIbNS_6python6detail19translate_exceptionI8ErrorCppILi6EEPFvRKS9_EEENS3_5list3INS_3argILi1EEENSG_ILi2EEENS3_5valueISD_EEEEEEbRKNS6_17exception_handlerERKNS_9function0IvEEE6invokeERNS1_15function_bufferESP_ST_+0x1c) [0x7f6f2a9c07bc]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(_ZNK5boost6python6detail17exception_handlerclERKNS_9function0IvEE+0x3b) [0x7f6f274b72bb]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_ZN5boost6detail8function21function_obj_invoker2INS_3_bi6bind_tIbNS_6python6detail19translate_exceptionI8ErrorCppILi5EEPFvRKS9_EEENS3_5list3INS_3argILi1EEENSG_ILi2EEENS3_5valueISD_EEEEEEbRKNS6_17exception_handlerERKNS_9function0IvEEE6invokeERNS1_15function_bufferESP_ST_+0x1c) [0x7f6f2a9c075c]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(_ZNK5boost6python6detail17exception_handlerclERKNS_9function0IvEE+0x3b) [0x7f6f274b72bb]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_ZN5boost6detail8function21function_obj_invoker2INS_3_bi6bind_tIbNS_6python6detail19translate_exceptionI8ErrorCppILi4EEPFvRKS9_EEENS3_5list3INS_3argILi1EEENSG_ILi2EEENS3_5valueISD_EEEEEEbRKNS6_17exception_handlerERKNS_9function0IvEEE6invokeERNS1_15function_bufferESP_ST_+0x1c) [0x7f6f2a9c06fc]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(_ZNK5boost6python6detail17exception_handlerclERKNS_9function0IvEE+0x3b) [0x7f6f274b72bb]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_ZN5boost6detail8function21function_obj_invoker2INS_3_bi6bind_tIbNS_6python6detail19translate_exceptionI8ErrorCppILi3EEPFvRKS9_EEENS3_5list3INS_3argILi1EEENSG_ILi2EEENS3_5valueISD_EEEEEEbRKNS6_17exception_handlerERKNS_9function0IvEEE6invokeERNS1_15function_bufferESP_ST_+0x1c) [0x7f6f2a9c069c]
/lib/x86_64-linux-gnu/libboost_python38.so.1.71.0(_ZNK5boost6python6detail17exception_handlerclERKNS_9function0IvEE+0x3b) [0x7f6f274b72bb]
/home/anirudh/aster_15.2_mpi/PAR15.4/lib/aster/libbibcxx.so(_ZN5boost6detail8function21function_obj_invoker2INS_3_bi6bind_tIbNS_6python6detail19translate_exceptionI8ErrorCppILi2EEPFvRKS9_EEENS3_5list3INS_3argILi1EEENSG_ILi2EEENS3_5valueISD_EEEEEEbRKNS6_17exception_handlerERKNS_9function0IvEEE6invokeERNS1_15function_bufferESP_ST_+0x1c) [0x7f6f2a9c063c]

EXECUTION_CODE_ASTER_EXIT_154972=6


I attach the mess file, If someone could help with this error, greatly appreciated.

Thanks and Regards
Anirudh Nehra

#5 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-26 02:36:01

Hi,
I tried again but ow I am getting this error:
Waf: Leaving directory `/home/anirudh/Desktop/extracts/code_aster_15.4_tip/codeaster-src-703aca508196b74ad104112bfb72a87d1dba78df/build/std/release'
+ build the elements catalog elem.1 using installed aster (from cata_ele.ojb)
stdout: MPI_Init...
calling MPI_Init...

stderr: [anirudh-Inspiron-7591:51568] *** Process received signal ***
[anirudh-Inspiron-7591:51568] Signal: Segmentation fault (11)
[anirudh-Inspiron-7591:51568] Signal code: Address not mapped (1)
[anirudh-Inspiron-7591:51568] Failing at address: 0x90
[anirudh-Inspiron-7591:51568] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x46210)[0x7f4bf924c210]
[anirudh-Inspiron-7591:51568] [ 1] /lib/x86_64-linux-gnu/libblas.so.3(cblas_sdot+0x24)[0x7f4bf08284a4]
[anirudh-Inspiron-7591:51568] [ 2] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x3e925)[0x7f4be61b0925]
[anirudh-Inspiron-7591:51568] [ 3] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x237ff7)[0x7f4be63a9ff7]
[anirudh-Inspiron-7591:51568] [ 4] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xeb997)[0x7f4be625d997]
[anirudh-Inspiron-7591:51568] [ 5] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xe6d28)[0x7f4be6258d28]
[anirudh-Inspiron-7591:51568] [ 6] /usr/bin/python3[0x504939]
[anirudh-Inspiron-7591:51568] [ 7] /usr/bin/python3(_PyEval_EvalFrameDefault+0x906)[0x56acb6]
[anirudh-Inspiron-7591:51568] [ 8] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [ 9] /usr/bin/python3(_PyEval_EvalFrameDefault+0x72f)[0x56aadf]
[anirudh-Inspiron-7591:51568] [10] /usr/bin/python3(_PyEval_EvalCodeWithName+0x26a)[0x568d9a]
[anirudh-Inspiron-7591:51568] [11] /usr/bin/python3(PyEval_EvalCode+0x27)[0x68cdc7]
[anirudh-Inspiron-7591:51568] [12] /usr/bin/python3[0x5ff5d4]
[anirudh-Inspiron-7591:51568] [13] /usr/bin/python3[0x5c3cb0]
[anirudh-Inspiron-7591:51568] [14] /usr/bin/python3(PyVectorcall_Call+0x58)[0x5f2168]
[anirudh-Inspiron-7591:51568] [15] /usr/bin/python3(_PyEval_EvalFrameDefault+0x6552)[0x570902]
[anirudh-Inspiron-7591:51568] [16] /usr/bin/python3(_PyEval_EvalCodeWithName+0x26a)[0x568d9a]
[anirudh-Inspiron-7591:51568] [17] /usr/bin/python3(_PyFunction_Vectorcall+0x393)[0x5f5b33]
[anirudh-Inspiron-7591:51568] [18] /usr/bin/python3(_PyEval_EvalFrameDefault+0x57d7)[0x56fb87]
[anirudh-Inspiron-7591:51568] [19] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [20] /usr/bin/python3(_PyEval_EvalFrameDefault+0x906)[0x56acb6]
[anirudh-Inspiron-7591:51568] [21] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [22] /usr/bin/python3(_PyEval_EvalFrameDefault+0x72f)[0x56aadf]
[anirudh-Inspiron-7591:51568] [23] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [24] /usr/bin/python3(_PyEval_EvalFrameDefault+0x72f)[0x56aadf]
[anirudh-Inspiron-7591:51568] [25] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [26] /usr/bin/python3[0x5f34b1]
[anirudh-Inspiron-7591:51568] [27] /usr/bin/python3(_PyObject_CallMethodIdObjArgs+0x198)[0x5f3938]
[anirudh-Inspiron-7591:51568] [28] /usr/bin/python3(PyImport_ImportModuleLevelObject+0x822)[0x551f52]
[anirudh-Inspiron-7591:51568] [29] /usr/bin/python3[0x4f6cb8]
[anirudh-Inspiron-7591:51568] *** End of error message ***
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 0 with PID 0 on node anirudh-Inspiron-7591 exited on signal 11 (Segmentation fault)

Could someone help with this error?

Merci beaucoup
Anirudh

#6 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-24 05:38:18

Hi,
Could someone provide a clue to what's going wrong?
May I need to upgrade gfortran and gcc.

Regards
Anirudh

#7 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-22 13:25:43

Hi,
Yes you are right. The problem was PETSC needs to be shared memory object.
I was able to make it to 100% but there is again a problem. The problem is :

stdout: 
stderr: Traceback (most recent call last):
  File "/home/anirudh/Desktop/extracts/code_aster_mpi_15.2/aster-full-src-15.2.0/SRC/aster-15.2.0/build/std/release/catalo/fort.1", line 1, in <module>
    from code_aster.Commands import DEBUT, MAJ_CATA, FIN
  File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Commands/__init__.py", line 30, in <module>
    from ..Supervis import CO
  File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Supervis/__init__.py", line 27, in <module>
    from .CommandSyntax import CommandSyntax
  File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Supervis/CommandSyntax.py", line 61, in <module>
    from ..Objects import DataStructure
  File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Objects/__init__.py", line 27, in <module>
    from libaster import *
ImportError: /home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/libbibcxx.so: undefined symbol: isdeco_
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[36661,1],0]
  Exit code:    1

I attach the screenshot too.
Please help.

Regards
Anirudh

#8 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-21 15:59:11

Hello,
Greatly appreciated if someone could help with this error.

Regards
Anirudh

#9 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-20 16:00:14

Hi,
could someone help me.
I always get this error:
/usr/bin/ld: /home/anirudh/aster_libs/petsc-3.11.4/linux-metis-mumps/lib/libpetsc.a(pinit.o): relocation R_X86_64_PC32 against symbol `ompi_mpi_2int' can not be used when making a shared object; recompile with -fPIC
/usr/bin/ld: final link failed: bad value
collect2: error: ld returned 1 exit status

I tried:
export LINKFLAGS=-shared
export CFLAGS=-fPIC

but it doesnot work.

How to specify additional flages to waf during build?

Thanks and Regards
Anirudh

#11 Re: Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-19 23:07:27

Hi,
I see that full error output is not visible in the post, so I attach the screenshot.

Regards
Anirudh

#12 Code_Aster installation » Code_aster 15.4 MPI installation problem » 2021-07-19 23:03:30

Anirudh
Replies: 16

Hi,
I am trying to install parallel version of Code_Aster from sourceforge. I am not using mercurial.
My $ASTER_ROOT=/home/anirudh/aster_!5.2_mpi
I have already installed sequential 15.2 successfully with PETSC-3.11.4 but when I try to install parallel version 15.4(commit 703...), I get the following error:

/usr/bin/ld: /home/anirudh/aster_libs/petsc-3.11.4/linux-metis-mumps/lib/libpetsc.a(pinit.o): relocation R_X86_64_PC32 against symbol `ompi_mpi_2int' can not be used when making a shared object; recompile with -fPIC
/usr/bin/ld: final link failed: bad value
collect2: error: ld returned 1 exit status

I tried to install using

 ./waf install -p --jobs=6 -v CFLAGS="-fPIC"

but it doesn't work. How can I give -fPIC flag to it and why doesn't it pull it aoutomatically.
I configured with these options:

source /home/anirudh/aster_15.2_mpi/15.2/share/aster/profile.sh
source $ASTER_ROOT/etc/codeaster/profile.sh (both of above produce same result)
./waf configure --use-config-dir=$ASTER_ROOT/15.2/share/aster --use-config=Ubuntu_gnu_mpi --prefix=$ASTER_ROOT/PAR15.4 --without-hg

this is profile.sh under /home/anirudh/aster_15.2_mpi/15.2/share/aster

# created by waf using data/wscript


# This file set the environment for code_aster
# This configuration is generated by aster-full package.

# keep path to this file
export WAFBUILD_ENV=$(readlink -n -f ${BASH_SOURCE})

# DEVTOOLS_COMPUTER_ID avoids waf to re-source the environment
export DEVTOOLS_COMPUTER_ID=aster_full

# generic environment: compilers, python
export PATH=/usr/bin:${PATH}
export LD_LIBRARY_PATH=/usr/lib/python3.8/config-3.8-x86_64-linux-gnu:${LD_LIBRARY_PATH}

# seems needed for gcc>=9
export LINKFLAGS="-Wl,--no-as-needed"

# custom configuration
export CONFIG_PARAMETERS_addmem=800

# prerequisites paths
export PYPATH_NUMPY="/usr/lib/python3/dist-packages"
export PYPATH_ASRUN="/home/anirudh/aster_15.2_mpi/lib/python3.8/site-packages"

export LIBPATH_HDF5="/home/anirudh/aster_15.2_mpi/public/hdf5-1.10.3/lib"
export INCLUDES_HDF5="/home/anirudh/aster_15.2_mpi/public/hdf5-1.10.3/include"

export LIBPATH_MED="/home/anirudh/aster_15.2_mpi/public/med-4.0.0/lib"
export INCLUDES_MED="/home/anirudh/aster_15.2_mpi/public/med-4.0.0/include"

export LIBPATH_METIS="/home/anirudh/aster_15.2_mpi/public/metis-5.1.0/lib"
export INCLUDES_METIS="/home/anirudh/aster_15.2_mpi/public/metis-5.1.0/include"

export LIBPATH_SCOTCH="/home/anirudh/aster_15.2_mpi/public/scotch-6.0.4/lib"
export INCLUDES_SCOTCH="/home/anirudh/aster_15.2_mpi/public/scotch-6.0.4/include"

export LIBPATH_MUMPS="/home/anirudh/aster_15.2_mpi/public/mumps-5.2.1/lib"
export INCLUDES_MUMPS="/home/anirudh/aster_15.2_mpi/public/mumps-5.2.1/include /home/anirudh/aster_15.2_mpi/public/mumps-5.2.1/include_seq"

export TFELHOME="/home/anirudh/aster_15.2_mpi/public/tfel-3.2.1"
export TFELVERS="3.2.1"
export LIBPATH_MFRONT="/home/anirudh/aster_15.2_mpi/public/tfel-3.2.1/lib"
export INCLUDES_MFRONT="${TFELHOME}/include"
export PYPATH_MFRONT="${TFELHOME}/lib/python3.8/site-packages"

export INCLUDES_BOOST="/usr/lib/x86_64-linux-gnu/include"
export LIBPATH_BOOST="/usr/lib/x86_64-linux-gnu"
export LIB_BOOST="boost_python38"


export LD_LIBRARY_PATH=${LIBPATH_HDF5}:${LIBPATH_MED}:${LIBPATH_METIS}:${LIBPATH_SCOTCH}:${LIBPATH_MUMPS}:${LIBPATH_MFRONT}:${LIBPATH_BOOST}:${LD_LIBRARY_PATH}

export PYTHONPATH=${PYPATH_NUMPY}:${PYPATH_ASRUN}:${PYPATH_MFRONT}:${PYTHONPATH}

# may be needed: gmsh, miss3d, ecrevisse, salome
export PATH=/home/anirudh/aster_15.2_mpi/public/med-4.0.0/bin:/home/anirudh/aster_15.2_mpi/public/homard-11.12:${TFELHOME}/bin:${PATH}



remove_path()
{
    # usage: remove_path value pattern1 [pattern2 [...]]
    #     Returns the 'value' with excluding the given patterns.
    #     Example of use: export PATH=$(remove_path "${PATH}" ${HOME}/local)
    if [ ${#} -lt 2 ]
    then
        echo ${1}
        return
    fi
    local values=${1}
    shift

    local i
    for i in ${@}
    do
        values=$(echo ${values} | tr ":" "\n" | grep -v -F ${i} | tr "\n" ":" | sed -e "s%:\+%:%g;s%^:%%g;s%:$%%g")
    done

    echo ${values}
}

LD_LIBRARY_PATH=/home/anirudh/aster_15.2_mpi/15.2/lib/aster:${LD_LIBRARY_PATH}:.
export LD_LIBRARY_PATH

# Exclude paths to 'python2.7'.
PYTHONPATH=$(remove_path "${PYTHONPATH}" python2.7)

PYTHONPATH=/home/anirudh/aster_15.2_mpi/15.2/lib/aster:${PYTHONPATH}:.
export PYTHONPATH

# sometimes one should not change PYTHONHOME under SALOME environment...
PYTHONHOME=/usr
export PYTHONHOME

# as PYTHONHOME is changed, path to 'python' must preceed all others if a
# subprocess calls it
PATH=/usr/bin:${PATH}
export PATH

ASTER_LIBDIR=/home/anirudh/aster_15.2_mpi/15.2/lib/aster
export ASTER_LIBDIR

ASTER_DATADIR=/home/anirudh/aster_15.2_mpi/15.2/share/aster
export ASTER_DATADIR

ASTER_LOCALEDIR=/home/anirudh/aster_15.2_mpi/15.2/share/locale/aster
export ASTER_LOCALEDIR

ASTER_ELEMENTSDIR=/home/anirudh/aster_15.2_mpi/15.2/lib/aster
export ASTER_ELEMENTSDIR

this is profile.sh located under $ASTER_ROOT/etc/codeaster

# AUTOMATICALLY GENERATED - DO NOT EDIT !
# Put all your changes in profile_local.sh in the same directory
#
# profile.sh : initialize the environment for as_run services
# (sh, ksh, bash syntax)
#
# If variables are depending on Code_Aster version, use ENV_SH in
# the corresponding 'config.txt' file.
#

if [ -z "${ASTER_ROOT}" ]; then
    [ -z "${BASH_SOURCE[0]}" ] && here="$0" || here="${BASH_SOURCE[0]}"
    ASTER_ROOT=`dirname $(dirname $(dirname $(readlink -f ${here})))`
    export ASTER_ROOT
fi

if [ -z "${ASTER_ETC}" ]; then
    ASTER_ETC="${ASTER_ROOT}"/etc
    if [ "${ASTER_ROOT}" = "/usr" ]; then
        ASTER_ETC=/etc
    fi
    export ASTER_ETC
fi

if [ -z "${PATH}" ]; then
    export PATH="$ASTER_ROOT"/bin:"${ASTER_ROOT}"/outils
else
    export PATH="${ASTER_ROOT}"/bin:"${ASTER_ROOT}"/outils:"${PATH}"
fi

if [ -z "${LD_LIBRARY_PATH}" ]; then
    export LD_LIBRARY_PATH="/usr"/lib
else
    export LD_LIBRARY_PATH="/usr"/lib:"${LD_LIBRARY_PATH}"
fi

if [ -z "${PYTHONPATH}" ]; then
    export PYTHONPATH="$ASTER_ROOT/lib/python3.8/site-packages"
else
    export PYTHONPATH="$ASTER_ROOT/lib/python3.8/site-packages":"${PYTHONPATH}"
fi

export PYTHONEXECUTABLE="/usr/bin/python3"

# this may be required if PYTHONHOME is defined to another location
if [ ! -z "${PYTHONHOME}" ]; then
    export PYTHONHOME="/usr"
fi

export WISHEXECUTABLE="/usr/bin/wish"

# define the default temporary directory
# Use profile_local.sh if you need change it!
if [ -z "${ASTER_TMPDIR}" ]; then
    export ASTER_TMPDIR=/tmp
fi

# source local profile
if [ -e "${ASTER_ETC}"/codeaster/profile_local.sh ]; then
    . "${ASTER_ETC}"/codeaster/profile_local.sh
fi

This is my config output:

./waf configure --use-config-dir=$ASTER_ROOT/15.2/share/aster --use-config=Ubuntu_gnu_mpi --prefix=$ASTER_ROOT/PAR15.4 --without-hg
checking environment... already set: /home/anirudh/aster_15.2_mpi/15.2/share/aster/profile.sh
executing: ./waf.engine configure --use-config-dir=/home/anirudh/aster_15.2_mpi/15.2/share/aster --use-config=Ubuntu_gnu_mpi --prefix=/home/anirudh/aster_15.2_mpi/PAR15.4 --without-hg --out=build/std --jobs=4
Setting top to                           : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/codeaster-src-703aca508196b74ad104112bfb72a87d1dba78df 
Setting out to                           : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/codeaster-src-703aca508196b74ad104112bfb72a87d1dba78df/build/std 
Setting prefix to                        : /home/anirudh/aster_15.2_mpi/PAR15.4 
Searching configuration 'Ubuntu_gnu_mpi'... 
Checking for configuration               : Ubuntu_gnu_mpi 
Checking for 'gcc' (C compiler)          : mpicc 
Checking for 'g++' (C++ compiler)        : mpicxx 
Checking for 'gfortran' (Fortran compiler) : mpif90 
Checking mpicc package (collect configuration flags) : yes 
Checking mpif90 package (collect configuration flags) : yes 
Checking for header mpi.h                : yes 
Checking for C compiler version          : gcc 9.3.0 
Checking for Fortran compiler version    : gfortran 9.3.0 
fortran link verbose flag                : -v 
Checking for OpenMP flag -fopenmp        : yes 
Checking for program 'python'            : /usr/bin/python3 
Checking for python version >= 3.5.0     : 3.8.5 
python-config                            : /usr/bin/python3-config 
Asking python-config for pyembed '--cflags --libs --ldflags --embed' flags : yes 
Testing pyembed configuration            : yes 
Asking python-config for pyext '--cflags --libs --ldflags' flags : yes 
Testing pyext configuration              : yes 
Checking for python module 'numpy'       : 1.17.4 
Checking for numpy include               : ['/usr/lib/python3/dist-packages/numpy/core/include'] 
Checking for python module 'asrun'       : 2020.0 
Checking for python module 'mpi4py'      : not found 
Getting platform                         : ASTER_PLATFORM_LINUX64 
Checking for library m                   : yes 
Checking for library z                   : yes 
Checking for number of cores             : 6 
Checking for library openblas            : yes 
Checking for library superlu             : yes 
Setting libm after files                 : nothing done 
Checking for library hdf5                : yes 
Checking for library z                   : yes 
Checking for header hdf5.h               : yes 
Checking hdf5 version                    : 1.10.3 
Checking for API hdf5 v18                : default v18 
Checking size of hid_t integers          : 8 
Checking for library med                 : yes 
Checking for library stdc++              : yes 
Checking for header med.h                : yes 
Checking size of med_int integers        : 4 
Checking size of med_idt integers        : 8 
Checking med version                     : 4.0.0 
Checking for python module 'med'         : not found 
Checking for python module 'medcoupling' : not found 
Checking for library metis               : yes 
Checking for header metis.h              : yes 
Checking metis version                   : 5.1.0 
Checking for library parmetis            : yes 
Checking parmetis version                : 4.0.3 
Checking for smumps_struc.h              : yes 
Checking for dmumps_struc.h              : yes 
Checking for cmumps_struc.h              : yes 
Checking for zmumps_struc.h              : yes 
Checking for mpif.h                      : yes 
Checking mumps version                   : 5.2.1 
Checking size of Mumps integer           : 4 
Checking for library dmumps              : yes 
Checking for library zmumps              : yes 
Checking for library smumps              : yes 
Checking for library cmumps              : yes 
Checking for library mumps_common        : yes 
Checking for library pord                : yes 
Checking for library metis               : yes 
Checking for library scalapack           : yes 
Checking for library openblas            : yes 
Checking for library esmumps             : yes 
Checking for library scotch              : yes 
Checking for library scotcherr           : yes 
Checking for header scotch.h             : yes 
Checking scotch version                  : 6.0.4 
Checking for library ['esmumps', 'scotch', 'scotcherr', 'ptscotch', 'ptscotcherr'] : yes 
Checking for library petsc               : yes 
Checking for library HYPRE               : yes 
Checking for library ml                  : yes 
Checking for header petsc.h              : yes 
Checking for header petscconf.h          : yes 
Checking petsc version                   : 3.11.4p0 
Checking size of PETSc integer           : 4 
Checking value of ASTER_PETSC_64BIT_INDICES : no 
Checking value of ASTER_PETSC_HAVE_ML    : 1 
Checking value of ASTER_PETSC_HAVE_HYPRE : 1 
Checking value of ASTER_PETSC_HAVE_SUPERLU : no 
Checking value of ASTER_PETSC_HAVE_MUMPS : 1 
Checking for python module 'petsc4py'    : not found 
Reading build preferences from ~/.hgrc   : not found 
Checking for 'gfortran' (Fortran compiler) : mpif90 
Compiling a simple fortran app           : yes 
Detecting whether we need a dummy main   : yes main 
Checking for fortran option              : yes (-fdefault-double-8 -fdefault-integer-8 -fdefault-real-8) 
Checking for fortran option              : yes (-Wimplicit-interface) 
Checking for fortran option              : yes (-Wintrinsic-shadow) 
Checking for fortran option              : yes (-fno-aggressive-loop-optimizations) 
Checking for fortran option              : yes (-ffree-line-length-none) 
Setting fortran compiler flags           : ['-fPIC', '-fdefault-double-8', '-fdefault-integer-8', '-fdefault-real-8', '-Wimplicit-interface', '-Wintrinsic-shadow', '-fno-aggressive-loop-optimizations', '-ffree-line-length-none'] 
Getting fortran mangling scheme          : ok ('_', '', 'lower-case') 
Checking size of integer4                : 4 
Checking the matching C type             : int 
Checking size of default integer         : 8 
Checking the matching C type             : long 
Checking size of logical                 : 1 
Checking size of simple precision real   : 4 
Checking the matching C type             : float 
Checking size of double precision real   : 8 
Checking the matching C type             : double 
Checking size of double complex          : 16 
Setting type for fortran string length   : unsigned int 
Setting size of blas/lapack integers     : 4 
Checking size of MPI_Fint integers       : 4 
Checking fpp stringify using #           : no 
Checking fpp stringify using ""          : yes 
Checking compilation with long lines     : yes 
Check for backtrace feature              : yes 
Check for tracebackqq feature            : no 
Getting code_aster version               : [(0, 0, 1), 'n/a', 'n/a', '20/07/2021', 'n/a', 1] 
Checking for 'g++' (C++ compiler)        : mpicxx 
Checking for compiler flags -std=c++11   : yes 
Checking for library stdc++              : yes 
Checking size of C++ bool                : 1 
Checking for program 'dpkg-architecture' : /usr/bin/dpkg-architecture 
Checking boost includes                  : 1.71.0 
Checking boost libs                      : ok 
Checking for boost linkage               : ok 
Checking for 'gcc' (C compiler)          : mpicc 
Getting C compiler flags                 : ['-fPIC'] 
Check for msgfmt programs                : ['/usr/bin/msgfmt'] 
Check for xgettext programs              : ['/usr/bin/xgettext'] 
Check for msgmerge programs              : ['/usr/bin/msgmerge'] 
Check for lrelease programs              : not found 
Set parameters for 'config.json'         : done 
. use 'tmpdir'                           : /tmp 
. use 'addmem'                           : 800 
. use 'python'                           : python3 
. use 'python_interactive'               : python3 
. use 'mpiexec'                          : mpiexec -n {mpi_nbcpu} --tag-output {program} 
. use 'mpi_get_rank'                     : echo ${OMPI_COMM_WORLD_RANK} 
Checking measure of VmSize during MPI_Init : ok (162116 kB) 
Checking for program 'gmsh'              : /home/anirudh/aster_libs/gmsh-4.8.4-Linux64/bin/gmsh 
Checking for program 'salome'            : not found 
Checking for program 'salome'            : not found 
Checking for program 'run_miss3d'        : not found 
Checking for program 'run_miss3d'        : not found 
Checking for program 'homard'            : /home/anirudh/aster_15.2_mpi/public/homard-11.12/ASTER_HOMARD/homard 
Checking for program 'ecrevisse'         : not found 
Checking for program 'ecrevisse'         : not found 
Checking for program 'mfront'            : not found 
Checking for program 'mfront'            : not found 
Checking for program 'xmgrace'           : /usr/bin/xmgrace 
Checking for program 'gracebat'          : /usr/bin/gracebat 
Checking for program 'mdump'             : /home/anirudh/aster_15.2_mpi/public/med-4.0.0/bin/mdump 
Checking for 'data' repository           : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/data not found 
Store execution environment              : yes 
Build list of testcases                  : done 
Checking for program 'dot'               : not found 
Checking for 'validation' repository     : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/validation not found 
Setting C debug flags                    : ['-g', '-O0'] 
Setting C optimization flags             : ['-O2'] 
Setting C++ debug flags                  : ['-g', '-O0'] 
Setting C++ optimization flags           : ['-O2'] 
Setting fortran debug flags              : ['-g', '-O0'] 
Checking loop optimization with LOC      : VOLATILE is required 
Checking implicit loop in write          : '-fno-frontend-optimize' is required 
Getting fortran optimization flags       : ['-O2', '-fno-frontend-optimize'] 
Write config file                        : debug/asterf_config.h 
Write config file                        : debug/asterc_config.h 
Write config file                        : debug/aster_config.py 
Write config file                        : release/asterf_config.h 
Write config file                        : release/asterc_config.h 
Write config file                        : release/aster_config.py

Another issue is that I am continuously using disk space in above installation trials upto 4 GBs lost. Could someone tell how to free that space and remove tmps generated by above script.
Please could someone help in this regard.

Thanks
Anirudh Nehra

#13 Re: Code_Aster installation » Latest version of code aster » 2021-07-09 15:32:21

Hi,
Thanks for the reply. Is the default same as tip branch?
Is tip the latest rolling release?

Regards
Anirudh

#14 Code_Aster installation » Latest version of code aster » 2021-07-09 02:06:27

Anirudh
Replies: 2

Hi,
I want to install latest 15.3.xx release of code aster which has new solid shell elements but don't know where to download it from.
I saw various commits on sourceforge but don't know which branch is the latest.
Please help.

Regards
Anirudh Nehra

#15 Code_Aster development » Solid shell elements in Code Aster » 2021-03-12 19:48:02

Anirudh
Replies: 3

Hi,
Great work by developers indeed.
I just wanted to know when will the new non-linear solid shell elements be introduced in the latest release of Code Aster which I have waiting for some time.

Thanks and Regards
Anirudh Nehra

#16 Re: Code_Aster usage » Different stiffness in compression and tension for K_T_D_L » 2021-01-27 22:47:02

Hi,
Thanks for the reply.
I will check soon and update my findings here.

Regards
Anirudh

#17 Re: Code_Aster usage » Different stiffness in compression and tension for K_T_D_L » 2021-01-26 17:33:57

Hi Mr. Sameer,
It's a simple problem which should have a simple answer. I don't want to turn off anything, it's the physics that can make go spring in compression or tension.You are very new to forum yet talk in a rude manner. And read my previous posts if you think I have any inclination towards paid software like ansys.

#18 Re: Code_Aster usage » Different stiffness in compression and tension for K_T_D_L » 2021-01-25 19:25:22

Hi, thanks for reply.
I don't know how that will work. Suppose during simulation  both forces if compression and tension are active,  how can I separate them or turn them off. Because they act by themselves.

Regards
Anirudh

#19 Code_Aster usage » Different stiffness in compression and tension for K_T_D_L » 2021-01-24 01:54:36

Anirudh
Replies: 10

How,
Is it possible while using K_T_D_L on seg2 to assign variable values when the spring is under tension/pulling or being compressed due to external forces? I hope I am clear. Say 100 N/mm in tension and 50 N/mm during compression
Thanks and regards

Anirudh

#20 Re: Code_Aster installation » Code_Aster doesn't quit after the calculus' end » 2020-08-19 19:14:20

Hi,
It's highly likely that you use PAR_LOT=NON, in the DEBUT command.

Anirudh

#21 Re: Code_Aster usage » Parallel run of parametric study » 2020-08-08 11:00:36

Hi,
Indeed. Your mpi_nbcpu is set to 1. It should be set to the number of processors you want to run on.
Also, since you got 80 cores, it's unlikely all of them are located on just one chip/processor.
As you use mpi_nbnoeud equal to 1, Code_Aster will use only 1 processor chip(otherwise called node).
In this case set this value to >1 to capture all cores on all nodes available.
Say you got 2 nodes with 40 cores each so 40×2= 80 cores, then mpi_nbnoeud should be 2 and mpi_nbcpu should be 40.
Also, set ncpus to 2 to use two threads on each core.(activating second level parallelism). It won't cost more RAM but runs will speed up by a factor of something like 1.6 .
Finally check you have sufficient RAM .The parameter memory_limit is the max size available to aster per core. So for 80 cores you should have at least 80×10 GB = 800 GB available. I could be wrong in some aspects though.

Regards
Anirudh

#22 Re: Code_Aster usage » [Solved]Advice needed on <EXCEPTION> <FACTOR_78> (DYNA_NON_LINE) » 2020-06-21 16:14:55

@hkboondogle
Please introduce yourself in the introduction section. Your username is not enough to understand what you do.

Regards
Anirudh

#23 Re: Code_Aster usage » Display all the information in the log file. » 2020-05-31 23:36:46

Hi,
I don't think you can do it from code aster supervisor.
However, you can put in a time command at each time step and using time module of python. something like this for a typical python command.
import time
t1=time.time()
python commands...
t2=time.time()

print (t2-t1)

Code aster will print time taken after each of the above commands is executed in the terminal as well as message file.
Note that you have to use PAR_LOT=NON in DEBUT at the start to make this effect which makes code aster interpreter work line by line and this disables command file syntax checking too.

Regards
Anirudh

#24 Re: Code_Aster usage » State of Mortar Contact method in Code Aster » 2020-04-06 05:28:15

Hi,
Thanks for the reply.
3 years is too long. Is there a way to contribute to mortar friction method from me to speed up process? Is there any textual reference that can be used.

Regards
Anirudh Nehra

#25 Re: Code_Aster usage » State of Mortar Contact method in Code Aster » 2020-03-24 15:32:38

Hi,
Thanks for the reply. The CONTINUE LAC method is a mortar method but does not support friction.

Regards
Anirudh