Atom topic feed | site map | contact | login | Protection des données personnelles | Powered by FluxBB | réalisation artaban
You are not logged in.
Hi,
I am trying to install parallel version of Code_Aster from sourceforge. I am not using mercurial.
My $ASTER_ROOT=/home/anirudh/aster_!5.2_mpi
I have already installed sequential 15.2 successfully with PETSC-3.11.4 but when I try to install parallel version 15.4(commit 703...), I get the following error:
/usr/bin/ld: /home/anirudh/aster_libs/petsc-3.11.4/linux-metis-mumps/lib/libpetsc.a(pinit.o): relocation R_X86_64_PC32 against symbol `ompi_mpi_2int' can not be used when making a shared object; recompile with -fPIC
/usr/bin/ld: final link failed: bad value
collect2: error: ld returned 1 exit status
I tried to install using
./waf install -p --jobs=6 -v CFLAGS="-fPIC"
but it doesn't work. How can I give -fPIC flag to it and why doesn't it pull it aoutomatically.
I configured with these options:
source /home/anirudh/aster_15.2_mpi/15.2/share/aster/profile.sh
source $ASTER_ROOT/etc/codeaster/profile.sh (both of above produce same result)
./waf configure --use-config-dir=$ASTER_ROOT/15.2/share/aster --use-config=Ubuntu_gnu_mpi --prefix=$ASTER_ROOT/PAR15.4 --without-hg
this is profile.sh under /home/anirudh/aster_15.2_mpi/15.2/share/aster
# created by waf using data/wscript
# This file set the environment for code_aster
# This configuration is generated by aster-full package.
# keep path to this file
export WAFBUILD_ENV=$(readlink -n -f ${BASH_SOURCE})
# DEVTOOLS_COMPUTER_ID avoids waf to re-source the environment
export DEVTOOLS_COMPUTER_ID=aster_full
# generic environment: compilers, python
export PATH=/usr/bin:${PATH}
export LD_LIBRARY_PATH=/usr/lib/python3.8/config-3.8-x86_64-linux-gnu:${LD_LIBRARY_PATH}
# seems needed for gcc>=9
export LINKFLAGS="-Wl,--no-as-needed"
# custom configuration
export CONFIG_PARAMETERS_addmem=800
# prerequisites paths
export PYPATH_NUMPY="/usr/lib/python3/dist-packages"
export PYPATH_ASRUN="/home/anirudh/aster_15.2_mpi/lib/python3.8/site-packages"
export LIBPATH_HDF5="/home/anirudh/aster_15.2_mpi/public/hdf5-1.10.3/lib"
export INCLUDES_HDF5="/home/anirudh/aster_15.2_mpi/public/hdf5-1.10.3/include"
export LIBPATH_MED="/home/anirudh/aster_15.2_mpi/public/med-4.0.0/lib"
export INCLUDES_MED="/home/anirudh/aster_15.2_mpi/public/med-4.0.0/include"
export LIBPATH_METIS="/home/anirudh/aster_15.2_mpi/public/metis-5.1.0/lib"
export INCLUDES_METIS="/home/anirudh/aster_15.2_mpi/public/metis-5.1.0/include"
export LIBPATH_SCOTCH="/home/anirudh/aster_15.2_mpi/public/scotch-6.0.4/lib"
export INCLUDES_SCOTCH="/home/anirudh/aster_15.2_mpi/public/scotch-6.0.4/include"
export LIBPATH_MUMPS="/home/anirudh/aster_15.2_mpi/public/mumps-5.2.1/lib"
export INCLUDES_MUMPS="/home/anirudh/aster_15.2_mpi/public/mumps-5.2.1/include /home/anirudh/aster_15.2_mpi/public/mumps-5.2.1/include_seq"
export TFELHOME="/home/anirudh/aster_15.2_mpi/public/tfel-3.2.1"
export TFELVERS="3.2.1"
export LIBPATH_MFRONT="/home/anirudh/aster_15.2_mpi/public/tfel-3.2.1/lib"
export INCLUDES_MFRONT="${TFELHOME}/include"
export PYPATH_MFRONT="${TFELHOME}/lib/python3.8/site-packages"
export INCLUDES_BOOST="/usr/lib/x86_64-linux-gnu/include"
export LIBPATH_BOOST="/usr/lib/x86_64-linux-gnu"
export LIB_BOOST="boost_python38"
export LD_LIBRARY_PATH=${LIBPATH_HDF5}:${LIBPATH_MED}:${LIBPATH_METIS}:${LIBPATH_SCOTCH}:${LIBPATH_MUMPS}:${LIBPATH_MFRONT}:${LIBPATH_BOOST}:${LD_LIBRARY_PATH}
export PYTHONPATH=${PYPATH_NUMPY}:${PYPATH_ASRUN}:${PYPATH_MFRONT}:${PYTHONPATH}
# may be needed: gmsh, miss3d, ecrevisse, salome
export PATH=/home/anirudh/aster_15.2_mpi/public/med-4.0.0/bin:/home/anirudh/aster_15.2_mpi/public/homard-11.12:${TFELHOME}/bin:${PATH}
remove_path()
{
# usage: remove_path value pattern1 [pattern2 [...]]
# Returns the 'value' with excluding the given patterns.
# Example of use: export PATH=$(remove_path "${PATH}" ${HOME}/local)
if [ ${#} -lt 2 ]
then
echo ${1}
return
fi
local values=${1}
shift
local i
for i in ${@}
do
values=$(echo ${values} | tr ":" "\n" | grep -v -F ${i} | tr "\n" ":" | sed -e "s%:\+%:%g;s%^:%%g;s%:$%%g")
done
echo ${values}
}
LD_LIBRARY_PATH=/home/anirudh/aster_15.2_mpi/15.2/lib/aster:${LD_LIBRARY_PATH}:.
export LD_LIBRARY_PATH
# Exclude paths to 'python2.7'.
PYTHONPATH=$(remove_path "${PYTHONPATH}" python2.7)
PYTHONPATH=/home/anirudh/aster_15.2_mpi/15.2/lib/aster:${PYTHONPATH}:.
export PYTHONPATH
# sometimes one should not change PYTHONHOME under SALOME environment...
PYTHONHOME=/usr
export PYTHONHOME
# as PYTHONHOME is changed, path to 'python' must preceed all others if a
# subprocess calls it
PATH=/usr/bin:${PATH}
export PATH
ASTER_LIBDIR=/home/anirudh/aster_15.2_mpi/15.2/lib/aster
export ASTER_LIBDIR
ASTER_DATADIR=/home/anirudh/aster_15.2_mpi/15.2/share/aster
export ASTER_DATADIR
ASTER_LOCALEDIR=/home/anirudh/aster_15.2_mpi/15.2/share/locale/aster
export ASTER_LOCALEDIR
ASTER_ELEMENTSDIR=/home/anirudh/aster_15.2_mpi/15.2/lib/aster
export ASTER_ELEMENTSDIR
this is profile.sh located under $ASTER_ROOT/etc/codeaster
# AUTOMATICALLY GENERATED - DO NOT EDIT !
# Put all your changes in profile_local.sh in the same directory
#
# profile.sh : initialize the environment for as_run services
# (sh, ksh, bash syntax)
#
# If variables are depending on Code_Aster version, use ENV_SH in
# the corresponding 'config.txt' file.
#
if [ -z "${ASTER_ROOT}" ]; then
[ -z "${BASH_SOURCE[0]}" ] && here="$0" || here="${BASH_SOURCE[0]}"
ASTER_ROOT=`dirname $(dirname $(dirname $(readlink -f ${here})))`
export ASTER_ROOT
fi
if [ -z "${ASTER_ETC}" ]; then
ASTER_ETC="${ASTER_ROOT}"/etc
if [ "${ASTER_ROOT}" = "/usr" ]; then
ASTER_ETC=/etc
fi
export ASTER_ETC
fi
if [ -z "${PATH}" ]; then
export PATH="$ASTER_ROOT"/bin:"${ASTER_ROOT}"/outils
else
export PATH="${ASTER_ROOT}"/bin:"${ASTER_ROOT}"/outils:"${PATH}"
fi
if [ -z "${LD_LIBRARY_PATH}" ]; then
export LD_LIBRARY_PATH="/usr"/lib
else
export LD_LIBRARY_PATH="/usr"/lib:"${LD_LIBRARY_PATH}"
fi
if [ -z "${PYTHONPATH}" ]; then
export PYTHONPATH="$ASTER_ROOT/lib/python3.8/site-packages"
else
export PYTHONPATH="$ASTER_ROOT/lib/python3.8/site-packages":"${PYTHONPATH}"
fi
export PYTHONEXECUTABLE="/usr/bin/python3"
# this may be required if PYTHONHOME is defined to another location
if [ ! -z "${PYTHONHOME}" ]; then
export PYTHONHOME="/usr"
fi
export WISHEXECUTABLE="/usr/bin/wish"
# define the default temporary directory
# Use profile_local.sh if you need change it!
if [ -z "${ASTER_TMPDIR}" ]; then
export ASTER_TMPDIR=/tmp
fi
# source local profile
if [ -e "${ASTER_ETC}"/codeaster/profile_local.sh ]; then
. "${ASTER_ETC}"/codeaster/profile_local.sh
fi
This is my config output:
./waf configure --use-config-dir=$ASTER_ROOT/15.2/share/aster --use-config=Ubuntu_gnu_mpi --prefix=$ASTER_ROOT/PAR15.4 --without-hg
checking environment... already set: /home/anirudh/aster_15.2_mpi/15.2/share/aster/profile.sh
executing: ./waf.engine configure --use-config-dir=/home/anirudh/aster_15.2_mpi/15.2/share/aster --use-config=Ubuntu_gnu_mpi --prefix=/home/anirudh/aster_15.2_mpi/PAR15.4 --without-hg --out=build/std --jobs=4
Setting top to : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/codeaster-src-703aca508196b74ad104112bfb72a87d1dba78df
Setting out to : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/codeaster-src-703aca508196b74ad104112bfb72a87d1dba78df/build/std
Setting prefix to : /home/anirudh/aster_15.2_mpi/PAR15.4
Searching configuration 'Ubuntu_gnu_mpi'...
Checking for configuration : Ubuntu_gnu_mpi
Checking for 'gcc' (C compiler) : mpicc
Checking for 'g++' (C++ compiler) : mpicxx
Checking for 'gfortran' (Fortran compiler) : mpif90
Checking mpicc package (collect configuration flags) : yes
Checking mpif90 package (collect configuration flags) : yes
Checking for header mpi.h : yes
Checking for C compiler version : gcc 9.3.0
Checking for Fortran compiler version : gfortran 9.3.0
fortran link verbose flag : -v
Checking for OpenMP flag -fopenmp : yes
Checking for program 'python' : /usr/bin/python3
Checking for python version >= 3.5.0 : 3.8.5
python-config : /usr/bin/python3-config
Asking python-config for pyembed '--cflags --libs --ldflags --embed' flags : yes
Testing pyembed configuration : yes
Asking python-config for pyext '--cflags --libs --ldflags' flags : yes
Testing pyext configuration : yes
Checking for python module 'numpy' : 1.17.4
Checking for numpy include : ['/usr/lib/python3/dist-packages/numpy/core/include']
Checking for python module 'asrun' : 2020.0
Checking for python module 'mpi4py' : not found
Getting platform : ASTER_PLATFORM_LINUX64
Checking for library m : yes
Checking for library z : yes
Checking for number of cores : 6
Checking for library openblas : yes
Checking for library superlu : yes
Setting libm after files : nothing done
Checking for library hdf5 : yes
Checking for library z : yes
Checking for header hdf5.h : yes
Checking hdf5 version : 1.10.3
Checking for API hdf5 v18 : default v18
Checking size of hid_t integers : 8
Checking for library med : yes
Checking for library stdc++ : yes
Checking for header med.h : yes
Checking size of med_int integers : 4
Checking size of med_idt integers : 8
Checking med version : 4.0.0
Checking for python module 'med' : not found
Checking for python module 'medcoupling' : not found
Checking for library metis : yes
Checking for header metis.h : yes
Checking metis version : 5.1.0
Checking for library parmetis : yes
Checking parmetis version : 4.0.3
Checking for smumps_struc.h : yes
Checking for dmumps_struc.h : yes
Checking for cmumps_struc.h : yes
Checking for zmumps_struc.h : yes
Checking for mpif.h : yes
Checking mumps version : 5.2.1
Checking size of Mumps integer : 4
Checking for library dmumps : yes
Checking for library zmumps : yes
Checking for library smumps : yes
Checking for library cmumps : yes
Checking for library mumps_common : yes
Checking for library pord : yes
Checking for library metis : yes
Checking for library scalapack : yes
Checking for library openblas : yes
Checking for library esmumps : yes
Checking for library scotch : yes
Checking for library scotcherr : yes
Checking for header scotch.h : yes
Checking scotch version : 6.0.4
Checking for library ['esmumps', 'scotch', 'scotcherr', 'ptscotch', 'ptscotcherr'] : yes
Checking for library petsc : yes
Checking for library HYPRE : yes
Checking for library ml : yes
Checking for header petsc.h : yes
Checking for header petscconf.h : yes
Checking petsc version : 3.11.4p0
Checking size of PETSc integer : 4
Checking value of ASTER_PETSC_64BIT_INDICES : no
Checking value of ASTER_PETSC_HAVE_ML : 1
Checking value of ASTER_PETSC_HAVE_HYPRE : 1
Checking value of ASTER_PETSC_HAVE_SUPERLU : no
Checking value of ASTER_PETSC_HAVE_MUMPS : 1
Checking for python module 'petsc4py' : not found
Reading build preferences from ~/.hgrc : not found
Checking for 'gfortran' (Fortran compiler) : mpif90
Compiling a simple fortran app : yes
Detecting whether we need a dummy main : yes main
Checking for fortran option : yes (-fdefault-double-8 -fdefault-integer-8 -fdefault-real-8)
Checking for fortran option : yes (-Wimplicit-interface)
Checking for fortran option : yes (-Wintrinsic-shadow)
Checking for fortran option : yes (-fno-aggressive-loop-optimizations)
Checking for fortran option : yes (-ffree-line-length-none)
Setting fortran compiler flags : ['-fPIC', '-fdefault-double-8', '-fdefault-integer-8', '-fdefault-real-8', '-Wimplicit-interface', '-Wintrinsic-shadow', '-fno-aggressive-loop-optimizations', '-ffree-line-length-none']
Getting fortran mangling scheme : ok ('_', '', 'lower-case')
Checking size of integer4 : 4
Checking the matching C type : int
Checking size of default integer : 8
Checking the matching C type : long
Checking size of logical : 1
Checking size of simple precision real : 4
Checking the matching C type : float
Checking size of double precision real : 8
Checking the matching C type : double
Checking size of double complex : 16
Setting type for fortran string length : unsigned int
Setting size of blas/lapack integers : 4
Checking size of MPI_Fint integers : 4
Checking fpp stringify using # : no
Checking fpp stringify using "" : yes
Checking compilation with long lines : yes
Check for backtrace feature : yes
Check for tracebackqq feature : no
Getting code_aster version : [(0, 0, 1), 'n/a', 'n/a', '20/07/2021', 'n/a', 1]
Checking for 'g++' (C++ compiler) : mpicxx
Checking for compiler flags -std=c++11 : yes
Checking for library stdc++ : yes
Checking size of C++ bool : 1
Checking for program 'dpkg-architecture' : /usr/bin/dpkg-architecture
Checking boost includes : 1.71.0
Checking boost libs : ok
Checking for boost linkage : ok
Checking for 'gcc' (C compiler) : mpicc
Getting C compiler flags : ['-fPIC']
Check for msgfmt programs : ['/usr/bin/msgfmt']
Check for xgettext programs : ['/usr/bin/xgettext']
Check for msgmerge programs : ['/usr/bin/msgmerge']
Check for lrelease programs : not found
Set parameters for 'config.json' : done
. use 'tmpdir' : /tmp
. use 'addmem' : 800
. use 'python' : python3
. use 'python_interactive' : python3
. use 'mpiexec' : mpiexec -n {mpi_nbcpu} --tag-output {program}
. use 'mpi_get_rank' : echo ${OMPI_COMM_WORLD_RANK}
Checking measure of VmSize during MPI_Init : ok (162116 kB)
Checking for program 'gmsh' : /home/anirudh/aster_libs/gmsh-4.8.4-Linux64/bin/gmsh
Checking for program 'salome' : not found
Checking for program 'salome' : not found
Checking for program 'run_miss3d' : not found
Checking for program 'run_miss3d' : not found
Checking for program 'homard' : /home/anirudh/aster_15.2_mpi/public/homard-11.12/ASTER_HOMARD/homard
Checking for program 'ecrevisse' : not found
Checking for program 'ecrevisse' : not found
Checking for program 'mfront' : not found
Checking for program 'mfront' : not found
Checking for program 'xmgrace' : /usr/bin/xmgrace
Checking for program 'gracebat' : /usr/bin/gracebat
Checking for program 'mdump' : /home/anirudh/aster_15.2_mpi/public/med-4.0.0/bin/mdump
Checking for 'data' repository : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/data not found
Store execution environment : yes
Build list of testcases : done
Checking for program 'dot' : not found
Checking for 'validation' repository : /home/anirudh/Desktop/extracts/code_aster_15.4_tip/validation not found
Setting C debug flags : ['-g', '-O0']
Setting C optimization flags : ['-O2']
Setting C++ debug flags : ['-g', '-O0']
Setting C++ optimization flags : ['-O2']
Setting fortran debug flags : ['-g', '-O0']
Checking loop optimization with LOC : VOLATILE is required
Checking implicit loop in write : '-fno-frontend-optimize' is required
Getting fortran optimization flags : ['-O2', '-fno-frontend-optimize']
Write config file : debug/asterf_config.h
Write config file : debug/asterc_config.h
Write config file : debug/aster_config.py
Write config file : release/asterf_config.h
Write config file : release/asterc_config.h
Write config file : release/aster_config.py
Another issue is that I am continuously using disk space in above installation trials upto 4 GBs lost. Could someone tell how to free that space and remove tmps generated by above script.
Please could someone help in this regard.
Thanks
Anirudh Nehra
Last edited by Anirudh (2021-07-21 16:02:33)
Offline
Also, the asterbibc.log
Offline
Hi,
could someone help me.
I always get this error:
/usr/bin/ld: /home/anirudh/aster_libs/petsc-3.11.4/linux-metis-mumps/lib/libpetsc.a(pinit.o): relocation R_X86_64_PC32 against symbol `ompi_mpi_2int' can not be used when making a shared object; recompile with -fPIC
/usr/bin/ld: final link failed: bad value
collect2: error: ld returned 1 exit status
I tried:
export LINKFLAGS=-shared
export CFLAGS=-fPIC
but it doesnot work.
How to specify additional flages to waf during build?
Thanks and Regards
Anirudh
Last edited by Anirudh (2021-07-20 17:49:00)
Offline
Hello,
Greatly appreciated if someone could help with this error.
Regards
Anirudh
Offline
Hello,
if it is of any help: I think there is something wrong with your installation of PETSc. Did you follow the guide of Hitoricae (I saw a filename like in the description below)? I remember, there is a lot to edit when installing PETSc (see link below under PETSc).
I do not know if this
h_ttps://hitoricae.com/2020/10/31/code_aster-14-6-parallel-version-with-petsc/
is still usable with 15.4.
Do you have another (working) installation of an MPI-version? Might be better to start with a 'clean' system. I, personally, will wait for the full working package of 15.4 with all the correct prerequisites and try to build the MPI-version. Sadly, there is no support from EDF for people like you and me. Also, there are people making money with installations of the MPI-version, so it is very unlikely to get help.
Mario.
Last edited by mf (2021-07-22 09:05:29)
Offline
Hi,
first of all I have the greatest respect for your project.
I was thinking about the error message:
"... can not be used when making a shared object ..."
In my opinion this means PETSC needs a "distributed memory object" not a "shared memory object".
Distributed memory objects are needed for mpi parallelization. In contrary
Shared memory objects are needed for openMP parallelization.
I know the mumps solver can deal with both objects, but Im not sure that PETSC can do the same.
So one hint is to comment out the PETSC Solver. Or try to compile PETSC solver for both memory objects...
Good Luck. !!! Pleace let me know when you have any progress in your project. Thank you.
Volker
Last edited by Volker (2021-07-22 12:52:35)
Offline
METIS produces a shared-memory-object and
SCOTCH produces in my opinion both or at least also one distributed-memory-object ...
Offline
Hi,
Yes you are right. The problem was PETSC needs to be shared memory object.
I was able to make it to 100% but there is again a problem. The problem is :
stdout:
stderr: Traceback (most recent call last):
File "/home/anirudh/Desktop/extracts/code_aster_mpi_15.2/aster-full-src-15.2.0/SRC/aster-15.2.0/build/std/release/catalo/fort.1", line 1, in <module>
from code_aster.Commands import DEBUT, MAJ_CATA, FIN
File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Commands/__init__.py", line 30, in <module>
from ..Supervis import CO
File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Supervis/__init__.py", line 27, in <module>
from .CommandSyntax import CommandSyntax
File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Supervis/CommandSyntax.py", line 61, in <module>
from ..Objects import DataStructure
File "/home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/code_aster/Objects/__init__.py", line 27, in <module>
from libaster import *
ImportError: /home/anirudh/aster_15.2_mpi/PAR15.2/lib/aster/libbibcxx.so: undefined symbol: isdeco_
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[36661,1],0]
Exit code: 1
I attach the screenshot too.
Please help.
Regards
Anirudh
Offline
Hi,
Could someone provide a clue to what's going wrong?
May I need to upgrade gfortran and gcc.
Regards
Anirudh
Offline
Hi,
I tried again but ow I am getting this error:
Waf: Leaving directory `/home/anirudh/Desktop/extracts/code_aster_15.4_tip/codeaster-src-703aca508196b74ad104112bfb72a87d1dba78df/build/std/release'
+ build the elements catalog elem.1 using installed aster (from cata_ele.ojb)
stdout: MPI_Init...
calling MPI_Init...
stderr: [anirudh-Inspiron-7591:51568] *** Process received signal ***
[anirudh-Inspiron-7591:51568] Signal: Segmentation fault (11)
[anirudh-Inspiron-7591:51568] Signal code: Address not mapped (1)
[anirudh-Inspiron-7591:51568] Failing at address: 0x90
[anirudh-Inspiron-7591:51568] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x46210)[0x7f4bf924c210]
[anirudh-Inspiron-7591:51568] [ 1] /lib/x86_64-linux-gnu/libblas.so.3(cblas_sdot+0x24)[0x7f4bf08284a4]
[anirudh-Inspiron-7591:51568] [ 2] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x3e925)[0x7f4be61b0925]
[anirudh-Inspiron-7591:51568] [ 3] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x237ff7)[0x7f4be63a9ff7]
[anirudh-Inspiron-7591:51568] [ 4] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xeb997)[0x7f4be625d997]
[anirudh-Inspiron-7591:51568] [ 5] /usr/lib/python3/dist-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xe6d28)[0x7f4be6258d28]
[anirudh-Inspiron-7591:51568] [ 6] /usr/bin/python3[0x504939]
[anirudh-Inspiron-7591:51568] [ 7] /usr/bin/python3(_PyEval_EvalFrameDefault+0x906)[0x56acb6]
[anirudh-Inspiron-7591:51568] [ 8] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [ 9] /usr/bin/python3(_PyEval_EvalFrameDefault+0x72f)[0x56aadf]
[anirudh-Inspiron-7591:51568] [10] /usr/bin/python3(_PyEval_EvalCodeWithName+0x26a)[0x568d9a]
[anirudh-Inspiron-7591:51568] [11] /usr/bin/python3(PyEval_EvalCode+0x27)[0x68cdc7]
[anirudh-Inspiron-7591:51568] [12] /usr/bin/python3[0x5ff5d4]
[anirudh-Inspiron-7591:51568] [13] /usr/bin/python3[0x5c3cb0]
[anirudh-Inspiron-7591:51568] [14] /usr/bin/python3(PyVectorcall_Call+0x58)[0x5f2168]
[anirudh-Inspiron-7591:51568] [15] /usr/bin/python3(_PyEval_EvalFrameDefault+0x6552)[0x570902]
[anirudh-Inspiron-7591:51568] [16] /usr/bin/python3(_PyEval_EvalCodeWithName+0x26a)[0x568d9a]
[anirudh-Inspiron-7591:51568] [17] /usr/bin/python3(_PyFunction_Vectorcall+0x393)[0x5f5b33]
[anirudh-Inspiron-7591:51568] [18] /usr/bin/python3(_PyEval_EvalFrameDefault+0x57d7)[0x56fb87]
[anirudh-Inspiron-7591:51568] [19] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [20] /usr/bin/python3(_PyEval_EvalFrameDefault+0x906)[0x56acb6]
[anirudh-Inspiron-7591:51568] [21] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [22] /usr/bin/python3(_PyEval_EvalFrameDefault+0x72f)[0x56aadf]
[anirudh-Inspiron-7591:51568] [23] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [24] /usr/bin/python3(_PyEval_EvalFrameDefault+0x72f)[0x56aadf]
[anirudh-Inspiron-7591:51568] [25] /usr/bin/python3(_PyFunction_Vectorcall+0x1b6)[0x5f5956]
[anirudh-Inspiron-7591:51568] [26] /usr/bin/python3[0x5f34b1]
[anirudh-Inspiron-7591:51568] [27] /usr/bin/python3(_PyObject_CallMethodIdObjArgs+0x198)[0x5f3938]
[anirudh-Inspiron-7591:51568] [28] /usr/bin/python3(PyImport_ImportModuleLevelObject+0x822)[0x551f52]
[anirudh-Inspiron-7591:51568] [29] /usr/bin/python3[0x4f6cb8]
[anirudh-Inspiron-7591:51568] *** End of error message ***
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 0 with PID 0 on node anirudh-Inspiron-7591 exited on signal 11 (Segmentation fault)
Could someone help with this error?
Merci beaucoup
Anirudh
Offline
Dear Anirudh,
I am a new code aster user,
I was trying to compile from source on Ubuntu 20.04
but I see there are no official instructions to follow.
I was watching the videos by Cyprien Rusu on youtube
but I think the procedure has changed.
Currently I have a problem with python, when I use python3.6 I cannot install tfel
and when I use python3.8 I cannot install aster.
Also in the youtube video the tfel installation fails
I will need to use MFront, therefore I need tfel compiled properly.
Could you please share the procedure you used for the sequential installation on Ubuntu 20.04 from source?
Thank you very much in advance
Best Regards,
Nicolò
Offline
I attach the installation instructions I followed
Offline
Another set of installation instructions that were unsuccessful
Offline
Hi
I installed on version 15.2 sequential, installation of which is easy. Just download code aster stable and python3 setup.py.
I am yet not able to install 15.4 sequential version.
Regards
Anirudh
Offline
Hi Anirudh,
for which applications and tasks is the PETSC solver compulsory for you?
Can you live also with the mumps solver (for openMP and mpi calculations) together with MULT_FRONT solver (only for openMP calculation)?
Greetings Volker
Offline
Where can i download the 15.4 version ?
Last edited by ing.nicola (2021-08-10 13:55:50)
Offline