site map | contact | login | Protection des données personnelles | Powered by FluxBB | réalisation artaban
You are not logged in.
Dear AsterO'dactyle,
Thank you for your answer. My question was not meant to criticize anything, it was just a genuine question on the "modern developments", as you said. Where are they posted and released? Is the website to be considered outdated? Where should a generic user look for news, documentation and official versions?
About the forum I completely agree with you, it is made by enthusiasts and experts in their free time. So the next question is: how should a user (an academic one, in my case) look for more structured support? Is it possible in Code_Aster, for instance in form of formal agreement or paid support?
I am sorry for the long list of questions, but I would really like to understand.
Kind regards,
Corrado
Hello there,
I have been using Code_Aster since 2009. I love the software, and I especially love the documentation coming with it, which allows everyone to understand the theory behind it. For some time, I have used the Windows version; then I went back to Linux systems and, I must say, all the hassle coming from the installation.
I have noticed that the downloading page has not been updated since 2020 (version 15.2), while, scrolling the forum, I got in touch with this gitlab repository:
https /gitlab.com/codeaster-opensource-documentation
where it seems that many changes have occurred since 2020 (Code Aster seems mainly moved to containers; more close relationship with Salome Meca; Python API and new architecture), which are, at once, a bit confusing. It seems strange that there is no mention of all this in the official website. So my basic questions are: is the website still maintained? Has Code Aster been forked by a new team? Where should a generic user look for news, documentation and official versions?
Thank you very much for any kind answer.
Corrado
mess files
Input and result files are at the link "drive google com/drive/folders/1y9RUTvllAZnwOIlHIv4WGgEf7eWfir_K?usp=sharing" (add . in place of the spaces)
Screen of the second machine.
Dear All,
I have noticed a strange behaviour with POURSUITE when running the same project in two different machines. The resulting rmed files are different, even though nothing strange appears looking at the mess files.
I attach the files and two screens of the two resulting files.
I hope anyone can help me.
Kind regards,
Corrado
Dear jlucas, thank you for you reply. As I said, this does not work even though the operation I kill is within a try-except block...
Dear All,
I would like to know if there is a way to write a base folder upon killing a job. I have noticed that CTRL+C kills the job, allows for writing any output file (at least .mess and .csv), but I could not be successful in writing the base folder which could help me restart the analysis.
My .comm file ends with:
FIN(RETASSAGE='OUI',)
and my .export file has the following line:
R base test.base R 0
If the analysis goes to the end, the base folder is written; if I kill it this does not happen, even though the operation I kill is within a try-except section (which, in my understanding, should bypass the error raised by the killing command and lead the analysis to the FIN command).
Any help?
Thank you and best regards.
Dear Paolo,
Yes it is indeed strange. Maybe you need to double check your boundary conditions? If you are applying a traction in x on the yzp plane, you should fix the yzn plane in x plus block any rigid motion (for instance, in addition n1 should be blocked in xyz and n4 should be blocked in z). All other BC may jeopardise the expected results.
Best regards,
Corrado
Dear Mario,
I analysed the file you sent. It is true that that the integral evaluated by POST_ELEM does not correspond to the total volume, but still the sum (including sign) of the W components in field provides the right total volume. Also, if I transform your quadratic mesh into a linear mesh, the weight of the single Gauss Point of each element contains the element volume, and summing them up I again obtain 8E6, which is the total volume of the element (200x200x200).
If I however project the field onto the nodes:
VOLCH=CREA_CHAMP(TYPE_CHAM='NOEU_GEOM_R',
OPERATION='DISC',
MODELE=model,
CHAM_GD=field,
);
the sum of the weights at each node does not correspond to the total volume.
Any idea why this projection disrupts the volume evaluation?
Hello,
I re-open this old post to ask a question on the use of Python.
My question is about the visibility of the objects created in a Python function.
Let us consider a function which creates a field:
def function1():
Y0CH=CREA_CHAMP(TYPE_CHAM='NOEU_NEUT_R',
OPERATION='AFFE',
MODELE=MODEL,
AFFE= _F(
GROUP_MA='all',
NOM_CMP= ('X1','X2','X3'),
VALE = (0.0, 0.0, 0.0),
),
);
How can I access Y0CH (including destroying it!) outside the function?
- Option 1 (regular Python): the object is not visible, unless I return it at the end of the function with return Y0CH
- Option 2 (similar to INCLUDE operator): the object is already visible, I can access it with its name without returning it
- Option 3: the object can be returned, but it will be a shallow copy. Any action on the object outside the function (including destroying it) will not change the original object, which maintains the same name as in the function. This means that it is not possible to re-use that name, because it is not possible to destroy it with DETRUIRE.
From the tests I am doing, my understanding is that the correct answer is the last one, which basically prevents the use of function to create objects that need to be destroyed.
Can anyone explain me better this point?
Thank you and best regards,
Corrado
Thank you, this is helpful. However, in this way I obtain the total volume of the mesh group, while I would like the volume of each element. I think it is not feasible to create a GROUP_MA for each element and run the POST_ELEM command for each of them. Do you think there is a workaround?
Dear Mario,
Thank you for your reply. It occurs to me that the "weights" I mentioned are the elementary volumes corresponding to each Gauss point within the element, the sum of which (should) return the total volume of the element (I am not interested in mass, just the volume). Hence, it makes perfectly sense that altering the density does not affect this value.
My question is if there is a way to have a table or a cham_elem containing the individual volumes of the elements. In other words, if my supposition above is correct, I would only need to sum up the values I have extracted for the GPs in an element, for each element. Do you know how to do so?
Thank you,
Corrado
Dear All,
Any idea on the topic?
Thank you and best regards,
Corrado
Dear All,
I would like to extract the volume of each element in the mesh in a CHAMP. So far, I have learnt how to extract the weights at each Gauss point with:
MCHAM = CALC_CHAM_ELEM (
MODELE=MODEL,
OPTION='COOR_ELGA',
GROUP_MA = 'my_group',
);
but I do not know how to sum them up.
Thank you in advance for any help,
Corrado
Dear All,
I would like to know how to calibrate the parameter D_SIGM_EPSI in ENDO_ISOT_BETON, based on more physical parameters (Fracture energy for instance). From my understanding, this parameter governs the mechanical behaviour in terms of stress-strain, instead of stress-displacement, from which my uncertainty.
I also tried NON-LOCAL feature, but, as it requires quadratic elements, it is not compatible with IMPLEX method. Can anyone confirm this?
Thank you and best regards,
Corrado
Dear All,
I am trying to perform a push-over analysis on a 3D solid structure, modelled with BETON_DOUBLE_DP material. The evolution of the analysis is controlled by "LONG_ARC" pilotage.
At some point of the analysis, I get the error:
<Erreur> Échec in control
And even subdividing the time step or increasing the COEF_MULT of my analysis I cannot go forward.
Can anyone explain me what it exactly means and how I can overcome it?
Thinking it was due to ill-conditioned tangent matrix, in case of lack of convergence at the step (with substepping) I added a further switch to Modified Newton Raphson with initial elastic matrix and increased COEF_MULT, but apparently it does not solve the problem. I attach the .mess file.
Thank you for your help.
Corrado
Thank you for your reply, but also:
Checking for liblapack.a... /galileo/prod/opt/libraries/lapack/3.6.1/gnu--6.1.0/lib/liblapack.a
Checking for libblas.a... /galileo/prod/opt/libraries/blas/3.6.0/gnu--6.1.0/lib/libblas.a
Dear all,
I am trying to compile Code Aster in CentOS (HPC). I cannot get past med installation:
--------------------------------------------------------------------------------
Code_Aster Setup version 14.4.0-1
Copyright (c) 2001-2019 EDF R&D - http://www.code-aster.org
--------------------------------------------------------------------------------
--------------------------------------------------------------------------------
Command line :
/cineca/prod/opt/compilers/python/3.6.4/none/bin/python3.6 setup.py --nocache --prefix=/galileo/home/userexternal/cchisari/bin/aster install hdf5 med
--------------------------------------------------------------------------------
--------------------------------------------------------------------------------
Reading config file '/galileo/home/userexternal/cchisari/bin/aster-full-src-14.4.0/setup.cfg'...
ASTER_ROOT (from cfg) : '/opt/aster'
PREFER_COMPILER (from cfg) : 'GNU'
MAXDEPTH (from cfg) : 5
USE_LOCATE (from cfg) : False
PREFER_SHARED_LIBS (from cfg) : False
--------------------------------------------------------------------------------
ASTER_ROOT (from arguments) : /galileo/home/userexternal/cchisari/bin/aster
--------------------------------------------------------------------------------
Installation on :
Kernel on an
Linux r033c01s03 3.10.0-693.21.1.el7.x86_64 #1 SMP Wed Mar 7 19:03:37 UTC 2018 x86_64
--------------------------------------------------------------------------------
--------------------------------------------------------------------------------
Checking for max command length... 32768.0
Checking for file... /usr/bin/file
Checking for ar... /usr/bin/ar
Checking for architecture... Linux / posix / x86_64
Checking for number of processors (core)... 36 (will use: make -j 8)
Checking for Code_Aster platform type... LINUX64
Checking for bash... /usr/bin/bash
Checking for Python version... 3.6.4
Checking for python3.6... no
Checking for libpython3.6.so... /galileo/prod/opt/compilers/python/3.6.4/none/lib/libpython3.6.so
Checking for Python.h... /usr/include/python2.7/Python.h
Checking for gcc... /usr/bin/gcc
Checking for /usr/bin/gcc configured installation directory... /usr/lib/gcc/x86_64-redhat-linux/4.8.5, /usr/bin
Checking for libpthread.so... /usr/lib/x86_64-redhat-linux6E/lib64/libpthread.so
Checking for libz.so... /usr/lib64/libz.so
Checking for libdl.so... /usr/lib/x86_64-redhat-linux6E/lib64/libdl.so
Checking for libutil.so... /usr/lib/x86_64-redhat-linux6E/lib64/libutil.so
Checking for libm.so... /usr/lib/x86_64-redhat-linux6E/lib64/libm.so
--------------------------------------------------------------------------------
Checking for default compiler (for all products)...
Checking for GNU compilers...
Checking for gcc... /usr/bin/gcc
Checking for compiler version... gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-16)
Checking for g++... /usr/bin/g++
Checking for compiler version... g++ (GCC) 4.8.5 20150623 (Red Hat 4.8.5-16)
Checking for gfortran... /usr/bin/gfortran
Checking for compiler version... GNU Fortran (GCC) 4.8.5 20150623 (Red Hat 4.8.5-16)
Checking for pthread... -L/usr/lib/x86_64-redhat-linux6E/lib64 -lpthread (already found)
Checking for z... -L/usr/lib64 -lz (already found)
Checking for liblapack.a... /galileo/prod/opt/libraries/lapack/3.6.1/gnu--6.1.0/lib/liblapack.a
Checking for libblas.a... /galileo/prod/opt/libraries/blas/3.6.0/gnu--6.1.0/lib/libblas.a
Checking for libstdc++.so... /usr/lib/gcc/x86_64-redhat-linux/4.8.2/libstdc++.so
<E>_ABNORMAL_ABORT error reading profile : /tmp/system.36354/temp.opt_env
Checking for CC (/usr/bin/gcc) supports '-fno-stack-protector' option... yes
Checking for fortran program if the gcc bug #51267 is fixed (using VOLATILE)... no
Checking for F90 (/usr/bin/gfortran) supports '-fno-tree-dse' option... yes
Checking for fortran program if the gcc bug #51267 is fixed with -fno-tree-dse option... no
---------- ERROR MESSAGE ----------
/usr/lib/gcc/x86_64-redhat-linux/4.8.2/libstdc++.so: undefined reference to `memcpy@GLIBC_2.14'
/usr/lib/gcc/x86_64-redhat-linux/4.8.2/libgfortran.so: undefined reference to `clock_gettime@GLIBC_2.17'
/usr/lib/gcc/x86_64-redhat-linux/4.8.2/libgfortran.so: undefined reference to `secure_getenv@GLIBC_2.17'
collect2: error: ld returned 1 exit status
-------------------------------------------------------------------------------
WARNING :
The fortran test program checking the LOC function with a loop failed.
Reasons :
- it is known to fail using GNU Fortran 4.6.1 (and may be other releases)
but it should be fixed using '-fno-tree-dse option'.
Code_Aster will be compiled without error but will be unusable!
You must choose another compiler or change the optimization level.
You can cancel now or make the changes later in the config.txt file of
Code_Aster and rebuild it.
-------------------------------------------------------------------------------
Checking for C/fortran program using blas/lapack... no
---------- ERROR MESSAGE ----------
/usr/lib/gcc/x86_64-redhat-linux/4.8.2/libstdc++.so: undefined reference to `memcpy@GLIBC_2.14'
/usr/lib/gcc/x86_64-redhat-linux/4.8.2/libgfortran.so: undefined reference to `clock_gettime@GLIBC_2.17'
/usr/lib/gcc/x86_64-redhat-linux/4.8.2/libgfortran.so: undefined reference to `secure_getenv@GLIBC_2.17'
collect2: error: ld returned 1 exit status
-------------------------------------------------------------------------------
WARNING :
The C/fortran test program calling blas and lapack subroutines failed.
Reasons :
- unable to find suitable C/fortran compilers
- blas/lapack libraries (or required by them) missing
- incorrect compilation options
Nevertheless the compilation of Code_Aster may work !
If it failed, you must help the setup by setting CC, CFLAGS, MATHLIB...
-------------------------------------------------------------------------------
Checking for GNU compilers... yes
Checking for global values...
Compiler variables (set as environment variables):
export CC='/usr/bin/gcc'
export CFLAGS='-O2 -fno-stack-protector -fPIC'
export CFLAGS_DBG='-g -fno-stack-protector -fPIC'
export CFLAGS_OPENMP='-fopenmp'
export CXX='/usr/bin/g++'
export CXXLIB='-L/usr/lib/gcc/x86_64-redhat-linux/4.8.2 -lstdc++'
export DEFINED='LINUX64 _USE_OPENMP'
export F90='/usr/bin/gfortran'
export F90FLAGS='-O2 -fPIC -fno-tree-dse'
export F90FLAGS_DBG='-g -fPIC'
export F90FLAGS_I8=' -fdefault-double-8 -fdefault-integer-8 -fdefault-real-8'
export F90FLAGS_OPENMP=' -fopenmp'
export FFLAGS_I8=' -fdefault-double-8 -fdefault-integer-8 -fdefault-real-8'
export LD='/usr/bin/gfortran'
export LDFLAGS_OPENMP=' -fopenmp'
export MATHLIB='/galileo/prod/opt/libraries/lapack/3.6.1/gnu--6.1.0/lib/liblapack.a /galileo/prod/opt/libraries/blas/3.6.0/gnu--6.1.0/lib/libblas.a'
export OTHERLIB='-L/usr/lib/x86_64-redhat-linux6E/lib64 -lpthread -L/usr/lib64 -lz'
# Environment settings :
--------------------------------------------------------------------------------
Checking for ps... /usr/bin/ps
Checking for xterm... /usr/bin/xterm
Checking for nedit... no
Checking for geany... no
Checking for gvim... no
Checking for gedit... /usr/bin/gedit
Checking for gdb... /usr/bin/gdb
Checking for ddd... no
Checking for flex... /usr/bin/flex
Checking for ranlib... /usr/bin/ranlib
Checking for bison... /usr/bin/bison
Checking for cmake... /galileo/prod/opt/tools/cmake/3.12.0/none/bin/cmake
--------------------------------------------------------------------------------
Checking for host name... r033c01s03
Checking for network domain name... galileo.cineca.it
Checking for full qualified network name... r033c01s03.galileo.cineca.it
--------------------------------------------------------------------------------
Checking for dependencies and required variables for '__main__'... [ OK ]
--------------------------------------------------------------------------------
Checking for dependencies and required variables for '__cfg__'... [ OK ]
Filling cache... [ OK ]
--------------------------------------------------------------------------------
Check if found values seem correct. If not you can change them using 'setup.cfg'.
--------------------------------------------------------------------------------
Compiler variables for hdf5 (set as environment variables):
export CC='/usr/bin/gcc'
export CFLAGS='-O2 -fno-stack-protector -fPIC'
export CFLAGS_DBG='-g -fno-stack-protector -fPIC'
export CFLAGS_OPENMP='-fopenmp'
export CXX='/usr/bin/g++'
export CXXLIB='-L/usr/lib/gcc/x86_64-redhat-linux/4.8.2 -lstdc++'
export DEFINED='LINUX64 _USE_OPENMP'
export F90='/usr/bin/gfortran'
export F90FLAGS='-O2 -fPIC -fno-tree-dse'
export F90FLAGS_DBG='-g -fPIC'
export F90FLAGS_I8=' -fdefault-double-8 -fdefault-integer-8 -fdefault-real-8'
export F90FLAGS_OPENMP=' -fopenmp'
export FFLAGS_I8=' -fdefault-double-8 -fdefault-integer-8 -fdefault-real-8'
export LD='/usr/bin/gfortran'
export LDFLAGS_OPENMP=' -fopenmp'
export MATHLIB='/galileo/prod/opt/libraries/lapack/3.6.1/gnu--6.1.0/lib/liblapack.a /galileo/prod/opt/libraries/blas/3.6.0/gnu--6.1.0/lib/libblas.a'
export OTHERLIB='-L/usr/lib/x86_64-redhat-linux6E/lib64 -lpthread -L/usr/lib64 -lz'
# Environment settings :
--------------------------------------------------------------------------------
Checking for dependencies and required variables for 'hdf5'... [ OK ]
--------------------------------------------------------------------------------
Installation of : hdf5 1.10.3
HDF5 is a Hierarchical Data Format product consisting of a data format
specification and a supporting library implementation. HDF5 is designed to
address some of the limitations of the older HDF product and to address current
and anticipated requirements of modern systems and applications.
Archive filename : hdf5-1.10.3
Destination : /galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3
Working directory : /tmp/install_hdf5.36354
--------------------------------------------------------------------------------
Filling cache... [ OK ]
Checking permissions... [ OK ]
>>> Extraction <<<
entering directory '/tmp/install_hdf5.36354'
Extracting hdf5-1.10.3.tar.gz... [ OK ]
--- 3206 files extracted
leaving directory '/tmp/install_hdf5.36354'
>>> Configuration <<<
entering directory '/tmp/install_hdf5.36354/hdf5-1.10.3'
Command line : unset LD ; CFLAGS=-std=gnu9x ./configure --prefix=/galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3
configure hdf5 installation...
Command output :
configure hdf5 installation... [ OK ]
leaving directory '/tmp/install_hdf5.36354/hdf5-1.10.3'
>>> Building the product <<<
entering directory '/tmp/install_hdf5.36354/hdf5-1.10.3'
Command line : make -j 8
compiling hdf5...
Command output :
compiling hdf5... [ OK ]
leaving directory '/tmp/install_hdf5.36354/hdf5-1.10.3'
>>> Installation <<<
entering directory '/tmp/install_hdf5.36354/hdf5-1.10.3'
Command line : make install
installing hdf5 to /galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3...
Command output :
installing hdf5 to /galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3...
[ OK ]
leaving directory '/tmp/install_hdf5.36354/hdf5-1.10.3'
>>> Installation <<<
entering directory '/galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3'
leaving directory '/galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3'
>>> Clean temporary objects <<<
entering directory '/tmp/install_hdf5.36354'
deleting /tmp/install_hdf5.36354/hdf5-1.10.3... [ OK ]
deleting /tmp/install_hdf5.36354...
leaving directory '/tmp/install_hdf5.36354'
Filling cache... [ OK ]
--------------------------------------------------------------------------------
Installation of hdf5 1.10.3 successfully completed
--------------------------------------------------------------------------------
--------------------------------------------------------------------------------
Compiler variables for med (set as environment variables):
export CC='/usr/bin/gcc'
export CFLAGS='-O2 -fno-stack-protector -fPIC'
export CFLAGS_DBG='-g -fno-stack-protector -fPIC'
export CFLAGS_OPENMP='-fopenmp'
export CXX='/usr/bin/g++'
export CXXLIB='-L/usr/lib/gcc/x86_64-redhat-linux/4.8.2 -lstdc++'
export DEFINED='LINUX64 _USE_OPENMP'
export F90='/usr/bin/gfortran'
export F90FLAGS='-O2 -fPIC -fno-tree-dse'
export F90FLAGS_DBG='-g -fPIC'
export F90FLAGS_I8=' -fdefault-double-8 -fdefault-integer-8 -fdefault-real-8'
export F90FLAGS_OPENMP=' -fopenmp'
export FFLAGS_I8=' -fdefault-double-8 -fdefault-integer-8 -fdefault-real-8'
export LD='/usr/bin/gfortran'
export LDFLAGS_OPENMP=' -fopenmp'
export MATHLIB='/galileo/prod/opt/libraries/lapack/3.6.1/gnu--6.1.0/lib/liblapack.a /galileo/prod/opt/libraries/blas/3.6.0/gnu--6.1.0/lib/libblas.a'
export OTHERLIB='-L/usr/lib/x86_64-redhat-linux6E/lib64 -lpthread -L/usr/lib64 -lz'
# Environment settings :
--------------------------------------------------------------------------------
Checking for dependencies and required variables for 'med'... [ OK ]
--------------------------------------------------------------------------------
Installation of : med 4.0.0
MED-fichier (Modelisation et Echanges de Donnees, in English Modelisation
and Data Exchange) is a library to store and exchange meshed data or computation results.
It uses the HDF5 file format to store the data.
Archive filename : med-4.0.0
Destination : /galileo/home/userexternal/cchisari/bin/aster/public/med-4.0.0
Working directory : /tmp/install_med.36354
--------------------------------------------------------------------------------
Filling cache... [ OK ]
Checking permissions... [ OK ]
>>> Extraction <<<
entering directory '/tmp/install_med.36354'
Extracting med-4.0.0.tar.gz... [ OK ]
--- 6706 files extracted
leaving directory '/tmp/install_med.36354'
>>> Configuration <<<
entering directory '/tmp/install_med.36354/med-4.0.0'
Command line : unset LD ; export LDFLAGS='-Wl,--no-as-needed -L/usr/lib/x86_64-redhat-linux6E/lib64 -lpthread -L/usr/lib64 -lz -L/usr/lib/gcc/x86_64-redhat-linux/4.8.2 -lstdc++' ; export F77=$F90; export CXXFLAGS='-std=gnu++98'; export PATH=/tmp/tmpiida9huw:${PATH} ; ./configure --disable-mesgerr --with-hdf5=/galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3 --prefix=/galileo/home/userexternal/cchisari/bin/aster/public/med-4.0.0
configure med installation...
Command output :
configure med installation... [FAILED]
Exit code : 77
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /usr/bin/mkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking whether UID '32460' is supported by ustar format... yes
checking whether GID '25200' is supported by ustar format... yes
checking how to create a ustar tar archive... gnutar
configure: Trying /galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3 home path for searching H5pubconf.h file.
checking for a sed that does not truncate output... /usr/bin/sed
checking whether to compile C using MPI... no
checking for style of include used by make... GNU
checking for gcc... /usr/bin/gcc
checking whether the C compiler works... no
configure: error: in `/tmp/install_med.36354/med-4.0.0':
configure: error: C compiler cannot create executables
See `config.log' for more details
EXIT_COMMAND_36354_00000074=77
*** Exception raised : error during configure
--------------------------------------------------------------------------------
SUMMARY OF INSTALLATION
--------------------------------------------------------------------------------
Installation of : hdf5 1.10.3
Destination : /galileo/home/userexternal/cchisari/bin/aster/public/hdf5-1.10.3
Elapsed time : 119.79 s
[ OK ]
Installation of : med 4.0.0
Destination : /galileo/home/userexternal/cchisari/bin/aster/public/med-4.0.0
Elapsed time : 3.68 s
*** Exception None raised : None
See detailed traceback in the logfile
[FAILED]
Exit code : 4
Installation of : Code_Aster + 2 of its prerequisites
Destination : /galileo/home/userexternal/cchisari/bin/aster
Elapsed time : 139.21 s
[ OK ]
I cannot understand what the error is about. Is anyone able to help please?
I also attach the setup.dbg file.
Thank you very much,
Corrado
Any suggestions? I would like to model a quasi static monotonic applied displacement. From my previous experiences, I have found out that using a nonlinear dynamic analysis I can overcome many convergence issues I find with static procedures, thanks to the stabilising effect of the mass matrix. To do so, I need to convert the displacement history into a acceleration history with appropriate velocity initial conditions.
Has anyone done the same before?
Thanks,
Corrado
Dear All,
I would like to apply an acceleration history+ initial velocity to some nodes of my model in a DYNA_NON_LINE command. I applied initial velocity by means of ETAT_INIT=_F(VITE = H_VEL), where:
H_VEL=CREA_CHAMP(TYPE_CHAM='NOEU_DEPL_R',OPERATION='AFFE',MODELE=MODEL,
AFFE=(
F(TOUT='OUI', NOM_CMP=('DX','DY','DZ'), VALE=(0.,0.,0.),),
_F(GROUP_NO=load_g, NOM_CMP=('DX'), VALE = (0.1,),),
),);
However, I cannot understand how to apply acceleration history, the only relevant topic in the forum seems this: /forum2/viewtopic.php?id=19190 but ended up with no suggestion.
Could anyone help me?
Thank you and best regards,
Corrado
Many thanks!
Corrado
Dear All,
I would like to create a system of forces to be applied in a push-over analysis. Such a system of forces should be corresponding to a mode of vibration, i.e. it should be equal to:
f=M phi
where f is the system of forces, M is the mass matrix and phi is a the mode of vibration. After a modal analysis, I have extracted the mode:
disp1=CREA_CHAMP(TYPE_CHAM='NOEU_DEPL_R',
OPERATION='EXTR',
RESULTAT=MODES,
NOM_CHAM='DEPL',
NUME_MODE=1,
);
I then create a force system:
MODE1p=AFFE_CHAR_MECA(MODELE=MODEL,
VECT_ASSE = disp1,
);
However, in my understanding, it creates a force system which is proportional to the first mode, but lacks the multiplication by the mass matrix. How can achieve this?
Thank you in advance and best regards,
Corrado
Dear Voulet2,
Yes, you are right. The smaller it is the larger is the arc-length step, so potentially the faster the analysis is. Of course it means that in "difficult" time steps convergence becomes harder to achieve. I tried with smaller COEF_MULT, but I could not get convergence as with larger steps. My issue is within the single step, where convergence is not quadratic but apparently linear. Increasing the step will not help in my opinion.
Thank you,
Corrado
And here is my force-displacement plot.