357 resultados para Voronoi Meshes
Resumo:
Post-processing a finite element solution is a well-known technique, which consists in a recalculation of the originally obtained quantities such that the rate of convergence increases without the need for expensive remeshing techniques. Postprocessing is especially effective in problems where better accuracy is required for derivatives of nodal variables in regions where Dirichlet essential boundary condition is imposed strongly. Consequently such an approach can be exceptionally good in modelling of resin infiltration under quasi steady-state assumption by remeshing techniques and with explicit time integration, because only the free-front normal velocities are necessary to advance the resin front to the next position. The new contribution is the post-processing analysis and implementation of the freeboundary velocities of mesolevel infiltration analysis. Such implementation ensures better accuracy on even coarser meshes, which in consequence reduces the computational time also by the possibility of employing larger time steps.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e Computadores
Resumo:
Os modelos a ser analisados pelo Método de Elementos Finitos são cada vez mais complexos e, nos tempos que correm, seria impensável realizar tais análises sem um apoio computorizado. Existe para esta finalidade uma vasta gama de programas que permitem realizar tarefas que passam pelo desenho de estruturas, análise estática de cargas, análise dinâmica e vibrações, visualização do comportamento físico (deformações) em tempo real, que permitem a otimização da estrutura. Sob o pretexto de permitir a qualquer utilizador uma análise de estruturas simples com o Método dos Elementos Finitos, surge esta tese, onde se irá criar de raiz um programa com interface gráfica no ambiente MATLAB® para análise de estruturas simples com dois tipos de elemento finito, triangular de deformação constante e quadrangular de deformação linear. O software desenvolvido, verificado por comparação com um software comercial dedicado para o efeito, efetua malhagem com elementos bidimensionais triangulares e quadriláteros e resolve modelos arbitrados pelo Método de Elementos Finitos, representando estes resultados visualmente e em formato tabular.
Resumo:
Dissertação apresentada para a obtenção do Grau de Mestre em Genética Molecular e Biomedicina, pela Universidade N ova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
The vulnerability of masonry infill walls has been highlighted in recent earthquakes in which severe inplane damage and out-of-plane collapse developed, justifying the investment in the proposal of strengthening solutions aiming to improve the seismic performance of these construction elements. Therefore, this work presents an innovative strengthening solution to be applied in masonry infill walls, in order to avoid brittle failure and thus minimize the material damage and human losses. The textilereinforced mortar technique (TRM) has been shown to improve the out-of-plane resistance of masonry and to enhance its ductility, and here an innovative reinforcing mesh composed of braided composite rods is proposed. The external part of the rod is composed of braided polyester whose structure is defined so that the bond adherence with mortar is optimized. The mechanical performance of the strengthening technique to improve the out-of-plane behaviour of brick masonry is assessed based on experimental bending tests. Additionally, a comparison of the mechanical behaviour of the proposed meshes with commercial meshes is provided. The idea is that the proposed meshes are efficient in avoiding brittle collapse and premature disintegration of brick masonry during seismic events.
Resumo:
In the present work the benefits of using graphics processing units (GPU) to aid the design of complex geometry profile extrusion dies, are studied. For that purpose, a3Dfinite volume based code that employs unstructured meshes to solve and couple the continuity, momentum and energy conservation equations governing the fluid flow, together with aconstitutive equation, was used. To evaluate the possibility of reducing the calculation time spent on the numerical calculations, the numerical code was parallelized in the GPU, using asimple programing approach without complex memory manipulations. For verificationpurposes, simulations were performed for three benchmark problems: Poiseuille flow, lid-driven cavity flow and flow around acylinder. Subsequently, the code was used on the design of two real life extrusion dies for the production of a medical catheter and a wood plastic composite decking profile. To evaluate the benefits, the results obtained with the GPU parallelized code were compared, in terms of speedup, with a serial implementation of the same code, that traditionally runs on the central processing unit (CPU). The results obtained show that, even with the simple parallelization approach employed, it was possible to obtain a significant reduction of the computation times.
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes. Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions.
Resumo:
[Excerpt] The advantages resulting from the use of numerical modelling tools to support the design of processing equipment are almost consensual. The design of calibration systems in profile extrusion is not an exception . H owever , the complex geome tries and heat exchange phenomena involved in this process require the use of numerical solvers able to model the heat exchange in more than one domain ( calibrator and polymer), the compatibilization of the heat transfer at the profile - calibrator interface and with the ability to deal with complex geometries. The combination of all these features is usually hard to find in commercial software. Moreover , the dimension of the meshes required to ob tain accurate results, result in computational times prohibitive for industrial application. (...)
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes . Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions
Resumo:
The reinforcement of soil is defined as an effective and reliable technique to improve strength and stability. For this propose, the use of natural fibers has been commonly. Over the past years, a series of studies have been performed in order to investigate the influence of randomly oriented fibers, especially for compressible clayey soils. However, less attention has been given to the reinforcing of sandy materials, as well as the use of oriented fibers meshes in order to improve mechanical behaviour. The main aim of this study is to identify the influence that different percentages of fibers, as well as the use of meshes of oriented fibers, has on soil mechanical behaviour. For this purpose, unconfined compression tests with local strain measurements were performed on a silty sand reinforced with Sisal fibers and a comparative study between randomly oriented and 0° and 90° fibers is presented.
Resumo:
Among the various possible embodiements of Advanced Therapies and in particular of Tissue Engineering the use of temporary scaffolds to regenerate tissue defects is one of the key issues. The scaffolds should be specifically designed to create environments that promote tissue development and not merely to support the maintenance of communities of cells. To achieve that goal, highly functional scaffolds may combine specific morphologies and surface chemistry with the local release of bioactive agents. Many biomaterials have been proposed to produce scaffolds aiming the regeneration of a wealth of human tissues. We have a particular interest in developing systems based in nanofibrous biodegradable polymers1,2. Those demanding applications require a combination of mechanical properties, processability, cell-friendly surfaces and tunable biodegradability that need to be tailored for the specific application envisioned. Those biomaterials are usually processed by different routes into devices with wide range of morphologies such as biodegradable fibers and meshes, films or particles and adaptable to different biomedical applications. In our approach, we combine the temporary scaffolds populated with therapeutically relevant communities of cells to generate a hybrid implant. For that we have explored different sources of adult and also embryonic stem cells. We are exploring the use of adult MSCs3, namely obtained from the bone marrow for the development autologous-based therapies. We also develop strategies based in extra-embryonic tissues, such as amniotic fluid (AF) and the perivascular region of the umbilical cord4 (Whartonâ s Jelly, WJ). Those tissues offer many advantages over both embryonic and other adult stem cell sourcess. These tissues are frequently discarded at parturition and its extracorporeal nature facilitates tissue donation by the patients. The comparatively large volume of tissue and ease of physical manipulation facilitates the isolation of larger numbers of stem cells. The fetal stem cells appear to have more pronounced immunomodulatory properties than adult MSCs. This allogeneic escape mechanism may be of therapeutic value, because the transplantation of readily available allogeneic human MSCs would be preferable as opposed to the required expansion stage (involving both time and logistic effort) of autologous cells. Topics to be covered: This talk will review our latest developments of nanostructured-based biomaterials and scaffolds in combination with stem cells for bone and cartilage tissue engineering.
Resumo:
Mo-Si-B alloys, Real microstructures, Voronoi structures, Microstructural characterization, Modelling and finite element simulations, Effective material properties, Damage and Crack growth, tensile strength, fracture toughness
Resumo:
The present thesis is a contribution to the debate on the applicability of mathematics; it examines the interplay between mathematics and the world, using historical case studies. The first part of the thesis consists of four small case studies. In chapter 1, I criticize "ante rem structuralism", proposed by Stewart Shapiro, by showing that his so-called "finite cardinal structures" are in conflict with mathematical practice. In chapter 2, I discuss Leonhard Euler's solution to the Königsberg bridges problem. I propose interpreting Euler's solution both as an explanation within mathematics and as a scientific explanation. I put the insights from the historical case to work against recent philosophical accounts of the Königsberg case. In chapter 3, I analyze the predator-prey model, proposed by Lotka and Volterra. I extract some interesting philosophical lessons from Volterra's original account of the model, such as: Volterra's remarks on mathematical methodology; the relation between mathematics and idealization in the construction of the model; some relevant details in the derivation of the Third Law, and; notions of intervention that are motivated by one of Volterra's main mathematical tools, phase spaces. In chapter 4, I discuss scientific and mathematical attempts to explain the structure of the bee's honeycomb. In the first part, I discuss a candidate explanation, based on the mathematical Honeycomb Conjecture, presented in Lyon and Colyvan (2008). I argue that this explanation is not scientifically adequate. In the second part, I discuss other mathematical, physical and biological studies that could contribute to an explanation of the bee's honeycomb. The upshot is that most of the relevant mathematics is not yet sufficiently understood, and there is also an ongoing debate as to the biological details of the construction of the bee's honeycomb. The second part of the thesis is a bigger case study from physics: the genesis of GR. Chapter 5 is a short introduction to the history, physics and mathematics that is relevant to the genesis of general relativity (GR). Chapter 6 discusses the historical question as to what Marcel Grossmann contributed to the genesis of GR. I will examine the so-called "Entwurf" paper, an important joint publication by Einstein and Grossmann, containing the first tensorial formulation of GR. By comparing Grossmann's part with the mathematical theories he used, we can gain a better understanding of what is involved in the first steps of assimilating a mathematical theory to a physical question. In chapter 7, I introduce, and discuss, a recent account of the applicability of mathematics to the world, the Inferential Conception (IC), proposed by Bueno and Colyvan (2011). I give a short exposition of the IC, offer some critical remarks on the account, discuss potential philosophical objections, and I propose some extensions of the IC. In chapter 8, I put the Inferential Conception (IC) to work in the historical case study: the genesis of GR. I analyze three historical episodes, using the conceptual apparatus provided by the IC. In episode one, I investigate how the starting point of the application process, the "assumed structure", is chosen. Then I analyze two small application cycles that led to revisions of the initial assumed structure. In episode two, I examine how the application of "new" mathematics - the application of the Absolute Differential Calculus (ADC) to gravitational theory - meshes with the IC. In episode three, I take a closer look at two of Einstein's failed attempts to find a suitable differential operator for the field equations, and apply the conceptual tools provided by the IC so as to better understand why he erroneously rejected both the Ricci tensor and the November tensor in the Zurich Notebook.
Stabilized Petrov-Galerkin methods for the convection-diffusion-reaction and the Helmholtz equations
Resumo:
We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.