977 resultados para Static analysis
Resumo:
Osteoporotic proximal femur fractures are caused by low energy trauma, typically when falling on the hip from standing height. Finite element simulations, widely used to predict the fracture load of femora in fall, usually include neither mass-related inertial effects, nor the viscous part of bone's material behavior. The aim of this study was to elucidate if quasi-static non-linear homogenized finite element analyses can predict in vitro mechanical properties of proximal femora assessed in dynamic drop tower experiments. The case-specific numerical models of thirteen femora predicted the strength (R2=0.84, SEE=540 N, 16.2%), stiffness (R2=0.82, SEE=233 N/mm, 18.0%) and fracture energy (R2=0.72, SEE=3.85 J, 39.6%); and provided fair qualitative matches with the fracture patterns. The influence of material anisotropy was negligible for all predictions. These results suggest that quasi-static homogenized finite element analysis may be used to predict mechanical properties of proximal femora in the dynamic sideways fall situation.
Resumo:
One of the common pathologies of brickwork masonry structural elements and walls is the cracking associated with the differential settlements and/or excessive deflections of the slabs along the life of the structure. The scarce capacity of the masonry in order to accompany the structural elements that surround it, such as floors, beams or foundations, in their movements makes the brickwork masonry to be an element that frequently presents this kind of problem. This problem is a fracture problem, where the wall is cracked under mixed mode fracture: tensile and shear stresses combination, under static loading. Consequently, it is necessary to advance in the simulation and prediction of brickwork masonry mechanical behaviour under tensile and shear loading. The quasi-brittle behaviour of the brickwork masonry can be studied using the cohesive crack model whose application to other quasibrittle materials like concrete has traditionally provided very satisfactory results.
Resumo:
Flexible spacecraft with attached solar panels may exhibit undesired vibrations and structural deformations. These types of vehicles show an intrinsic coupling of the elements of the structure. The attitude maneuvers performed by flexible spacecraft may cause non-desired deflections of attached flexible elements. Any attitude and orbit control system generally solves these problems using filters that are designed to attenuate the relative deflections of flexible appendages. In this paper, we propose a method for designing attitude static controllers using an eigenstructure assignment (EA) method. A set of requirements were specified from our understanding of the system modes in an open loop. Exhaustive theoretical and numerical simulations were performed on special cases to verify the controller design procedure. In the design of the controller, we considered all of the aspects that relate to the eigenstructure assignment. The primary objective of this paper is to demonstrate the feasibility of obtaining a high degree of decoupling for some selected modes via the application of an EA method. Finally a robustness analysis is perform to the system together with the designed controller by means of a mu-analysis
Resumo:
Cable-stayed bridges represent nowadays key points in transport networks and their seismic behavior needs to be fully understood, even beyond the elastic range of materials. Both nonlinear dynamic (NL-RHA) and static (pushover) procedures are currently available to face this challenge, each with intrinsic advantages and disadvantages, and their applicability in the study of the nonlinear seismic behavior of cable-stayed bridges is discussed here. The seismic response of a large number of finite element models with different span lengths, tower shapes and class of foundation soil is obtained with different procedures and compared. Several features of the original Modal Pushover Analysis (MPA) are modified in light of cable-stayed bridge characteristics, furthermore, an extension of MPA and a new coupled pushover analysis (CNSP) are suggested to estimate the complex inelastic response of such outstanding structures subjected to multi-axial strong ground motions.
Resumo:
Static analyses of object-oriented programs usually rely on intermediate representations that respect the original semantics while having a more uniform and basic syntax. Most of the work involving object-oriented languages and abstract interpretation usually omits the description of that language or just refers to the Control Flow Graph(CFG) it represents. However, this lack of formalization on one hand results in an absence of assurances regarding the correctness of the transformation and on the other it typically strongly couples the analysis to the source language. In this work we present a framework for analysis of object-oriented languages in which in a first phase we transform the input program into a representation based on Horn clauses. This allows on one hand proving the transformation correct attending to a simple condition and on the other being able to apply an existing analyzer for (constraint) logic programming to automatically derive a safe approximation of the semantics of the original program. The approach is flexible in the sense that the first phase decouples the analyzer from most languagedependent features, and correct because the set of Horn clauses returned by the transformation phase safely approximates the standard semantics of the input program. The resulting analysis is also reasonably scalable due to the use of mature, modular (C)LP-based analyzers. The overall approach allows us to report results for medium-sized programs.
Resumo:
We propose a general framework for assertion-based debugging of constraint logic programs. Assertions are linguistic constructions for expressing properties of programs. We define several assertion schemas for writing (partial) specifications for constraint logic programs using quite general properties, including user-defined programs. The framework is aimed at detecting deviations of the program behavior (symptoms) with respect to the given assertions, either at compile-time (i.e., statically) or run-time (i.e., dynamically). We provide techniques for using information from global analysis both to detect at compile-time assertions which do not hold in at least one of the possible executions (i.e., static symptoms) and assertions which hold for all possible executions (i.e., statically proved assertions). We also provide program transformations which introduce tests in the program for checking at run-time those assertions whose status cannot be determined at compile-time. Both the static and the dynamic checking are provably safe in the sense that all errors flagged are definite violations of the pecifications. Finally, we report briefly on the currently implemented instances of the generic framework.
Resumo:
We present a method for the static resource usage analysis of MiniZinc models. The analysis can infer upper bounds on the usage that a MiniZinc model will make of some resources such as the number of constraints of a given type (equality, disequality, global constraints, etc.), the number of variables (search variables or temporary variables), or the size of the expressions before calling the solver. These bounds are obtained from the models independently of the concrete input data (the instance data) and are in general functions of sizes of such data. In our approach, MiniZinc models are translated into Ciao programs which are then analysed by the CiaoPP system. CiaoPP includes a parametric analysis framework for resource usage in which the user can define resources and express the resource usage of library procedures (and certain program construets) by means of a language of assertions. We present the approach and report on a preliminary implementation, which shows the feasibility of the approach, and provides encouraging results.
Resumo:
Automatic cost analysis of programs has been traditionally concentrated on a reduced number of resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certiflcation of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually application-dependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a datábase, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-deflnable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these deflnitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering a signiflcant set of interesting resources.
Resumo:
Automatic cost analysis of programs has been traditionally studied in terms of a number of concrete, predefined resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certification of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually applicationdependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a database, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-definable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these definitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering an ample set of interesting resources.
Resumo:
Algorithms for distributed agreement are a powerful means for formulating distributed versions of existing centralized algorithms. We present a toolkit for this task and show how it can be used systematically to design fully distributed algorithms for static linear Gaussian models, including principal component analysis, factor analysis, and probabilistic principal component analysis. These algorithms do not rely on a fusion center, require only low-volume local (1-hop neighborhood) communications, and are thus efficient, scalable, and robust. We show how they are also guaranteed to asymptotically converge to the same solution as the corresponding existing centralized algorithms. Finally, we illustrate the functioning of our algorithms on two examples, and examine the inherent cost-performance tradeoff.
Resumo:
Seismic hazard study in “La Hispaniola” island in connection with the land tenure situation in the region, in order to define priority areas with a high risk, where some land management recommendations are proposed. The seismic hazard assessment has been carried out following the probabilistic method with a seismogenic zonation and including the major faults of the region as independent units. In order to identify the priority areas, it has taken into account, besides the seismic hazard study, the map of changes of static Coulomb failure stress and the landslide hazard map.
Resumo:
This paper presents a numerical implementation of the cohesive crack model for the anal-ysis of quasibrittle materials based on the strong discontinuity approach in the framework of the finite element method. A simple central force model is used for the stress versus crack opening curve. The additional degrees of freedom defining the crack opening are determined at the crack level, thus avoiding the need for performing a static condensation at the element level. The need for a tracking algorithm is avoided by using a consistent pro-cedure for the selection of the separated nodes. Such a model is then implemented into a commercial program by means of a user subroutine, consequently being contrasted with the experimental results. The model takes into account the anisotropy of the material. Numerical simulations of well-known experiments are presented to show the ability of the proposed model to simulate the fracture of quasibrittle materials such as mortar, concrete and masonry.
Resumo:
Nitrous oxide emissions from a network of agricultural experiments in Europe were used to explore the relative importance of site and management controls of emissions. At each site, a selection of management interventions were compared within replicated experimental designs in plot-based experiments. Arable experiments were conducted at Beano in Italy, El Encin in Spain, Foulum in Denmark, Logarden in Sweden, Maulde in Belgium CE1, Paulinenaue in Germany, and Tulloch in the UK. Grassland experiments were conducted at Crichton, Nafferton and Peaknaze in the UK, Godollo in Hungary, Rzecin in Poland, Zarnekow in Germany and Theix in France. Nitrous oxide emissions were measured at each site over a period of at least two years using static chambers. Emissions varied widely between sites and as a result of manipulation treatments. Average site emissions (throughout the study period) varied between 0.04 and 21.21 kg N2O-N ha−1yr−1, with the largest fluxes and variability associated with the grassland sites. Total nitrogen addition was found to be the single most important deter- minant of emissions, accounting for 15 % of the variance (using linear regression) in the data from the arable sites (p<0.0001), and 77 % in the grassland sites. The annual emissions from arable sites were significantly greater than those that would be predicted by IPCC default emission fac- tors. Variability of N2O emissions within sites that occurred as a result of manipulation treatments was greater than that resulting from site-to-site and year-to-year variation, highlighting the importance of management interventions in contributing to greenhouse gas mitigation
Resumo:
In this paper an analytical static approach to analyse buried tunnels under seismic surface waves (Rayleigh and Love waves), propagating parallel to the tunnels axis, is provided. In the proposed method, the tunnel is considered as a beam on elastic foundation by using a Winkler model to represent the subgrade reaction and the soil-structure interaction. The seismic load is imposed by giving at the base of the soil springs a determined configuration corresponding to the free-field motion. From the solution of the differential governing equations of the problem, results are obtained in form of relative displacements between points of tunnel, and therefore the seismic bending moments and shearing forces, acting on the tunnel cross section, can be computed.
Resumo:
To our knowledge, no current software development methodology explicitly describes how to transit from the analysis model to the software architecture of the application. This paper presents a method to derive the software architecture of a system from its analysis model. To do this, we are going to use MDA. Both the analysis model and the architectural model are PIMs described with UML 2. The model type mapping designed consists of several rules (expressed using OCL and natural language) that, when applied to the analysis artifacts, generate the software architecture of the application. Specifically the rules act on elements of the UML 2 metamodel (metamodel mapping). We have developed a tool (using Smalltalk) that permits the automatic application of these rules to an analysis model defined in RoseTM to generate the application architecture expressed in the architectural style C2.