17 resultados para Engineering simulation
em Universidad Politécnica de Madrid
Resumo:
The purpose of this report is to build a model that represents, as best as possible, the seismic behavior of a pile cap bridge foundation by a nonlinear static (analysis) procedure. It will consist of a reproduction of a specimen already built in the laboratory. This model will carry out a pseudo static lateral and horizontal pushover test that will be applied onto the pile cap until the failure of the structure, the formation of a plastic hinge in the piles due to the horizontal deformation, occurs. The pushover test consists of increasing the horizontal load over the pile cap until the horizontal displacement wanted at the height of the pile cap is reached. The output of this model will be a Skeleton curve that will plot the lateral load (kN) over the displacement (m), so that the maximum movement the pile cap foundation can reach before its failure can be calculated. This failure will be achieved when the load at that specific shift is equal to 85% of the maximum. The pile cap foundation finite element model was based on pile cap built for a laboratory experiment already carried out by the Master student Deming Zhang at Tongji University. Two different pile caps were tested with a difference in height above the ground level. While one has 0:3m, the other rises 0:8m above the ground level. The computer model was calibrated using the experimental results. The pile cap foundation will be programmed in a finite element environment called OpenSees (Open System for Earthquake Engineering Simulation [28]). This environment is a free software developed by Berkeley University specialized, as it name says, in the study of earthquakes and its effects on structures. This specialization is the main reason why it is being used for building this model as it makes it possible to build any finite element model, and perform several analysis in order to get the results wanted. The development of OpenSees is sponsored by the Pacific Earthquake Engineering Research Center through the National Science Foundation engineering and education centers program. OpenSees uses Tcl language to program it, which is a language similar to C++.
Resumo:
Nowadays, computer simulators are becoming basic tools for education and training in many engineering fields. In the nuclear industry, the role of simulation for training of operators of nuclear power plants is also recognized of the utmost relevance. As an example, the International Atomic Energy Agency sponsors the development of nuclear reactor simulators for education, and arranges the supply of such simulation programs. Aware of this, in 2008 Gas Natural Fenosa, a Spanish gas and electric utility that owns and operate nuclear power plants and promotes university education in the nuclear technology field, provided the Department of Nuclear Engineering of Universidad Politécnica de Madrid with the Interactive Graphic Simulator (IGS) of “José Cabrera” (Zorita) nuclear power plant, an industrial facility whose commercial operation ceased definitively in April 2006. It is a state-of-the-art full-scope real-time simulator that was used for training and qualification of the operators of the plant control room, as well as to understand and analyses the plant dynamics, and to develop, qualify and validate its emergency operating procedures.
Resumo:
A finite element model was used to simulate timberbeams with defects and predict their maximum load in bending. Taking into account the elastoplastic constitutive law of timber, the prediction of fracture load gives information about the mechanisms of timber failure, particularly with regard to the influence of knots, and their local graindeviation, on the fracture. A finite element model was constructed using the ANSYS element Plane42 in a plane stress 2D-analysis, which equates thickness to the width of the section to create a mesh which is as uniform as possible. Three sub-models reproduced the bending test according to UNE EN 408: i) timber with holes caused by knots; ii) timber with adherent knots which have structural continuity with the rest of the beam material; iii) timber with knots but with only partial contact between knot and beam which was artificially simulated by means of contact springs between the two materials. The model was validated using ten 45 × 145 × 3000 mm beams of Pinus sylvestris L. which presented knots and graindeviation. The fracture stress data obtained was compared with the results of numerical simulations, resulting in an adjustment error less of than 9.7%
Resumo:
All meta-analyses should include a heterogeneity analysis. Even so, it is not easy to decide whether a set of studies are homogeneous or heterogeneous because of the low statistical power of the statistics used (usually the Q test). Objective: Determine a set of rules enabling SE researchers to find out, based on the characteristics of the experiments to be aggregated, whether or not it is feasible to accurately detect heterogeneity. Method: Evaluate the statistical power of heterogeneity detection methods using a Monte Carlo simulation process. Results: The Q test is not powerful when the meta-analysis contains up to a total of about 200 experimental subjects and the effect size difference is less than 1. Conclusions: The Q test cannot be used as a decision-making criterion for meta-analysis in small sample settings like SE. Random effects models should be used instead of fixed effects models. Caution should be exercised when applying Q test-mediated decomposition into subgroups.
Resumo:
Goal-level Independent and-parallelism (IAP) is exploited by scheduling for simultaneous execution two or more goals which will not interfere with each other at run time. This can be done safely even if such goals can produce multiple answers. The most successful IAP implementations to date have used recomputation of answers and sequentially ordered backtracking. While in principle simplifying the implementation, recomputation can be very inefficient if the granularity of the parallel goals is large enough and they produce several answers, while sequentially ordered backtracking limits parallelism. And, despite the expected simplification, the implementation of the classic schemes has proved to involve complex engineering, with the consequent difficulty for system maintenance and expansion, and still frequently run into the well-known trapped goal and garbage slot problems. This work presents ideas about an alternative parallel backtracking model for IAP and a simulation studio. The model features parallel out-of-order backtracking and relies on answer memoization to reuse and combine answers. Whenever a parallel goal backtracks, its siblings also perform backtracking, but after storing the bindings generated by previous answers. The bindings are then reinstalled when combining answers. In order not to unnecessarily penalize forward execution, non-speculative and-parallel goals which have not been executed yet take precedence over sibling goals which could be backtracked over. Using a simulator, we show that this approach can bring significant performance advantages over classical approaches.
Resumo:
Light detection and ranging (LiDAR) technology is beginning to have an impact on agriculture. Canopy volume and/or fruit tree leaf area can be estimated using terrestrial laser sensors based on this technology. However, the use of these devices may have different options depending on the resolution and scanning mode. As a consequence, data accuracy and LiDAR derived parameters are affected by sensor configuration, and may vary according to vegetative characteristics of tree crops. Given this scenario, users and suppliers of these devices need to know how to use the sensor in each case. This paper presents a computer program to determine the best configuration, allowing simulation and evaluation of different LiDAR configurations in various tree structures (or training systems). The ultimate goal is to optimise the use of laser scanners in field operations. The software presented generates a virtual orchard, and then allows the scanning simulation with a laser sensor. Trees are created using a hidden Markov tree (HMT) model. Varying the foliar structure of the orchard the LiDAR simulation was applied to twenty different artificially created orchards with or without leaves from two positions (lateral and zenith). To validate the laser sensor configuration, leaf surface of simulated trees was compared with the parameters obtained by LiDAR measurements: the impacted leaf area, the impacted total area (leaves and wood), and th impacted area in the three outer layers of leaves.
Resumo:
The need for the simulation of spectrum compatible earthquake time histories has existed since earthquake engineering for complicated structures began. More than the safety of the main structure, the analysis of the equipment (piping, racks, etc.) can only be assessed on the basis of the time history of the floor in which they are contained. This paper presents several methods for calculating simulated spectrum compatible earthquakes as well as a comparison between them. As a result of this comparison, the use of the phase content in real earthquakes as proposed by Ohsaki appears as an effective alternative to the classical methods. With this method, it is possible to establish an approach without the arbitrary modulation commonly used in other methods. Different procedures are described as is the influence of the different parameters which appear in the analysis. Several numerical examples are also presented, and the effectiveness of Ohsaki's method is confirmed.
Resumo:
BioMet®Tools is a set of software applications developed for the biometrical characterization of voice in different fields as voice quality evaluation in laryngology, speech therapy and rehabilitation, education of the singing voice, forensic voice analysis in court, emotional detection in voice, secure access to facilities and services, etc. Initially it was conceived as plain research code to estimate the glottal source from voice and obtain the biomechanical parameters of the vocal folds from the spectral density of the estimate. This code grew to what is now the Glottex®Engine package (G®E). Further demands from users in medical and forensic fields instantiated the development of different Graphic User Interfaces (GUI’s) to encapsulate user interaction with the G®E. This required the personalized design of different GUI’s handling the same G®E. In this way development costs and time could be saved. The development model is described in detail leading to commercial production and distribution. Study cases from its application to the field of laryngology and speech therapy are given and discussed.
Resumo:
This article describes the simulation and characterization of an ultrasonic transducer using a new material called Rexolite to be used as a matching element. This transducer was simulated using a commercial piezoelectric ceramic PIC255 at 8 MHz. Rexolite, the new material, presents an excellent acoustic matching, specially in terms of the acoustic impedance of water. Finite elements simulations were used in this work. Rexolite was considered as a suitable material in the construction of the transducer due to its malleability and acoustic properties, to validate the simulations a prototype transducer was constructed. Experimental measurements were used to determine the resonance frequency of the prototype transducer. Simulated and experimental results were very similar showing that Rexolite may be an excellent matching, particularly for medical applications.
Resumo:
Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.
Resumo:
The mechanisms of growth of a circular void by plastic deformation were studied by means of molecular dynamics in two dimensions (2D). While previous molecular dynamics (MD) simulations in three dimensions (3D) have been limited to small voids (up to ≈10 nm in radius), this strategy allows us to study the behavior of voids of up to 100 nm in radius. MD simulations showed that plastic deformation was triggered by the nucleation of dislocations at the atomic steps of the void surface in the whole range of void sizes studied. The yield stress, defined as stress necessary to nucleate stable dislocations, decreased with temperature, but the void growth rate was not very sensitive to this parameter. Simulations under uniaxial tension, uniaxial deformation and biaxial deformation showed that the void growth rate increased very rapidly with multiaxiality but it did not depend on the initial void radius. These results were compared with previous 3D MD and 2D dislocation dynamics simulations to establish a map of mechanisms and size effects for plastic void growth in crystalline solids.
Resumo:
Electrical Protection systems and Automatic Voltage Regulators (AVR) are essential components of actual power plants. Its installation and setting is performed during the commissioning, and it needs extensive experience since any failure in this process or in the setting, may entails some risk not only for the generator of the power plant, but also for the reliability of the power grid. In this paper, a real time power plant simulation platform is presented as a tool for improving the training and learning process on electrical protections and automatic voltage regulators. The activities of the commissioning procedure which can be practiced are described, and the applicability of this tool for improving the comprehension of this important part of the power plants is discussed. A commercial AVR and a multifunction protective relay have been tested with satisfactory results.
Resumo:
Two mathematical models are used to simulate pollution in the Bay of Santander. The first is the hydrodynamic model that provides the velocity field and height of the water. The second gives the pollutant concentration field as a resultant. Both models are formulated in two-dimensional equations. Linear triangular finite elements are used in the Galerkin procedure for spatial discretization. A finite difference scheme is used for the time integration. At each time step the calculated results of the first model are input to the second model as field data. The efficiency and accuracy of the models are tested by their application to a simple illustrative example. Finally a case study in simulation of pollution evolution in the Bay of Santander is presented
Resumo:
This document contains detailed description of the design and the implementation of a multi-agent application controlling traffic lights in a city together with a system for simulating traffic and testing. The goal of this thesis is to design and build a simplified intelligent and distributed solution to the problem with the traffic in the big cities following different good practices in order to allow future refining of the model of the real world. The problem of the traffic in the big cities is still a problem that cannot be solved. Not only is the increasing number of cars a reason for the traffic jams, but also the way the traffic is organized. Usually, the intersections with traffic lights are replaced by roundabouts or interchanges to increase the number of cars that can cross the intersection in certain time. But still there are places where the infrastructure cannot be changed and the traffic light semaphores are the only way to control the car flows. In real life, the traffic lights have a predefined plan for change or they receive information from a centralized system when and how they have to change. But what if the traffic lights can cooperate and decide on their own when and how to change? Using this problem, the purpose of the thesis is to explore different agent-based software engineering approaches to design and build a non-conventional distributed system. From the software engineering point of view, the goal of the thesis is to apply the knowledge and use the skills, acquired during the various courses of the master program in Software Engineering, while solving a practical and complex problem such as the traffic in the cities.
Resumo:
Shading reduces the power output of a photovoltaic (PV) system. The design engineering of PV systems requires modeling and evaluating shading losses. Some PV systems are affected by complex shading scenes whose resulting PV energy losses are very difficult to evaluate with current modeling tools. Several specialized PV design and simulation software include the possibility to evaluate shading losses. They generally possess a Graphical User Interface (GUI) through which the user can draw a 3D shading scene, and then evaluate its corresponding PV energy losses. The complexity of the objects that these tools can handle is relatively limited. We have created a software solution, 3DPV, which allows evaluating the energy losses induced by complex 3D scenes on PV generators. The 3D objects can be imported from specialized 3D modeling software or from a 3D object library. The shadows cast by this 3D scene on the PV generator are then directly evaluated from the Graphics Processing Unit (GPU). Thanks to the recent development of GPUs for the video game industry, the shadows can be evaluated with a very high spatial resolution that reaches well beyond the PV cell level, in very short calculation times. A PV simulation model then translates the geometrical shading into PV energy output losses. 3DPV has been implemented using WebGL, which allows it to run directly from a Web browser, without requiring any local installation from the user. This also allows taken full benefits from the information already available from Internet, such as the 3D object libraries. This contribution describes, step by step, the method that allows 3DPV to evaluate the PV energy losses caused by complex shading. We then illustrate the results of this methodology to several application cases that are encountered in the world of PV systems design. Keywords: 3D, modeling, simulation, GPU, shading, losses, shadow mapping, solar, photovoltaic, PV, WebGL