925 resultados para loss, PBEE, PEER method, earthquake engineering
Resumo:
Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.
Resumo:
A theory is developed of an electrostatic probe in a fully-ionized plasma in the presence of a strong magnetic field. The ratio of electron Larmor radius to probe transverse dimension is assumed to be small. Poisson's equation, together with kinetic equations for ions and electrons are considered. An asymptotic perturbation method of multiple scales is used by considering the characteristic lengths appearing in the problem. The leading behavior of the solution is found. The results obtained appear to apply to weaker fields also, agreeing with the solutions known in the limit of no magnetic field. The range of potentials for wich results are presented is limited. The basic effects produced by the field are a depletion of the plasma near the probe and a non-monotonic potential surrounding the probe. The ion saturation current is not changed but changes appear in both the floating potential Vf and the slope of the current-voltage diagram at Vf. The transition region extends beyond the space potential Vs,at wich point the current is largely reduced. The diagram does not have an exponential form in this region as commonly assumed. There exists saturation in electron collection. The extent to which the plasma is disturbed is determined. A cylindrical probe has no solution because of a logarithmic singularity at infinity. Extensions of the theory are considered.
Resumo:
A method to reduce the bruise susceptibility of apples by controlling the moisture loss of the fruit was evaluated. Previous research indicates that reduction of the relative humidity of the storage air leads to an immediate effect on the weight loss and on skin properties and to a lower bruise susceptibility of apples. The diffusion equation is used to determine the waterpotential profile inside the fruit during storage. Characteristics of the waterpotential distribution in the fruit are related to measured bruise volumes. The results indicate how /this model can be used to control bruise susceptibility.
Resumo:
Bruise damage is a major cause of quality loss for apples. It would be very useful to establish a method of characterizing bruise susceptibility in order to improve fruit handling, sometimes Magness-Taylor firmness is used as an indirect guide to handling requirements. The objective of the present work was to achieve a better bruise susceptibility prediction.
Resumo:
Dimensionality Reduction (DR) is attracting more attention these days as a result of the increasing need to handle huge amounts of data effectively. DR methods allow the number of initial features to be reduced considerably until a set of them is found that allows the original properties of the data to be kept. However, their use entails an inherent loss of quality that is likely to affect the understanding of the data, in terms of data analysis. This loss of quality could be determinant when selecting a DR method, because of the nature of each method. In this paper, we propose a methodology that allows different DR methods to be analyzed and compared as regards the loss of quality produced by them. This methodology makes use of the concept of preservation of geometry (quality assessment criteria) to assess the loss of quality. Experiments have been carried out by using the most well-known DR algorithms and quality assessment criteria, based on the literature. These experiments have been applied on 12 real-world datasets. Results obtained so far show that it is possible to establish a method to select the most appropriate DR method, in terms of minimum loss of quality. Experiments have also highlighted some interesting relationships between the quality assessment criteria. Finally, the methodology allows the appropriate choice of dimensionality for reducing data to be established, whilst giving rise to a minimum loss of quality.
Resumo:
Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and valuations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. Introduction Nowadays, any engineering calculation performed in the nuclear field should be accompanied by an uncertainty analysis. In such an analysis, different sources of uncertainties are taken into account. Works such as those performed under the UAM project (Ivanov, et al., 2013) treat nuclear data as a source of uncertainty, in particular cross-section data for which uncertainties given in the form of covariance matrices are already provided in the major nuclear data libraries. Meanwhile, fission yield uncertainties were often neglected or treated shallowly, because their effects were considered of second order compared to cross-sections (Garcia-Herranz, et al., 2010). However, the Working Party on International Nuclear Data Evaluation Co-operation (WPEC)
Resumo:
The Software Engineering (SE) community has historically focused on working with models to represent functionality and persistence, pushing interaction modelling into the background, which has been covered by the Human Computer Interaction (HCI) community. Recently, adequately modelling interaction, and specifically usability, is being considered as a key factor for success in user acceptance, making the integration of the SE and HCI communities more necessary. If we focus on the Model-Driven Development (MDD) paradigm, we notice that there is a lack of proposals to deal with usability features from the very first steps of software development process. In general, usability features are manually implemented once the code has been generated from models. This contradicts the MDD paradigm, which claims that all the analysts? effort must be focused on building models, and the code generation is relegated to model to code transformations. Moreover, usability features related to functionality may involve important changes in the system architecture if they are not considered from the early steps. We state that these usability features related to functionality can be represented abstractly in a conceptual model, and their implementation can be carried out automatically.
Resumo:
La comparación de las diferentes ofertas presentadas en la licitación de un proyecto,con el sistema de contratación tradicional de medición abierta y precio unitario cerrado, requiere herramientas de análisis que sean capaces de discriminar propuestas que teniendo un importe global parecido pueden presentar un impacto económico muy diferente durante la ejecución. Una de las situaciones que no se detecta fácilmente con los métodos tradicionales es el comportamiento del coste real frente a las variaciones de las cantidades realmente ejecutadas en obra respecto de las estimadas en el proyecto. Este texto propone abordar esta situación mediante un sistema de análisis cuantitativo del riesgo como el método de Montecarlo. Este procedimiento, como es sabido, consiste en permitir que los datos de entrada que definen el problema varíen unas funciones de probabilidad definidas, generar un gran número de casos de prueba y tratar los resultados estadísticamente para obtener los valores finales más probables,con los parámetros necesarios para medir la fiabilidad de la estimación. Se presenta un modelo para la comparación de ofertas, desarrollado de manera que puede aplicarse en casos reales aplicando a los datos conocidos unas condiciones de variación que sean fáciles de establecer por los profesionales que realizan estas tareas. ABSTRACT: The comparison of the different bids in the tender for a project, with the traditional contract system based on unit rates open to and re-measurement, requires analysis tools that are able to discriminate proposals having a similar overall economic impact, but that might show a very different behaviour during the execution of the works. One situation not easily detected by traditional methods is the reaction of the actual cost to the changes in the exact quantity of works finally executed respect of the work estimated in the project. This paper intends to address this situation through the Monte Carlo method, a system of quantitative risk analysis. This procedure, as is known, is allows the input data defining the problem to vary some within well defined probability functions, generating a large number of test cases, the results being statistically treated to obtain the most probable final values, with the rest of the parameters needed to measure the reliability of the estimate. We present a model for the comparison of bids, designed in a way that it can be applied in real cases, based on data and assumptions that are easy to understand and set up by professionals who wish to perform these tasks.
Resumo:
Enabling Subject Matter Experts (SMEs) to formulate knowledge without the intervention of Knowledge Engineers (KEs) requires providing SMEs with methods and tools that abstract the underlying knowledge representation and allow them to focus on modeling activities. Bridging the gap between SME-authored models and their representation is challenging, especially in the case of complex knowledge types like processes, where aspects like frame management, data, and control flow need to be addressed. In this paper, we describe how SME-authored process models can be provided with an operational semantics and grounded in a knowledge representation language like F-logic in order to support process-related reasoning. The main results of this work include a formalism for process representation and a mechanism for automatically translating process diagrams into executable code following such formalism. From all the process models authored by SMEs during evaluation 82% were well-formed, all of which executed correctly. Additionally, the two optimizations applied to the code generation mechanism produced a performance improvement at reasoning time of 25% and 30% with respect to the base case, respectively.
Resumo:
In order to evaluate ground shaking characteristics due to surface soil layers in the urban area of Port-au-Prince, short-period ambient noise observation has been performed approximately in a 500x500m grid. The HVSR method was applied to this set of 36 ambient noise measurement points to determine a distribution map of soil predominant periods. This map reveals a general increasing trend in the period values, from the Miocene conglomerates in the northern and southern parts of the town to the central and western zones formed of Pleistocene and Holocene alluvial deposits respectively, where the shallow geological materials that cover the basement increase in thickness. Shorter predominant periods (less than 0.3 s) were found in mountainous and neighbouring zones, where the thickness of sediments is smaller whereas longer periods (greater than 0.5 s) appear in Holocene alluvial fans, where the thickness of sediments is larger. The shallow shear-wave velocity structure have been estimated by means of inversion of Rayleigh wave dispersion data obtained from vertical-component array records of ambient noise. The measurements were carried out at one open space located in Holocene alluvial deposits, using 3 regular pentagonal arrays with 5, 10 and 20m respectively. Reliable dispersion curves were retrieved for frequencies between 4.0 and 14 Hz, with phase velocity values ranging from 420m/s down to 270 m/s. Finally, the average shear-wave velocity of the upper 30 m (VS30) was inverted for characterization of this geological unit.
Resumo:
In the last years, many analyses from acoustic signal processing have been used for different applications. In most cases, these sensor systems are based on the determination of times of flight for signals from every transducer. This paper presents a flat plate generalization method for impact detection and location over linear links or bars-based structures. The use of three piezoelectric sensors allow to achieve the position and impact time while the use of additional sensors lets cover a larger area of detection and avoid wrong timing difference measurements. An experimental setup and some experimental results are briefly presented.
Resumo:
En esta carta al editor, el profesor D. Enrique Alarcón Álvarez comenta el artículo de Thomas J. Rudolphi "An implementation of the Boundary Element Method for zoned media with stress discontinuities" publicado en la revista "International Journal for Numerical Methods in Engineering" Vol. 19, Nº 1, pags. 1–15, enero 1983.
Resumo:
The bankability of CPV projects is an important issue to pave the way toward a swift and sustained growth in this technology. The bankability of a PV plant is generally addressed through the modeling of its energy yield under a b aseline loss scenario, followed by an on-site measurement campaign aimed at verifying its energetic behavior. The main difference between PV and CPV resides in the proper CPV modules, in particular in the inclusion of optical lements and III-V multijunction cells that are much more sensitive to spectral variations than xSi cells, while the rest of the system behaves in a way that possesses many common points with xSi technology. The modeling of the DC power output of a CPV system thus requires several impo rtant second order parameters to be considered, mainly related to optics, spectral direct solar radiation, wind speed, tracker accuracy and heat dissipation of cells.
Resumo:
The aim of this paper is to develop a probabilistic modeling framework for the segmentation of structures of interest from a collection of atlases. Given a subset of registered atlases into the target image for a particular Region of Interest (ROI), a statistical model of appearance and shape is computed for fusing the labels. Segmentations are obtained by minimizing an energy function associated with the proposed model, using a graph-cut technique. We test different label fusion methods on publicly available MR images of human brains.
Resumo:
The design of containment walls suffering seismic loads traditionally has been realized with methods based on pseudoanalitic procedures such as Mononobe- Okabe's method, which it has led in certain occasions to insecure designs, that they have produced the ruin of many containment walls suffering the action of an earthquake. A method is proposed in this papers for the design of containment walls in different soils, suffering to the action of an earthquake, based on the Performance-Based Seismic Design.