11 resultados para Probabilistic Projections
em Universidade do Minho
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
PhD thesis in Bioengineering
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
In the trend towards tolerating hardware unreliability, accuracy is exchanged for cost savings. Running on less reliable machines, functionally correct code becomes risky and one needs to know how risk propagates so as to mitigate it. Risk estimation, however, seems to live outside the average programmer’s technical competence and core practice. In this paper we propose that program design by source-to-source transformation be risk-aware in the sense of making probabilistic faults visible and supporting equational reasoning on the probabilistic behaviour of programs caused by faults. This reasoning is carried out in a linear algebra extension to the standard, `a la Bird-Moor algebra of programming. This paper studies, in particular, the propagation of faults across standard program transformation techniques known as tupling and fusion, enabling the fault of the whole to be expressed in terms of the faults of its parts.
Resumo:
Software reconfigurability became increasingly relevant to the architectural process due to the crescent dependency of modern societies on reliable and adaptable systems. Such systems are supposed to adapt themselves to surrounding environmental changes with minimal service disruption, if any. This paper introduces an engine that statically applies reconfigurations to (formal) models of software architectures. Reconfigurations are specified using a domain specific language— ReCooPLa—which targets the manipulation of software coordinationstructures,typicallyusedinservice-orientedarchitectures(soa).Theengine is responsible for the compilation of ReCooPLa instances and their application to the relevant coordination structures. The resulting configurations are amenable to formal analysis of qualitative and quantitative (probabilistic) properties.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
First published online: December 16, 2014.