21 resultados para Probabilistic metrics
em Universidade do Minho
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
The occupational risks in the nanotechnology research laboratories are an important topic since a great number of researchers are involved in this area. The risk assessment performed by both qualitative and quantitative methods is a necessary step for the management of the occupational risks. Risk assessment could be performed by qualitative methods that gather consensus in the scientific community. It is also possible to use quantitative methods, based in different technics and metrics, as indicative exposure limits are been settled by several institutions. While performing the risk assessment, the information on the materials used is very important and, if it is not updated, it could create a bias in the assessment results. The exposure to TiO2 nanoparticles risk was assessed in a research laboratory using a quantitative exposure method and qualitative risk assessment methods. It was found the results from direct-reading Condensation Particle Counter (CPC) equipment and the CB Nanotool seem to be related and aligned, while the results obtained from the use of the Stoffenmanager Nano seem to indicate a higher risk level.
Resumo:
Human activity is very dynamic and subtle, and most physical environments are also highly dynamic and support a vast range of social practices that do not map directly into any immediate ubiquitous computing functionally. Identifying what is valuable to people is very hard and obviously leads to great uncertainty regarding the type of support needed and the type of resources needed to create such support. We have addressed the issues of system development through the adoption of a Crowdsourced software development model [13]. We have designed and developed Anywhere places, an open and flexible system support infrastructure for Ubiquitous Computing that is based on a balanced combination between global services and applications and situated devices. Evaluation, however, is still an open problem. The characteristics of ubiquitous computing environments make their evaluation very complex: there are no globally accepted metrics and it is very difficult to evaluate large-scale and long-term environments in real contexts. In this paper, we describe a first proposal of an hybrid 3D simulated prototype of Anywhere places that combines simulated and real components to generate a mixed reality which can be used to assess the envisaged ubiquitous computing environments [17].
Resumo:
This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.
Resumo:
Although most of the accidents occurred in Olive Oil Mill (OOM) resulted from “basic” risks, there is a need to apply adequate tools to support risk decisions that can meet the specificities of this sector. This study aims to analyse the views of Occupational, Safety & Health (OSH) practitioners about the risk assessment process in OOM, identifying the key difficulties inherent to the risk assessment process in these sector, as well as identifying some improvements to the current practice. This analysis was based on a questionnaire that was developed and applied to 13 OSH practitioners working at OOM. The results showed that the time available to perform the risk assessment is the more frequent limitation. They believe that the methodologies available are not an important limitation to this process. However, a specific risk assessment methodology, that includes acceptance criteria adjusted to the OOM reality, using risk metrics supported on the frequency of accidents and workdays lost, were indicated as being also an important contributions improve the process. A semi-quantitative approach, complemented with the use of the sector accident statistics, can be a good solution for this sector. However, further strategies should also be adopted, mainly those that can lead to an easy application of the risk assessment process.
Resumo:
Tese de Doutoramento em Engenharia Industrial e de Sistemas (PDEIS)
Resumo:
The Prognostic Health Management (PHM) has been asserting itself as the most promising methodology to enhance the effective reliability and availability of a product or system during its life-cycle conditions by detecting current and approaching failures, thus, providing mitigation of the system risks with reduced logistics and support costs. However, PHM is at an early stage of development, it also expresses some concerns about possible shortcomings of its methods, tools, metrics and standardization. These factors have been severely restricting the applicability of PHM and its adoption by the industry. This paper presents a comprehensive literature review about the PHM main general weaknesses. Exploring the research opportunities present in some recent publications, are discussed and outlined the general guide-lines for finding the answer to these issues.
Resumo:
The main features of most components consist of simple basic functional geometries: planes, cylinders, spheres and cones. Shape and position recognition of these geometries is essential for dimensional characterization of components, and represent an important contribution in the life cycle of the product, concerning in particular the manufacturing and inspection processes of the final product. This work aims to establish an algorithm to automatically recognize such geometries, without operator intervention. Using differential geometry large volumes of data can be treated and the basic functional geometries to be dealt recognized. The original data can be obtained by rapid acquisition methods, such as 3D survey or photography, and then converted into Cartesian coordinates. The satisfaction of intrinsic decision conditions allows different geometries to be fast identified, without operator intervention. Since inspection is generally a time consuming task, this method reduces operator intervention in the process. The algorithm was first tested using geometric data generated in MATLAB and then through a set of data points acquired by measuring with a coordinate measuring machine and a 3D scan on real physical surfaces. Comparison time spent in measuring is presented to show the advantage of the method. The results validated the suitability and potential of the algorithm hereby proposed
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão Industrial
Resumo:
The research aimed to establish tyre-road noise models by using a Data Mining approach that allowed to build a predictive model and assess the importance of the tested input variables. The data modelling took into account three learning algorithms and three metrics to define the best predictive model. The variables tested included basic properties of pavement surfaces, macrotexture, megatexture, and uneven- ness and, for the first time, damping. Also, the importance of those variables was measured by using a sensitivity analysis procedure. Two types of models were set: one with basic variables and another with complex variables, such as megatexture and damping, all as a function of vehicles speed. More detailed models were additionally set by the speed level. As a result, several models with very good tyre-road noise predictive capacity were achieved. The most relevant variables were Speed, Temperature, Aggregate size, Mean Profile Depth, and Damping, which had the highest importance, even though influenced by speed. Megatexture and IRI had the lowest importance. The applicability of the models developed in this work is relevant for trucks tyre-noise prediction, represented by the AVON V4 test tyre, at the early stage of road pavements use. Therefore, the obtained models are highly useful for the design of pavements and for noise prediction by road authorities and contractors.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia de Gestão e Sistemas de Informação