983 resultados para Probabilistic analysis
Resumo:
Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.
Resumo:
Assessing the safety of existing timber structures is of paramount importance for taking reliable decisions on repair actions and their extent. The results obtained through semi-probabilistic methods are unrealistic, as the partial safety factors present in codes are calibrated considering the uncertainty present in new structures. In order to overcome these limitations, and also to include the effects of decay in the safety analysis, probabilistic methods, based on Monte-Carlo simulation are applied here to assess the safety of existing timber structures. In particular, the impact of decay on structural safety is analyzed and discussed, using a simple structural model, similar to that used for current semi-probabilistic analysis.
Resumo:
In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.
Resumo:
The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.
Resumo:
The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.
Resumo:
Introduction Different modalities of palliation for obstructive symptoms in patients with unresectable esophageal cancer (EC) exist. However, these therapeutic alternatives have significant differences in costs and effectiveness. Methods A Markov model was designed to compare the cost-effectiveness (CE) of self-expandable stent (SES), brachytherapy and laser in the palliation of unresectable EC. Patients were assigned to one of the strategies, and the improvement in swallowing function was compared given the treatment efficacy, probability of survival, and risks of complications associated to each strategy. Probabilities and parameters for distribution were based on a 9-month time frame. Results Under the base-case scenario, laser has the lowest CE ratio, followed by brachytherapy at an incremental cost-effectiveness ratio (ICER) of $4,400.00, and SES is a dominated strategy. In the probabilistic analysis, laser is the strategy with the highest probability of cost-effectiveness for willingness to pay (WTP) values lower than $3,201 and brachytherapy for all WTP yielding a positive net health benefit (NHB) (threshold $4,440). The highest probability of cost-effectiveness for brachytherapy is 96%, and consequently, selection of suboptimal strategies can lead to opportunity losses for the US health system, ranging from US$ 4.32 to US$ 38.09 million dollars over the next 5-20 years. Conclusion Conditional to the WTP and current US Medicare costs, palliation of unresectable esophageal cancers with brachytherapy provides the largest amount of NHB and is the strategy with the highest probability of CE. However, some level of uncertainly remains, and wrong decisions will be made until further knowledge is acquired.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Estruturas
Resumo:
The European Court of Justice has held that as from 21 December 2012 insurers may no longer charge men and women differently on the basis of scientific evidence that is statistically linked to their sex, effectively prohibiting the use of sex as a factor in the calculation of premiums and benefits for the purposes of insurance and related financial services throughout the European Union. This ruling marks a sharp turn away from the traditional view that insurers should be allowed to apply just about any risk assessment criterion, so long as it is sustained by the findings of actuarial science. The naïveté behind the assumption that insurers’ recourse to statistical data and probabilistic analysis, given their scientific nature, would suffice to keep them out of harm’s way was exposed. In this article I look at the flaws of this assumption and question whether this judicial decision, whilst constituting a most welcome landmark in the pursuit of equality between men and women, has nonetheless gone too far by saying too little on the million dollar question of what separates admissible criteria of differentiation from inadmissible forms of discrimination.
Resumo:
Critical real-time ebedded (CRTE) Systems require safe and tight worst-case execution time (WCET) estimations to provide required safety levels and keep costs low. However, CRTE Systems require increasing performance to satisfy performance needs of existing and new features. Such performance can be only achieved by means of more agressive hardware architectures, which are much harder to analyze from a WCET perspective. The main features considered include cache memòries and multi-core processors.Thus, althoug such features provide higher performance, corrent WCET analysis methods are unable to provide tight WCET estimations. In fact, WCET estimations become worse than for simple rand less powerful hardware. The main reason is the fact that hardware behavior is deterministic but unknown and, therefore, the worst-case behavior must be assumed most of the time, leading to large WCET estimations. The purpose of this project is developing new hardware designs together with WCET analysis tools able to provide tight and safe WCET estimations. In order to do so, those pieces of hardware whose behavior is not easily analyzable due to lack of accurate information during WCET analysis will be enhanced to produce a probabilistically analyzable behavior. Thus, even if the worst-case behavior cannot be removed, its probabilty can be bounded, and hence, a safe and tight WCET can be provided for a particular safety level in line with the safety levels of the remaining components of the system. During the first year the project we have developed molt of the evaluation infraestructure as well as the techniques hardware techniques to analyze cache memories. During the second year those techniques have been evaluated, and new purely-softwar techniques have been developed.
Resumo:
-
Resumo:
Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A tecnologia DSL- Digital Subscriber Line permitem um acesso universal de banda larga, com redução de custo e tempo de implantação necessários, quando comparadas a outras redes de acesso. DSL pode ser considerada uma rede de banda larga de grande capilaridade, uma vez que utiliza uma combinação de infra-estrutura de telefonia existente e a tecnologia de transmissão. Não obstante, a rede DSL precisa ser amplamente compreendida, de maneira que suas principais vantagens não sejam suplantadas pela ineficiência geral do sistema. Esta tese apresenta uma abordagem baseada em estratégias que possibilitem o planejamento de redes de comunicação que a priori não foram implementadas para suportar fluxos Triple Play (pré-requisito obrigatório diante do perfil atual de usuários da Internet). Será mostrado que, a partir do uso de medidas reais e análises probabilísticas, é possível elaborar o planejamento de redes de comunicação, considerando os parâmetros físicos e lógicos, tais como: o caminho de transmissão, a influência do ruído na comunicação, os protocolos utilizados e tráfego de Triple Play (voz, vídeo e dados).
Resumo:
This paper addresses the analysis of probabilistic corrosion time initiation in reinforced concrete structures exposed to ions chloride penetration. Structural durability is an important criterion which must be evaluated in every type of structure, especially when these structures are constructed in aggressive atmospheres. Considering reinforced concrete members, chloride diffusion process is widely used to evaluate the durability. Therefore, at modelling this phenomenon, corrosion of reinforcements can be better estimated and prevented. These processes begin when a threshold level of chlorides concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in the literature, deterministic approaches fail to predict accurately the corrosion time initiation due to the inherently randomness observed in this process. In this regard, the durability can be more realistically represented using probabilistic approaches. A probabilistic analysis of ions chloride penetration is presented in this paper. The ions chloride penetration is simulated using the Fick's second law of diffusion. This law represents the chloride diffusion process, considering time dependent effects. The probability of failure is calculated using Monte Carlo simulation and the First Order Reliability Method (FORM) with a direct coupling approach. Some examples are considered in order to study these phenomena and a simplified method is proposed to determine optimal values for concrete cover.
Resumo:
SETTING: Kinshasa Province, Democratic Republic of Congo. OBJECTIVE: To identify and validate register-based indicators of acid-fast bacilli (AFB) microscopy quality. DESIGN: Selection of laboratories based on reliability and variation in routine smear rechecking results. Calculation of relative sensitivity (RS) compared to recheckers and its correlation coefficient (R) with candidate indicators based on a fully probabilistic analysis incorporating vague prior information using WinBUGS. RESULTS: The proportion of positive follow-up smears correlated well (median R 0.81, 95% credibility interval [CI] 0.58-0.93), and the proportion of first smear-positive cases fairly (median R 0.70, 95% CI 0.38-0.89) with RS. The proportions of both positive suspect and low positive case smears showed poor correlations (median R 0.27 and -0.22, respectively, with ranges including zero). CONCLUSIONS: The proportion of positives in follow-up smears is the most promising indicator of AFB smear sensitivity, while the proportion of positive suspects may be more indicative of accessibility and suspect selection. Both can be obtained from simple reports, and should be used for internal and external monitoring and as guidance for supervision. As proportion of low positive suspect smears and consistency within case series are more difficult to interpret, they should be used only on-site by laboratory professionals. All indicators require more research to define their optimal range in various settings.