9 resultados para probability of occurrence

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is unproductive. A risk-based decision support system (DSS) that reduces the amount of time spent on inspection has been presented. The risk-based DSS uses the analytic hierarchy process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of occurrence of these risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost and the cumulative effect of failure is determined through probability analysis. The model optimizes the cost of pipeline operations by reducing subjectivity in selecting a specific inspection method, identifying and prioritizing the right pipeline segment for inspection and maintenance, deriving budget allocation, providing guidance to deploy the right mix labor for inspection and maintenance, planning emergency preparation, and deriving logical insurance plan. The proposed methodology also helps derive inspection and maintenance policy for the entire pipeline system, suggest design, operational philosophy, and construction methodology for new pipelines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the pattern-dependent decoding failures in full-field electronic dispersion compensation (EDC) by offline processing of experimental signals, and find that the performance of such an EDC receiver may be degraded by an isolated "1" bit surrounded by long strings of consecutive "0s". By reducing the probability of occurrence of this kind of isolated "1" and using a novel adaptive threshold decoding method, we greatly improve the compensation performance to achieve 10-Gb/s on-off keyed signal transmission over 496-km field-installed single-mode fiber without optical dispersion compensation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the pattern-dependent decoding failures in full-field electronic dispersion compensation (EDC) by offline processing of experimental signals, and find that the performance of such an EDC receiver may be degraded by an isolated "1" bit surrounded by long strings of consecutive "0s". By reducing the probability of occurrence of this kind of isolated "1" and using a novel adaptive threshold decoding method, we greatly improve the compensation performance to achieve 10-Gb/s on-off keyed signal transmission over 496-km field-installed single-mode fiber without optical dispersion compensation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effective clinical decision making depends upon identifying possible outcomes for a patient, selecting relevant cues, and processing the cues to arrive at accurate judgements of each outcome's probability of occurrence. These activities can be considered as classification tasks. This paper describes a new model of psychological classification that explains how people use cues to determine class or outcome likelihoods. It proposes that clinicians respond to conditional probabilities of outcomes given cues and that these probabilities compete with each other for influence on classification. The model explains why people appear to respond to base rates inappropriately, thereby overestimating the occurrence of rare categories, and a clinical example is provided for predicting suicide risk. The model makes an effective representation for expert clinical judgements and its psychological validity enables it to generate explanations in a form that is comprehensible to clinicians. It is a strong candidate for incorporation within a decision support system for mental-health risk assessment, where it can link with statistical and pattern recognition tools applied to a database of patients. The symbiotic combination of empirical evidence and clinical expertise can provide an important web-based resource for risk assessment, including multi-disciplinary education and training. © 2002 Informa UK Ltd All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detection of signals in the presence of noise is one of the most basic and important problems encountered by communication engineers. Although the literature abounds with analyses of communications in Gaussian noise, relatively little work has appeared dealing with communications in non-Gaussian noise. In this thesis several digital communication systems disturbed by non-Gaussian noise are analysed. The thesis is divided into two main parts. In the first part, a filtered-Poisson impulse noise model is utilized to calulate error probability characteristics of a linear receiver operating in additive impulsive noise. Firstly the effect that non-Gaussian interference has on the performance of a receiver that has been optimized for Gaussian noise is determined. The factors affecting the choice of modulation scheme so as to minimize the deterimental effects of non-Gaussian noise are then discussed. In the second part, a new theoretical model of impulsive noise that fits well with the observed statistics of noise in radio channels below 100 MHz has been developed. This empirical noise model is applied to the detection of known signals in the presence of noise to determine the optimal receiver structure. The performance of such a detector has been assessed and is found to depend on the signal shape, the time-bandwidth product, as well as the signal-to-noise ratio. The optimal signal to minimize the probability of error of; the detector is determined. Attention is then turned to the problem of threshold detection. Detector structure, large sample performance and robustness against errors in the detector parameters are examined. Finally, estimators of such parameters as. the occurrence of an impulse and the parameters in an empirical noise model are developed for the case of an adaptive system with slowly varying conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Everyday human behaviour relies on our ability to predict outcomes on the basis of moment by moment information. Long-range neural phase synchronization has been hypothesized as a mechanism by which ‘predictions’ can exert an effect on the processing of incoming sensory events. Using magnetoencephalography (MEG) we have studied the relationship between the modulation of phase synchronization in a cerebral network of areas involved in visual target processing and the predictability of target occurrence. Our results reveal a striking increase in the modulation of phase synchronization associated with an increased probability of target occurrence. These observations are consistent with the hypothesis that long-range phase synchronization plays a critical functional role in humans' ability to effectively employ predictive heuristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To analyze, in a general population sample, clustering of delusional and hallucinatory experiences in relation to environmental exposures and clinical parameters. METHOD: General population-based household surveys of randomly selected adults between 18 and 65 years of age were carried out. SETTING: 52 countries participating in the World Health Organization's World Health Survey were included. PARTICIPANTS: 225 842 subjects (55.6% women), from nationally representative samples, with an individual response rate of 98.5% within households participated. RESULTS: Compared with isolated delusions and hallucinations, co-occurrence of the two phenomena was associated with poorer outcome including worse general health and functioning status (OR = 0.93; 95% CI: 0.92-0.93), greater severity of symptoms (OR = 2.5 95% CI: 2.0-3.0), higher probability of lifetime diagnosis of psychotic disorder (OR = 12.9; 95% CI: 11.5-14.4), lifetime treatment for psychotic disorder (OR = 19.7; 95% CI: 17.3-22.5), and depression during the last 12 months (OR = 11.6; 95% CI: 10.9-12.4). Co-occurrence was also associated with adversity and hearing problems (OR = 2.0; 95% CI: 1.8-2.3). CONCLUSION: The results suggest that the co-occurrence of hallucinations and delusions in populations is not random but instead can be seen, compared with either phenomenon in isolation, as the result of more etiologic loading leading to a more severe clinical state.