924 resultados para probabilistic Hough transform


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we introduce the probabilistic justification logic PJ, a logic in which we can reason about the probability of justification statements. We present its syntax and semantics, and establish a strong completeness theorem. Moreover, we investigate the relationship between PJ and the logic of uncertain justifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes’ theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The logic PJ is a probabilistic logic defined by adding (noniterated) probability operators to the basic justification logic J. In this paper we establish upper and lower bounds for the complexity of the derivability problem in the logic PJ. The main result of the paper is that the complexity of the derivability problem in PJ remains the same as the complexity of the derivability problem in the underlying logic J, which is π[p/2] -complete. This implies that the probability operators do not increase the complexity of the logic, although they arguably enrich the expressiveness of the language.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a probabilistic justification logic, PPJ, to study rational belief, degrees of belief and justifications. We establish soundness and completeness for PPJ and show that its satisfiability problem is decidable. In the last part we use PPJ to provide a solution to the lottery paradox.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigated whether a pure perceptual stream is sufficient for probabilistic sequence learning to occur within a single session or whether correlated streams are necessary, whether learning is affected by the transition probability between sequence elements, and how the sequence length influences learning. In each of three experiments, we used six horizontally arranged stimulus displays which consisted of randomly ordered bigrams xo and ox. The probability of the next possible target location out of two was either .50/.50 or .75/.25 and was marked by an underline. In Experiment 1, a left vs. right key response was required for the x of a marked bigram in the pure perceptual learning condition and a response key press corresponding to the marked bigram location (out of 6) was required in the correlated streams condition (i.e., the ring, middle, or index finger of the left and right hand, respectively). The same probabilistic 3-element sequence was used in both conditions. Learning occurred only in the correlated streams condition. In Experiment 2, we investigated whether sequence length affected learning correlated sequences by contrasting the 3-elements sequence with a 6-elements sequence. Significant sequence learning occurred in all conditions. In Experiment 3, we removed a potential confound, that is, the sequence of hand changes. Under these conditions, learning occurred for the 3-element sequence only and transition probability did not affect the amount of learning. Together, these results indicate that correlated streams are necessary for probabilistic sequence learning within a single session and that sequence length can reduce the chances for learning to occur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Every x-ray attenuation curve inherently contains all the information necessary to extract the complete energy spectrum of a beam. To date, attempts to obtain accurate spectral information from attenuation data have been inadequate.^ This investigation presents a mathematical pair model, grounded in physical reality by the Laplace Transformation, to describe the attenuation of a photon beam and the corresponding bremsstrahlung spectral distribution. In addition the Laplace model has been mathematically extended to include characteristic radiation in a physically meaningful way. A method to determine the fraction of characteristic radiation in any diagnostic x-ray beam was introduced for use with the extended model.^ This work has examined the reconstructive capability of the Laplace pair model for a photon beam range of from 50 kVp to 25 MV, using both theoretical and experimental methods.^ In the diagnostic region, excellent agreement between a wide variety of experimental spectra and those reconstructed with the Laplace model was obtained when the atomic composition of the attenuators was accurately known. The model successfully reproduced a 2 MV spectrum but demonstrated difficulty in accurately reconstructing orthovoltage and 6 MV spectra. The 25 MV spectrum was successfully reconstructed although poor agreement with the spectrum obtained by Levy was found.^ The analysis of errors, performed with diagnostic energy data, demonstrated the relative insensitivity of the model to typical experimental errors and confirmed that the model can be successfully used to theoretically derive accurate spectral information from experimental attenuation data. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce two probabilistic, data-driven models that predict a ship's speed and the situations where a ship is probable to get stuck in ice based on the joint effect of ice features such as the thickness and concentration of level ice, ice ridges, rafted ice, moreover ice compression is considered. To develop the models to datasets were utilized. First, the data from the Automatic Identification System about the performance of a selected ship was used. Second, a numerical ice model HELMI, developed in the Finnish Meteorological Institute, provided information about the ice field. The relations between the ice conditions and ship movements were established using Bayesian learning algorithms. The case study presented in this paper considers a single and unassisted trip of an ice-strengthened bulk carrier between two Finnish ports in the presence of challenging ice conditions, which varied in time and space. The obtained results show good prediction power of the models. This means, on average 80% for predicting the ship's speed within specified bins, and above 90% for predicting cases where a ship may get stuck in ice. We expect this new approach to facilitate the safe and effective route selection problem for ice-covered waters where the ship performance is reflected in the objective function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing models estimating oil spill costs at sea are based on data from the past, and they usually lack a systematic approach. This make them passive, and limits their ability to forecast the effect of the changes in the oil combating fleet or location of a spill on the oil spill costs. In this paper we make an attempt towards the development of a probabilistic and systematic model estimating the costs of clean-up operations for the Gulf of Finland. For this purpose we utilize expert knowledge along with the available data and information from literature. Then, the obtained information is combined into a framework with the use of a Bayesian Belief Networks. Due to lack of data, we validate the model by comparing its results with existing models, with which we found good agreement. We anticipate that the presented model can contribute to the cost-effective oil-combating fleet optimization for the Gulf of Finland. It can also facilitate the accident consequences estimation in the framework of formal safety assessment (FSA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gabbroic cumulates drilled south of the Kane Transform Fault on the slow-spread Mid-Atlantic Ridge preserve up to three discrete magnetization components. Here we use absolute age constraints derived from the paleomagnetic data to develop a model for the magmatic construction of this section of the lower oceanic crust. By comparing the paleomagnetic data with mineral compositions, and based on thermal models of local reheating, we infer that magmas that began crystallizing in the upper mantle intruded into the lower oceanic crust and formed meter-scale sills. Some of these magmas were crystal-laden and the subsequent expulsion of interstitial liquid from them produced '"cumulus" sills. These small-scale magmatic injections took place over at least 210 000 years and at distances of ~3 km from the ridge axis and may have formed much of the lower crust. This model explains many of the complexities described in this area and can be used to help understand the general formation of oceanic crust at slow-spread ridges.