900 resultados para probabilistic reasoning


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human leukocyte antigen (HLA) haplotypes are frequently evaluated for population history inferences and association studies. However, the available typing techniques for the main HLA loci usually do not allow the determination of the allele phase and the constitution of a haplotype, which may be obtained by a very time-consuming and expensive family-based segregation study. Without the family-based study, computational inference by probabilistic models is necessary to obtain haplotypes. Several authors have used the expectation-maximization (EM) algorithm to determine HLA haplotypes, but high levels of erroneous inferences are expected because of the genetic distance among the main HLA loci and the presence of several recombination hotspots. In order to evaluate the efficiency of computational inference methods, 763 unrelated individuals stratified into three different datasets had their haplotypes manually defined in a family-based study of HLA-A, -B, -DRB1 and -DQB1 segregation, and these haplotypes were compared with the data obtained by the following three methods: the Expectation-Maximization (EM) and Excoffier-Laval-Balding (ELB) algorithms using the arlequin 3.11 software, and the PHASE method. When comparing the methods, we observed that all algorithms showed a poor performance for haplotype reconstruction with distant loci, estimating incorrect haplotypes for 38%-57% of the samples considering all algorithms and datasets. We suggest that computational haplotype inferences involving low-resolution HLA-A, HLA-B, HLA-DRB1 and HLA-DQB1 haplotypes should be considered with caution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spread of an infectious disease in a population involves interactions leading to an epidemic outbreak through a network of contacts. Extending on Watts and Strogatz (1998) who showed that short-distance connections create a small-world effect, a model combining short-and long-distance probabilistic and regularly updated contacts helps considering spatial heterogeneity. The method is based on cellular automata. The presence of long-distance connections accelerates the small-world effect, as if the world shrank in proportion of their total number.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To determine the prevalence of trachoma in Sao Gabriel da Cachoeira (SGC), the only urban community of the upper Rio Negro Basin of the Amazon state in Brazil, near the Colombian border, and to investigate the risk factors associated with the active forms of the disease. Methods: A total of 1702 people (440 children up to 9 years and 1069 adults aged 15 years and above) were examined. The sample was selected from a probabilistic household sampling procedure based on census data and a previous study of trachoma prevalence in Sao Gabriel da Cachoeira. A two-stage probabilistic household cluster sample was drawn. Household units were randomly selected within each cluster. A variety of socioeconomic and hygiene variables were studied in order to determine the risk factors for active trachoma in a household. Results: The total prevalence of trachoma was 8.9%. Prevalence of active trachoma (TF and/or TI) in children aged 1-9 years was 11.1% and trachomatous trichiasis in adults aged 15 years and above was 0.19%. Trachomatous scarring reached a peak of 22.4% for subjects between 50 to 60 years of age. Corneal opacity occurred in subjects aged 50 years and older with a prevalence of 2.0%. No sex effect was found on the overall prevalence of trachoma in SGC. Risk factors associated with active trachoma were mainly related to poor socioeconomic indicators. Conclusions: Despite the ubiquitous presence of water, the analysis of the risk factors associated with the active forms of the disease supports the idea that a low personal standard of hygiene and not water availability per se, is the key factor associated with trachoma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective. To estimate physical violence between intimate partners and to examine the association between violence and sociodemographic variables, use of alcohol, and other related factors. Method. This epidemiologic survey included a stratified probabilistic sample representative of the population from the city of Sao Paulo in economic and educational terms. The Gender, Alcohol and Culture: An International Study (GENACIS) questionnaire was employed. The sampling unit was the home, where all individuals older than 18 years were candidates for interview. The final sample included 1 631 people. Statistical analysis employed the Rao Scott test and logistic regression. Results. The response rate was 74.5%. Most participants were female (58.8%), younger than 40 years of age (52%), or had 5 to 12 years of schooling. Of the overall group, 5.4% reported having been victims of physical violence by an intimate partner and 5.4% declared having been aggressors of intimate partners in the past 2 years. Most men declared that none of those involved had ingested alcohol at the moment of aggression. Most women reported that nobody or only the man had drunk. Being a victim or an aggressor was associated with younger age and having a heavy-drinking partner. Women suffered more serious aggression, requiring medical care, and expressed more anger and disgust at aggression than men. Conclusions. The results underscore the importance of the association between alcohol use and risk of aggression between intimate partners, and may contribute to the design of public policies aimed to control this situation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent interest in the development and evolution of theory of mind has provided a wealth of information about representational skills in both children and animals, According to J, Perrier (1991), children begin to entertain secondary representations in the 2nd year of life. This advance manifests in their passing hidden displacement tasks, engaging in pretense and means-ends reasoning, interpreting external representations, displaying mirror self-recognition and empathic behavior, and showing an early understanding of mind and imitation. New data show a cluster of mental accomplishments in great apes that is very similar to that observed in 2-year-old humans. It is suggested that it is most parsimonious to assume that this cognitive profile is of homologous origin and that great apes possess secondary representational capacity. Evidence from animals other than apes is scant. This analysis leads to a number of predictions for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines Simpson's paradox as applied to the theory of probabilites and percentages. The author discusses possible flaws in the paradox and compares it to the Sure Thing Principle, statistical inference, causal inference and probabilistic analyses of causation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the limit state design (LSD) method each design criterion is formally stated and assessed using a performance function. The performance function defines the relationship between the design parameters and the design criterion. In practice, LSD involves factoring up loads and factoring down calculated strengths and material parameters. This provides a convenient way to carry out routine probabilistic-based design. The factors are statistically calculated to produce a design with an acceptably low probability of failure. Hence the ultimate load and the design material properties are mathematical concepts that have no physical interpretation. They may be physically impossible. Similarly, the appropriate analysis model is also defined by the performance function and may not describe the real behaviour at the perceived physical equivalent limit condition. These points must be understood to avoid confusion in the discussion and application of partial factor LSD methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Teaching ethics incorporates teaching of knowledge as well as skills and attitudes. Each of these requires different teaching and assessment methods. A core curriculum of ethics knowledge must address both the foundations of ethics and specific ethical topics. Ethical skills teaching focuses on the development of ethical awareness, moral reasoning, communication and collaborative action skills. Attitudes that are important for medical students to develop include honesty, integrity and trustworthiness, empathy and compassion, respect, and responsibility, as well as critical self-appraisal and commitment to lifelong education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1 Previous studies have demonstrated that chronic pre-synaptic inhibition of transmitter release by morphine evokes a counter-adaptive response in the sympathetic nerve terminals that manifests itself as an increase in transmitter release during acute withdrawal. In the present study we examined the possibility that other pre-synaptically acting drugs such as clonidine also evoke a counter-adaptive response in the sympathetic nerve terminals. 2 In chronically saline treated (CST) preparations, clonidine (0.5 muM) completely abolished evoked transmitter release from sympathetic varicosities bathed in an extracellular calcium concentration ([Ca2+](o)) of 2 mM. The inhibitory effect of clonidine was reduced by increasing [Ca2+](o) from 2 to 4 mM and the stimulation frequency from 0.1 to 1 Hz. 3 The nerve terminal impulse (NTI) was not affected by concentrations of clonidine that completely abolished evoked transmitter release. 4 Sympathetic varicosities developed a tolerance to clonidine (0.5 muM) following 7-9 days of chronic exposure to clonidine. 5 Acute withdrawal of preparations following chronic clonidine treatment (CCT) resulted in a significant (P

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computer simulation of dynamical systems involves a phase space which is the finite set of machine arithmetic. Rounding state values of the continuous system to this grid yields a spatially discrete dynamical system, often with different dynamical behaviour. Discretization of an invertible smooth system gives a system with set-valued negative semitrajectories. As the grid is refined, asymptotic behaviour of the semitrajectories follows probabilistic laws which correspond to a set-valued Markov chain, whose transition probabilities can be explicitly calculated. The results are illustrated for two-dimensional dynamical systems obtained by discretization of fractional linear transformations of the unit disc in the complex plane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we describe a distributed object oriented logic programming language in which an object is a collection of threads deductively accessing and updating a shared logic program. The key features of the language, such as static and dynamic object methods and multiple inheritance, are illustrated through a series of small examples. We show how we can implement object servers, allowing remote spawning of objects, which we can use as staging posts for mobile agents. We give as an example an information gathering mobile agent that can be queried about the information it has so far gathered whilst it is gathering new information. Finally we define a class of co-operative reasoning agents that can do resource bounded inference for full first order predicate logic, handling multiple queries and information updates concurrently. We believe that the combination of the concurrent OO and the LP programming paradigms produces a powerful tool for quickly implementing rational multi-agent applications on the internet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In computer simulations of smooth dynamical systems, the original phase space is replaced by machine arithmetic, which is a finite set. The resulting spatially discretized dynamical systems do not inherit all functional properties of the original systems, such as surjectivity and existence of absolutely continuous invariant measures. This can lead to computational collapse to fixed points or short cycles. The paper studies loss of such properties in spatial discretizations of dynamical systems induced by unimodal mappings of the unit interval. The problem reduces to studying set-valued negative semitrajectories of the discretized system. As the grid is refined, the asymptotic behavior of the cardinality structure of the semitrajectories follows probabilistic laws corresponding to a branching process. The transition probabilities of this process are explicitly calculated. These results are illustrated by the example of the discretized logistic mapping.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comprehensive probabilistic model for simulating microstructure formation and evolution during solidification has been developed, based on coupling a Finite Differential Method (FDM) for macroscopic modelling of heat diffusion to a modified Cellular Automaton (mCA) for microscopic modelling of nucleation, growth of microstructures and solute diffusion. The mCA model is similar to Nastac's model for handling solute redistribution in the liquid and solid phases, curvature and growth anisotropy, but differs in the treatment of nucleation and growth. The aim is to improve understanding of the relationship between the solidification conditions and microstructure formation and evolution. A numerical algorithm used for FDM and mCA was developed. At each coarse scale, temperatures at FDM nodes were calculated while nucleation-growth simulation was done at a finer scale, with the temperature at the cell locations being interpolated from those at the coarser volumes. This model takes account of thermal, curvature and solute diffusion effects. Therefore, it can not only simulate microstructures of alloys both on the scale of grain size (macroscopic level) and the dendrite tip length (mesoscopic level), but also investigate nucleation mechanisms and growth kinetics of alloys solidified with various solute concentrations and solidification morphologies. The calculated results are compared with values of grain sizes and solidification morphologies of microstructures obtained from a set of casting experiments of Al-Si alloys in graphite crucibles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fixed-point roundoff noise in digital implementation of linear systems arises due to overflow, quantization of coefficients and input signals, and arithmetical errors. In uniform white-noise models, the last two types of roundoff errors are regarded as uniformly distributed independent random vectors on cubes of suitable size. For input signal quantization errors, the heuristic model is justified by a quantization theorem, which cannot be directly applied to arithmetical errors due to the complicated input-dependence of errors. The complete uniform white-noise model is shown to be valid in the sense of weak convergence of probabilistic measures as the lattice step tends to zero if the matrices of realization of the system in the state space satisfy certain nonresonance conditions and the finite-dimensional distributions of the input signal are absolutely continuous.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper tests the explanatory capacities of different versions of new institutionalism by examining the Australian case of a general transition in central banking practice and monetary politics: namely, the increased emphasis on low inflation and central bank independence. Standard versions of rational choice institutionalism largely dominate the literature on the politics of central banking, but this approach (here termed RC1) fails to account for Australian empirics. RC1 has a tendency to establish actor preferences exogenously to the analysis; actors' motives are also assumed a priori; actor's preferences are depicted in relatively static, ahistorical terms. And there is the tendency, even a methodological requirement, to assume relatively simple motives and preference sets among actors, in part because of the game theoretic nature of RC1 reasoning. It is possible to build a more accurate rational choice model by re-specifying and essentially updating the context, incentives and choice sets that have driven rational choice in this case. Enter RC2. However, this move subtly introduces methodological shifts and new theoretical challenges. By contrast, historical institutionalism uses an inductive methodology. Compared with deduction, it is arguably better able to deal with complexity and nuance. It also utilises a dynamic, historical approach, and specifies (dynamically) endogenous preference formation by interpretive actors. Historical institutionalism is also able to more easily incorporate a wider set of key explanatory variables and incorporate wider social aggregates. Hence, it is argued that historical institutionalism is the preferred explanatory theory and methodology in this case.