10 resultados para Biases

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabasi-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q > 2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Malaria caused by Plasmodium vivax is an experimentally neglected severe disease with a substantial burden on human health. Because of technical limitations, little is known about the biology of this important human pathogen. Whole genome analysis methods on patient-derived material are thus likely to have a substantial impact on our understanding of P. vivax pathogenesis and epidemiology. For example, it will allow study of the evolution and population biology of the parasite, allow parasite transmission patterns to be characterized, and may facilitate the identification of new drug resistance genes. Because parasitemias are typically low and the parasite cannot be readily cultured, on-site leukocyte depletion of blood samples is typically needed to remove human DNA that may be 1000X more abundant than parasite DNA. These features have precluded the analysis of archived blood samples and require the presence of laboratories in close proximity to the collection of field samples for optimal pre-cryopreservation sample preparation. Results: Here we show that in-solution hybridization capture can be used to extract P. vivax DNA from human contaminating DNA in the laboratory without the need for on-site leukocyte filtration. Using a whole genome capture method, we were able to enrich P. vivax DNA from bulk genomic DNA from less than 0.5% to a median of 55% (range 20%-80%). This level of enrichment allows for efficient analysis of the samples by whole genome sequencing and does not introduce any gross biases into the data. With this method, we obtained greater than 5X coverage across 93% of the P. vivax genome for four P. vivax strains from Iquitos, Peru, which is similar to our results using leukocyte filtration (greater than 5X coverage across 96% of the genome). Conclusion: The whole genome capture technique will enable more efficient whole genome analysis of P. vivax from a larger geographic region and from valuable archived sample collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Cryptococcus neoformans causes meningitis and disseminated infection in healthy individuals, but more commonly in hosts with defective immune responses. Cell-mediated immunity is an important component of the immune response to a great variety of infections, including yeast infections. We aimed to evaluate a specific lymphocyte transformation assay to Cryptococcus neoformans in order to identify immunodeficiency associated to neurocryptococcosis (NCC) as primary cause of the mycosis. Methods: Healthy volunteers, poultry growers, and HIV-seronegative patients with neurocryptococcosis were tested for cellular immune response. Cryptococcal meningitis was diagnosed by India ink staining of cerebrospinal fluid and cryptococcal antigen test (Immunomycol-Inc, SP, Brazil). Isolated peripheral blood mononuclear cells were stimulated with C. neoformans antigen, C. albicans antigen, and pokeweed mitogen. The amount of H-3-thymidine incorporated was assessed, and the results were expressed as stimulation index (SI) and log SI, sensitivity, specificity, and cut-off value (receiver operating characteristics curve). We applied unpaired Student t tests to compare data and considered significant differences for p<0.05. Results: The lymphotoxin alpha showed a low capacity with all the stimuli for classifying patients as responders and non-responders. Lymphotoxin alpha stimulated by heated-killed antigen from patients with neurocryptococcosis was not affected by TCD4+ cell count, and the intensity of response did not correlate with the clinical evolution of neurocryptococcosis. Conclusion: Response to lymphocyte transformation assay should be analyzed based on a normal range and using more than one stimulator. The use of a cut-off value to classify patients with neurocryptococcosis is inadequate. Statistical analysis should be based on the log transformation of SI. A more purified antigen for evaluating specific response to C. neoformans is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The frequency distribution of SNPs and haplotypes in the ABCB1, SLCO1B1 and SLCO1B3 genes varies largely among continental populations. This variation can lead to biases in pharmacogenetic studies conducted in admixed populations such as those from Brazil and other Latin American countries. The aim of this study was to evaluate the influence of self-reported colour, geographical origin and genomic ancestry on distributions of the ABCB1, SLCO1B1 and SLCO1B3 polymorphisms and derived haplotypes in admixed Brazilian populations. A total of 1039 healthy adults from the north, north-east, south-east and south of Brazil were recruited for this investigation. The c.388A>G (rs2306283), c.463C>A (rs11045819) and c.521T>C (rs4149056) SNPs in the SLCO1B1 gene and c.334T>G (rs4149117) and c.699G>A (rs7311358) SNPs in the SLCO1B3 gene were determined by Taqman 5'-nuclease assays. The ABCB1 c.1236C>T (rs1128503), c.2677G>T/A (rs2032582) and c.3435C>T (rs1045642) polymorphisms were genotyped using a previously described single-base extension/termination method. The results showed that genotype and haplotype distributions are highly variable among populations of the same self-reported colour and geographical region. However, genomic ancestry showed that these associations are better explained by a continuous variable. The influence of ancestry on the distribution of alleles and haplotype frequencies was more evident in variants with large differences in allele frequencies between European and African populations. Design and interpretation of pharmacogenetic studies using these transporter genes should include genomic controls to avoid spurious conclusions based on improper matching of study cohorts from Brazilian populations and other highly admixed populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyzes important aspects of the tropical Atlantic Ocean from simulations of the fourth version of the Community Climate System Model (CCSM4): the mean sea surface temperature (SST) and wind stress, the Atlantic warm pools, the principal modes of SST variability, and the heat budget in the Benguela region. The main goal was to assess the similarities and differences between the CCSM4 simulations and observations. The results indicate that the tropical Atlantic overall is realistic in CCSM4. However, there are still significant biases in the CCSM4 Atlantic SSTs, with a colder tropical North Atlantic and a hotter tropical South Atlantic, that are related to biases in the wind stress. These are also reflected in the Atlantic warm pools in April and September, with its volume greater than in observations in April and smaller than in observations in September. The variability of SSTs in the tropical Atlantic is well represented in CCSM4. However, in the equatorial and tropical South Atlantic regions, CCSM4 has two distinct modes of variability, in contrast to observed behavior. A model heat budget analysis of the Benguela region indicates that the variability of the upper-ocean temperature is dominated by vertical advection, followed by meridional advection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives Clinical significance and management of prenatal hydronephrosis (PNH) are sources of debate. Existing studies are flawed with biased cohorts or inconsistent follow-up. We aimed to evaluate the incidence of pathology in a large cohort of PNH and assess the biases and outcomes of this population. Methods We reviewed 1034 charts of fetuses with PNH. Records of delivered offspring were reviewed at a pediatric center and analyzed with respect to prenatal and postnatal pathology and management. Results Prenatal resolution of hydronephrosis occurred in 24.7% of pregnancies. On first postnatal ultrasound, some degree of dilatation was present in 80%, 88% and 95% of mild, moderate and severe PNH cases, respectively. At the end of follow-up, hydronephrosis persisted in 10%, 25% and 72% of children, respectively. Incidence of vesicoureteral reflux did not correlate with severity of PNH. Children with postnatal workup had more severe PNH than those without. Conclusions Despite prenatal resolution totalizing 25%, pelvic dilatation persisted on first postnatal imaging in most cases, thus justifying postnatal ultrasound evaluation. Whereas most mild cases resolved spontaneously, a quarter of moderate and more than half of severe cases required surgery. Patients with postnatal imaging and referral had more severe PNH, which could result in overestimation of pathology. (c) 2012 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A set of predictor variables is said to be intrinsically multivariate predictive (IMP) for a target variable if all properly contained subsets of the predictor set are poor predictors of the. target but the full set predicts the target with great accuracy. In a previous article, the main properties of IMP Boolean variables have been analytically described, including the introduction of the IMP score, a metric based on the coefficient of determination (CoD) as a measure of predictiveness with respect to the target variable. It was shown that the IMP score depends on four main properties: logic of connection, predictive power, covariance between predictors and marginal predictor probabilities (biases). This paper extends that work to a broader context, in an attempt to characterize properties of discrete Bayesian networks that contribute to the presence of variables (network nodes) with high IMP scores. We have found that there is a relationship between the IMP score of a node and its territory size, i.e., its position along a pathway with one source: nodes far from the source display larger IMP scores than those closer to the source, and longer pathways display larger maximum IMP scores. This appears to be a consequence of the fact that nodes with small territory have larger probability of having highly covariate predictors, which leads to smaller IMP scores. In addition, a larger number of XOR and NXOR predictive logic relationships has positive influence over the maximum IMP score found in the pathway. This work presents analytical results based on a simple structure network and an analysis involving random networks constructed by computational simulations. Finally, results from a real Bayesian network application are provided. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The evolutionary advantages of selective attention are unclear. Since the study of selective attention began, it has been suggested that the nervous system only processes the most relevant stimuli because of its limited capacity [1]. An alternative proposal is that action planning requires the inhibition of irrelevant stimuli, which forces the nervous system to limit its processing [2]. An evolutionary approach might provide additional clues to clarify the role of selective attention. Methods We developed Artificial Life simulations wherein animals were repeatedly presented two objects, "left" and "right", each of which could be "food" or "non-food." The animals' neural networks (multilayer perceptrons) had two input nodes, one for each object, and two output nodes to determine if the animal ate each of the objects. The neural networks also had a variable number of hidden nodes, which determined whether or not it had enough capacity to process both stimuli (Table 1). The evolutionary relevance of the left and the right food objects could also vary depending on how much the animal's fitness was increased when ingesting them (Table 1). We compared sensory processing in animals with or without limited capacity, which evolved in simulations in which the objects had the same or different relevances. Table 1. Nine sets of simulations were performed, varying the values of food objects and the number of hidden nodes in the neural networks. The values of left and right food were swapped during the second half of the simulations. Non-food objects were always worth -3. The evolution of neural networks was simulated by a simple genetic algorithm. Fitness was a function of the number of food and non-food objects each animal ate and the chromosomes determined the node biases and synaptic weights. During each simulation, 10 populations of 20 individuals each evolved in parallel for 20,000 generations, then the relevance of food objects was swapped and the simulation was run again for another 20,000 generations. The neural networks were evaluated by their ability to identify the two objects correctly. The detectability (d') for the left and the right objects was calculated using Signal Detection Theory [3]. Results and conclusion When both stimuli were equally relevant, networks with two hidden nodes only processed one stimulus and ignored the other. With four or eight hidden nodes, they could correctly identify both stimuli. When the stimuli had different relevances, the d' for the most relevant stimulus was higher than the d' for the least relevant stimulus, even when the networks had four or eight hidden nodes. We conclude that selection mechanisms arose in our simulations depending not only on the size of the neuron networks but also on the stimuli's relevance for action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has consistently been shown that agents judge the intervals between their actions and outcomes as compressed in time, an effect named intentional binding. In the present work, we investigated whether this effect is result of prior bias volunteers have about the timing of the consequences of their actions, or if it is due to learning that occurs during the experimental session. Volunteers made temporal estimates of the interval between their action and target onset (Action conditions), or between two events (No-Action conditions). Our results show that temporal estimates become shorter throughout each experimental block in both conditions. Moreover, we found that observers judged intervals between action and outcomes as shorter even in very early trials of each block. To quantify the decrease of temporal judgments in experimental blocks, exponential functions were fitted to participants’ temporal judgments. The fitted parameters suggest that observers had different prior biases as to intervals between events in which action was involved. These findings suggest that prior bias might play a more important role in this effect than calibration-type learning processes.