959 resultados para false personation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The statement that pairs of individuals from different populations are often more genetically similar than pairs from the same population is a widespread idea inside and outside the scientific community. Witherspoon et al. [""Genetic similarities within and between human populations,"" Genetics 176:351-359 (2007)] proposed an index called the dissimilarity fraction (omega) to access in a quantitative way the validity of this statement for genetic systems. Witherspoon demonstrated that, as the number of loci increases, omega decreases to a point where, when enough sampling is available, the statement is false. In this study, we applied the dissimilarity fraction to Howells`s craniometric database to establish whether or not similar results are obtained for cranial morphological traits. Although in genetic studies thousands of loci are available, Howells`s database provides no more than 55 metric traits, making the contribution of each variable important. To cope with this limitation, we developed a routine that takes this effect into consideration when calculating. omega Contrary to what was observed for the genetic data, our results show that cranial morphology asymptotically approaches a mean omega of 0.3 and therefore supports the initial statement-that is, that individuals from the same geographic region do not form clear and discrete clusters-further questioning the idea of the existence of discrete biological clusters in the human species. Finally, by assuming that cranial morphology is under an additive polygenetic model, we can say that the population history signal of human craniometric traits presents the same resolution as a neutral genetic system dependent on no more than 20 loci.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data obtained during routine diagnosis of human T-cell lymphotropic virus type 1 (HTLV-1) and 2 (HTLV-2) in ""at-risk"" individuals from Sao Paulo, Brazil using signal-to-cutoff (S/C) values obtained by first, second, and third generation enzyme immunoassay (EIA) kits, were compared. The highest S/C values were obtained with third generation EIA kits, but no correlation was detected between these values and specific antibody reactivity to HTLV-1, HTLV-2, or untyped HTLV (p = 0.302). In addition, use of these third generation kits resulted in HTLV-1/2 false-positive samples. In contrast, first and second generation EIA kits showed high specificity, and the second generation EIA kits showed the highest efficiency, despite lower S/C values. Using first and second generation EIA kits, significant differences in specific antibody detection of HTLV-1, relative to HTLV-2 (p = 0.019 for first generation and p < 0.001 for second generation EIA kits) and relative to untyped HTLV (p = 0.025 for first generation EIA kits), were observed. These results were explained by the composition and format of the assays. In addition, using receiver operating characteristics (ROC) analysis, a slight adjustment in cutoff values for third generation EIA kits improved their specificities and should be used when HTLV ""at-risk"" populations from this geographic area are to be evaluated. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Human capillariasis caused by Capillaria hepatica (syn. Calodium hepaticum) is a rare disease with no more than 40 cases registered around the world. Classically, the disease has severe symptoms that mimic acute hepatitis. Natural reservoirs of C. hepatica are urban rodents (Mus musculus and Rattus novergicus) that harbor their eggs in the liver. After examining the feces of 6 riverine inhabitants (Rio Preto area, 8 degrees 03`S and 62 degrees 53`W to 8 degrees 14`S and 62 degrees 52`W) of the State of Rondonia, Brazil, and identifying C. hepatica eggs in their feces, the authors decided to investigate the real dimension of these findings by looking for two positive signals. Methods: Between June 1(st) and 15(th), 2008, 246 out of 304 individuals were clinically examined. Blood samples were collected, kept under -20 degrees C, and test by the indirect immunofluorescence technique. Results: The first positive signal was the presence of specific antibodies at 1: 150 dilution, which indicates that the person is likely to have been exposed to eggs, most likely non-infective eggs, passing through the food chain or via contaminated food (total prevalence of 34.1%). A second more specific signal was the presence of antibodies at higher titers, thus indicating true infection. Conclusions: The authors concluded that only two subjects were really infected (prevalence of 0.81%); the rest was false-positives that were sensitized after consuming non-embryonated eggs. The present study is the first one carried out in a native Amazonian population and indicates the presence of antibodies against C. hepatica in this population. The results further suggest that the transmission of the parasite occurs by the ingestion of embryonated eggs from human feces and/or carcasses of wild animals. The authors propose a novel mode of transmission, describing the disease as a low pathogenic one, and showing low infectivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We search for planar deviations of statistical isotropy in the Wilkinson Microwave Anisotropy Probe (WMAP) data by applying a recently introduced angular-planar statistics both to full-sky and to masked temperature maps, including in our analysis the effect of the residual foreground contamination and systematics in the foreground removing process as sources of error. We confirm earlier findings that full-sky maps exhibit anomalies at the planar (l) and angular (l) scales (l; l) = (2; 5); (4; 7); and (6; 8), which seem to be due to unremoved foregrounds since this features are present in the full-sky map but not in the masked maps. On the other hand, our test detects slightly anomalous results at the scales (l; l) = (10; 8) and (2; 9) in the masked maps but not in the full-sky one, indicating that the foreground cleaning procedure (used to generate the full-sky map) could not only be creating false anomalies but also hiding existing ones. We also find a significant trace of an anomaly in the full-sky map at the scale (l; l) = (10; 5), which is still present when we consider galactic cuts of 18.3% and 28.4%. As regards the quadrupole (l = 2), we find a coherent over-modulation over the whole celestial sphere, for all full-sky and cut-sky maps. Overall, our results seem to indicate that current CMB maps derived from WMAP data do not show significant signs of anisotropies, as measured by our angular-planar estimator. However, we have detected a curious coherence of planar modulations at angular scales of the order of the galaxy`s plane, which may be an indication of residual contaminations in the full-and cut-sky maps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a method to locate and track people by combining evidence from multiple cameras using the homography constraint. The proposed method use foreground pixels from simple background subtraction to compute evidence of the location of people on a reference ground plane. The algorithm computes the amount of support that basically corresponds to the ""foreground mass"" above each pixel. Therefore, pixels that correspond to ground points have more support. The support is normalized to compensate for perspective effects and accumulated on the reference plane for all camera views. The detection of people on the reference plane becomes a search for regions of local maxima in the accumulator. Many false positives are filtered by checking the visibility consistency of the detected candidates against all camera views. The remaining candidates are tracked using Kalman filters and appearance models. Experimental results using challenging data from PETS`06 show good performance of the method in the presence of severe occlusion. Ground truth data also confirms the robustness of the method. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: A major problem in Chagas disease donor screening is the high frequency of samples with inconclusive results. The objective of this study was to describe patterns of serologic results among donors to the three Brazilian REDS-II blood centers and correlate with epidemiologic characteristics. STUDY DESIGN AND METHODS: The centers screened donor samples with one Trypanosoma cruzi lysate enzyme immunoassay (EIA). EIA-reactive samples were tested with a second lysate EIA, a recombinant-antigen based EIA, and an immunfluorescence assay. Based on the serologic results, samples were classified as confirmed positive (CP), probable positive (PP), possible other parasitic infection (POPI), and false positive (FP). RESULTS: In 2007 to 2008, a total of 877 of 615,433 donations were discarded due to Chagas assay reactivity. The prevalences (95% confidence intervals [CIs]) among first-time donors for CP, PP, POPI, and FP patterns were 114 (99-129), 26 (19-34), 10 (5-14), and 96 (82-110) per 100,000 donations, respectively. CP and PP had similar patterns of prevalence when analyzed by age, sex, education, and location, suggesting that PP cases represent true T. cruzi infections; in contrast the demographics of donors with POPI were distinct and likely unrelated to Chagas disease. No CP cases were detected among 218,514 repeat donors followed for a total of 718,187 person-years. CONCLUSION: We have proposed a classification algorithm that may have practical importance for donor counseling and epidemiologic analyses of T. cruzi-seroreactive donors. The absence of incident T. cruzi infections is reassuring with respect to risk of window phase infections within Brazil and travel-related infections in nonendemic countries such as the United States.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cytochrome P450 (CYP450) is a class of enzymes where the substrate identification is particularly important to know. It would help medicinal chemists to design drugs with lower side effects due to drug-drug interactions and to extensive genetic polymorphism. Herein, we discuss the application of the 2D and 3D-similarity searches in identifying reference Structures with higher capacity to retrieve Substrates of three important CYP enzymes (CYP2C9, CYP2D6, and CYP3A4). On the basis of the complementarities of multiple reference structures selected by different similarity search methods, we proposed the fusion of their individual Tanimoto scores into a consensus Tanimoto score (T(consensus)). Using this new score, true positive rates of 63% (CYP2C9) and 81% (CYP2D6) were achieved with false positive rates of 4% for the CYP2C9-CYP2D6 data Set. Extended similarity searches were carried out oil a validation data set, and the results showed that by using the T(consensus) score, not only the area of a ROC graph increased, but also more substrates were recovered at the beginning of a ranked list.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I consider the case for genuinely anonymous web searching. Big data seems to have it in for privacy. The story is well known, particularly since the dawn of the web. Vastly more personal information, monumental and quotidian, is gathered than in the pre-digital days. Once gathered it can be aggregated and analyzed to produce rich portraits, which in turn permit unnerving prediction of our future behavior. The new information can then be shared widely, limiting prospects and threatening autonomy. How should we respond? Following Nissenbaum (2011) and Brunton and Nissenbaum (2011 and 2013), I will argue that the proposed solutions—consent, anonymity as conventionally practiced, corporate best practices, and law—fail to protect us against routine surveillance of our online behavior. Brunton and Nissenbaum rightly maintain that, given the power imbalance between data holders and data subjects, obfuscation of one’s online activities is justified. Obfuscation works by generating “misleading, false, or ambiguous data with the intention of confusing an adversary or simply adding to the time or cost of separating good data from bad,” thus decreasing the value of the data collected (Brunton and Nissenbaum, 2011). The phenomenon is as old as the hills. Natural selection evidently blundered upon the tactic long ago. Take a savory butterfly whose markings mimic those of a toxic cousin. From the point of view of a would-be predator the data conveyed by the pattern is ambiguous. Is the bug lunch or potential last meal? In the light of the steep costs of a mistake, the savvy predator goes hungry. Online obfuscation works similarly, attempting for instance to disguise the surfer’s identity (Tor) or the nature of her queries (Howe and Nissenbaum 2009). Yet online obfuscation comes with significant social costs. First, it implies free riding. If I’ve installed an effective obfuscating program, I’m enjoying the benefits of an apparently free internet without paying the costs of surveillance, which are shifted entirely onto non-obfuscators. Second, it permits sketchy actors, from child pornographers to fraudsters, to operate with near impunity. Third, online merchants could plausibly claim that, when we shop online, surveillance is the price we pay for convenience. If we don’t like it, we should take our business to the local brick-and-mortar and pay with cash. Brunton and Nissenbaum have not fully addressed the last two costs. Nevertheless, I think the strict defender of online anonymity can meet these objections. Regarding the third, the future doesn’t bode well for offline shopping. Consider music and books. Intrepid shoppers can still find most of what they want in a book or record store. Soon, though, this will probably not be the case. And then there are those who, for perfectly good reasons, are sensitive about doing some of their shopping in person, perhaps because of their weight or sexual tastes. I argue that consumers should not have to pay the price of surveillance every time they want to buy that catchy new hit, that New York Times bestseller, or a sex toy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When using the digital halftone proofing systems, a closer print match can be achieved compared to what earlier couldbe done with the analogue proofing systems. These proofing systems possibilities to produce accurate print match canas well lead to producing bad print matches as several print related parameters can be adjusted manually in the systemby the user. Therefore, more advanced knowledge in graphic arts technology is required by the user of the system.The prepress company Colorcraft AB wishes to control that their color proofs always have the right quality. This projectwas started with the purpose to find a quality control metod for Colorcraft´s digital halftone proofing system(Kodak Approval XP4).Using a software who supports spectral measuring combined with a spectrophotometer and a control bar, a qualitycontrol system was assembled. This system detects variations that lies out of the proofing system´s natural deviation.The prerequisite for this quality control system is that the tolerances are defined with consideration taken to the proofingsystems natural deviations. Othervise the quality control system will generate unnecessecary false alarms and thereforenot be reliable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to find similarities and differences between male and female fiction-writing. The data has been collected from pupils at an upper secondary school in Central Sweden. They were given an extract from a novel by Bernard MacLaverty and from that they were supposed to continue the story.Theories that have evolved during the last centuries claim that the language use between men and women differ in many aspects. Women, it is said, use a more emotional language than men do, while men use more expletives than women. Likewise, the language is said to differ in the use of adverbs, verbs and adjectives. It has also been claimed that men and women have different topic developments and that women write longer sentences than men.The results of the current study show that most of these claims are false, or at least not true in this specific context. In most cases there is little or no difference between the male writing and the female writing. There are also cases where the opposite is true – for example, the female participants write shorter sentences than the male participants. A general conclusion of the study is that the writing between the two groups are quite similar – or at least that similarities are present to a larger extent than differences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, a new algorithm has been proposed to segment the foreground of the fingerprint from the image under consideration. The algorithm uses three features, mean, variance and coherence. Based on these features, a rule system is built to help the algorithm to efficiently segment the image. In addition, the proposed algorithm combine split and merge with modified Otsu. Both enhancements techniques such as Gaussian filter and histogram equalization are applied to enhance and improve the quality of the image. Finally, a post processing technique is implemented to counter the undesirable effect in the segmented image. Fingerprint recognition system is one of the oldest recognition systems in biometrics techniques. Everyone have a unique and unchangeable fingerprint. Based on this uniqueness and distinctness, fingerprint identification has been used in many applications for a long period. A fingerprint image is a pattern which consists of two regions, foreground and background. The foreground contains all important information needed in the automatic fingerprint recognition systems. However, the background is a noisy region that contributes to the extraction of false minutiae in the system. To avoid the extraction of false minutiae, there are many steps which should be followed such as preprocessing and enhancement. One of these steps is the transformation of the fingerprint image from gray-scale image to black and white image. This transformation is called segmentation or binarization. The aim for fingerprint segmentation is to separate the foreground from the background. Due to the nature of fingerprint image, the segmentation becomes an important and challenging task. The proposed algorithm is applied on FVC2000 database. Manual examinations from human experts show that the proposed algorithm provides an efficient segmentation results. These improved results are demonstrating in diverse experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Defects are often present in rolled products, such as wire rod. The markets demand for wire rod without any defects has increased. In the final wire rod products, defects originating from the steel making, casting, pre-rolling of billets and during the wire rod rolling can appear. In this work, artificial V-shaped longitudinal surface cracks has been analysed experimentally and by means of FEM. The results indicate that the experiments and FEM calculations show the same tendency except in two cases, where instability due to a fairly “round” false round bars disturbed the experiment. FE studies in combination with practical experiments are necessary in order to understand the behaviour of the material flows in the groove and to explain whether the crack will open up as a V-shape or if it will be closed as an I-shape.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Negotiating experience in the court How do judges assess witness credibility, and how do they proceed to reach sustainable conclusions in a criminal court? This article is based on discussions in four focus groups with lay judges in Swedish district courts. In criminal court trials, a version of an event is generally reinforced if it is confirmed by witnesses. However, if their narratives are too similar, none of them is found trustworthy. The focus group participants agreed that if witnesses were suspected of having discussed their individual experiences of an event and accommodated them into a common story, their testimonies were not considered credible. While testimonies should ideally be untainted by other people’s impressions and opinions, other rules govern the truth of the court. The lay judges appreciated their deliberations, including negotiations on impressions and memories of the trial, and they sometimes adjusted their perceptions in the light of information provided by other members of the court. However, if the lay judges are viewed as witnesses of what takes place in the trial, this gives rise to a paradox: While witness negotiations on experiences are regarded as a means to construct a false or biased story, the same kind of interaction between the judges is considered necessary to establish a consensual truth of what actually happened.