20 resultados para Artifact

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The node-density effect is an artifact of phylogeny reconstruction that can cause branch lengths to be underestimated in areas of the tree with fewer taxa. Webster, Payne, and Pagel (2003, Science 301:478) introduced a statistical procedure (the "delta" test) to detect this artifact, and here we report the results of computer simulations that examine the test's performance. In a sample of 50,000 random data sets, we find that the delta test detects the artifact in 94.4% of cases in which it is present. When the artifact is not present (n = 10,000 simulated data sets) the test showed a type I error rate of approximately 1.69%, incorrectly reporting the artifact in 169 data sets. Three measures of tree shape or "balance" failed to predict the size of the node-density effect. This may reflect the relative homogeneity of our randomly generated topologies, but emphasizes that nearly any topology can suffer from the artifact, the effect not being confined only to highly unevenly sampled or otherwise imbalanced trees. The ability to screen phylogenies for the node-density artifact is important for phylogenetic inference and for researchers using phylogenetic trees to infer evolutionary processes, including their use in molecular clock dating. [Delta test; molecular clock; molecular evolution; node-density effect; phylogenetic reconstruction; speciation; simulation.]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contamination of the electroencephalogram (EEG) by artifacts greatly reduces the quality of the recorded signals. There is a need for automated artifact removal methods. However, such methods are rarely evaluated against one another via rigorous criteria, with results often presented based upon visual inspection alone. This work presents a comparative study of automatic methods for removing blink, electrocardiographic, and electromyographic artifacts from the EEG. Three methods are considered; wavelet, blind source separation (BSS), and multivariate singular spectrum analysis (MSSA)-based correction. These are applied to data sets containing mixtures of artifacts. Metrics are devised to measure the performance of each method. The BSS method is seen to be the best approach for artifacts of high signal to noise ratio (SNR). By contrast, MSSA performs well at low SNRs but at the expense of a large number of false positive corrections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fully automated and online artifact removal method for the electroencephalogram (EEG) is developed for use in brain-computer interfacing. The method (FORCe) is based upon a novel combination of wavelet decomposition, independent component analysis, and thresholding. FORCe is able to operate on a small channel set during online EEG acquisition and does not require additional signals (e.g. electrooculogram signals). Evaluation of FORCe is performed offline on EEG recorded from 13 BCI particpants with cerebral palsy (CP) and online with three healthy participants. The method outperforms the state-of the-art automated artifact removal methods Lagged auto-mutual information clustering (LAMIC) and Fully automated statistical thresholding (FASTER), and is able to remove a wide range of artifact types including blink, electromyogram (EMG), and electrooculogram (EOG) artifacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ratio bias—according to which individuals prefer to bet on probabilities expressed as a ratio of large numbers to normatively equivalent or superior probabilities expressed as a ratio of small numbers—has recently gained momentum, with researchers especially in health economics emphasizing the policy importance of the phenomenon. Although the bias has been replicated several times, some doubts remain about its economic significance. Our two experiments show that the bias disappears once order effects are excluded, and once salient and dominant incentives are provided. This holds true for both choice and valuation tasks. Also, adding context to the decision problem does not alter this finding. No ratio bias could be found in between-subject tests either, which leads us to the conclusion that the policy relevance of the phenomenon is doubtful at best.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several global quantities are computed from the ERA40 reanalysis for the period 1958-2001 and explored for trends. These are discussed in the context of changes to the global observing system. Temperature, integrated water vapor (IWV), and kinetic energy are considered. The ERA40 global mean temperature in the lower troposphere has a trend of +0.11 K per decade over the period of 1979-2001, which is slightly higher than the MSU measurements, but within the estimated error limit. For the period 1958 2001 the warming trend is 0.14 K per decade but this is likely to be an artifact of changes in the observing system. When this is corrected for, the warming trend is reduced to 0.10 K per decade. The global trend in IWV for the period 1979-2001 is +0.36 mm per decade. This is about twice as high as the trend determined from the Clausius-Clapeyron relation assuming conservation of relative humidity. It is also larger than results from free climate model integrations driven by the same observed sea surface temperature as used in ERA40. It is suggested that the large trend in IWV does not represent a genuine climate trend but an artifact caused by changes in the global observing system such as the use of SSM/I and more satellite soundings in later years. Recent results are in good agreement with GPS measurements. The IWV trend for the period 1958-2001 is still higher but reduced to +0.16 mm per decade when corrected for changes in the observing systems. Total kinetic energy shows an increasing global trend. Results from data assimilation experiments strongly suggest that this trend is also incorrect and mainly caused by the huge changes in the global observing system in 1979. When this is corrected for, no significant change in global kinetic energy from 1958 onward can be found.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

M. R. Banaji and A. G. Greenwald (1995) demonstrated a gender bias in fame judgments—that is, an increase in judged fame due to prior processing that was larger for male than for female names. They suggested that participants shift criteria between judging men and women, using the more liberal criterion for judging men. This "criterion-shift" account appeared problematic for a number of reasons. In this article, 3 experiments are reported that were designed to evaluate the criterion-shift account of the gender bias in the false-fame effect against a distribution-shift account. The results were consistent with the criterion-shift account, and they helped to define more precisely the situations in which people may be ready to shift their response criterion on an item-by-item basis. In addition, the results were incompatible with an interpretation of the criterion shift as an artifact of the experimental situation in the experiments reported by M. R. Banaji and A. G. Greenwald. (PsycINFO Database Record (c) 2010 APA, all rights reserved)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We need a new definition for the kilogram! The present definition was sanctioned by the first General Conference on Weights and Measures (CGPM, Conférence Général des Poids et Mesures) in 1889, with a minor revision to the words in 1901, and remains unchanged after 116 years. It is the only base unit of the International System of Units (the SI) that is still defined in terms of a prototype artifact, the International Prototype of the Kilogram (IPK), which is kept in a safe at the International Bureau of Weights and Measures (the BIPM, Bureau International des Poids et Mesures) in Sèvres, near Paris.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the performance of phylogenetic mixture models in reducing a well-known and pervasive artifact of phylogenetic inference known as the node-density effect, comparing them to partitioned analyses of the same data. The node-density effect refers to the tendency for the amount of evolutionary change in longer branches of phylogenies to be underestimated compared to that in regions of the tree where there are more nodes and thus branches are typically shorter. Mixture models allow more than one model of sequence evolution to describe the sites in an alignment without prior knowledge of the evolutionary processes that characterize the data or how they correspond to different sites. If multiple evolutionary patterns are common in sequence evolution, mixture models may be capable of reducing node-density effects by characterizing the evolutionary processes more accurately. In gene-sequence alignments simulated to have heterogeneous patterns of evolution, we find that mixture models can reduce node-density effects to negligible levels or remove them altogether, performing as well as partitioned analyses based on the known simulated patterns. The mixture models achieve this without knowledge of the patterns that generated the data and even in some cases without specifying the full or true model of sequence evolution known to underlie the data. The latter result is especially important in real applications, as the true model of evolution is seldom known. We find the same patterns of results for two real data sets with evidence of complex patterns of sequence evolution: mixture models substantially reduced node-density effects and returned better likelihoods compared to partitioning models specifically fitted to these data. We suggest that the presence of more than one pattern of evolution in the data is a common source of error in phylogenetic inference and that mixture models can often detect these patterns even without prior knowledge of their presence in the data. Routine use of mixture models alongside other approaches to phylogenetic inference may often reveal hidden or unexpected patterns of sequence evolution and can improve phylogenetic inference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The upper outer quadrant (UOQ) of the breast is the most frequent site for incidence of breast cancel; but the reported disproportionate incidence in this quadrant appears to rise with year of publication. Materials and Methods: In order to determine whether this increasing incidence in the UOQ is an artifact of different study populations or is chronological, data have been analysed for annual quadrant incidence of female breast cancer recorded nationally in England and Wales between 1979 and 2000 and in Scotland between 1980 and 2001. Results: In England and Wales, the recorded incidence of female breast cancer in the UOQ rose front 47.9% in 1979 to 53.3% in 2000, and has done so linearly over tune with a con-elation coefficient R of +/- 0.71 +/- SD 0.01 (p < 0.001). Analysis of independent data front Scotland showed a similar trend in that recorded female breast cancer had also increased in the UOQ from 38.3% in 1980 to 54.7% in 2001, with a con-elation coefficient R for the linear annual increase of +0.80 +/- SD 0.03 (p < 0.001). Conclusion: These results are inconsistent with current views that the high level of UOQ breast cancer is due solely to a greater amount of target epithelial tissue in that region. Identification of the reasons for such a disproportionate site-specific increase could provide clues as to causative factors in breast cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

13C-2H correlation NMR spectroscopy (13C-2H COSY) permits the identification of 13C and 2H nuclei which are connected to one another by a single chemical bond via the sizeable 1JCD coupling constant. The practical development of this technique is described using a 13C-2H COSY pulse sequence which is derived from the classical 13C-1H correlation experiment. An example is given of the application of 13C-2H COSY to the study of the biogenesis of natural products from the anti-malarial plant Artemisia annua, using a doubly-labelled precursor molecule. Although the biogenesis of artemisinin, the anti-malarial principle from this species, has been extensively studied over the past twenty years there is still no consensus as to the true biosynthetic route to this important natural product – indeed, some published experimental results are directly contradictory. One possible reason for this confusion may be the ease with which some of the metabolites from A. annua undergo spontaneous autoxidation, as exemplified by our recent in vitro studies of the spontaneous autoxidation of dihydroartemisinic acid, and the application of 13C-2H COSY to this biosynthetic problem has been important in helping to mitigate against such processes. In this in vivo application of 13C-2H COSY, [15-13C2H3]-dihydroartemisinic acid (the doubly-labelled analogue of the natural product from this species which was obtained through synthesis) was fed to A. annua plants and was shown to be converted into several natural products which have been described previously, including artemisinin. It is proposed that all of these transformations occurred via a tertiary hydroperoxide intermediate, which is derived from dihyroartemisinic acid. This intermediate was observed directly in this feeding experiment by the 13C-2H COSY technique; its observation by more traditional procedures (e.g., chromatographic separation, followed by spectroscopic analysis of the purified product) would have been difficult owing to the instability of the hydroperoxide group (as had been established previously by our in vitro studies of the spontaneous autoxidation of dihydroartemisinic acid). This same hydroperoxide has been reported as the initial product of the spontaneous autoxidation of dihydroartemisinic acid in our previous in vitro studies. Its observation in this feeding experiment by the 13C-2H COSY technique, a procedure which requires the minimum of sample manipulation in order to achieve a reliable identification of metabolites (based on both 13C and 2H chemical shifts at the 15-position), provides the best possible evidence for its status as a genuine biosynthetic intermediate, rather than merely as an artifact of the experimental procedure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The evolvability of a software artifact is its capacity for producing heritable or reusable variants; the inverse quality is the artifact's inertia or resistance to evolutionary change. Evolvability in software systems may arise from engineering and/or self-organising processes. We describe our 'Conditional Growth' simulation model of software evolution and show how, it can be used to investigate evolvability from a self-organisation perspective. The model is derived from the Bak-Sneppen family of 'self-organised criticality' simulations. It shows good qualitative agreement with Lehman's 'laws of software evolution' and reproduces phenomena that have been observed empirically. The model suggests interesting predictions about the dynamics of evolvability and implies that much of the observed variability in software evolution can be accounted for by comparatively simple self-organising processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of nano-scale and micro-scale zerovalent iron (nZVI and mZVI) particles on general (dehydrogenase and hydrolase) and specific (ammonia oxidation potential, AOP) activities mediated by the microbial community in an uncontaminated soil were examined. nZVI (diameter 12.5 nm; 10 mg gÿ1 soil)apparently inhibited AOP and nZVI and mZVI apparently stimulated dehydrogenase activity but had minimal influence on hydrolase activity. Sterile experiments revealed that the apparent inhibition of AOP could not be interpreted as such due to the confounding action of the particles, whereas, the nZVIenhanced dehydrogenase activity could represent the genuine response of a stimulated microbial population or an artifact of ZVI reactivity. Overall, there was no evidence for negative effects of nZVI or mZVI on the processes studied. When examining the impact of redox active particles such as ZVI on microbial oxidation–reduction reactions, potential confounding effects of the test particles on assay conditions should be considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract. In a recent paper Hu et al. (2011) suggest that the recovery of stratospheric ozone during the first half of this century will significantly enhance free tropospheric and surface warming caused by the anthropogenic increase of greenhouse gases, with the effects being most pronounced in Northern Hemisphere middle and high latitudes. These surprising results are based on a multi-model analysis of CMIP3 model simulations with and without prescribed stratospheric ozone recovery. Hu et al. suggest that in order to properly quantify the tropospheric and surface temperature response to stratospheric ozone recovery, it is necessary to run coupled atmosphere-ocean climate models with stratospheric ozone chemistry. The results of such an experiment are presented here, using a state-of-the-art chemistry-climate model coupled to a three-dimensional ocean model. In contrast to Hu et al., we find a much smaller Northern Hemisphere tropospheric temperature response to ozone recovery, which is of opposite sign. We suggest that their result is an artifact of the incomplete removal of the large effect of greenhouse gas warming between the two different sets of models.