989 resultados para A-not-B error


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the last several decades debates on the 'tempo and mode' of evolution have centered on the question whether morphological evolution preferentially occurs gradually or punctuated, i.e., with long periods of stasis alternating with short periods of rapid morphological change and generation of new species. Another major debate is focused on the question whether long-term evolution is driven by, or at least strongly influenced by changes in the environment, or by interaction with other life forms. Microfossils offer a unique opportunity to obtain the large datasets as well as the precision in dating of subsequent samples to study both these questions.We present high-resolution analyses of selected calcareous nannofossils from the deep-sea section recovered at ODP Site 1262 (Leg 208) in the South-eastern Atlantic. The studied section encompasses nannofossil Zones NP4-NP12 (equivalent to CP3-CP10) and Chrons C27r-C24n. We document more than 70 biohorizons occurring over an about 10 Myr time interval, (~62.5 Ma to ~52.5 Ma), and discuss their reliability and reproducibility with respect to previous data, thus providing an improved biostratigraphic framework, which we relate to magnetostratigraphic information, and present for two possible options of a new Paleocene stratigraphic framework based on cyclostratigraphy. This new framework enabled us to tentatively reconstruct steps in the evolution of early Paleogene calcareous nannoplankton through documentation of transitional morphotypes between genera and/or species and of the phylogenetic relations between the genera Fasciculithus, Heliolithus, Discoasteroides and Discoaster, as well as between Rhomboaster and Tribrachiatus. The exceptional record provided by the continuous, composite sequence recovered at Walvis Ridge allows us to describe the mode of evolution among calcareous nannoplankton: new genera and/or new species usually originated through branching of lineages via gradual, but relatively rapid, morphological transitions, as documented by the presence of intermediate forms between the end-member ancestral and descendant forms. Significant modifications in the calcareous nannofossil assemblages are often "related" to significant changes in environmental conditions, but the appearance of structural innovations and radiations within a single genus also occurred during "stable" environmental conditions. These lines of evidence suggest that nannoplankton evolution is not always directly triggered by stressed environmental conditions but could be also driven by endogenous biotic control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The reliability of measurement refers to unsystematic error in observed responses. Investigations of the prevalence of random error in stated estimates of willingness to pay (WTP) are important to an understanding of why tests of validity in CV can fail. However, published reliability studies have tended to adopt empirical methods that have practical and conceptual limitations when applied to WTP responses. This contention is supported in a review of contingent valuation reliability studies that demonstrate important limitations of existing approaches to WTP reliability. It is argued that empirical assessments of the reliability of contingent values may be better dealt with by using multiple indicators to measure the latent WTP distribution. This latent variable approach is demonstrated with data obtained from a WTP study for stormwater pollution abatement. Attitude variables were employed as a way of assessing the reliability of open-ended WTP (with benchmarked payment cards) for stormwater pollution abatement. The results indicated that participants' decisions to pay were reliably measured, but not the magnitude of the WTP bids. This finding highlights the need to better discern what is actually being measured in VVTP studies, (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Paradoxically, while peripheral self-tolerance exists for constitutively presented somatic self Ag, self-peptide recognized in the context of MHC class II has been shown to sensitize T cells for subsequent activation. We have shown that MHC class II(+)CD86(+)CD40(-) DC, which can be generated from bone marrow in the presence of an NF-kappaB inhibitor, and which constitutively populate peripheral tissues and lymphoid organs in naive animals, can induce Ag-specific tolerance. In this study, we show that CD40(-) human monocyte-derived dendritic cells (DC), generated in the presence of an NF-kappaB inhibitor, signal phosphorylation of TCRzeta, but little proliferation or IFN-gamma in vitro. Proliferation is arrested in the G(1)/G(0) phase of the cell cycle. Surprisingly, responding T cells are neither anergic nor regulatory, but are sensitized for subsequent IFN-gamma production. The data indicate that signaling through NF-kappaB determines the capacity of DC to stimulate T cell proliferation. Functionally, NF-kappaB(-)CD40(-)class II+ DC may either tolerize or sensitize T cells. Thus, while CD40(-) DC appear to prime or prepare T cells, the data imply that signals derived from other cells drive the generation either of Ag-specific regulatory or effector cells in vivo.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We demonstrate a quantum error correction scheme that protects against accidental measurement, using a parity encoding where the logical state of a single qubit is encoded into two physical qubits using a nondeterministic photonic controlled-NOT gate. For the single qubit input states vertical bar 0 >, vertical bar 1 >, vertical bar 0 > +/- vertical bar 1 >, and vertical bar 0 > +/- i vertical bar 1 > our encoder produces the appropriate two-qubit encoded state with an average fidelity of 0.88 +/- 0.03 and the single qubit decoded states have an average fidelity of 0.93 +/- 0.05 with the original state. We are able to decode the two-qubit state (up to a bit flip) by performing a measurement on one of the qubits in the logical basis; we find that the 64 one-qubit decoded states arising from 16 real and imaginary single-qubit superposition inputs have an average fidelity of 0.96 +/- 0.03.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Operator quantum error correction is a recently developed theory that provides a generalized and unified framework for active error correction and passive error avoiding schemes. In this Letter, we describe these codes using the stabilizer formalism. This is achieved by adding a gauge group to stabilizer codes that defines an equivalence class between encoded states. Gauge transformations leave the encoded information unchanged; their effect is absorbed by virtual gauge qubits that do not carry useful information. We illustrate the construction by identifying a gauge symmetry in Shor's 9-qubit code that allows us to remove 3 of its 8 stabilizer generators, leading to a simpler decoding procedure and a wider class of logical operations without affecting its essential properties. This opens the path to possible improvements of the error threshold of fault-tolerant quantum computing.