64 resultados para sequencing error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nitrogen removal capacity of a suspended culture system treating mature landfill leachate was investigated. Leachate containing high ammonium levels of 300-900 mg N/L was nitrified in a bench scale sequencing batch reactor. Leachate from four different landfills was treated over a two year period for the removal of nitrogen. In this time, a highly specific nitrifying culture was attained that delivered exceptionally high rates of ammonia removal. No sludge was wasted from the system to increase the throughput and up to 13 g/L of MLSS was obtained. Settleability of the purely nitrifying biomass was excellent with SVI less than 40 mL/g, even at the high sludge concentrations. Nitrification rates up to 246 mg NI(L h) (5.91 g N/(L d)) and specific nitrification rates of 36 mg N/(gVSS h) (880 mg N/(gVSS d)) were obtained. The loading to the system at this time allowed complete nitrification of the leachate with a hydraulic retention time of only 5 hours. Following these successful treatability studies, a full-scale plant was designed and built at one of the landfills investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simultaneous nitrification and denitrification (SND) via the nitrite pathway and anaerobic-anoxic-enhanced biological phosphorus removal (EBPR) are two processes that can significantly reduce the energy and COD demand for nitrogen and phosphorus removal. The combination of these two processes has the potential of achieving simultaneous nitrogen and phosphorus removal with a minimal requirement for COD. A lab-scale sequencing batch reactor (SBR) was operated in alternating anaerobic-aerobic mode with a low dissolved oxygen (DO) concentration (0.5 mg/L) during the aerobic period, and was demonstrated to accomplish nitrification, denitrification, and phosphorus removal. Under anaerobic conditions, COD was taken up and converted to poly-hydroxyalkanoates (PHAs), accompanied by phosphorus release. In the subsequent aerobic stage, PHA was oxidized and phosphorus was taken up to

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent advances in molecular biology have made it possible to use the trace amounts of DNA in faeces to non-invasively sample endangered species for genetic studies. Here we use faeces as a source of DNA and mtDNA sequence data to elucidate the relationship among Spanish and Moroccan populations of great bustards. 834 bp of combined control region and cytochrome-b mtDNA fragments revealed four variable sites that defined seven closely related haplotypes in 54 individuals. Morocco was fixed for a single mtDNA haplotype that occurs at moderate frequency (28%) in Spain. We could not differentiate among the sampled Spanish populations of Caceres and Andalucia but these combined populations were differentiated from the Moroccan population. Estimates of gene flow (Nm = 0.82) are consistent with extensive observations on the southern Iberian peninsular indicating that few individuals fly across the Strait of Gibraltar. We demonstrate that both this sea barrier and mountain barriers in Spain limit dispersal among adjacent great bustard populations to a similar extent. The Moroccan population is of high ornithological significance as it holds the only population of great bustards in Africa. This population is critically small and genetic and observational data indicate that it is unlikely to be recolonised via immigration from Spain should it be extirpated. In light of the evidence presented here it deserves the maximum level of protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dizziness and/or unsteadiness are common symptoms of chronic whiplash-associated disorders. This study aimed to report the characteristics of these symptoms and determine whether there was any relationship to cervical joint position error. Joint position error, the accuracy to return to the natural head posture following extension and rotation, was measured in 102 subjects with persistent whiplash-associated disorder and 44 control subjects. Whiplash subjects completed a neck pain index and answered questions about the characteristics of dizziness. The results indicated that subjects with whiplash-associated disorders had significantly greater joint position errors than control subjects. Within the whiplash group, those with dizziness had greater joint position errors than those without dizziness following rotation (rotation (R) 4.5degrees (0.3) vs 2.9degrees (0.4); rotation (L) 3.9degrees (0.3) vs 2.8degrees (0.4) respectively) and a higher neck pain index (55.3% (1.4) vs 43.1% (1.8)). Characteristics of the dizziness were consistent for those reported for a cervical cause but no characteristics could predict the magnitude of joint position error. Cervical mechanoreceptor dysfunction is a likely cause of dizziness in whiplash-associated disorder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliability of measurement refers to unsystematic error in observed responses. Investigations of the prevalence of random error in stated estimates of willingness to pay (WTP) are important to an understanding of why tests of validity in CV can fail. However, published reliability studies have tended to adopt empirical methods that have practical and conceptual limitations when applied to WTP responses. This contention is supported in a review of contingent valuation reliability studies that demonstrate important limitations of existing approaches to WTP reliability. It is argued that empirical assessments of the reliability of contingent values may be better dealt with by using multiple indicators to measure the latent WTP distribution. This latent variable approach is demonstrated with data obtained from a WTP study for stormwater pollution abatement. Attitude variables were employed as a way of assessing the reliability of open-ended WTP (with benchmarked payment cards) for stormwater pollution abatement. The results indicated that participants' decisions to pay were reliably measured, but not the magnitude of the WTP bids. This finding highlights the need to better discern what is actually being measured in VVTP studies, (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of presence/absence data in wildlife management and biological surveys is widespread. There is a growing interest in quantifying the sources of error associated with these data. We show that false-negative errors (failure to record a species when in fact it is present) can have a significant impact on statistical estimation of habitat models using simulated data. Then we introduce an extension of logistic modeling, the zero-inflated binomial (ZIB) model that permits the estimation of the rate of false-negative errors and the correction of estimates of the probability of occurrence for false-negative errors by using repeated. visits to the same site. Our simulations show that even relatively low rates of false negatives bias statistical estimates of habitat effects. The method with three repeated visits eliminates the bias, but estimates are relatively imprecise. Six repeated visits improve precision of estimates to levels comparable to that achieved with conventional statistics in the absence of false-negative errors In general, when error rates are less than or equal to50% greater efficiency is gained by adding more sites, whereas when error rates are >50% it is better to increase the number of repeated visits. We highlight the flexibility of the method with three case studies, clearly demonstrating the effect of false-negative errors for a range of commonly used survey methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A quantum circuit implementing 5-qubit quantum-error correction on a linear-nearest-neighbor architecture is described. The canonical decomposition is used to construct fast and simple gates that incorporate the necessary swap operations allowing the circuit to achieve the same depth as the current least depth circuit. Simulations of the circuit's performance when subjected to discrete and continuous errors are presented. The relationship between the error rate of a physical qubit and that of a logical qubit is investigated with emphasis on determining the concatenated error correction threshold.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a scheme for quantum-error correction that employs feedback and weak measurement rather than the standard tools of projective measurement and fast controlled unitary gates. The advantage of this scheme over previous protocols [for example, Ahn Phys. Rev. A 65, 042301 (2001)], is that it requires little side processing while remaining robust to measurement inefficiency, and is therefore considerably more practical. We evaluate the performance of our scheme by simulating the correction of bit flips. We also consider implementation in a solid-state quantum-computation architecture and estimate the maximal error rate that could be corrected with current technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An enhanced biological phosphorus removal (EBPR) system was developed in a sequencing batch reactor (SBR) using propionate as the sole carbon source. The microbial community was followed using fluorescence in situ hybridization (FISH) techniques and Candidatus 'Accumulibacter phosphatis' were quantified from the start up of the reactor until steady state. A series of SBR cycle studies was performed when 55% of the SBR biomass was Accumulibacter, a confirmed polyphosphate accumulating organism (PAO) and when Candidatus 'Competibacter phosphatis,' a confirmed glycogen-accumulating organism (GAO), was essentially undetectable. These experiments evaluated two different carbon sources (propionate and acetate), and in every case, two different P-release rates were detected. The highest rate took place while there was volatile fatty acid (VFA) in the mixed liquor, and after the VFA was depleted a second P-release rate was observed. This second rate was very similar to the one detected in experiments performed without added VFA. A kinetic and stoichiometric model developed as a modification of Activated Sludge Model 2 (ASM2) including glycogen economy, was fitted to the experimental profiles. The validation and calibration of this model was carried out with the cycle study experiments performed using both VFAs. The effect of pH from 6.5 to 8.0 on anaerobic P-release and VFA-uptake and aerobic P-uptake was also studied using propionate. The optimal overall working pH was around 7.5. This is the first study of the microbial community involved in EBPR developed with propionate as a sole carbon source along with detailed process performance investigations of the propionate-utilizing PAOs. (C) 2004 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In vitro evolution imitates the natural evolution of genes and has been very successfully applied to the modification of coding sequences, but it has not yet been applied to promoter sequences. We propose an alternative method for functional promoter analysis by applying an in vitro evolution scheme consisting of rounds of error-prone PCR, followed by DNA shuffling and selection of mutant promoter activities. We modified the activity in embryogenic sugarcane cells of the promoter region of the Goldfinger isolate of banana streak virus and obtained mutant promoter sequences that showed an average mutation rate of 2.5% after applying one round of error-prone PCR and DNA shuffling. Selection and sequencing of promoter sequences with decreased or unaltered activity allowed us to rapidly map the position of one cis-acting element that influenced promoter activity in embryogenic sugarcane cells and to discover neutral mutations that did not affect promoter Junction. The selective-shotgun approach of this promoter analysis method immediately after the promoter boundaries have been defined by 5' deletion analysis dramatically reduces the labor associated with traditional linker-scanning deletion analysis to reveal the position of functional promoter domains. Furthermore, this method allows the entire promoter to be investigated at once, rather than selected domains or nucleotides, increasing the, prospect of identifying interacting promoter regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate a quantum error correction scheme that protects against accidental measurement, using a parity encoding where the logical state of a single qubit is encoded into two physical qubits using a nondeterministic photonic controlled-NOT gate. For the single qubit input states vertical bar 0 >, vertical bar 1 >, vertical bar 0 > +/- vertical bar 1 >, and vertical bar 0 > +/- i vertical bar 1 > our encoder produces the appropriate two-qubit encoded state with an average fidelity of 0.88 +/- 0.03 and the single qubit decoded states have an average fidelity of 0.93 +/- 0.05 with the original state. We are able to decode the two-qubit state (up to a bit flip) by performing a measurement on one of the qubits in the logical basis; we find that the 64 one-qubit decoded states arising from 16 real and imaginary single-qubit superposition inputs have an average fidelity of 0.96 +/- 0.03.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe an implementation of quantum error correction that operates continuously in time and requires no active interventions such as measurements or gates. The mechanism for carrying away the entropy introduced by errors is a cooling procedure. We evaluate the effectiveness of the scheme by simulation, and remark on its connections to some recently proposed error prevention procedures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.