986 resultados para SAMPLE ERROR


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of biological effect monitoring for the detection of environmental chemical exposure in domestic animals is still in its infancy. This study investigated blood sample preparations in vitro for their use in biological effect monitoring. When peripheral blood mononuclear cells (PBMCs), isolated following the collection of multiple blood samples from sheep in the field, were cryopreserved and subsequently cultured for 24 hours a reduction in cell viability (<80%) was attributed to delays in the processing following collection. Alternative blood sample preparations using rat and sheep blood demonstrated that 3 to 5 hour incubations can be undertaken without significant alterations in the viability of the lymphocytes; however, a substantial reduction in viability was observed after 24 hours in frozen blood. Detectable levels of early and late apoptosis as well as increased levels of ROS were detectable in frozen sheep blood samples. The addition of ascorbic acid partly reversed this effect and reduced the loss in cell viability. The response of the rat and sheep blood sample preparations to genotoxic compounds ex vivo showed that EMS caused comparable dose-dependent genotoxic effects in all sample preparations (fresh and frozen) as detected by the Comet assay. In contrast, the effects of CdCl2 were dependent on the duration of exposure as well as the sample preparation. The analysis of leukocyte subsets in frozen sheep blood showed no alterations in the percentages of T and B lymphocytes but led to a major decrease in the percentage of granulocytes compared to those in the fresh samples. The percentages of IFN-γ and IL-4 but not IL-6 positive cells were comparable between fresh and frozen sheep blood after 4 hour stimulation with phorbol 12-myrisate 13-acetate and ionomycin (PMA+I). These results show that frozen blood gives comparable responses to fresh blood samples in the toxicological and immune assays used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explores the use of electromagnetics for both steering and tracking of medical instruments in minimally invasive surgeries. The end application is virtual navigation of the lung for biopsy of early stage cancer nodules. Navigation to the peripheral regions of the lung is difficult due to physical dimensions of the bronchi and current methods have low successes rates for accurate diagnosis. Firstly, the potential use of DC magnetic fields for the actuation of catheter devices with permanently magnetised distal attachments is investigated. Catheter models formed from various materials and magnetic tip formations are used to examine the usefulness of relatively low power and compact electromagnets. The force and torque that can be exerted on a small permanent magnet is shown to be extremely limited. Hence, after this initial investigation we turn our attention to electromagnetic tracking, in the development of a novel, low-cost implementation of a GPS-like system for navigating within a patient. A planar magnetic transmitter, formed on a printed circuit board for a low-profile and low cost manufacture, is used to generate a low frequency magnetic field distribution which is detected by a small induction coil sensor. The field transmitter is controlled by a novel closed-loop system that ensures a highly stable magnetic field with reduced interference from one transmitter coil to another. Efficient demodulation schemes are presented which utilise synchronous detection of each magnetic field component experienced by the sensor. The overall tracking accuracy of the system is shown to be less than 2 mm with an orientation error less than 1°. A novel demodulation implementation using a unique undersampling approach allows the use of reduced sample rates to sample the signals of interest without loss of tracking accuracy. This is advantageous for embedded microcontroller implementations of EM tracking systems. The EM tracking system is demonstrated in the pre-clinical environment of a breathing lung phantom. The airways of the phantom are successfully navigated using the system in combination with a 3D computer model rendered from CT data. Registration is achieved using both a landmark rigid registration method and a hybrid fiducial-free approach. The design of a planar magnetic shield structure for blocking the effects of metallic distortion from below the transmitter is presented which successfully blocks the impact of large ferromagnetic objects such as operating tables. A variety of shielding material are analysed with MuMetal and ferrite both providing excellent shieling performance and an increased signal to noise ratio. Finally, the effect of conductive materials and human tissue on magnetic field measurements is presented. Error due to induced eddy currents and capacitive coupling is shown to severely affect EM tracking accuracy at higher frequencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Spirituality is fundamental to all human beings, existing within a person, and developing until death. This research sought to operationalise spirituality in a sample of individuals with chronic illness. A review of the conceptual literature identified three dimensions of spirituality: connectedness, transcendence, and meaning in life. A review of the empirical literature identified one instrument that measures the three dimensions together. Yet, recent appraisals of this instrument highlighted issues with item formulation and limited evidence of reliability and validity. Aim: The aim of this research was to develop a theoretically-grounded instrument to measure spirituality – the Spirituality Instrument-27 (SpI-27). A secondary aim was to psychometrically evaluate this instrument in a sample of individuals with chronic illness (n=249). Methods: A two-phase design was adopted. Phase one consisted of the development of the SpI-27 based on item generation from a concept analysis, a literature review, and an instrument appraisal. The second phase established the psychometric properties of the instrument and included: a qualitative descriptive design to establish content validity; a pilot study to evaluate the mode of administration; and a descriptive correlational design to assess the instrument’s reliability and validity. Data were analysed using SPSS (Version 18). Results: Results of exploratory factor analysis concluded a final five-factor solution with 27 items. These five factors were labelled: Connectedness with Others, Self-Transcendence, Self-Cognisance, Conservationism, and Connectedness with a Higher Power. Cronbach’s alpha coefficients ranged from 0.823 to 0.911 for the five factors, and 0.904 for the overall scale, indicating high internal consistency. Paired-sample t-tests, intra-class correlations, and weighted kappa values supported the temporal stability of the instrument over 2 weeks. A significant positive correlation was found between the SpI-27 and the Spirituality Index of Well-Being, providing evidence for convergent validity. Conclusion: This research addresses a call for a theoretically-grounded instrument to measure spirituality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New compensation methods are presented that can greatly reduce the slit errors (i.e. transition location errors) and interval errors induced due to non-idealities in optical incremental encoders (square-wave). An M/T-type, constant sample-time digital tachometer (CSDT) is selected for measuring the velocity of the sensor drives. Using this data, three encoder compensation techniques (two pseudoinverse based methods and an iterative method) are presented that improve velocity measurement accuracy. The methods do not require precise knowledge of shaft velocity. During the initial learning stage of the compensation algorithm (possibly performed in-situ), slit errors/interval errors are calculated through pseudoinversebased solutions of simple approximate linear equations, which can provide fast solutions, or an iterative method that requires very little memory storage. Subsequent operation of the motion system utilizes adjusted slit positions for more accurate velocity calculation. In the theoretical analysis of the compensation of encoder errors, encoder error sources such as random electrical noise and error in estimated reference velocity are considered. Initially, the proposed learning compensation techniques are validated by implementing the algorithms in MATLAB software, showing a 95% to 99% improvement in velocity measurement. However, it is also observed that the efficiency of the algorithm decreases with the higher presence of non-repetitive random noise and/or with the errors in reference velocity calculations. The performance improvement in velocity measurement is also demonstrated experimentally using motor-drive systems, each of which includes a field-programmable gate array (FPGA) for CSDT counting/timing purposes, and a digital-signal-processor (DSP). Results from open-loop velocity measurement and closed-loop servocontrol applications, on three optical incremental square-wave encoders and two motor drives, are compiled. While implementing these algorithms experimentally on different drives (with and without a flywheel) and on encoders of different resolutions, slit error reductions of 60% to 86% are obtained (typically approximately 80%).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Leaving Certificate (LC) is the national, standardised state examination in Ireland necessary for entry to third level education – this presents a massive, raw corpus of data with the potential to yield invaluable insight into the phenomena of learner interlanguage. With samples of official LC Spanish examination data, this project has compiled a digitised corpus of learner Spanish comprised of the written and oral production of 100 candidates. This corpus was then analysed using a specific investigative corpus technique, Computer-aided Error Analysis (CEA, Dagneaux et al, 1998). CEA is a powerful apparatus in that it greatly facilitates the quantification and analysis of a large learner corpus in digital format. The corpus was both compiled and analysed with the use of UAM Corpus Tool (O’Donnell 2013). This Tool allows for the recording of candidate-specific variables such as grade, examination level, task type and gender, therefore allowing for critical analysis of the corpus as one unit, as separate written and oral sub corpora and also of performance per task, level and gender. This is an interdisciplinary work combining aspects of Applied Linguistics, Learner Corpus Research and Foreign Language (FL) Learning. Beginning with a review of the context of FL learning in Ireland and Europe, I go on to discuss the disciplinary context and theoretical framework for this work and outline the methodology applied. I then perform detailed quantitative and qualitative analyses before going on to combine all research findings outlining principal conclusions. This investigation does not make a priori assumptions about the data set, the LC Spanish examination, the context of FLs or of any aspect of learner competence. It undertakes to provide the linguistic research community and the domain of Spanish language learning and pedagogy in Ireland with an empirical, descriptive profile of real learner performance, characterising learner difficulty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user. © 2010 The American Physical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ground state structure of C(4N+2) rings is believed to exhibit a geometric transition from angle alternation (N < or = 2) to bond alternation (N > 2). All previous density functional theory (DFT) studies on these molecules have failed to reproduce this behavior by predicting either that the transition occurs at too large a ring size, or that the transition leads to a higher symmetry cumulene. Employing the recently proposed perspective of delocalization error within DFT we rationalize this failure of common density functional approximations (DFAs) and present calculations with the rCAM-B3LYP exchange-correlation functional that show an angle-to-bond-alternation transition between C(10) and C(14). The behavior exemplified here manifests itself more generally as the well known tendency of DFAs to bias toward delocalized electron distributions as favored by Huckel aromaticity, of which the C(4N+2) rings provide a quintessential example. Additional examples are the relative energies of the C(20) bowl, cage, and ring isomers; we show that the results from functionals with minimal delocalization error are in good agreement with CCSD(T) results, in contrast to other commonly used DFAs. An unbiased DFT treatment of electron delocalization is a key for reliable prediction of relative stability and hence the structures of complex molecules where many structure stabilization mechanisms exist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The rate of emergence of human pathogens is steadily increasing; most of these novel agents originate in wildlife. Bats, remarkably, are the natural reservoirs of many of the most pathogenic viruses in humans. There are two bat genome projects currently underway, a circumstance that promises to speed the discovery host factors important in the coevolution of bats with their viruses. These genomes, however, are not yet assembled and one of them will provide only low coverage, making the inference of most genes of immunological interest error-prone. Many more wildlife genome projects are underway and intend to provide only shallow coverage. RESULTS: We have developed a statistical method for the assembly of gene families from partial genomes. The method takes full advantage of the quality scores generated by base-calling software, incorporating them into a complete probabilistic error model, to overcome the limitation inherent in the inference of gene family members from partial sequence information. We validated the method by inferring the human IFNA genes from the genome trace archives, and used it to infer 61 type-I interferon genes, and single type-II interferon genes in the bats Pteropus vampyrus and Myotis lucifugus. We confirmed our inferences by direct cloning and sequencing of IFNA, IFNB, IFND, and IFNK in P. vampyrus, and by demonstrating transcription of some of the inferred genes by known interferon-inducing stimuli. CONCLUSION: The statistical trace assembler described here provides a reliable method for extracting information from the many available and forthcoming partial or shallow genome sequencing projects, thereby facilitating the study of a wider variety of organisms with ecological and biomedical significance to humans than would otherwise be possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Droplet-based digital microfluidics technology has now come of age, and software-controlled biochips for healthcare applications are starting to emerge. However, today's digital microfluidic biochips suffer from the drawback that there is no feedback to the control software from the underlying hardware platform. Due to the lack of precision inherent in biochemical experiments, errors are likely during droplet manipulation; error recovery based on the repetition of experiments leads to wastage of expensive reagents and hard-to-prepare samples. By exploiting recent advances in the integration of optical detectors (sensors) into a digital microfluidics biochip, we present a physical-aware system reconfiguration technique that uses sensor data at intermediate checkpoints to dynamically reconfigure the biochip. A cyberphysical resynthesis technique is used to recompute electrode-actuation sequences, thereby deriving new schedules, module placement, and droplet routing pathways, with minimum impact on the time-to-response. © 2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Single-molecule sequencing instruments can generate multikilobase sequences with the potential to greatly improve genome and transcriptome assembly. However, the error rates of single-molecule reads are high, which has limited their use thus far to resequencing bacteria. To address this limitation, we introduce a correction algorithm and assembly strategy that uses short, high-fidelity sequences to correct the error in single-molecule sequences. We demonstrate the utility of this approach on reads generated by a PacBio RS instrument from phage, prokaryotic and eukaryotic whole genomes, including the previously unsequenced genome of the parrot Melopsittacus undulatus, as well as for RNA-Seq reads of the corn (Zea mays) transcriptome. Our long-read correction achieves >99.9% base-call accuracy, leading to substantially better assemblies than current sequencing strategies: in the best example, the median contig size was quintupled relative to high-coverage, second-generation assemblies. Greater gains are predicted if read lengths continue to increase, including the prospect of single-contig bacterial chromosome assembly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analysis is carried out, using the prolate spheroidal wave functions, of certain regularized iterative and noniterative methods previously proposed for the achievement of object restoration (or, equivalently, spectral extrapolation) from noisy image data. The ill-posedness inherent in the problem is treated by means of a regularization parameter, and the analysis shows explicitly how the deleterious effects of the noise are then contained. The error in the object estimate is also assessed, and it is shown that the optimal choice for the regularization parameter depends on the signal-to-noise ratio. Numerical examples are used to demonstrate the performance of both unregularized and regularized procedures and also to show how, in the unregularized case, artefacts can be generated from pure noise. Finally, the relative error in the estimate is calculated as a function of the degree of superresolution demanded for reconstruction problems characterized by low space–bandwidth products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thin-layer and high-performance thin-layer chromatography (TLC/HPTLC) methods for assaying compound(s) in a sample must be validated to ensure that they are fit for their intended purpose and, where applicable, meet the strict regulatory requirements for controlled products. Two validation approaches are identified in the literature, i.e. the classic and the alternative, which is using accuracy profiles.Detailed procedures of the two approaches are discussed based on the validation of methods for pharmaceutical analysis, which is an area considered having more strict requirements. Estimation of the measurement uncertainty from the validation approach using accuracy profiles is also described.Examples of HPTLC methods, developed and validated to assay sulfamethoxazole and trimethoprim on the one hand and lamivudine, stavudine, and nevirapine on the other, in their fixed-dose combination tablets, are further elaborated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pensar que existen soluciones para cerrar la brecha entre el colegio y la universidad es utópico. Sin embargo, sí tiene sentido el trabajo que se haga con respecto al problema de la brecha para conocer y acercar los ideales y las expectativas que tienen las diferentes instituciones de educación. En la Universidad de los Andes fue evidente que dicho trabajo se podría orientar en diferentes direcciones y haciendo énfasis en la institución o bien en los profesores o bien en los estudiantes. Se podían abordar temas como: diseño curricular, creencias y actitudes de los profesores y de los estudiantes, métodos de enseñanza, concepciones sobre la enseñanza y el aprendizaje, dificultades y errores de aprendizaje y otros temas. Luego de varios traspiés en la elección del tema de investigación, elegimos finalmente explorar el tema del aprendizaje y considerar a los primíparos para el estudio por ser ellos los que viven realmente el proceso de transición del colegio a la universidad. Por otra parte, nos restringimos al área de precálculo motivados en parte porque en esta materia había un mayor índice de desaprobación. Concretamente, se propuso como objetivo general describir un perfil de aprendizaje en matemáticas del estudiante de Precálculo en el momento de ingresar a la Universidad. Del objetivo anterior se derivó el problema principal de este proyecto: definir los elementos conceptuales con los cuáles articular la descripción de dicho perfil. La presentación está dividida en cuatro partes, en la primera se expone un marco conceptual que presenta los elementos con los cuales se describirá el perfil, la segunda y tercera se refieren respectivamente a la metodología de la investigación y a los resultados obtenidos y la última a las conclusiones del trabajo.