10 resultados para Error serial correlation

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pre-treatment HCV quasispecies complexity and diversity may predict response to interferon based anti-viral therapy. The objective of this study was to retrospectively (1) examine temporal changes in quasispecies prior to the start of therapy and (2) investigate extensively quasispecies evolution in a group of 10 chronically infected patients with genotype 3a, treated with pegylated alpha 2a-Interferon and ribavirin. The degree of sequence heterogeneity within the hypervariable region 1 was assessed by analyzing 20-30 individual clones in serial serum samples. Genetic parameters, including amino acid Shannon entropy, Hamming distance and genetic distance were calculated for each sample. Treatment outcome was divided into (1) sustained virological responders (SVR) and (2) treatment failure (TF).Our results indicate, (1) quasispecies complexity and diversity are lower in the SVR group, (2) quasispecies vary temporally and (3) genetic heterogeneity at baseline can be used to predict treatment outcome. We discuss the results from the perspective of replicative homeostasis. We discuss the results from the perspective of replicative homeostasis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two classes of techniques have been developed to whiten the quantization noise in digital delta-sigma modulators (DDSMs): deterministic and stochastic. In this two-part paper, a design methodology for reduced-complexity DDSMs is presented. The design methodology is based on error masking. Rules for selecting the word lengths of the stages in multistage architectures are presented. We show that the hardware requirement can be reduced by up to 20% compared with a conventional design, without sacrificing performance. Simulation and experimental results confirm theoretical predictions. Part I addresses MultistAge noise SHaping (MASH) DDSMs; Part II focuses on single-quantizer DDSMs..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind energy is the energy source that contributes most to the renewable energy mix of European countries. While there are good wind resources throughout Europe, the intermittency of the wind represents a major problem for the deployment of wind energy into the electricity networks. To ensure grid security a Transmission System Operator needs today for each kilowatt of wind energy either an equal amount of spinning reserve or a forecasting system that can predict the amount of energy that will be produced from wind over a period of 1 to 48 hours. In the range from 5m/s to 15m/s a wind turbine’s production increases with a power of three. For this reason, a Transmission System Operator requires an accuracy for wind speed forecasts of 1m/s in this wind speed range. Forecasting wind energy with a numerical weather prediction model in this context builds the background of this work. The author’s goal was to present a pragmatic solution to this specific problem in the ”real world”. This work therefore has to be seen in a technical context and hence does not provide nor intends to provide a general overview of the benefits and drawbacks of wind energy as a renewable energy source. In the first part of this work the accuracy requirements of the energy sector for wind speed predictions from numerical weather prediction models are described and analysed. A unique set of numerical experiments has been carried out in collaboration with the Danish Meteorological Institute to investigate the forecast quality of an operational numerical weather prediction model for this purpose. The results of this investigation revealed that the accuracy requirements for wind speed and wind power forecasts from today’s numerical weather prediction models can only be met at certain times. This means that the uncertainty of the forecast quality becomes a parameter that is as important as the wind speed and wind power itself. To quantify the uncertainty of a forecast valid for tomorrow requires an ensemble of forecasts. In the second part of this work such an ensemble of forecasts was designed and verified for its ability to quantify the forecast error. This was accomplished by correlating the measured error and the forecasted uncertainty on area integrated wind speed and wind power in Denmark and Ireland. A correlation of 93% was achieved in these areas. This method cannot solve the accuracy requirements of the energy sector. By knowing the uncertainty of the forecasts, the focus can however be put on the accuracy requirements at times when it is possible to accurately predict the weather. Thus, this result presents a major step forward in making wind energy a compatible energy source in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The atom pencil we describe here is a versatile tool that writes arbitrary structures by atomic deposition in a serial lithographic process. This device consists of a transversely laser-cooled and collimated cesium atomic beam that passes through a 4-pole atom-flux concentrator and impinges on to micron- and sub-micron-sized apertures. The aperture translates above a fixed substrate and enables the writing of sharp features with sizes down to 280 nm. We have investigated the writing and clogging properties of an atom pencil tip fabricated from silicon oxide pyramids perforated at the tip apex with a sub-micron aperture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Colorectal cancer is the most common cause of death due to malignancy in nonsmokers in the western world. In 1995 there were 1,757 cases of colon cancer in Ireland. Most colon cancer is sporadic, however ten percent of cases occur where there is a previous family history of the disease. In an attempt to understand the tumorigenic pathway in Irish colon cancer patients, a number of genes associated with colorectal cancer development were analysed in Irish sporadic and HNPCC colon cancer patients. The hereditary forms of colon cancer include Familial adenomatous polyposis coli (FAP) and Hereditary Non-Polyposis Colon Cancer (HNPCC). Genetic analysis of the gene responsible for FAP, (the APC gene) has been previously performed on Irish families, however the genetic analysis of HNPCC families is limited. In an attempt to determine the mutation spectrum in Irish HNPCC pedigrees, the hMSH2 and hMLHl mismatch repair genes were screened in 18 Irish HNPCC families. Using SSCP analysis followed by DNA sequencing, five mutations were identified, four novel and a previously reported mutation. In families where a mutation was detected, younger asyptomatic members were screened for the presence of the predisposing mutation (where possible). Detection of mutations is particularly important for the identification of at risk individuals as the early diagnosis of cancer can vastly improve the prognosis. The sensitive and efficient detection of multiple different mutations and polymorphisms in DNA is of prime importance for genetic diagnosis and the identification of disease genes. A novel mutation detection technique has recently been developed in our laboratory. In order to assess the efficacy and application of the methodology in the analysis of cancer associated genes, a protocol for the analysis of the K-ras gene was developed and optimised. Matched normal and tumour DNA from twenty sporadic colon cancer patients was analysed for K-ras mutations using the Glycosylase Mediated Polymorphism Detection technique. Five mutations of the K-ras gene were detected using this technology. Sequencing analysis verified the presence of the mutations and SSCP analysis of the same samples did not identify any additional mutations. The GMPD technology proved to be highly sensitive, accurate and efficient in the identification of K-ras gene mutations. In order to investigate the role of the replication error phenomenon in Irish colon cancer, 3 polyA tract repeat loci were analysed. The repeat loci included a 10 bp intragenic repeat of the TGF-β-RII gene. TGF-β-RII is involved in the TGF-β epithelial cell growth pathway and mutation of the gene is thought to play a role in cell proliferation and tumorigenesis. Due to the presence of a repeat sequence within the gene, TGFB-RII defects are associated with tumours that display the replication error phenomenon. Analysis of the TGF-β-RII 10 bp repeat failed to identify mutations in any colon cancer patients. Analysis of the Bat26 and Bat 40 polyA repeat sequences in the sporadic and HNPCC families revealed that instability is associated with HNPCC tumours harbouring mismatch repair defects and with 20 % of sporadic colon cancer tumours. No correlation between K-ras gene mutations and the RER+ phenotype was detected in sporadic colon cancer tumours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For two multinormal populations with equal covariance matrices the likelihood ratio discriminant function, an alternative allocation rule to the sample linear discriminant function when n1 ≠ n2 ,is studied analytically. With the assumption of a known covariance matrix its distribution is derived and the expectation of its actual and apparent error rates evaluated and compared with those of the sample linear discriminant function. This comparison indicates that the likelihood ratio allocation rule is robust to unequal sample sizes. The quadratic discriminant function is studied, its distribution reviewed and evaluation of its probabilities of misclassification discussed. For known covariance matrices the distribution of the sample quadratic discriminant function is derived. When the known covariance matrices are proportional exact expressions for the expectation of its actual and apparent error rates are obtained and evaluated. The effectiveness of the sample linear discriminant function for this case is also considered. Estimation of true log-odds for two multinormal populations with equal or unequal covariance matrices is studied. The estimative, Bayesian predictive and a kernel method are compared by evaluating their biases and mean square errors. Some algebraic expressions for these quantities are derived. With equal covariance matrices the predictive method is preferable. Where it derives this superiority is investigated by considering its performance for various levels of fixed true log-odds. It is also shown that the predictive method is sensitive to n1 ≠ n2. For unequal but proportional covariance matrices the unbiased estimative method is preferred. Product Normal kernel density estimates are used to give a kernel estimator of true log-odds. The effect of correlation in the variables with product kernels is considered. With equal covariance matrices the kernel and parametric estimators are compared by simulation. For moderately correlated variables and large dimension sizes the product kernel method is a good estimator of true log-odds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Practical realisation of quantum information science is a challenge being addressed by researchers employing various technologies. One of them is based on quantum dots (QD), usually referred to as artificial atoms. Being capable to emit single and polarization entangled photons, they are attractive as sources of quantum bits (qubits) which can be relatively easily integrated into photonic circuits using conventional semiconductor technologies. However, the dominant self-assembled QD systems suffer from asymmetry related problems which modify the energetic structure. The main issue is the degeneracy lifting (the fine-structure splitting, FSS) of an optically allowed neutral exciton state which participates in a polarization-entanglement realisation scheme. The FSS complicates polarization-entanglement detection unless a particular FSS manipulation technique is utilized to reduce it to vanishing values, or a careful selection of intrinsically good candidates from the vast number of QDs is carried out, preventing the possibility of constructing vast arrays of emitters on the same sample. In this work, site-controlled InGaAs QDs grown on (111)B oriented GaAs substrates prepatterned with 7.5 μm pitch tetrahedrons were studied in order to overcome QD asymmetry related problems. By exploiting an intrinsically high rotational symmetry, pyramidal QDs were shown as polarization-entangled photon sources emitting photons with the fidelity of the expected maximally entangled state as high as 0.721. It is the first site-controlled QD system of entangled photon emitters. Moreover, the density of such emitters was found to be as high as 15% in some areas: the density much higher than in any other QD system. The associated physical phenomena (e.g., carrier dynamic, QD energetic structure) were studied, as well, by different techniques: photon correlation spectroscopy, polarization-resolved microphotoluminescence and magneto-photoluminescence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Very Long Baseline Interferometry (VLBI) polarisation observations of the relativistic jets from Active Galactic Nuclei (AGN) allow the magnetic field environment around the jet to be probed. In particular, multi-wavelength observations of AGN jets allow the creation of Faraday rotation measure maps which can be used to gain an insight into the magnetic field component of the jet along the line of sight. Recent polarisation and Faraday rotation measure maps of many AGN show possible evidence for the presence of helical magnetic fields. The detection of such evidence is highly dependent both on the resolution of the images and the quality of the error analysis and statistics used in the detection. This thesis focuses on the development of new methods for high resolution radio astronomy imaging in both of these areas. An implementation of the Maximum Entropy Method (MEM) suitable for multi-wavelength VLBI polarisation observations is presented and the advantage in resolution it possesses over the CLEAN algorithm is discussed and demonstrated using Monte Carlo simulations. This new polarisation MEM code has been applied to multi-wavelength imaging of the Active Galactic Nuclei 0716+714, Mrk 501 and 1633+382, in each case providing improved polarisation imaging compared to the case of deconvolution using the standard CLEAN algorithm. The first MEM-based fractional polarisation and Faraday-rotation VLBI images are presented, using these sources as examples. Recent detections of gradients in Faraday rotation measure are presented, including an observation of a reversal in the direction of a gradient further along a jet. Simulated observations confirming the observability of such a phenomenon are conducted, and possible explanations for a reversal in the direction of the Faraday rotation measure gradient are discussed. These results were originally published in Mahmud et al. (2013). Finally, a new error model for the CLEAN algorithm is developed which takes into account correlation between neighbouring pixels. Comparison of error maps calculated using this new model and Monte Carlo maps show striking similarities when the sources considered are well resolved, indicating that the method is correctly reproducing at least some component of the overall uncertainty in the images. The calculation of many useful quantities using this model is demonstrated and the advantages it poses over traditional single pixel calculations is illustrated. The limitations of the model as revealed by Monte Carlo simulations are also discussed; unfortunately, the error model does not work well when applied to compact regions of emission.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The training and ongoing education of medical practitioners has undergone major changes in an incremental fashion over the past 15 years. These changes have been driven by patient safety, educational, economic and legislative/regulatory factors. In the near future, training in procedural skills will undergo a paradigm shift to proficiency based progression with associated requirements for competence-based programmes, valid, reliable assessment tools and simulation technology. Before training begins, the learning outcomes require clear definition; any form of assessment applied should include measurement of these outcomes. Currently training in a procedural skill often takes place on an ad hoc basis. The number of attempts necessary to attain a defined degree of proficiency varies from procedure to procedure. Convincing evidence exists that simulation training helps trainees to acquire skills more efficiently rather than relying on opportunities in their clinical practice. Simulation provides a safe, stress free environment for trainees for skill acquisition, generalization and transfer via deliberate practice. The work described in this thesis contributes to a greater understanding of how medical procedures can be performed more safely and effectively through education. The effect of feedback, provided to novices in a standardized setting on a bench model, based on knowledge of performance was associated with an increase in the speed of skill acquisition and a decrease in error rate during initial learning. The timing of feedback was also associated with effective learning of skill. A marked attrition of skills (independent of the type of feedback provided) was demonstrable 24 hrs after they have first been learned. Using the principles of feedback as described above, when studying the effect of an intense training program on novices of varied years of experience in anaesthesia (i.e. the present training programmes / courses of an intense training day for one or more procedures). There was a marked attrition of skill at 24 hours with a significant correlation with increasing years of experience; there also appeared to be an inverse relationship between years of experience in anaesthesia and performance. The greater the number of years of practice experience, the longer it required a learner to acquire a new skill. The findings of the studies described in this thesis may have important implications for the trainers, trainees and training bodies in the design and implementation of training courses and the formats of delivery of changing curricula. Both curricula and training modalities will need to take account of characteristics of individual learners and the dynamic nature of procedural healthcare.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Leaving Certificate (LC) is the national, standardised state examination in Ireland necessary for entry to third level education – this presents a massive, raw corpus of data with the potential to yield invaluable insight into the phenomena of learner interlanguage. With samples of official LC Spanish examination data, this project has compiled a digitised corpus of learner Spanish comprised of the written and oral production of 100 candidates. This corpus was then analysed using a specific investigative corpus technique, Computer-aided Error Analysis (CEA, Dagneaux et al, 1998). CEA is a powerful apparatus in that it greatly facilitates the quantification and analysis of a large learner corpus in digital format. The corpus was both compiled and analysed with the use of UAM Corpus Tool (O’Donnell 2013). This Tool allows for the recording of candidate-specific variables such as grade, examination level, task type and gender, therefore allowing for critical analysis of the corpus as one unit, as separate written and oral sub corpora and also of performance per task, level and gender. This is an interdisciplinary work combining aspects of Applied Linguistics, Learner Corpus Research and Foreign Language (FL) Learning. Beginning with a review of the context of FL learning in Ireland and Europe, I go on to discuss the disciplinary context and theoretical framework for this work and outline the methodology applied. I then perform detailed quantitative and qualitative analyses before going on to combine all research findings outlining principal conclusions. This investigation does not make a priori assumptions about the data set, the LC Spanish examination, the context of FLs or of any aspect of learner competence. It undertakes to provide the linguistic research community and the domain of Spanish language learning and pedagogy in Ireland with an empirical, descriptive profile of real learner performance, characterising learner difficulty.