944 resultados para Genotyping uncertainty
Resumo:
Background: Great efforts have been made to increase accessibility of HIV antiretroviral therapy (ART) in low and middle-income countries. The threat of wide-scale emergence of drug resistance could severely hamper ART scale-up efforts. Population-based surveillance of transmitted HIV drug resistance ensures the use of appropriate first-line regimens to maximize efficacy of ART programs where drug options are limited. However, traditional HIV genotyping is extremely expensive, providing a cost barrier to wide-scale and frequent HIV drug resistance surveillance. Methods/Results: We have developed a low-cost laboratory-scale next-generation sequencing-based genotyping method to monitor drug resistance. We designed primers specifically to amplify protease and reverse transcriptase from Brazilian HIV subtypes and developed a multiplexing scheme using multiplex identifier tags to minimize cost while providing more robust data than traditional genotyping techniques. Using this approach, we characterized drug resistance from plasma in 81 HIV infected individuals collected in Sao Paulo, Brazil. We describe the complexities of analyzing next-generation sequencing data and present a simplified open-source workflow to analyze drug resistance data. From this data, we identified drug resistance mutations in 20% of treatment naive individuals in our cohort, which is similar to frequencies identified using traditional genotyping in Brazilian patient samples. Conclusion: The developed ultra-wide sequencing approach described here allows multiplexing of at least 48 patient samples per sequencing run, 4 times more than the current genotyping method. This method is also 4-fold more sensitive (5% minimal detection frequency vs. 20%) at a cost 3-5 x less than the traditional Sanger-based genotyping method. Lastly, by using a benchtop next-generation sequencer (Roche/454 GS Junior), this approach can be more easily implemented in low-resource settings. This data provides proof-of-concept that next-generation HIV drug resistance genotyping is a feasible and low-cost alternative to current genotyping methods and may be particularly beneficial for in-country surveillance of transmitted drug resistance.
Resumo:
Human parvovirus B19 (B19V) infection can be a life-threatening condition among patients with hereditary (chronic) hemolytic anemias. Our objective was to characterize the infection molecularly among patients with sickle cell disease and thalassemia. Forty-seven patients (37 with sickle cell disease, and 10 with beta-thalassemia major) as well as 47 healthy blood donors were examined for B19V infection by anti-B19V IgG enzyme immunoassay, quantitative PCR, which detects all B19V genotypes, and DNA sequencing. B19V viremia was documented in nine patients (19.1%) as two displayed acute infection and the rest had a low titre viremia (mean 3.4 x 10(4) copies/mL). All donors were negative for B19V DNA. Anti-B19V IgG was detected in 55.3% of the patients and 57.4% among the donors. Based on partial NS1 fragments, all patient isolates were classified as genotype 1 and subgenotype 1A. The evolutionary events of the examined partial NS1 gene sequence were associated with a lack of positive selection. The quantification of all B19V genotypes by a single hydrolytic probe is a technically useful method, but it is difficult to establish relationships between B19V sequence characteristics and infection outcome.
Resumo:
Categorical data cannot be interpolated directly because they are outcomes of discrete random variables. Thus, types of categorical variables are transformed into indicator functions that can be handled by interpolation methods. Interpolated indicator values are then backtransformed to the original types of categorical variables. However, aspects such as variability and uncertainty of interpolated values of categorical data have never been considered. In this paper we show that the interpolation variance can be used to map an uncertainty zone around boundaries between types of categorical variables. Moreover, it is shown that the interpolation variance is a component of the total variance of the categorical variables, as measured by the coefficient of unalikeability. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Santos C.R., Mori E., Leao D.A. & Maiorka P.C. 2012. [Genotyping of polymorphisms in the prnp gene in Santa Ines sheep in the State of Sao Paulo, Brazil.] Genotipagem de polimorfismos no gene prnp em ovinos Santa Ines no Estado de Sao Paulo. Pesquisa Veterinaria Brasileira 32(3):221-226. Laboratorio de Neuropatologia Experimental e Comparada, Departamento de Patologia, Faculdade de Medicina Veterinaria e Zootecnia, Universidade de Sao Paulo, Av. Prof. Dr. Orlando Marques de Paiva 87, Cidade Universitaria, Sao Paulo, SP 05508-270, Brazil. E-mail: caio-patologia@usp.br Enzootic paraplexia or scrapie is a fatal neurodegenerative disease affecting mainly sheep and rarely goats. The disease is influenced by polymorphisms at codons 136, 154 and 171 of prnp gene that encodes the prion protein. The animals may be susceptible or resistant to the development of the disease according to the allelic sequences observed in these codons. In Brazil there were only cases of scrapie in imported animals, therefore the country is considered free of the disease. This study performed the genotyping of different polymorphisms associated to the development of scrapie. Then, based on these findings the animals were categorized in resistant and susceptible. A total of 118 samples were sequenced from the Santa Ines sheep raised on properties located in the State of Sao Paulo. From these samples, 6 alleles and 11 genotypes were identified (ARQ / ARQ, ARR / ARQ, ARQ / AHQ, ARQ / VRQ, AHQ / AHQ, ARR / ARR, ARR / AHQ, VRQ / VRQ, ARQ / TRQ, TRR / TRR, TRQ / TRQ), the genotype ARQ / ARQ presented a frequency of 56.7%. It was also detected the presence of tyrosine at codon 136, which may be considered a rare observation, since there is no report regarding Santa Ines breeding presenting this polymorphism. These results showed the great genetic variability in Santa Ines in Sao Paulo and only 1,69% of the genotypes observed are extremely resistant to scrapie. These data demonstrate that the Santa Ines sheep can be considered potentially susceptible to scrapie.
Resumo:
In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Background Genotyping of hepatitis C virus (HCV) has become an essential tool for prognosis and prediction of treatment duration. The aim of this study was to compare two HCV genotyping methods: reverse hybridization line probe assay (LiPA v.1) and partial sequencing of the NS5B region. Methods Plasma of 171 patients with chronic hepatitis C were screened using both a commercial method (LiPA HCV Versant, Siemens, Tarrytown, NY, USA) and different primers targeting the NS5B region for PCR amplification and sequencing analysis. Results Comparison of the HCV genotyping methods showed no difference in the classification at the genotype level. However, a total of 82/171 samples (47.9%) including misclassification, non-subtypable, discrepant and inconclusive results were not classified by LiPA at the subtype level but could be discriminated by NS5B sequencing. Of these samples, 34 samples of genotype 1a and 6 samples of genotype 1b were classified at the subtype level using sequencing of NS5B. Conclusions Sequence analysis of NS5B for genotyping HCV provides precise genotype and subtype identification and an accurate epidemiological representation of circulating viral strains.
Resumo:
The Brazilian network for genotyping is composed of 21 laboratories that perform and analyze genotyping tests for all HIV-infected patients within the public system, performing approximately 25,000 tests per year. We assessed the interlaboratory and intralaboratory reproducibility of genotyping systems by creating and implementing a local external quality control evaluation. Plasma samples from HIV-1-infected individuals (with low and intermediate viral loads) or RNA viral constructs with specific mutations were used. This evaluation included analyses of sensitivity and specificity of the tests based on qualitative and quantitative criteria, which scored laboratory performance on a 100-point system. Five evaluations were performed from 2003 to 2008, with 64% of laboratories scoring over 80 points in 2003, 81% doing so in 2005, 56% in 2006, 91% in 2007, and 90% in 2008 (Kruskal-Wallis, p = 0.003). Increased performance was aided by retraining laboratories that had specific deficiencies. The results emphasize the importance of investing in laboratory training and interpretation of DNA sequencing results, especially in developing countries where public (or scarce) resources are used to manage the AIDS epidemic.
Resumo:
Programa de Doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería
Resumo:
In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.
Resumo:
In this thesis we address a collection of Network Design problems which are strongly motivated by applications from Telecommunications, Logistics and Bioinformatics. In most cases we justify the need of taking into account uncertainty in some of the problem parameters, and different Robust optimization models are used to hedge against it. Mixed integer linear programming formulations along with sophisticated algorithmic frameworks are designed, implemented and rigorously assessed for the majority of the studied problems. The obtained results yield the following observations: (i) relevant real problems can be effectively represented as (discrete) optimization problems within the framework of network design; (ii) uncertainty can be appropriately incorporated into the decision process if a suitable robust optimization model is considered; (iii) optimal, or nearly optimal, solutions can be obtained for large instances if a tailored algorithm, that exploits the structure of the problem, is designed; (iv) a systematic and rigorous experimental analysis allows to understand both, the characteristics of the obtained (robust) solutions and the behavior of the proposed algorithm.
Resumo:
Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine.