944 resultados para Quantitative methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Current guidelines for evaluating cleft palate treatments are mostly based on two-dimensional (2D) evaluation, but three-dimensional (3D) imaging methods to assess treatment outcome are steadily rising. OBJECTIVE To identify 3D imaging methods for quantitative assessment of soft tissue and skeletal morphology in patients with cleft lip and palate. DATA SOURCES Literature was searched using PubMed (1948-2012), EMBASE (1980-2012), Scopus (2004-2012), Web of Science (1945-2012), and the Cochrane Library. The last search was performed September 30, 2012. Reference lists were hand searched for potentially eligible studies. There was no language restriction. STUDY SELECTION We included publications using 3D imaging techniques to assess facial soft tissue or skeletal morphology in patients older than 5 years with a cleft lip with/or without cleft palate. We reviewed studies involving the facial region when at least 10 subjects in the sample size had at least one cleft type. Only primary publications were included. DATA EXTRACTION Independent extraction of data and quality assessments were performed by two observers. RESULTS Five hundred full text publications were retrieved, 144 met the inclusion criteria, with 63 high quality studies. There were differences in study designs, topics studied, patient characteristics, and success measurements; therefore, only a systematic review could be conducted. Main 3D-techniques that are used in cleft lip and palate patients are CT, CBCT, MRI, stereophotogrammetry, and laser surface scanning. These techniques are mainly used for soft tissue analysis, evaluation of bone grafting, and changes in the craniofacial skeleton. Digital dental casts are used to evaluate treatment and changes over time. CONCLUSION Available evidence implies that 3D imaging methods can be used for documentation of CLP patients. No data are available yet showing that 3D methods are more informative than conventional 2D methods. Further research is warranted to elucidate it.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative descriptive analysis (QDA) is used to describe the nature and the intensity of sensory properties from a single evaluation of a product, whereas temporal dominance of sensation (TDS) is primarily used to identify dominant sensory properties over time. Previous studies with TDS have focused on model systems, but this is the first study to use a sequential approach, i.e. QDA then TDS in measuring sensory properties of a commercial product category, using the same set of trained assessors (n = 11). The main objectives of this study were to: (1) investigate the benefits of using a sequential approach of QDA and TDS and (2) to explore the impact of the sample composition on taste and flavour perceptions in blackcurrant squashes. The present study has proposed an alternative way of determining the choice of attributes for TDS measurement based on data obtained from previous QDA studies, where available. Both methods indicated that the flavour profile was primarily influenced by the level of dilution and complexity of sample composition combined with blackcurrant juice content. In addition, artificial sweeteners were found to modify the quality of sweetness and could also contribute to bitter notes. Using QDA and TDS in tandem was shown to be more beneficial than each just on its own enabling a more complete sensory profile of the products.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: Qualitative research is increasingly valued as part of the evidence for policy and practice, but how it should be appraised is contested. Various appraisal methods, including checklists and other structured approaches, have been proposed but rarely evaluated. We aimed to compare three methods for appraising qualitative research papers that were candidates for inclusion in a systematic review of evidence on support for breast-feeding. Method: A sample of 12 research papers on support for breast-feeding was appraised by six qualitative reviewers using three appraisal methods: unprompted judgement, based on expert opinion; a UK Cabinet Office quality framework; and CASP, a Critical Appraisal Skills Programme tool. Papers were assigned, following appraisals, to 1 of 5 categories, which were dichotomized to indicate whether or not papers should be included in a systematic review. Patterns of agreement in categorization of papers were assessed quantitatively using κ statistics, and qualitatively using cross-case analysis. Results: Agreement in categorizing papers across the three methods was slight (κ =0.13; 95% CI 0.06-0.24). Structured approaches did not appear to yield higher agreement than that by unprompted judgement. Qualitative analysis revealed reviewers' dilemmas in deciding between the potential impact of findings and the quality of the research execution or reporting practice. Structured instruments appeared to make reviewers more explicit about the reasons for their judgements. Conclusions: Structured approaches may not produce greater consistency of judgements about whether to include qualitative papers in a systematic review. Future research should address how appraisals of qualitative research should be incorporated in systematic reviews. © The Royal Society of Medicine Press Ltd 2007.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Drowsy driving impairs motorists’ ability to operate vehicles safely, endangering both the drivers and other people on the road. The purpose of the project is to find the most effective wearable device to detect drowsiness. Existing research has demonstrated several options for drowsiness detection, such as electroencephalogram (EEG) brain wave measurement, eye tracking, head motions, and lane deviations. However, there are no detailed trade-off analyses for the cost, accuracy, detection time, and ergonomics of these methods. We chose to use two different EEG headsets: NeuroSky Mindwave Mobile (single-electrode) and Emotiv EPOC (14- electrode). We also tested a camera and gyroscope-accelerometer device. We can successfully determine drowsiness after five minutes of training using both single and multi-electrode EEGs. Devices were evaluated using the following criteria: time needed to achieve accurate reading, accuracy of prediction, rate of false positives vs. false negatives, and ergonomics and portability. This research will help improve detection devices, and reduce the number of future accidents due to drowsy driving.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Natural events are a widely recognized hazard for industrial sites where relevant quantities of hazardous substances are handled, due to the possible generation of cascading events resulting in severe technological accidents (Natech scenarios). Natural events may damage storage and process equipment containing hazardous substances, that may be released leading to major accident scenarios called Natech events. The need to assess the risk associated with Natech scenarios is growing and methodologies were developed to allow the quantification of Natech risk, considering both point sources and linear sources as pipelines. A key element of these procedures is the use of vulnerability models providing an estimation of the damage probability of equipment or pipeline segment as a result of the impact of the natural event. Therefore, the first aim of the PhD project was to outline the state of the art of vulnerability models for equipment and pipelines subject to natural events such as floods, earthquakes, and wind. Moreover, the present PhD project also aimed at the development of new vulnerability models in order to fill some gaps in literature. In particular, a vulnerability model for vertical equipment subject to wind and to flood were developed. Finally, in order to improve the calculation of Natech risk for linear sources an original methodology was developed for Natech quantitative risk assessment methodology for pipelines subject to earthquakes. Overall, the results obtained are a step forward in the quantitative risk assessment of Natech accidents. The tools developed open the way to the inclusion of new equipment in the analysis of Natech events, and the methodology for the assessment of linear risk sources as pipelines provides an important tool for a more accurate and comprehensive assessment of Natech risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A flow injection method for the quantitative analysis of ketoconazole in tablets, based on the reaction with iron (III) ions, is presented. Ketoconazole forms a red complex with iron ions in an acid medium, with maximum absorbance at 495 nm. The detection limit was estimated to be 1×10--4 mol L-1; the quantitation limit is about 3×10--4 mol L-1 and approximately 30 determinations can be performed in an hour. The results were compared with those obtained with a reference HPLC method. Statistical comparisons were done using the Student's t procedure and the F test. Complete agreement was found at the 0.95 significance level between the proposed flow injection and the HPLC procedures. The two methods present similar precision, i.e., for HPLC the mean relative standard deviation was ca. 1.2% and for FIA ca. 1.6%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Open access publishing is becoming increasingly popular within the biomedical sciences. SciELO, the Scientific Electronic Library Online, is a digital library covering a selected collection of Brazilian scientific journals many of which provide open access to full-text articles.This library includes a number of dental journals some of which may include reports of clinical trials in English, Portuguese and/or Spanish. Thus, SciELO could play an important role as a source of evidence for dental healthcare interventions especially if it yields a sizeable number of high quality reports. OBJECTIVE: The aim of this study was to identify reports of clinical trials by handsearching of dental journals that are accessible through SciELO, and to assess the overall quality of these reports. MATERIAL AND METHODS: Electronic versions of six Brazilian dental Journals indexed in SciELO were handsearched at www.scielo.br in September 2008. Reports of clinical trials were identified and classified as controlled clinical trials (CCTs - prospective, experimental studies comparing 2 or more healthcare interventions in human beings) or randomized controlled trials (RCTs - a random allocation method is clearly reported), according to Cochrane eligibility criteria. CRITERIA TO ASSESS METHODOLOGICAL QUALITY INCLUDED: method of randomization, concealment of treatment allocation, blinded outcome assessment, handling of withdrawals and losses and whether an intention-to-treat analysis had been carried out. RESULTS: The search retrieved 33 CCTs and 43 RCTs. A majority of the reports provided no description of either the method of randomization (75.3%) or concealment of the allocation sequence (84.2%). Participants and outcome assessors were reported as blinded in only 31.2% of the reports. Withdrawals and losses were only clearly described in 6.5% of the reports and none mentioned an intention-to-treat analysis or any similar procedure. CONCLUSIONS: The results of this study indicate that a substantial number of reports of trials and systematic reviews are available in the dental journals listed in SciELO, and that these could provide valuable evidence for clinical decision making. However, it is clear that the quality of a number of these reports is of some concern and that improvement in the conduct and reporting of these trials could be achieved if authors adhered to internationally accepted guidelines, e.g. the CONSORT statement.