970 resultados para Statistical Method
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).
Resumo:
The application of statistics to science is not a neutral act. Statistical tools have shaped and were also shaped by its objects. In the social sciences, statistical methods fundamentally changed research practice, making statistical inference its centerpiece. At the same time, textbook writers in the social sciences have transformed rivaling statistical systems into an apparently monolithic method that could be used mechanically. The idol of a universal method for scientific inference has been worshipped since the "inference revolution" of the 1950s. Because no such method has ever been found, surrogates have been created, most notably the quest for significant p values. This form of surrogate science fosters delusions and borderline cheating and has done much harm, creating, for one, a flood of irreproducible results. Proponents of the "Bayesian revolution" should be wary of chasing yet another chimera: an apparently universal inference procedure. A better path would be to promote both an understanding of the various devices in the "statistical toolbox" and informed judgment to select among these.
Resumo:
PURPOSE: Proper delineation of ocular anatomy in 3-dimensional (3D) imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic resonance imaging (MRI) is presently used in clinical practice for diagnosis confirmation and treatment planning for treatment of retinoblastoma in infants, where it serves as a source of information, complementary to the fundus or ultrasonographic imaging. Here we present a framework to fully automatically segment the eye anatomy for MRI based on 3D active shape models (ASM), and we validate the results and present a proof of concept to automatically segment pathological eyes. METHODS AND MATERIALS: Manual and automatic segmentation were performed in 24 images of healthy children's eyes (3.29 ± 2.15 years of age). Imaging was performed using a 3-T MRI scanner. The ASM consists of the lens, the vitreous humor, the sclera, and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens, and the optic nerve, and then aligning the model and fitting it to the patient. We validated our segmentation method by using a leave-one-out cross-validation. The segmentation results were evaluated by measuring the overlap, using the Dice similarity coefficient (DSC) and the mean distance error. RESULTS: We obtained a DSC of 94.90 ± 2.12% for the sclera and the cornea, 94.72 ± 1.89% for the vitreous humor, and 85.16 ± 4.91% for the lens. The mean distance error was 0.26 ± 0.09 mm. The entire process took 14 seconds on average per eye. CONCLUSION: We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor, and the lens, using MRI. We additionally present a proof of concept for fully automatically segmenting eye pathology. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor.
Resumo:
A statistical indentation method has been employed to study the hardness value of fire-refined high conductivity copper, using nanoindentation technique. The Joslin and Oliver approach was used with the aim to separate the hardness (H) influence of copper matrix, from that of inclusions and grain boundaries. This approach relies on a large array of imprints (around 400 indentations), performed at 150 nm of indentation depth. A statistical study using a cumulative distribution function fit and Gaussian simulated distributions, exhibits that H for each phase can be extracted when the indentation depth is much lower than the size of the secondary phases. It is found that the thermal treatment produces a hardness increase, due to the partly re-dissolution of the inclusions (mainly Pb and Sn) in the matrix.
Resumo:
Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.
Resumo:
A simple and rapid precipitation titration method was developed and validated to determine sulfate ion content in indinavir sulfate raw material. 0.1 mol L-1 lead nitrate volumetric solution was used as titrant employing potentiometric endpoint determination using a lead-specific electrode. The United States Pharmacopoeia Forum indicates a potentiometric method for sulfate ion quantitation using 0.1 mol L-1 lead perchlorate as titrant. Both methods were validated concerning linearity, precision and accuracy, yielding good results. The sulfate ion content found by the two validated methods was compared by the statistical t-student test, indicating that there was no statistically significant difference between the methods.
Resumo:
The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.
Resumo:
A flow injection method for the quantitative analysis of ketoconazole in tablets, based on the reaction with iron (III) ions, is presented. Ketoconazole forms a red complex with iron ions in an acid medium, with maximum absorbance at 495 nm. The detection limit was estimated to be 1×10--4 mol L-1; the quantitation limit is about 3×10--4 mol L-1 and approximately 30 determinations can be performed in an hour. The results were compared with those obtained with a reference HPLC method. Statistical comparisons were done using the Student's t procedure and the F test. Complete agreement was found at the 0.95 significance level between the proposed flow injection and the HPLC procedures. The two methods present similar precision, i.e., for HPLC the mean relative standard deviation was ca. 1.2% and for FIA ca. 1.6%.
Resumo:
A spectrophotometric method based on the formation of ion-pair complex between haloperidol and eriochrome black T (EBT) at pH 1.85 has been described. The formed complex was extracted quantitatively into chloroform and measured at 510 nm. Infra red (IR) studies were performed to confirm the formation of ion-pair complex. Beer's law was obeyed in the concentration range of 2.0-9.0 µg mL-1 with molar absorptivity of 2.67 × 10(4) L mol-1 cm-1. The detection limit was found to be 0.18 µg mL-1. Statistical comparison of the results of the proposed method with those of the reference method shows excellent agreement and indicates no significant difference in accuracy and precision.
Resumo:
In the present study, a reversed-phase high-performance liquid chromatographic (RP-HPLC) procedure was developed and validated for the simultaneous determination of seven water-soluble vitamins (thiamine, riboflavin, niacin, cyanocobalamin, ascorbic acid, folic acid, and p-aminobenzoic acid) and four fat-soluble vitamins (retinol acetate, cholecalciferol, α-tocopherol, and phytonadione) in multivitamin tablets. The linearity of the method was excellent (R² > 0.999) over the concentration range of 10 - 500 ng mL-1. The statistical evaluation of the method was carried out by performing the intra- and inter-day precision. The accuracy of the method was tested by measuring the average recovery; values ranged between 87.4% and 98.5% and were acceptable quantitative results that corresponded with the label claims.
Resumo:
A novel sensitive and relatively selective kinetic method is presented for the determination of V(V), based on its catalytic effect on the oxidation reaction of Ponceau Xylydine by potassium bromate in presence of 5-sulfosalicylic acid (SSA) as activator. The reaction was monitored spectrophotometrically by measuring the decrease in absorbance of Ponceau Xylydine at 640 nm between 0.5 to 7 min (the fixed time method) in H3PO4 medium at 25ºC. The effect of various parameters such as concentrations of H3PO4, SSA, bromate and Ponceau Xylydine, temperature and ionic strength on the rate of net reaction were studied. The method is free from most interferences, especially from large amounts of V(IV). The decrease in absorbance is proportional to the concentration of V(V) over the entire concentration range tested (1-15 ng mL−1) with a detection limit of 0.46 ng mL-1 (according to statistical 3Sblank/k criterion) and a coefficient of variation (CV) of 1.8% (for ten replicate measurement at 95% confidence level). The proposed method suffers few interferences such as Cr(VI) and Hg(II) ions. The method was successfully applied to the determination of V(V) in tap water, drinking water, bottled mineral water samples and a certified standard reference material such as SRM-1640 with satisfactory results. The vanadium contents of water samples were also determined by FAAS for a comparison. The recovery of spiked vanadium(V) was found to be quantitative and the reproducibility was satisfactory. It was observed that the results of the SRM 1640 were in good agreement with the certified value.
Resumo:
Thermal and air conditions inside animal facilities change during the day due to the influence of the external environment. For statistical and geostatistical analyses to be representative, a large number of points spatially distributed in the facility area must be monitored. This work suggests that the time variation of environmental variables of interest for animal production, monitored within animal facility, can be modeled accurately from discrete-time records. The aim of this study was to develop a numerical method to correct the temporal variations of these environmental variables, transforming the data so that such observations are independent of the time spent during the measurement. The proposed method approached values recorded with time delays to those expected at the exact moment of interest, if the data were measured simultaneously at the moment at all points distributed spatially. The correction model for numerical environmental variables was validated for environmental air temperature parameter, and the values corrected by the method did not differ by Tukey's test at 5% significance of real values recorded by data loggers.
Resumo:
Radiation balance is the fraction of incident solar radiation upon earth surface which is available to be used in several natural processes, such as biological metabolism, water loss by vegetated surfaces, variation of temperature in farming systems and organic decomposition. The present study aimed to assess and validate the performance of two estimation models for Rn in Ponta Grossa city, Paraná State, Brazil. To this end, during the period of 04/01/2008 to 04/30/2011, from radiometric data collected by an automatic weather station set at the Experimental Station, of the State University of Ponta Grossa. We performed a linear regression study by confrontation between measurements made through radiometric balance and Rn estimates obtained from Brunt classical method, and the proposed method. Both models showed excellent performance and were confirmed by the statistical parameters applied. However, the alternative method has the advantage of requiring only global solar radiation values, temperature, and relative humidity.
Resumo:
ABSTRACT This study aimed to develop a methodology based on multivariate statistical analysis of principal components and cluster analysis, in order to identify the most representative variables in studies of minimum streamflow regionalization, and to optimize the identification of the hydrologically homogeneous regions for the Doce river basin. Ten variables were used, referring to the river basin climatic and morphometric characteristics. These variables were individualized for each of the 61 gauging stations. Three dependent variables that are indicative of minimum streamflow (Q7,10, Q90 and Q95). And seven independent variables that concern to climatic and morphometric characteristics of the basin (total annual rainfall – Pa; total semiannual rainfall of the dry and of the rainy season – Pss and Psc; watershed drainage area – Ad; length of the main river – Lp; total length of the rivers – Lt; and average watershed slope – SL). The results of the principal component analysis pointed out that the variable SL was the least representative for the study, and so it was discarded. The most representative independent variables were Ad and Psc. The best divisions of hydrologically homogeneous regions for the three studied flow characteristics were obtained using the Mahalanobis similarity matrix and the complete linkage clustering method. The cluster analysis enabled the identification of four hydrologically homogeneous regions in the Doce river basin.