21 resultados para quantitative methods
Resumo:
Cefadroxil is a semi-synthetic first-generation oral cephalosporin used in the treatment of mild to moderate infections of the respiratory and urinary tracts, skin and soft tissue infections. In this work a simple, rapid, economic and sensitive HPLC-UV method is described for the quantitative determination of cefadroxil in human plasma samples using lamivudine as internal standard. Sample pre-treatment was accomplished through protein precipitation with acetonitrile and chromatographic separation was performed with a mobile phase consisting of a mixture of sodium dihydrogen phosphate monohydrate solution, methanol and acetonitrile in the ratio of 90:8:2 (v/v/v) at a flow rate of 1.0mL/min. The proposed method is linear between 0.4 to 40.0 mu g/mL and its average recovery is 102.21% for cefadroxil and 97.94% for lamivudine. The method is simple, sensitive, reproducible, less time consuming for determination of cefadroxil in human plasma. The method can therefore be recommended for pharmacokinetics studies, including bioavailability and bioequivalence studies.
Resumo:
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Resumo:
Human mesenchymal stem cells (hMSCs) are adult multipotent cells that have high therapeutic potential due to their immunological properties. They can be isolated from several different tissues with bone marrow (BM) being the most common source. Because the isolation procedure is invasive, other tissues such as human umbilical cord vein (UCV) have been considered. However, their interchangeability remains unclear. In the present study, total protein extracts of BM-hMSCs and UCV-hMSCs were quantitatively compared using gel-LC-MS/MS. Previous SAGE analysis of the same cells was re-annotated to enable comparison and combination of these two data sets. We observed a more than 63% correlation between proteomic and transcriptomic data. In silico analysis of highly expressed genes in cells of both origins suggests that they can be modulated by microRNA, which can change protein abundance. Our results showed that MSCs from both tissues shared high similarity in metabolic and functional processes relevant to their therapeutic potential, especially in the immune system process, response to stimuli, and processes related to the delivery of the hMSCs to a given tissue, such as migration and adhesion. Hence, our results support the idea that the more accessible UCV could be a potentially less invasive source of MSCs.
Resumo:
Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.
Resumo:
Abstract Background The present study examined absolute alpha power using quantitative electroencephalogram (qEEG) in bilateral temporal and parietal cortices in novice soldiers under the influence of methylphenidate (MPH) during the preparatory aiming period in a practical pistol-shooting task. We anticipated higher bi-hemispheric cortical activation in the preparatory period relative to pre-shot baseline in the methylphenidate group when compared with the control group because methylphenidate has been shown to enhance task-related cognitive functions. Methods Twenty healthy, novice soldiers were equally distributed in control (CG; n = 10) and MPH groups 10 mg (MG; n = 10) using a randomized, double blind design. Subjects performed a pistol-shooting task while electroencephalographic activity was acquired. Results We found main effects for group and practice blocks on behavioral measures, and interactions between group and phases on electroencephalographic measures for the electrodes T3, T4, P3 and P4. Regarding the behavioral measures, the MPH group demonstrated significantly poorer in shooting performance when compared with the control and, in addition, significant increases in the scores over practice blocks were found on both groups. In addition, regarding the electroencephalographic data, we observed a significant increase in alpha power over practice blocks, but alpha power was significantly lower for the MPH group when compared with the placebo group. Moreover, we observed a significant decrease in alpha power in electrodes T4 and P4 during PTM. Conclusion Although we found no correlation between behavioral and EEG data, our findings show that MPH did not prevent the learning of the task in healthy subjects. However, during the practice blocks (PBs) it also did not favor the performance when compared with control group performance. It seems that the CNS effects of MPH demanded an initial readjustment period of integrated operations relative to the sensorimotor system. In other words, MPH seems to provoke a period of initial instability due to a possible modulation in neural activity, which can be explained by lower levels of alpha power (i.e., higher cortical activity). However, after the end of the PB1 a new stabilization was established in neural circuits, due to repetition of the task, resulting higher cortical activity during the task. In conclusion, MPH group performance was not initially superior to that of the control group, but eventually exceeded it, albeit without achieving statistical significance.
Resumo:
Introduction Toxoplasmosis may be life-threatening in fetuses and in immune-deficient patients. Conventional laboratory diagnosis of toxoplasmosis is based on the presence of IgM and IgG anti-Toxoplasma gondii antibodies; however, molecular techniques have emerged as alternative tools due to their increased sensitivity. The aim of this study was to compare the performance of 4 PCR-based methods for the laboratory diagnosis of toxoplasmosis. One hundred pregnant women who seroconverted during pregnancy were included in the study. The definition of cases was based on a 12-month follow-up of the infants. Methods Amniotic fluid samples were submitted to DNA extraction and amplification by the following 4 Toxoplasma techniques performed with parasite B1 gene primers: conventional PCR, nested-PCR, multiplex-nested-PCR, and real-time PCR. Seven parameters were analyzed, sensitivity (Se), specificity (Sp), positive predictive value (PPV), negative predictive value (NPV), positive likelihood ratio (PLR), negative likelihood ratio (NLR) and efficiency (Ef). Results Fifty-nine of the 100 infants had toxoplasmosis; 42 (71.2%) had IgM antibodies at birth but were asymptomatic, and the remaining 17 cases had non-detectable IgM antibodies but high IgG antibody titers that were associated with retinochoroiditis in 8 (13.5%) cases, abnormal cranial ultrasound in 5 (8.5%) cases, and signs/symptoms suggestive of infection in 4 (6.8%) cases. The conventional PCR assay detected 50 cases (9 false-negatives), nested-PCR detected 58 cases (1 false-negative and 4 false-positives), multiplex-nested-PCR detected 57 cases (2 false-negatives), and real-time-PCR detected 58 cases (1 false-negative). Conclusions The real-time PCR assay was the best-performing technique based on the parameters of Se (98.3%), Sp (100%), PPV (100%), NPV (97.6%), PLR (â^ž), NLR (0.017), and Ef (99%).