944 resultados para Quantitative methods
Resumo:
Background: The study of myofiber reorganization in the remote zone after myocardial infarction has been performed in 2D. Microstructural reorganization in remodeled hearts, however, can only be fully appreciated by considering myofibers as continuous 3D entities. The aim of this study was therefore to develop a technique for quantitative 3D diffusion CMR tractography of the heart, and to apply this method to quantify fiber architecture in the remote zone of remodeled hearts. Methods: Diffusion Tensor CMR of normal human, sheep, and rat hearts, as well as infarcted sheep hearts was performed ex vivo. Fiber tracts were generated with a fourth-order Runge-Kutta integration technique and classified statistically by the median, mean, maximum, or minimum helix angle (HA) along the tract. An index of tract coherence was derived from the relationship between these HA statistics. Histological validation was performed using phase-contrast microscopy. Results: In normal hearts, the subendocardial and subepicardial myofibers had a positive and negative HA, respectively, forming a symmetric distribution around the midmyocardium. However, in the remote zone of the infarcted hearts, a significant positive shift in HA was observed. The ratio between negative and positive HA variance was reduced from 0.96 +/- 0.16 in normal hearts to 0.22 +/- 0.08 in the remote zone of the remodeled hearts (p<0.05). This was confirmed histologically by the reduction of HA in the subepicardium from -52.03 degrees +/- 2.94 degrees in normal hearts to -37.48 degrees +/- 4.05 degrees in the remote zone of the remodeled hearts (p < 0.05). Conclusions: A significant reorganization of the 3D fiber continuum is observed in the remote zone of remodeled hearts. The positive (rightward) shift in HA in the remote zone is greatest in the subepicardium, but involves all layers of the myocardium. Tractography-based quantification, performed here for the first time in remodeled hearts, may provide a framework for assessing regional changes in the left ventricle following infarction.
Resumo:
Introduction: This research project examined influence of the doctors' speciality on primary health care (PHC) problem solving in Belo Horizonte (BH) Brazil, comparing homeopathic with family health doctors (FH), from the management's and the patients' viewpoint. In BH, both FH and homeopathic doctors work in PHC. The index of resolvability (IR) is used to compare resolution of problems by doctors. Methods: The present research compared IR, using official data from the Secretariat of Health and test requests made by the doctors and 482 structured interviews with patients. A total of 217,963 consultations by 14 homeopaths and 67 FH doctors between 1 July 2006 and 30 June 2007 were analysed. Results: The results show significant differences greater problem resolution by homeopaths compared to FH doctors. Conclusion: In BH, the medical speciality, homeopathy or FH, has an impact on problem solving, both from the managers' and the patients' point of view. Homeopaths request fewer tests and have better IR compared with FH doctors. Specialisation in homeopathy is an independent positive factor in problem solving at PHC level in BH, Brazil. Homeopathy (2012) 101, 44-50.
Resumo:
Purpose: One of the most common problems of the surgical management of Graves upper eyelid retraction is the occurrence of eyelid contour abnormalities. In the present study, the postoperative contour of a large sample of eyelids of patients with Graves orbitopathy was measured. Methods: The postoperative upper eyelid contour of 62 eyes of 43 patients with Graves orbitopathy was subjectively classified by 3 experienced surgeons in 3 categories: poor, fair, and good. The shape of the eyelid contours in each category was then measured with a recently developed custom-made software by measuring multiple midpupil eyelid distances each 15 degrees along the palpebral fissure. The upper eyelid contour of 60 normal subjects was also quantified as a control group. Results: The mean ratio between the sum of the lateral and medial midpupil eyelid distances (lateral/medial ratio) was 1.10 +/- 0.11 standard deviation in controls and 1.15 +/- 0.13 standard deviation in patients. Postoperatively, the mean midpupil eyelid distance at 90 degrees was 4.16 +/- 1.13 mm standard deviation. The distribution lateral/medial ratios of the eyelids judged as having good contours was similar to the distribution of the controls with a modal value centered on the interval between 1.0 and 1.10. The distribution of lateral/medial ratios of the eyelids judged as having poor contour was bimodal, with eyelids with low and high lateral/medial ratios. Low lateral/medial ratios occurred when there was a lateral overcorrection, giving the eyelid a flat or a medial ptosis appearance. High lateral/medial ratios were due to a central or medial overcorrection or a lateral peak maintenance. Conclusions: Postoperative upper eyelid contour abnormalities can be quantified by comparing the sum of multiple midpupil eyelid distances of the lateral and medial sectors of the eyelid. Low and high lateral/medial ratios are anomalous and judged as unpleasant. (Ophthal Plast Reconstr Surg 2012;28:429-433)
Resumo:
Cefadroxil is a semi-synthetic first-generation oral cephalosporin used in the treatment of mild to moderate infections of the respiratory and urinary tracts, skin and soft tissue infections. In this work a simple, rapid, economic and sensitive HPLC-UV method is described for the quantitative determination of cefadroxil in human plasma samples using lamivudine as internal standard. Sample pre-treatment was accomplished through protein precipitation with acetonitrile and chromatographic separation was performed with a mobile phase consisting of a mixture of sodium dihydrogen phosphate monohydrate solution, methanol and acetonitrile in the ratio of 90:8:2 (v/v/v) at a flow rate of 1.0mL/min. The proposed method is linear between 0.4 to 40.0 mu g/mL and its average recovery is 102.21% for cefadroxil and 97.94% for lamivudine. The method is simple, sensitive, reproducible, less time consuming for determination of cefadroxil in human plasma. The method can therefore be recommended for pharmacokinetics studies, including bioavailability and bioequivalence studies.
Resumo:
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Resumo:
Human mesenchymal stem cells (hMSCs) are adult multipotent cells that have high therapeutic potential due to their immunological properties. They can be isolated from several different tissues with bone marrow (BM) being the most common source. Because the isolation procedure is invasive, other tissues such as human umbilical cord vein (UCV) have been considered. However, their interchangeability remains unclear. In the present study, total protein extracts of BM-hMSCs and UCV-hMSCs were quantitatively compared using gel-LC-MS/MS. Previous SAGE analysis of the same cells was re-annotated to enable comparison and combination of these two data sets. We observed a more than 63% correlation between proteomic and transcriptomic data. In silico analysis of highly expressed genes in cells of both origins suggests that they can be modulated by microRNA, which can change protein abundance. Our results showed that MSCs from both tissues shared high similarity in metabolic and functional processes relevant to their therapeutic potential, especially in the immune system process, response to stimuli, and processes related to the delivery of the hMSCs to a given tissue, such as migration and adhesion. Hence, our results support the idea that the more accessible UCV could be a potentially less invasive source of MSCs.
Resumo:
Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.
Resumo:
Abstract Background The present study examined absolute alpha power using quantitative electroencephalogram (qEEG) in bilateral temporal and parietal cortices in novice soldiers under the influence of methylphenidate (MPH) during the preparatory aiming period in a practical pistol-shooting task. We anticipated higher bi-hemispheric cortical activation in the preparatory period relative to pre-shot baseline in the methylphenidate group when compared with the control group because methylphenidate has been shown to enhance task-related cognitive functions. Methods Twenty healthy, novice soldiers were equally distributed in control (CG; n = 10) and MPH groups 10 mg (MG; n = 10) using a randomized, double blind design. Subjects performed a pistol-shooting task while electroencephalographic activity was acquired. Results We found main effects for group and practice blocks on behavioral measures, and interactions between group and phases on electroencephalographic measures for the electrodes T3, T4, P3 and P4. Regarding the behavioral measures, the MPH group demonstrated significantly poorer in shooting performance when compared with the control and, in addition, significant increases in the scores over practice blocks were found on both groups. In addition, regarding the electroencephalographic data, we observed a significant increase in alpha power over practice blocks, but alpha power was significantly lower for the MPH group when compared with the placebo group. Moreover, we observed a significant decrease in alpha power in electrodes T4 and P4 during PTM. Conclusion Although we found no correlation between behavioral and EEG data, our findings show that MPH did not prevent the learning of the task in healthy subjects. However, during the practice blocks (PBs) it also did not favor the performance when compared with control group performance. It seems that the CNS effects of MPH demanded an initial readjustment period of integrated operations relative to the sensorimotor system. In other words, MPH seems to provoke a period of initial instability due to a possible modulation in neural activity, which can be explained by lower levels of alpha power (i.e., higher cortical activity). However, after the end of the PB1 a new stabilization was established in neural circuits, due to repetition of the task, resulting higher cortical activity during the task. In conclusion, MPH group performance was not initially superior to that of the control group, but eventually exceeded it, albeit without achieving statistical significance.
Resumo:
Introduction Toxoplasmosis may be life-threatening in fetuses and in immune-deficient patients. Conventional laboratory diagnosis of toxoplasmosis is based on the presence of IgM and IgG anti-Toxoplasma gondii antibodies; however, molecular techniques have emerged as alternative tools due to their increased sensitivity. The aim of this study was to compare the performance of 4 PCR-based methods for the laboratory diagnosis of toxoplasmosis. One hundred pregnant women who seroconverted during pregnancy were included in the study. The definition of cases was based on a 12-month follow-up of the infants. Methods Amniotic fluid samples were submitted to DNA extraction and amplification by the following 4 Toxoplasma techniques performed with parasite B1 gene primers: conventional PCR, nested-PCR, multiplex-nested-PCR, and real-time PCR. Seven parameters were analyzed, sensitivity (Se), specificity (Sp), positive predictive value (PPV), negative predictive value (NPV), positive likelihood ratio (PLR), negative likelihood ratio (NLR) and efficiency (Ef). Results Fifty-nine of the 100 infants had toxoplasmosis; 42 (71.2%) had IgM antibodies at birth but were asymptomatic, and the remaining 17 cases had non-detectable IgM antibodies but high IgG antibody titers that were associated with retinochoroiditis in 8 (13.5%) cases, abnormal cranial ultrasound in 5 (8.5%) cases, and signs/symptoms suggestive of infection in 4 (6.8%) cases. The conventional PCR assay detected 50 cases (9 false-negatives), nested-PCR detected 58 cases (1 false-negative and 4 false-positives), multiplex-nested-PCR detected 57 cases (2 false-negatives), and real-time-PCR detected 58 cases (1 false-negative). Conclusions The real-time PCR assay was the best-performing technique based on the parameters of Se (98.3%), Sp (100%), PPV (100%), NPV (97.6%), PLR (â^ž), NLR (0.017), and Ef (99%).
Resumo:
Type Ia supernovae have been successfully used as standardized candles to study the expansion history of the Universe. In the past few years, these studies led to the exciting result of an accelerated expansion caused by the repelling action of some sort of dark energy. This result has been confirmed by measurements of cosmic microwave background radiation, the large-scale structure, and the dynamics of galaxy clusters. The combination of all these experiments points to a “concordance model” of the Universe with flat large-scale geometry and a dominant component of dark energy. However, there are several points related to supernova measurements which need careful analysis in order to doubtlessly establish the validity of the concordance model. As the amount and quality of data increases, the need of controlling possible systematic effects which may bias the results becomes crucial. Also important is the improvement of our knowledge of the physics of supernovae events to assure and possibly refine their calibration as standardized candle. This thesis addresses some of those issues through the quantitative analysis of supernova spectra. The stress is put on a careful treatment of the data and on the definition of spectral measurement methods. The comparison of measurements for a large set of spectra from nearby supernovae is used to study the homogeneity and to search for spectral parameters which may further refine the calibration of the standardized candle. One such parameter is found to reduce the dispersion in the distance estimation of a sample of supernovae to below 6%, a precision which is comparable with the current lightcurve-based calibration, and is obtained in an independent manner. Finally, the comparison of spectral measurements from nearby and distant objects is used to test the possibility of evolution with cosmic time of the intrinsic brightness of type Ia supernovae.
Resumo:
In this thesis some multivariate spectroscopic methods for the analysis of solutions are proposed. Spectroscopy and multivariate data analysis form a powerful combination for obtaining both quantitative and qualitative information and it is shown how spectroscopic techniques in combination with chemometric data evaluation can be used to obtain rapid, simple and efficient analytical methods. These spectroscopic methods consisting of spectroscopic analysis, a high level of automation and chemometric data evaluation can lead to analytical methods with a high analytical capacity, and for these methods, the term high-capacity analysis (HCA) is suggested. It is further shown how chemometric evaluation of the multivariate data in chromatographic analyses decreases the need for baseline separation. The thesis is based on six papers and the chemometric tools used are experimental design, principal component analysis (PCA), soft independent modelling of class analogy (SIMCA), partial least squares regression (PLS) and parallel factor analysis (PARAFAC). The analytical techniques utilised are scanning ultraviolet-visible (UV-Vis) spectroscopy, diode array detection (DAD) used in non-column chromatographic diode array UV spectroscopy, high-performance liquid chromatography with diode array detection (HPLC-DAD) and fluorescence spectroscopy. The methods proposed are exemplified in the analysis of pharmaceutical solutions and serum proteins. In Paper I a method is proposed for the determination of the content and identity of the active compound in pharmaceutical solutions by means of UV-Vis spectroscopy, orthogonal signal correction and multivariate calibration with PLS and SIMCA classification. Paper II proposes a new method for the rapid determination of pharmaceutical solutions by the use of non-column chromatographic diode array UV spectroscopy, i.e. a conventional HPLC-DAD system without any chromatographic column connected. In Paper III an investigation is made of the ability of a control sample, of known content and identity to diagnose and correct errors in multivariate predictions something that together with use of multivariate residuals can make it possible to use the same calibration model over time. In Paper IV a method is proposed for simultaneous determination of serum proteins with fluorescence spectroscopy and multivariate calibration. Paper V proposes a method for the determination of chromatographic peak purity by means of PCA of HPLC-DAD data. In Paper VI PARAFAC is applied for the decomposition of DAD data of some partially separated peaks into the pure chromatographic, spectral and concentration profiles.
Resumo:
This thesis is based on five papers addressing variance reduction in different ways. The papers have in common that they all present new numerical methods. Paper I investigates quantitative structure-retention relationships from an image processing perspective, using an artificial neural network to preprocess three-dimensional structural descriptions of the studied steroid molecules. Paper II presents a new method for computing free energies. Free energy is the quantity that determines chemical equilibria and partition coefficients. The proposed method may be used for estimating, e.g., chromatographic retention without performing experiments. Two papers (III and IV) deal with correcting deviations from bilinearity by so-called peak alignment. Bilinearity is a theoretical assumption about the distribution of instrumental data that is often violated by measured data. Deviations from bilinearity lead to increased variance, both in the data and in inferences from the data, unless invariance to the deviations is built into the model, e.g., by the use of the method proposed in paper III and extended in paper IV. Paper V addresses a generic problem in classification; namely, how to measure the goodness of different data representations, so that the best classifier may be constructed. Variance reduction is one of the pillars on which analytical chemistry rests. This thesis considers two aspects on variance reduction: before and after experiments are performed. Before experimenting, theoretical predictions of experimental outcomes may be used to direct which experiments to perform, and how to perform them (papers I and II). After experiments are performed, the variance of inferences from the measured data are affected by the method of data analysis (papers III-V).
Resumo:
This thesis is focused on the development of heteronuclear correlation methods in solid-state NMR spectroscopy, where the spatial dependence of the dipolar coupling is exploited to obtain structural and dynamical information in solids. Quantitative results on dipolar coupling constants are extracted by means of spinning sideband analysis in the indirect dimension of the two-dimensional experiments. The principles of sideband analysis were established and are currently widely used in the group of Prof. Spiess for the special case of homonuclear 1H double-quantum spectroscopy. The generalization of these principles to the heteronuclear case is presented, with special emphasis on naturally abundant 13C-1H systems. For proton spectroscopy in the solid state, line-narrowing is of particular importance, and is here achieved by very-fast sample rotation at the magic angle (MAS), with frequencies up to 35 kHz. Therefore, the heteronuclear dipolar couplings are suppressed and have to be recoupled in order to achieve an efficient excitation of the observed multiple-quantum modes. Heteronuclear recoupling is most straightforwardly accomplished by performing the known REDOR experiment, where pi-pulses are applied every half rotor period. This experiment was modified by the insertion of an additional spectroscopic dimension, such that heteronuclear multiple-quantum experiments can be carried out, which, as shown experimentally and theoretically, closely resemble homonuclear double-quantum experiments. Variants are presented which are well-suited for the recording of high-resolution 13C-1H shift correlation and spinning-sideband spectra, by means of which spatial proximities and quantitative dipolar coupling constants, respectively, of heteronuclear spin pairs can be determined. Spectral editing of 13C spectra is shown to be feasible with these techniques. Moreover, order phenomena and dynamics in columnar mesophases with 13C in natural abundance were investigated. Two further modifications of the REDOR concept allow the correlation of 13C with quadrupolar nuclei, such as 2H. The spectroscopic handling of these nuclei is challenging in that they cover large frequency ranges, and with the new experiments it is shown how the excitation problem can be tackled or circumvented altogether, respectively. As an example, one of the techniques is used for the identification of a yet unknown motional process of the H-bonded protons in the crystalline parts of poly(vinyl alcohol).
Resumo:
Great strides have been made in the last few years in the pharmacological treatment of neuropsychiatric disorders, with the introduction into the therapy of several new and more efficient agents, which have improved the quality of life of many patients. Despite these advances, a large percentage of patients is still considered “non-responder” to the therapy, not drawing any benefits from it. Moreover, these patients have a peculiar therapeutic profile, due to the very frequent application of polypharmacy, attempting to obtain satisfactory remission of the multiple aspects of psychiatric syndromes. Therapy is heavily individualised and switching from one therapeutic agent to another is quite frequent. One of the main problems of this situation is the possibility of unwanted or unexpected pharmacological interactions, which can occur both during polypharmacy and during switching. Simultaneous administration of psychiatric drugs can easily lead to interactions if one of the administered compounds influences the metabolism of the others. Impaired CYP450 function due to inhibition of the enzyme is frequent. Other metabolic pathways, such as glucuronidation, can also be influenced. The Therapeutic Drug Monitoring (TDM) of psychotropic drugs is an important tool for treatment personalisation and optimisation. It deals with the determination of parent drugs and metabolites plasma levels, in order to monitor them over time and to compare these findings with clinical data. This allows establishing chemical-clinical correlations (such as those between administered dose and therapeutic and side effects), which are essential to obtain the maximum therapeutic efficacy, while minimising side and toxic effects. It is evident the importance of developing sensitive and selective analytical methods for the determination of the administered drugs and their main metabolites, in order to obtain reliable data that can correctly support clinical decisions. During the three years of Ph.D. program, some analytical methods based on HPLC have been developed, validated and successfully applied to the TDM of psychiatric patients undergoing treatment with drugs belonging to following classes: antipsychotics, antidepressants and anxiolytic-hypnotics. The biological matrices which have been processed were: blood, plasma, serum, saliva, urine, hair and rat brain. Among antipsychotics, both atypical and classical agents have been considered, such as haloperidol, chlorpromazine, clotiapine, loxapine, risperidone (and 9-hydroxyrisperidone), clozapine (as well as N-desmethylclozapine and clozapine N-oxide) and quetiapine. While the need for an accurate TDM of schizophrenic patients is being increasingly recognized by psychiatrists, only in the last few years the same attention is being paid to the TDM of depressed patients. This is leading to the acknowledgment that depression pharmacotherapy can greatly benefit from the accurate application of TDM. For this reason, the research activity has also been focused on first and second-generation antidepressant agents, like triciclic antidepressants, trazodone and m-chlorophenylpiperazine (m-cpp), paroxetine and its three main metabolites, venlafaxine and its active metabolite, and the most recent antidepressant introduced into the market, duloxetine. Among anxiolytics-hypnotics, benzodiazepines are very often involved in the pharmacotherapy of depression for the relief of anxious components; for this reason, it is useful to monitor these drugs, especially in cases of polypharmacy. The results obtained during these three years of Ph.D. program are reliable and the developed HPLC methods are suitable for the qualitative and quantitative determination of CNS drugs in biological fluids for TDM purposes.
Resumo:
Here I will focus on three main topics that best address and include the projects I have been working in during my three year PhD period that I have spent in different research laboratories addressing both computationally and practically important problems all related to modern molecular genomics. The first topic is the use of livestock species (pigs) as a model of obesity, a complex human dysfunction. My efforts here concern the detection and annotation of Single Nucleotide Polymorphisms. I developed a pipeline for mining human and porcine sequences. Starting from a set of human genes related with obesity the platform returns a list of annotated porcine SNPs extracted from a new set of potential obesity-genes. 565 of these SNPs were analyzed on an Illumina chip to test the involvement in obesity on a population composed by more than 500 pigs. Results will be discussed. All the computational analysis and experiments were done in collaboration with the Biocomputing group and Dr.Luca Fontanesi, respectively, under the direction of prof. Rita Casadio at the Bologna University, Italy. The second topic concerns developing a methodology, based on Factor Analysis, to simultaneously mine information from different levels of biological organization. With specific test cases we develop models of the complexity of the mRNA-miRNA molecular interaction in brain tumors measured indirectly by microarray and quantitative PCR. This work was done under the supervision of Prof. Christine Nardini, at the “CAS-MPG Partner Institute for Computational Biology” of Shangai, China (co-founded by the Max Planck Society and the Chinese Academy of Sciences jointly) The third topic concerns the development of a new method to overcome the variety of PCR technologies routinely adopted to characterize unknown flanking DNA regions of a viral integration locus of the human genome after clinical gene therapy. This new method is entirely based on next generation sequencing and it reduces the time required to detect insertion sites, decreasing the complexity of the procedure. This work was done in collaboration with the group of Dr. Manfred Schmidt at the Nationales Centrum für Tumorerkrankungen (Heidelberg, Germany) supervised by Dr. Annette Deichmann and Dr. Ali Nowrouzi. Furthermore I add as an Appendix the description of a R package for gene network reconstruction that I helped to develop for scientific usage (http://www.bioconductor.org/help/bioc-views/release/bioc/html/BUS.html).