932 resultados para Mini-scale method
Resumo:
In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS) to a preconcentration unit, called trace gas extractor (TREX). This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, µmole mole−1) methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on repeated measurements of compressed air during a 2-week intercomparison campaign, the repeatability of the TREX–QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass spectrometry (IRMS) based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. This also displays the potential to improve the interlaboratory compatibility based on the analysis of a reference air sample with accurately determined isotopic composition.
Resumo:
BACKGROUND: There is increasing evidence that a history of childhood abuse and neglect is not uncommon among individuals who experience mental disorder and that childhood trauma experiences are associated with adult psychopathology. Although several interview and self-report instruments for retrospective trauma assessment have been developed, many focus on sexual abuse (SexAb) rather than on multiple types of trauma or adversity. METHODS: Within the European Prediction of Psychosis Study, the Trauma and Distress Scale (TADS) was developed as a new self-report assessment of multiple types of childhood trauma and distressing experiences. The TADS includes 43 items and, following previous measures including the Childhood Trauma Questionnaire, focuses on five core domains: emotional neglect (EmoNeg), emotional abuse (EmoAb), physical neglect (PhyNeg), physical abuse (PhyAb), and SexAb.This study explores the psychometric properties of the TADS (internal consistency and concurrent validity) in 692 participants drawn from the general population who completed a mailed questionnaire, including the TADS, a depression self-report and questions on help-seeking for mental health problems. Inter-method reliability was examined in a random sample of 100 responders who were reassessed in telephone interviews. RESULTS: After minor revisions of PhyNeg and PhyAb, internal consistencies were good for TADS totals and the domain raw score sums. Intra-class coefficients for TADS total score and the five revised core domains were all good to excellent when compared to the interviewed TADS as a gold standard. In the concurrent validity analyses, the total TADS and its all core domains were significantly associated with depression and help-seeking for mental problems as proxy measures for traumatisation. In addition, robust cutoffs for the total TADS and its domains were calculated. CONCLUSIONS: Our results suggest the TADS as a valid, reliable, and clinically useful instrument for assessing retrospectively reported childhood traumatisation.
Resumo:
Pharmacokinetic and pharmacodynamic properties of a chiral drug can significantly differ between application of the racemate and single enantiomers. During drug development, the characteristics of candidate compounds have to be assessed prior to clinical testing. Since biotransformation significantly influences drug actions in an organism, metabolism studies represent a crucial part of such tests. Hence, an optimized and economical capillary electrophoretic method for on-line studies of the enantioselective drug metabolism mediated by cytochrome P450 enzymes was developed. It comprises a diffusion-based procedure, which enables mixing of the enzyme with virtually any compound inside the nanoliter-scale capillary reactor and without the need of additional optimization of mixing conditions. For CYP3A4, ketamine as probe substrate and highly sulfated γ-cyclodextrin as chiral selector, improved separation conditions for ketamine and norketamine enantiomers compared to a previously published electrophoretically mediated microanalysis method were elucidated. The new approach was thoroughly validated for the CYP3A4-mediated N-demethylation pathway of ketamine and applied to the determination of its kinetic parameters and the inhibition characteristics in presence of ketoconazole and dexmedetomidine. The determined parameters were found to be comparable to literature data obtained with different techniques. The presented method constitutes a miniaturized and cost-effective tool, which should be suitable for the assessment of the stereoselective aspects of kinetic and inhibition studies of cytochrome P450-mediated metabolic steps within early stages of the development of a new drug.
Resumo:
Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^
Resumo:
In the biomedical studies, the general data structures have been the matched (paired) and unmatched designs. Recently, many researchers are interested in Meta-Analysis to obtain a better understanding from several clinical data of a medical treatment. The hybrid design, which is combined two data structures, may create the fundamental question for statistical methods and the challenges for statistical inferences. The applied methods are depending on the underlying distribution. If the outcomes are normally distributed, we would use the classic paired and two independent sample T-tests on the matched and unmatched cases. If not, we can apply Wilcoxon signed rank and rank sum test on each case. ^ To assess an overall treatment effect on a hybrid design, we can apply the inverse variance weight method used in Meta-Analysis. On the nonparametric case, we can use a test statistic which is combined on two Wilcoxon test statistics. However, these two test statistics are not in same scale. We propose the Hybrid Test Statistic based on the Hodges-Lehmann estimates of the treatment effects, which are medians in the same scale.^ To compare the proposed method, we use the classic meta-analysis T-test statistic on the combined the estimates of the treatment effects from two T-test statistics. Theoretically, the efficiency of two unbiased estimators of a parameter is the ratio of their variances. With the concept of Asymptotic Relative Efficiency (ARE) developed by Pitman, we show ARE of the hybrid test statistic relative to classic meta-analysis T-test statistic using the Hodges-Lemann estimators associated with two test statistics.^ From several simulation studies, we calculate the empirical type I error rate and power of the test statistics. The proposed statistic would provide effective tool to evaluate and understand the treatment effect in various public health studies as well as clinical trials.^
Resumo:
Development of homology modeling methods will remain an area of active research. These methods aim to develop and model increasingly accurate three-dimensional structures of yet uncrystallized therapeutically relevant proteins e.g. Class A G-Protein Coupled Receptors. Incorporating protein flexibility is one way to achieve this goal. Here, I will discuss the enhancement and validation of the ligand-steered modeling, originally developed by Dr. Claudio Cavasotto, via cross modeling of the newly crystallized GPCR structures. This method uses known ligands and known experimental information to optimize relevant protein binding sites by incorporating protein flexibility. The ligand-steered models were able to model, reasonably reproduce binding sites and the co-crystallized native ligand poses of the β2 adrenergic and Adenosine 2A receptors using a single template structure. They also performed better than the choice of template, and crude models in a small scale high-throughput docking experiments and compound selectivity studies. Next, the application of this method to develop high-quality homology models of Cannabinoid Receptor 2, an emerging non-psychotic pain management target, is discussed. These models were validated by their ability to rationalize structure activity relationship data of two, inverse agonist and agonist, series of compounds. The method was also applied to improve the virtual screening performance of the β2 adrenergic crystal structure by optimizing the binding site using β2 specific compounds. These results show the feasibility of optimizing only the pharmacologically relevant protein binding sites and applicability to structure-based drug design projects.
Resumo:
Multiple holes were cored at Ocean Drilling Program Leg 178 Sites 1098 and 1099 in two subbasins of the Palmer Deep in order to recover complete and continuous records of sedimentation. By correlating measured properties of cores from different holes at a site, we have established a common depth scale, referred to as the meters composite depth scale (mcd), for all cores from Site 1098. For Site 1098, distinct similarities in the magnetic susceptibility records obtained from three holes provide tight constraints on between-hole correlation. Additional constraints come from lithologic features. Specific intervals from other data sets, particularly gamma-ray attenuation bulk density, magnetic intensity, and color reflectance, contain distinctive anomalies that correlate well when placed into the preferred composite depth scale, confirming that the scale is accurate. Coring in two holes at Site 1099 provides only a few meters of overlap. None of the data sets within this limited overlap region provide convincing correlations. Thus, the preferred composite depth scale for Site 1099 is the existing depth scale in meters below seafloor (mbsf).
Resumo:
Within the framework of the Collaborative Project for a European Sodium Fast Reactor, the reactor physics group at UPM is working on the extension of its in-house multi-scale advanced deterministic code COBAYA3 to Sodium Fast Reactors (SFR). COBAYA3 is a 3D multigroup neutron kinetics diffusion code that can be used either as a pin-by-pin code or as a stand-alone nodal code by using the analytic nodal diffusion solver ANDES. It is coupled with thermalhydraulics codes such as COBRA-TF and FLICA, allowing transient analysis of LWR at both fine-mesh and coarse-mesh scales. In order to enable also 3D pin-by-pin and nodal coupled NK-TH simulations of SFR, different developments are in progress. This paper presents the first steps towards the application of COBAYA3 to this type of reactors. ANDES solver, already extended to triangular-Z geometry, has been applied to fast reactor steady-state calculations. The required cross section libraries were generated with ERANOS code for several configurations. The limitations encountered in the application of the Analytic Coarse Mesh Finite Difference (ACMFD) method –implemented inside ANDES– to fast reactors are presented and the sensitivity of the method when using a high number of energy groups is studied. ANDES performance is assessed by comparison with the results provided by ERANOS, using a mini-core model in 33 energy groups. Furthermore, a benchmark from the NEA for a small 3D FBR in hexagonal-Z geometry and 4 energy groups is also employed to verify the behavior of the code with few energy groups.
Resumo:
The analysis of modes and natural frequencies is of primary interest in the computation of the response of bridges. In this article the transfer matrix method is applied to this problem to provide a computer code to calculate the natural frequencies and modes of bridge-like structures. The Fortran computer code is suitable for running on small computers and results are presented for a railway bridge.
Resumo:
The Pseudo-Dynamic Test Method (PDTM) is being developped currently as an alternative to the shaking table testing of large size models. However, the stepped slow execution of the former type of test has been found to be the source of important errors arising from the stress relaxation. A new continuous test method, wich allows the selection of a suitable time-scale factor in the response in order to control these errors, es proposed here. Such scaled-time response is theoretically obtained by simply augmenting the mass of the structure for wich some practical solutions are proposed.
Resumo:
We introduce a second order in time modified Lagrange--Galerkin (MLG) method for the time dependent incompressible Navier--Stokes equations. The main ingredient of the new method is the scheme proposed to calculate in a more efficient manner the Galerkin projection of the functions transported along the characteristic curves of the transport operator. We present error estimates for velocity and pressure in the framework of mixed finite elements when either the mini-element or the $P2/P1$ Taylor--Hood element are used.
Resumo:
Species?habitat associations may contribute to the maintenance of species richness in tropical forests, but previous research has been conducted almost exclusively in lowland forests and has emphasized the importance of topography and edaphic conditions. Is the distribution of woody plant species in a Peruvian cloud forest determined by microhabitat conditions? What is the role of environmental characteristics and forest structure in habitat partitioning in a tropical cloud forest? We examined species?habitat associations in three 1-ha plots using the torus-translation method. We used three different criteria to define habitats for habitat partitioning analyses, based on microtopography, forest structure and both sets of factors. The number of species associated either positively or negatively with each habitat was assessed. Habitats defined on the basis of environmental conditions and forest structure discriminated a greater number of positive and negative associations at the scale of our analyses in a tropical cloud forest. Both topographic conditions and forest structure contribute to small-scale microhabitat partitioning of woody plant species in a Peruvian tropical cloud forest. Nevertheless, canopy species were most correlated with the distribution of environmental variables, while understorey species displayed associations with forest structure.