897 resultados para Quantitative stratigraphy
Resumo:
A quantitative analysis of the individual compounds in tobacco essential oils is performed by comprehensive two-dimensional gas chromatography (GC x GC) combined with flame ionization detector (FID). A time-of-flight mass spectrometer (TOF/MS) was coupled to GC x GC for the identification of the resolved peaks. The response of a flame ionization detector to different compound classes was calibrated using multiple internal standards. In total, 172 compounds were identified with good match and 61 compounds with high probability value were reliably quantified. For comparative purposes, the essential oil sample was also quantified by one-dimensional gas chromatography-mass spectrometry (GC/MS) with multiple internal standards method. The results showed that there was close agreement between the two analysis methods when the peak purity and match quality in one-dimensional GC/MS are high enough. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
AIM: To evaluate the suitability of reference genes in gastric tissue samples and cell lines.METHODS: the suitability of genes ACTB, B2M, GAPDH, RPL29, and 18S rRNA was assessed in 21 matched pairs of neoplastic and adjacent nonneoplastic gastric tissues from patients with gastric adenocarcinoma, 27 normal gastric tissues from patients without cancer, and 4 cell lines using reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR). the ranking of the best single and combination of reference genes was determined by NormFinder, geNorm (TM), BestKeeper, and DataAssist (TM). in addition, GenEx software was used to determine the optimal number of reference genes. To validate the results, the mRNA expression of a target gene, DNMT1, was quantified using the different reference gene combinations suggested by the various software packages for normalization.RESULTS: ACTB was the best reference gene for all gastric tissues, cell lines and all gastric tissues plus cell lines. GAPDH + B2M or ACTB + B2M was the best combination of reference genes for all the gastric tissues. On the other hand, ACTB + B2M was the best combination for all the cell lines tested and was also the best combination for analyses involving all the gastric tissues plus cell lines. According to the GenEx software, 2 or 3 genes were the optimal number of references genes for all the gastric tissues. the relative quantification of DNMT1 showed similar patterns when normalized by each combination of reference genes. the level of expression of DNMT1 in neoplastic, adjacent non-neoplastic and normal gastric tissues did not differ when these samples were normalized using GAPDH + B2M (P = 0.32), ACTB + B2M (P = 0.61), or GAPDH + B2M + ACTB (P = 0.44).CONCLUSION: GAPDH + B2M or ACTB + B2M is the best combination of reference gene for all the gastric tissues, and ACTB + B2M is the best combination for the cell lines tested. (C) 2013 Baishideng Publishing Group Co., Limited. All rights reserved.
Resumo:
There are a variety of guidelines and methods available to measure and assess survey quality. Most of these are based on qualitative descriptions. In practice, they are not easy to implement and it is very difficult to make comparisons between surveys. Hence there is a theoretical and pragmatic demand to develop a mainly quantitative based survey assessment tool. This research aimed to meet this need and make contributions to the evaluation and improvement of survey quality. Acknowledging the critical importance of measurement issues in survey research, this thesis starts with a comprehensive introduction to measurement theory and identifies the types of measurement errors associated with measurement procedures through three experiments. Then it moves on to describe concepts, guidelines and methods available for measuring and assessing survey quality. Combining these with measurement principles leads to the development of a quantitative based statistical holistic tool to measure and assess survey quality. The criteria, weights and subweights for the assessment tool are determined using Multi-Criteria Decision-Making (MCDM) and a survey questionnaire based on the Delphi method. Finally the model is applied to a database of surveys which was constructed to develop methods of classification, assessment and improvement of survey quality. The model developed in this thesis enables survey researchers and/or commissioners to make a holistic assessment of the value of the particular survey(s). This model is an Excel based audit which takes a holistic approach, following all stages of the survey from inception, to design, construction, execution, analysis and dissemination. At each stage a set of criteria are applied to assess quality. Scores attained against these assessments are weighted by the importance of the criteria and summed to give an overall assessment of the stage. The total score for a survey can be obtained by a combination of the scores for every stage weighted again by the importance of each stage. The advantage of this is to construct a means of survey assessment which can be used in a diagnostic manner to assess and improve survey quality.
Resumo:
Ellis, D. I., Broadhurst, D., Kell, D. B., Rowland, J. J., Goodacre, R. (2002). Rapid and quantitative detection of the microbial spoilage of meat by Fourier Transform Infrared Spectroscopy and machine learning. ? Applied and Environmental Microbiology, 68, (6), 2822-2828 Sponsorship: BBSRC
Resumo:
Cumbers, B., Urquhart, C. & Durbin, J. (2006). Evaluation of the KA24 (Knowledge Access 24) service for health and social care staff in London and the South-East of England. Part 1: Quantitative. Health Information and Libraries Journal, 23(2), 133-139 Sponsorship: KA24 - NHS Trusts, London
Resumo:
Jenkins, Tudor; Hayton, D.J.; Bedson, T.R.; Palmer, R.E., (2001) 'Quantitative evaluation of electron beam writing in passivated gold nanoclusters', Applied Physics Letters (78) pp.1921-1923 RAE2008
Resumo:
Background Single nucleotide polymorphisms (SNPs) have been used extensively in genetics and epidemiology studies. Traditionally, SNPs that did not pass the Hardy-Weinberg equilibrium (HWE) test were excluded from these analyses. Many investigators have addressed possible causes for departure from HWE, including genotyping errors, population admixture and segmental duplication. Recent large-scale surveys have revealed abundant structural variations in the human genome, including copy number variations (CNVs). This suggests that a significant number of SNPs must be within these regions, which may cause deviation from HWE. Results We performed a Bayesian analysis on the potential effect of copy number variation, segmental duplication and genotyping errors on the behavior of SNPs. Our results suggest that copy number variation is a major factor of HWE violation for SNPs with a small minor allele frequency, when the sample size is large and the genotyping error rate is 0~1%. Conclusions Our study provides the posterior probability that a SNP falls in a CNV or a segmental duplication, given the observed allele frequency of the SNP, sample size and the significance level of HWE testing.
Resumo:
The topic of this thesis is an acoustic scattering technique for detennining the compressibility and density of individual particles. The particles, which have diameters on the order of 10 µm, are modeled as fluid spheres. Ultrasonic tone bursts of 2 µsec duration and 30 MHz center frequency scatter from individual particles as they traverse the focal region of two confocally positioned transducers. One transducer acts as a receiver while the other both transmits and receives acoustic signals. The resulting scattered bursts are detected at 90° and at 180° (backscattered). Using either the long wavelength (Rayleigh) or the weak scatterer (Born) approximations, it is possible to detennine the compressibility and density of the particle provided we possess a priori knowledge of the particle size and the host properties. The detected scattered signals are digitized and stored in computer memory. With this information we can compute the mean compressibility and density averaged over a population of particles ( typically 1000 particles) or display histograms of scattered amplitude statistics. An experiment was run first run to assess the feasibility of using polystyrene polymer microspheres to calibrate the instrument. A second study was performed on the buffy coat harvested from whole human blood. Finally, chinese hamster ovary cells which were subject to hyperthermia treatment were studied in order to see if the instrument could detect heat induced membrane blebbing.
Resumo:
Neoplastic tissue is typically highly vascularized, contains abnormal concentrations of extracellular proteins (e.g. collagen, proteoglycans) and has a high interstitial fluid pres- sure compared to most normal tissues. These changes result in an overall stiffening typical of most solid tumors. Elasticity Imaging (EI) is a technique which uses imaging systems to measure relative tissue deformation and thus noninvasively infer its mechanical stiffness. Stiffness is recovered from measured deformation by using an appropriate mathematical model and solving an inverse problem. The integration of EI with existing imaging modal- ities can improve their diagnostic and research capabilities. The aim of this work is to develop and evaluate techniques to image and quantify the mechanical properties of soft tissues in three dimensions (3D). To that end, this thesis presents and validates a method by which three dimensional ultrasound images can be used to image and quantify the shear modulus distribution of tissue mimicking phantoms. This work is presented to motivate and justify the use of this elasticity imaging technique in a clinical breast cancer screening study. The imaging methodologies discussed are intended to improve the specificity of mammography practices in general. During the development of these techniques, several issues concerning the accuracy and uniqueness of the result were elucidated. Two new algorithms for 3D EI are designed and characterized in this thesis. The first provides three dimensional motion estimates from ultrasound images of the deforming ma- terial. The novel features include finite element interpolation of the displacement field, inclusion of prior information and the ability to enforce physical constraints. The roles of regularization, mesh resolution and an incompressibility constraint on the accuracy of the measured deformation is quantified. The estimated signal to noise ratio of the measured displacement fields are approximately 1800, 21 and 41 for the axial, lateral and eleva- tional components, respectively. The second algorithm recovers the shear elastic modulus distribution of the deforming material by efficiently solving the three dimensional inverse problem as an optimization problem. This method utilizes finite element interpolations, the adjoint method to evaluate the gradient and a quasi-Newton BFGS method for optimiza- tion. Its novel features include the use of the adjoint method and TVD regularization with piece-wise constant interpolation. A source of non-uniqueness in this inverse problem is identified theoretically, demonstrated computationally, explained physically and overcome practically. Both algorithms were test on ultrasound data of independently characterized tissue mimicking phantoms. The recovered elastic modulus was in all cases within 35% of the reference elastic contrast. Finally, the preliminary application of these techniques to tomosynthesis images showed the feasiblity of imaging an elastic inclusion.
Resumo:
Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an intense relation between curvature and speed. The Adaptive Vector Integration to Endpoint (AVITEWRITE) model of Grossberg and Paine (2000) proposed how such complex movements may be learned through attentive imitation. The model suggest how frontal, parietal, and motor cortical mechanisms, such as difference vector encoding, under volitional control from the basal ganglia, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. Key psycophysical and neural data about learning to make curved movements were simulated, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size scaling with isochrony, and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a Two-Thirds Power Law relation between angular velocity and curvature. However, the model learned from letter trajectories of only one subject, and only qualitative kinematic comparisons were made with previously published human data. The present work describes a quantitative test of AVITEWRITE through direct comparison of a corpus of human handwriting data with the model's performance when it learns by tracing human trajectories. The results show that model performance was variable across subjects, with an average correlation between the model and human data of 89+/-10%. The present data from simulations using the AVITEWRITE model highlight some of its strengths while focusing attention on areas, such as novel shape learning in children, where all models of handwriting and learning of other complex sensory-motor skills would benefit from further research.
Resumo:
The administration of psychotropic and psychoactive medication for persons with learning disability and accompanying mental illness and/or challenging behaviour has undergone much critical review over the past two decades. Assessment and diagnosis of mental illness in this population continues to be psychopharmacological treatment include polypharmacy, irrational prescription procedures and frequent over-prescription. It is clear that all forms of treatment including non-pharmacological interventions need to be driven by accurate and appropriate diagnoses. Where a psychiatric diagnosis has been identified, it greatly aides the selection of appropriate medication, although a specific medication for each diagnosis, as was once hoped, is simply no longer a reality in practice. Part one of the present thesis seeks to address many of the current issues in mental health problems and pharmacological treatment to date. The author undertook a drug prevalence study within both residential and community facilities for persons with learning disability within the Mid-West region of Ireland in order to ascertain the current level of prescribing of psychotropic and psychoactive medications for this population. While many attempts have been made to account for the variation in prescribing, little systematic and empirical research has been undertaken to investigate the factors thought to influence such prescribing. While studies investigating the prescribing behaviours of General Practitioners (GP's) have illustrated the complex nature of the decision making process in the context of general practice, no similar efforts have yet been directed at examining the prescribing behaviours of Consultant Psychiatrists. Using The Critical Incident Technique, the author interviewed Consultant Psychiatrists in the Republic of Ireland to gather information relating not only to their patterns of prescribing for learning disabled populations, but also to examine reasons influencing their prescribing in addition to several related factors. Part two of this thesis presents the findings from this study and a number of issues are raised, not only in relation to attempting to account for the findings from part one of the thesis, but also with respect to implications for improved management and clinical practice.
The psychology of immersion and development of a quantitative measure of immersive response in games
Resumo:
This study sets out to investigate the psychology of immersion and the immersive response of individuals in relation to video and computer games. Initially, an exhaustive review of literature is presented, including research into games, player demographics, personality and identity. Play in traditional psychology is also reviewed, as well as previous research into immersion and attempts to define and measure this construct. An online qualitative study was carried out (N=38), and data was analysed using content analysis. A definition of immersion emerged, as well as a classification of two separate types of immersion, namely, vicarious immersion and visceral immersion. A survey study (N=217) verified the discrete nature of these categories and rejected the null hypothesis that there was no difference between individuals' interpretations of vicarious and visceral immersion. The primary aim of this research was to create a quantitative instrument which measures the immersive response as experienced by the player in a single game session. The IMX Questionnaire was developed using data from the initial qualitative study and quantitative survey. Exploratory Factor Analysis was carried out on data from 300 participants for the IMX Version 1, and Confirmatory Factor Analysis was conducted on data from 380 participants on the IMX Version 2. IMX Version 3 was developed from the results of these analyses. This questionnaire was found to have high internal consistency reliability and validity.
Resumo:
The bifunctional Ru(II) complex [Ru(BPY)2POQ-Nmet]2+ (1), in which the metallic unit is tethered by an aliphatic chain to an organic DNA binder, was designed in order to increase the affinity toward nucleic acids. The interaction of 1 with DNA was characterised from luminescence and absorption data and compared with the binding of its monofunctional metallic and organic analogues, [Ru(BPY)2(ac)phen]2+ (2) and Nmet-quinoline (3). The bifunctional complex has a binding affinity one order of magnitude higher than that of each of its separated moieties. Absorption changes induced upon addition of DNA at different pH indicate protonation of the organic sub-unit upon interaction with DNA under neutral conditions. The combination of the luminescence data under steady-state and time-resolved conditions shows that the attachment of the organic unit in 1 induces modifications of the association modes of the metallic unit, owing to the presence of the aliphatic chain which probably hinders the metallic moiety binding. The salt dependence of the binding constants was analysed in order to compare the thermodynamic parameters describing the association with DNA for each complex. This study demonstrates the interest of the derivatisation of a Ru(II) complex with an organic moiety (ia the bifunctional ligand POQ-Nmet) for the development of high affinity DNA probes or photoreactive agents.
Resumo:
Quantitative models are required to engineer biomaterials with environmentally responsive properties. With this goal in mind, we developed a model that describes the pH-dependent phase behavior of a class of stimulus responsive elastin-like polypeptides (ELPs) that undergo reversible phase separation in response to their solution environment. Under isothermal conditions, charged ELPs can undergo phase separation when their charge is neutralized. Optimization of this behavior has been challenging because the pH at which they phase separate, pHt, depends on their composition, molecular weight, concentration, and temperature. To address this problem, we developed a quantitative model to describe the phase behavior of charged ELPs that uses the Henderson-Hasselbalch relationship to describe the effect of side-chain ionization on the phase-transition temperature of an ELP. The model was validated with pH-responsive ELPs that contained either acidic (Glu) or basic (His) residues. The phase separation of both ELPs fit this model across a range of pH. These results have important implications for applications of pH-responsive ELPs because they provide a quantitative model for the rational design of pH-responsive polypeptides whose transition can be triggered at a specified pH.