910 resultados para Automated quantification
Resumo:
Mitochondrial DNA (mtDNA) mutations are an important cause of genetic disease and have been proposed to play a role in the ageing process. Quantification of total mtDNA mutation load in ageing tissues is difficult as mutational events are rare in a background of wild-type molecules, and detection of individual mutated molecules is beyond the sensitivity of most sequencing based techniques. The methods currently most commonly used to document the incidence of mtDNA point mutations in ageing include post-PCR cloning, single-molecule PCR and the random mutation capture assay. The mtDNA mutation load obtained by these different techniques varies by orders of magnitude, but direct comparison of the three techniques on the same ageing human tissue has not been performed. We assess the procedures and practicalities involved in each of these three assays and discuss the results obtained by investigation of mutation loads in colonic mucosal biopsies from ten human subjects.
Resumo:
Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.
Resumo:
The GABase assay is widely used to rapidly and accurately quantify levels of extracellular γ-aminobutyric acid (GABA). Here we demonstrate a modification of this assay that enables quantification of intracellular GABA in bacterial cells. Cells are lysed by boiling and ethanolamine-O-sulphate, a GABA transaminase inhibitor is used to distinguish between GABA and succinate semialdehyde.
Resumo:
The proteome of Salmonella enterica serovar Typhimurium was characterized by 2-dimensional HPLC mass spectrometry to provide a platform for subsequent proteomic investigations of low level multiple antibiotic resistance (MAR). Bacteria (2.15 +/- 0.23 x 10(10) cfu; mean +/- s.d.) were harvested from liquid culture and proteins differentially fractionated, on the basis of solubility, into preparations representative of the cytosol, cell envelope and outer membrane proteins (OMPs). These preparations were digested by treatment with trypsin and peptides separated into fractions (n = 20) by strong cation exchange chromatography (SCX). Tryptic peptides in each SCX fraction were further separated by reversed-phase chromatography and detected by mass spectrometry. Peptides were assigned to proteins and consensus rank listings compiled using SEQUEST. A total of 816 +/- 11 individual proteins were identified which included 371 +/- 33, 565 +/- 15 and 262 +/- 5 from the cytosolic, cell envelope and OMP preparations, respectively. A significant correlation was observed (r(2) = 0.62 +/- 0.10; P < 0.0001) between consensus rank position for duplicate cell preparations and an average of 74 +/- 5% of proteins were common to both replicates. A total of 34 outer membrane proteins were detected, 20 of these from the OMP preparation. A range of proteins (n = 20) previously associated with the mar locus in E. coli were also found including the key MAR effectors AcrA, TolC and OmpF.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.
Resumo:
• UV-B radiation currently represents c. 1.5% of incoming solar radiation. However, significant changes are known to have occurred in the amount of incoming radiation both on recent and on geological timescales. Until now it has not been possible to reconstruct a detailed measure of UV-B radiation beyond c. 150 yr ago. • Here, we studied the suitability of fossil Pinus spp. pollen to record variations in UV-B flux through time. In view of the large size of the grain and its long fossil history, we hypothesized that this grain could provide a good proxy for recording past variations in UV-B flux. • Two key objectives were addressed: to determine whether there was, similar to other studied species, a clear relationship between UV-B-absorbing compounds in the sporopollenin of extant pollen and the magnitude of UV-B radiation to which it had been exposed; and to determine whether these compounds could be extracted from a small enough sample size of fossil pollen to make reconstruction of a continuous record through time a realistic prospect. • Preliminary results indicate the excellent potential of this species for providing a quantitative record of UV-B through time. Using this technique, we present the first record of UV-B flux during the last 9500 yr from a site near Bergen, Norway.
Resumo:
Slabakova (2006b) poses and directly addresses the question of whether or not there is a maturational effect (a critical/sensitive period) that affects the semantic component. She demonstrates that there is no empirical evidence suggesting that adults are unable to acquire phrasal semantic properties, even when the accessing of semantic universals is conditioned upon the acquisition of L2 morphosyntactic features (see Dekydtspotter and Sprouse 2001, Slabakova and Montrul 2003). In light of this, the authors test for interpretive properties associated with the aspectual projection higher (outer) AspP in advanced English learners of adult L2 Portuguese via their knowledge of [+/- accidental] related nuances in adverbially quantified preterit and imperfect sentences (Lenci and Bertinetto 2000; Menéndez-Benito 2002). In two experiments, the authors test for L2 knowledge of this [+/- accidental] distinction via semantic felicitousness judgments of adverbially quantified preterit and imperfect sentences depending on a supporting context as well as related restrictions on subject DP interpretations. Overall, the data show that advanced learners acquire this distinction. As the authors discuss, the present data support Full Access theories (White 1989, Schwartz and Sprouse 1996; Duffield and White 1999) and the No-Critical Period for semantics position (Slabakova 2006b), demonstrating that the syntax-semantics interface is not an inevitable locus for fossilization.
Resumo:
This technique paper describes a novel method for quantitatively and routinely identifying auroral breakup following substorm onset using the Time History of Events and Macroscale Interactions During Substorms (THEMIS) all-sky imagers (ASIs). Substorm onset is characterised by a brightening of the aurora that is followed by auroral poleward expansion and auroral breakup. This breakup can be identified by a sharp increase in the auroral intensity i(t) and the time derivative of auroral intensity i'(t). Utilising both i(t) and i'(t) we have developed an algorithm for identifying the time interval and spatial location of auroral breakup during the substorm expansion phase within the field of view of ASI data based solely on quantifiable characteristics of the optical auroral emissions. We compare the time interval determined by the algorithm to independently identified auroral onset times from three previously published studies. In each case the time interval determined by the algorithm is within error of the onset independently identified by the prior studies. We further show the utility of the algorithm by comparing the breakup intervals determined using the automated algorithm to an independent list of substorm onset times. We demonstrate that up to 50% of the breakup intervals characterised by the algorithm are within the uncertainty of the times identified in the independent list. The quantitative description and routine identification of an interval of auroral brightening during the substorm expansion phase provides a foundation for unbiased statistical analysis of the aurora to probe the physics of the auroral substorm as a new scientific tool for aiding the identification of the processes leading to auroral substorm onset.
Resumo:
A method of automatically identifying and tracking polar-cap plasma patches, utilising data inversion and feature-tracking methods, is presented. A well-established and widely used 4-D ionospheric imaging algorithm, the Multi-Instrument Data Assimilation System (MIDAS), inverts slant total electron content (TEC) data from ground-based Global Navigation Satellite System (GNSS) receivers to produce images of the free electron distribution in the polar-cap ionosphere. These are integrated to form vertical TEC maps. A flexible feature-tracking algorithm, TRACK, previously used extensively in meteorological storm-tracking studies is used to identify and track maxima in the resulting 2-D data fields. Various criteria are used to discriminate between genuine patches and "false-positive" maxima such as the continuously moving day-side maximum, which results from the Earth's rotation rather than plasma motion. Results for a 12-month period at solar minimum, when extensive validation data are available, are presented. The method identifies 71 separate structures consistent with patch motion during this time. The limitations of solar minimum and the consequent small number of patches make climatological inferences difficult, but the feasibility of the method for patches larger than approximately 500 km in scale is demonstrated and a larger study incorporating other parts of the solar cycle is warranted. Possible further optimisation of discrimination criteria, particularly regarding the definition of a patch in terms of its plasma concentration enhancement over the surrounding background, may improve results.
Resumo:
Currently, infrared filters for astronomical telescopes and satellite radiometers are based on multilayer thin film stacks of alternating high and low refractive index materials. However, the choice of suitable layer materials is limited and this places limitations on the filter performance that can be achieved. The ability to design materials with arbitrary refractive index allows for filter performance to be greatly increased but also increases the complexity of design. Here a differential algorithm was used as a method for optimised design of filters with arbitrary refractive indices, and then materials are designed to these specifications as mono-materials with sub wavelength structures using Bruggeman’s effective material approximation (EMA).
Resumo:
Skillful and timely streamflow forecasts are critically important to water managers and emergency protection services. To provide these forecasts, hydrologists must predict the behavior of complex coupled human–natural systems using incomplete and uncertain information and imperfect models. Moreover, operational predictions often integrate anecdotal information and unmodeled factors. Forecasting agencies face four key challenges: 1) making the most of available data, 2) making accurate predictions using models, 3) turning hydrometeorological forecasts into effective warnings, and 4) administering an operational service. Each challenge presents a variety of research opportunities, including the development of automated quality-control algorithms for the myriad of data used in operational streamflow forecasts, data assimilation, and ensemble forecasting techniques that allow for forecaster input, methods for using human-generated weather forecasts quantitatively, and quantification of human interference in the hydrologic cycle. Furthermore, much can be done to improve the communication of probabilistic forecasts and to design a forecasting paradigm that effectively combines increasingly sophisticated forecasting technology with subjective forecaster expertise. These areas are described in detail to share a real-world perspective and focus for ongoing research endeavors.
Resumo:
Liquid Chromatography Mass Spectrometry (LC-MS) was used to obtain glucosinolate and flavonol content for 35 rocket accessions and commercial varieties. 13 glucosinolates and 11 flavonol compounds were identified. Semi-quantitative methods were used to estimate concentrations of both groups of compounds. Minor glucosinolate composition was found to be different between accessions; concentrations varied significantly. Flavonols showed differentiation between genera, with Diplotaxis accumulating quercetin glucosides and Eruca accumulating kaempferol glucosides. Several compounds were detected in each genus that have only previously been reported in the other. We highlight how knowledge of phytochemical content and concentration can be used to breed new, nutritionally superior varieties. We also demonstrate the effects of controlled environment conditions on the accumulations of glucosinolates and flavonols and explore the reasons for differences with previous studies. We stress the importance of consistent experimental design between research groups to effectively compare and contrast results.