911 resultados para QUANTITATIVE PROTEOMICS
Resumo:
[eng] This paper examines the quantitative effects of gender gaps in entrepreneurship and labor force participation on aggregate productivity and income per capita. We simulate an occupational choice model with heterogeneous agents in entrepreneurial ability, where agents choose to be workers, self-employed or employers. The model assumes that men and women have the same talent distribution, but we impose several frictions on women's opportunities and pay in the labor market. In particular, we restrict the fraction of women participating in the labor market.
Resumo:
This paper examines the quantitative effects of gender gaps in entrepreneurship and labor force participation on aggregate productivity and income per capita. We simulate an occupational choice model with heterogeneous agents in entrepreneurial ability, where agents choose to be workers, self-employed or employers. The model assumes that men and women have the same talent distribution, but we impose several frictions on women's opportunities and pay in the labor market. In particular, we restrict the fraction of women participating in the labor market.
Resumo:
This paper examines the quantitative effects of gender gaps in entrepreneurship and labor force participation on aggregate productivity and income per capita. We simulate an occupational choice model with heterogeneous agents in entrepreneurial ability, where agents choose to be workers, self-employed or employers. The model assumes that men and women have the same talent distribution, but we impose several frictions on women's opportunities and pay in the labor market. In particular, we restrict the fraction of women participating in the labor market.
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
Résumé Des développements antérieurs, au sein de l'Institut de Géophysique de Lausanne, ont permis de développer des techniques d'acquisition sismique et de réaliser l'interprétation des données sismique 2D et 3D pour étudier la géologie de la région et notamment les différentes séquences sédimentaires du Lac Léman. Pour permettre un interprétation quantitative de la sismique en déterminant des paramètres physiques des sédiments la méthode AVO (Amplitude Versus Offset) a été appliquée. Deux campagnes sismiques lacustres, 2D et 3D, ont été acquises afin de tester la méthode AVO dans le Grand Lac sur les deltas des rivières. La géométrie d'acquisition a été repensée afin de pouvoir enregistrer les données à grands déports. Les flûtes sismiques, mises bout à bout, ont permis d'atteindre des angles d'incidence d'environ 40˚ . Des récepteurs GPS spécialement développés à cet effet, et disposés le long de la flûte, ont permis, après post-traitement des données, de déterminer la position de la flûte avec précision (± 0.5 m). L'étalonnage de nos hydrophones, réalisé dans une chambre anéchoïque, a permis de connaître leur réponse en amplitude en fonction de la fréquence. Une variation maximale de 10 dB a été mis en évidence entre les capteurs des flûtes et le signal de référence. Un traitement sismique dont l'amplitude a été conservée a été appliqué sur les données du lac. L'utilisation de l'algorithme en surface en consistante a permis de corriger les variations d'amplitude des tirs du canon à air. Les sections interceptes et gradients obtenues sur les deltas de l'Aubonne et de la Dranse ont permis de produire des cross-plots. Cette représentation permet de classer les anomalies d'amplitude en fonction du type de sédiments et de leur contenu potentiel en gaz. L'un des attributs qui peut être extrait des données 3D, est l'amplitude de la réflectivité d'une interface sismique. Ceci ajoute une composante quantitative à l'interprétation géologique d'une interface. Le fond d'eau sur le delta de l'Aubonne présente des anomalies en amplitude qui caractérisent les chenaux. L'inversion de l'équation de Zoeppritz par l'algorithme de Levenberg-Marquardt a été programmée afin d'extraire les paramètres physiques des sédiments sur ce delta. Une étude statistique des résultats de l'inversion permet de simuler la variation de l'amplitude en fonction du déport. On a obtenu un modèle dont la première couche est l'eau et dont la seconde est une couche pour laquelle V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s. Abstract A system to record very high resolution (VHR) seismic data on lakes in 2D and 3D was developed at the Institute of Geophysics, University of Lausanne. Several seismic surveys carried out on Lake Geneva helped us to better understand the geology of the area and to identify sedimentary sequences. However, more sophisticated analysis of the data such as the AVO (Amplitude Versus Offset) method provides means of deciphering the detailed structure of the complex Quaternary sedimentary fill of the Lake Geneva trough. To study the physical parameters we applied the AVO method at some selected places of sediments. These areas are the Aubonne and Dranse River deltas where the configurations of the strata are relatively smooth and the discontinuities between them easy to pick. A specific layout was developed to acquire large incidence angle. 2D and 3D seismic data were acquired with streamers, deployed end to end, providing incidence angle up to 40˚ . One or more GPS antennas attached to the streamer enabled us to calculate individual hydrophone positions with an accuracy of 50 cm after post-processing of the navigation data. To ensure that our system provides correct amplitude information, our streamer sensors were calibrated in an anechoic chamber using a loudspeaker as a source. Amplitude variations between the each hydrophone were of the order of 10 dB. An amplitude correction for each hydrophone was computed and applied before processing. Amplitude preserving processing was then carried out. Intercept vs. gradient cross-plots enable us to determine that both geological discontinuities (lacustrine sediments/moraine and moraine/molasse) have well defined trends. A 3D volume collected on the Aubonne river delta was processed in order ro obtain AVO attributes. Quantitative interpretation using amplitude maps were produced and amplitude maps revealed high reflectivity in channels. Inversion of the water bottom of the Zoeppritz equation using the Levenberg-Marquadt algorithm was carried out to estimate V P , V S and ρ of sediments immediately under the lake bottom. Real-data inversion gave, under the water layer, a mud layer with V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s.
Resumo:
Agricultural practices, such as spreading liquid manure or the utilisation of land as animal pastures, can result in faecal contamination of water resources. Rhodococcus coprophilus is used in microbial source tracking to indicate animal faecal contamination in water. Methods previously described for detecting of R. coprophilus in water were neither sensitive nor specific. Therefore, the aim of this study was to design and validate a new quantitative polymerase chain reaction (qPCR) to improve the detection of R. coprophilus in water. The new PCR assay was based on the R. coprophilus 16S rRNA gene. The validation showed that the new approach was specific and sensitive for deoxyribunucleic acid from target host species. Compared with other PCR assays tested in this study, the detection limit of the new qPCR was between 1 and 3 log lower. The method, including a filtration step, was further validated and successfully used in a field investigation in Switzerland. Our work demonstrated that the new detection method is sensitive and robust to detect R. coprophilus in surface and spring water. Compared with PCR assays that are available in the literature or to the culture-dependent method, the new molecular approach improves the detection of R. coprophilus.
Resumo:
For the detection and management of osteoporosis and osteoporosis-related fractures, quantitative ultrasound (QUS) is emerging as a relatively low-cost and readily accessible alternative to dual-energy X-ray absorptiometry (DXA) measurement of bone mineral density (BMD) in certain circumstances. The following is a brief, but thorough review of the existing literature with respect to the use of QUS in 6 settings: 1) assessing fragility fracture risk; 2) diagnosing osteoporosis; 3) initiating osteoporosis treatment; 4) monitoring osteoporosis treatment; 5) osteoporosis case finding; and 6) quality assurance and control. Many QUS devices exist that are quite different with respect to the parameters they measure and the strength of empirical evidence supporting their use. In general, heel QUS appears to be most tested and most effective. Overall, some, but not all, heel QUS devices are effective assessing fracture risk in some, but not all, populations, the evidence being strongest for Caucasian females over 55 years old. Otherwise, the evidence is fair with respect to certain devices allowing for the accurate diagnosis of likelihood of osteoporosis, and generally fair to poor in terms of QUS use when initiating or monitoring osteoporosis treatment. A reasonable protocol is proposed herein for case-finding purposes, which relies on a combined assessment of clinical risk factors (CR.F) and heel QUS. Finally, several recommendations are made for quality assurance and control.
Resumo:
The objective of this work was to validate, by quantitative PCR in real time (RT-qPCR), genes to be used as reference in studies of gene expression in soybean in drought-stressed trials. Four genes commonly used in soybean were evaluated: Gmβ-actin, GmGAPDH, GmLectin and GmRNAr18S. Total RNA was extracted from six samples: three from roots in a hydroponic system with different drought intensities (0, 25, 50, 75 and 100 minutes of water stress), and three from leaves of plants grown in sand with different soil moistures (15, 5 and 2.5% gravimetric humidity). The raw cycle threshold (Ct) data were analyzed, and the efficiency of each primer was calculated for an overall analysis of the Ct range among the different samples. The GeNorm application was used to evaluate the best reference gene, according to its stability. The GmGAPDH was the least stable gene, with the highest mean values of expression stability (M), and the most stable genes, with the lowest M values, were the Gmβ-actin and GmRNAr18S, when both root and leaves samples were tested. These genes can be used in RT-qPCR as reference gene for expression analysis.
Resumo:
PPARs are members of the nuclear hormone receptor superfamily and are primarily involved in lipid metabolism. The expression patterns of all 3 PPAR isotypes in 22 adult rat organs were analyzed by a quantitative ribonuclease protection assay. The data obtained allowed comparison of the expression of each isotype to the others and provided new insight into the less studied PPAR beta (NR1C2) expression and function. This isotype shows a ubiquitous expression pattern and is the most abundant of the three PPARs in all analyzed tissues except adipose tissue. Its expression is especially high in the digestive tract, in addition to kidney, heart, diaphragm, and esophagus. After an overnight fast, PPAR beta mRNA levels are dramatically down-regulated in liver and kidney by up to 80% and are rapidly restored to control levels upon refeeding. This tight nutritional regulation is independent of the circulating glucocorticoid levels and the presence of PPAR alpha, whose activity is markedly up-regulated in the liver and small intestine during fasting. Finally, PPAR gamma 2 mRNA levels are decreased by 50% during fasting in both white and brown adipose tissue. In conclusion, fasting can strongly influence PPAR expression, but in only a few selected tissues.
Resumo:
Rapid assessment methods are valuable tools for collecting information about the quality and status of natural systems. However, they are not a substitute for detailed surveys of those systems. Users of this method should consider that the method may under-score or over-score the site that’s assessed, especially when the site is not a typical fen (i.e. a sedge meadow could score lower than fens, but in-fact be a relatively high quality sedge meadow). This assessment can be used throughout most of the spring, summer and fall period, however the ideal “index period” would be from late May through early October when native plant communities, as well as invasive species, are most apparent.
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).
Resumo:
What follows are the refined guidelines from the Thin Maintenance Surface: Phase II Report. For that report, test sections were created and monitored along with some existing test sections. From the monitoring and evaluation of these test sections, literature reviews, and the experience and knowledge of the authors, the following guidelines were created. More information about thin maintenance surfaces and their uses can be found in the above-mentioned report.
Resumo:
In bottom-up proteomics, rapid and efficient protein digestion is crucial for data reliability. However, sample preparation remains one of the rate-limiting steps in proteomics workflows. In this study, we compared the conventional trypsin digestion procedure with two accelerated digestion protocols based on shorter reaction times and microwave-assisted digestion for the preparation of membrane-enriched protein fractions of the human pathogenic bacterium Staphylococcus aureus. Produced peptides were analyzed by Shotgun IPG-IEF, a methodology relying on separation of peptides by IPG-IEF before the conventional LC-MS/MS steps of shotgun proteomics. Data obtained on two LC-MS/MS platforms showed that accelerated digestion protocols, especially the one relying on microwave irradiation, enhanced the cleavage specificity of trypsin and thus improved the digestion efficiency especially for hydrophobic and membrane proteins. The combination of high-throughput proteomics with accelerated and efficient sample preparation should enhance the practicability of proteomics by reducing the time from sample collection to obtaining the results.
Resumo:
Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM) was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1 mm high-resolution maps of the longitudinal relaxation rate (R1 = 1/T1), effective proton density (PD(*)), magnetization transfer saturation (MT) and effective transverse relaxation rate (R2(*) = 1/T2(*)). MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV) for typical morphometric measures [i.e., gray matter (GM) probability maps used in voxel-based morphometry] and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1 and 8%, respectively, except for the inter-site CoV of R2(*) (<20%). The GM probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived GM probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.