226 resultados para Relation quantitative propriété-propriété
Resumo:
Analyse comparée de la formation et des effets des régimes institutionnels de ressources naturelles en Suisse. Partant du constat de l'accroissement significatif et généralisé de la consommation des ressources naturelles, le projet a pour ambition d'examiner, dans le cas de la Suisse, quels sont les types de régimes institutionnels -régimes composés de l'ensemble des droits de propriété de disposition et d'usages s'appliquant aux différentes ressources naturelles, de même que des politiques publiques d'exploitation et de protection les régulant- susceptibles de prévenir des processus de surexploitation et de dégradation de ces ressources. Dans le cadre de ce projet de recherche financé par le Fonds national suisse de la recherche scientifique (FNRS), il s'agit, dans un premier temps, d'analyser les trajectoires historiques d'adaptation et de changements des régimes institutionnels des différentes ressources sur une durée d'environ un siècle (1900-2000). C'est l'objet des différents screenings. Dans un second temps et à l'aide d'études de cas, ces transformations de (ou au sein des) régimes institutionnels sont analysées sous l'angle de leurs effets sur l'état de la ressource. L'ambition finale de cette recherche est de comprendre les conditions l'émergence de "régimes intégrés" capables de prendre en compte un nombre croissant de groupes d'usagers agissant à différents niveaux (géographiques et institutionnels) et ayant des usages de plus en plus hétérogènes et concurrents de ces différentes ressources. Le champ empirique de la recherche porte plus particulièrement sur cinq ressources que sont: l'eau,l'air, le sol, le paysage et la forêt.
Resumo:
We analyze here the relation between alternative splicing and gene duplication in light of recent genomic data, with a focus on the human genome. We show that the previously reported negative correlation between level of alternative splicing and family size no longer holds true. We clarify this pattern and show that it is sufficiently explained by two factors. First, genes progressively gain new splice variants with time. The gain is consistent with a selectively relaxed regime, until purifying selection slows it down as aging genes accumulate a large number of variants. Second, we show that duplication does not lead to a loss of splice forms, but rather that genes with low levels of alternative splicing tend to duplicate more frequently. This leads us to reconsider the role of alternative splicing in duplicate retention.
Resumo:
In the past decade aquifers have increasingly become palaeoclimatic archives in their own right alongside ice cores, sediments and other proxy records. The main tool for this task has been the noble gas palaeo-thermometer in combination with quantitative groundwater dating using radionuclides. Noblegas radionuclides play a unique role as tracers in environmental studies due to their chemical inertness and low concentration making them ideal tracers. The same properties on the other hand make them difficult to measure on natural concentration levels. Therefore for decades low level counting (LLC) was the only method for detecting radioisotopes of argon and krypton at an atmospheric level. In recent times and with the increase of interest and potential applications the analytical efforts with novel detection methods have been intensified. In the talk noble gas groundwater dating techniques over times scales from decades to millions of years are discussed in relation to noble gas palaeo records at different locations in Europe and elsewhere.
Resumo:
This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.
Resumo:
Introduction: Intraoperative EMG based neurophysiological monitoring is increasingly used to assist pedicle screw insertion. We carried out a study comparing the final screw position in the pedicle measured on CT images in relation to its corresponding intraoperative muscle compound action potential (CMAP) values. Material and methods: A total of 189 screws were inserted in thoracolumbar spines of 31 patients during instrumented fusion under EMG control. An observer, blinded to the CMAP value, assessed the horizontal and vertical 'screw edge to pedicle edge' distance perpendicular to the longitudinal axis of the screw on reformatted CT reconstructions using OsiriX software. These distances were analysed with their corresponding CMAP values. Data from 62 thoracic and 127 lumbar screws were processed separately. Interobserver reliability of distance measurements was assessed. Results: No patient suffered neurological injury secondary to screw insertion. Distance measurements were reliable (paired t-test, P = 0.13/0.98 horizontal/vertical). Two screws had their position altered due to low CMAP values suggesting close proximity of nerve tissue. Seventy five percent of screws had CMAP results above 10mA and had an average distance of 0.35cm (SD 0.23) horizontally and 0.46cm (SD 0.26) vertically from the pedicle edge. Additional 12% had a distance from the edge of the pedicle less than 0mm indicating cortical breach but had CMAP values above 10mA. A poor correlation between CMAP values and screw position was found. Discussion: In this study CMAP values above 10mA indicated correct screw position in the majority of cases. The zone of 10-20mA CMAP carries highest risk of a misplaced screw despite high CMAP value (17% of screws this CMAP range). In order to improve accuracy of EMG predictive value further research is warranted including improvement of probing techniques.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
Aim To explore the respective power of climate and topography to predict the distribution of reptiles in Switzerland, hence at a mesoscale level. A more detailed knowledge of these relationships, in combination with maps of the potential distribution derived from the models, is a valuable contribution to the design of conservation strategies. Location All of Switzerland. Methods Generalized linear models are used to derive predictive habitat distribution models from eco-geographical predictors in a geographical information system, using species data from a field survey conducted between 1980 and 1999. Results The maximum amount of deviance explained by climatic models is 65%, and 50% by topographical models. Low values were obtained with both sets of predictors for three species that are widely distributed in all parts of the country (Anguis fragilis , Coronella austriaca , and Natrix natrix), a result that suggests that including other important predictors, such as resources, should improve the models in further studies. With respect to topographical predictors, low values were also obtained for two species where we anticipated a strong response to aspect and slope, Podarcis muralis and Vipera aspis . Main conclusions Overall, both models and maps derived from climatic predictors more closely match the actual reptile distributions than those based on topography. These results suggest that the distributional limits of reptile species with a restricted range in Switzerland are largely set by climatic, predominantly temperature-related, factors.
Resumo:
OBJECTIVE: Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. METHODS: 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. RESULTS: QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. CONCLUSIONS: QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
Si la maladie relève de l'individuel, la santé, elle, participe du collectif et constitue aujourd'hui un enjeu de société et un analyseur du social qui s'inscrit au coeur du questionnement anthropologique. Devenue valeur de référence, elle se définit comme une construction socialement et politiquement élaborée dont les limites sont constamment rediscutées. Ce processus se déploie dans un contexte sans précédent dans l'histoire de l'humanité. En effet, nous sommes actuellement les témoins aussi bien que les acteurs de mutations Inédites et fondamentales: la mondialisation en cours, qui se caractérise par l'émergence simultanée de diverses innovations sociales et culturelles s'accompagne d'un changement de références considérable. Cette évolution transforme la logique de l'offre et de la demande, tout particulièrement en ce qui concerne la problématique santé et migration. Dans cette perspective, la prise en charge des requérants d'asile doit s'accompagner d'une analyse des fins politiques, éthiques et culturelles sous-jacentes à la crise et aux solutions recherchées dans les systèmes de santé.
Resumo:
Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.