80 resultados para Financial Integration
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
In humans, spatial integration develops slowly, continuing through childhood into adolescence. On the assumption that this protracted course depends on the formation of networks with slowly developing top-down connections, we compared effective connectivity in the visual cortex between 13 children (age 7-13) and 14 adults (age 21-42) using a passive perceptual task. The subjects were scanned while viewing bilateral gratings, which either obeyed Gestalt grouping rules [colinear gratings (CG)] or violated them [non-colinear gratings (NG)]. The regions of interest for dynamic causal modeling were determined from activations in functional MRI contrasts stimuli > background and CG > NG. They were symmetrically located in V1 and V3v areas of both hemispheres. We studied a common model, which contained reciprocal intrinsic and modulatory connections between these regions. An analysis of effective connectivity showed that top-down modulatory effects generated at an extrastriate level and interhemispheric modulatory effects between primary visual areas (all inhibitory) are significantly weaker in children than in adults, suggesting that the formation of feedback and interhemispheric effective connections continues into adolescence. These results are consistent with a model in which spatial integration at an extrastriate level results in top-down messages to the primary visual areas, where they are supplemented by lateral (interhemispheric) messages, making perceptual encoding more efficient and less redundant. Abnormal formation of top-down inhibitory connections can lead to the reduction of habituation observed in migraine patients.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
The MIGCLIM R package is a function library for the open source R software that enables the implementation of species-specific dispersal constraints into projections of species distribution models under environmental change and/or landscape fragmentation scenarios. The model is based on a cellular automaton and the basic modeling unit is a cell that is inhabited or not. Model parameters include dispersal distance and kernel, long distance dispersal, barriers to dispersal, propagule production potential and habitat invasibility. The MIGCLIM R package has been designed to be highly flexible in the parameter values it accepts, and to offer good compatibility with existing species distribution modeling software. Possible applications include the projection of future species distributions under environmental change conditions and modeling the spread of invasive species.
Resumo:
Fraud is as old as Mankind. There are an enormous number of historical documents which show the interaction between truth and untruth; therefore it is not really surprising that the prevalence of publication discrepancies is increasing. More surprising is that new cases especially in the medical field generate such a huge astonishment. In financial mathematics a statistical tool for detection of fraud is known which uses the knowledge of Newcomb and Benford regarding the distribution of natural numbers. This distribution is not equal and lower numbers are more likely to be detected compared to higher ones. In this investigation all numbers contained in the blinded abstracts of the 2009 annual meeting of the Swiss Society of Anesthesia and Resuscitation (SGAR) were recorded and analyzed regarding the distribution. A manipulated abstract was also included in the investigation. The χ(2)-test was used to determine statistical differences between expected and observed counts of numbers. There was also a faked abstract integrated in the investigation. A p<0.05 was considered significant. The distribution of the 1,800 numbers in the 77 submitted abstracts followed Benford's law. The manipulated abstract was detected by statistical means (difference in expected versus observed p<0.05). Statistics cannot prove whether the content is true or not but can give some serious hints to look into the details in such conspicuous material. These are the first results of a test for the distribution of numbers presented in medical research.
Resumo:
QUESTION UNDER STUDY: Thirty-day readmissions can be classified as potentially avoidable (PARs) or not avoidable (NARs) by following a specific algorithm (SQLape®). We wanted to assess the financial impact of the Swiss-DRG system, which regroups some readmissions occurring within 18 days after discharge within the initial hospital stay, on PARs at our hospital. METHODS: First, PARs were identified from all hospitalisations recorded in 2011 at our university hospital. Second, 2012 Swiss-DRG readmission rules were applied, regrouped readmissions (RR) were identified, and their financial impact computed. Third, RRs were classified as potentially avoidable (PARRs), not avoidable (NARRs), and others causes (OCRRs). Characteristics of PARR patients and stays were retrieved, and the financial impact of PARRS was computed. RESULTS: A total of 36,777 hospitalisations were recorded in 2011, of which 3,140 were considered as readmissions (8.5%): 1,470 PARs (46.8%) and 1,733 NARs (53.2%). The 2012 Swiss-DRG rules would have resulted in 910 RRs (2.5% of hospitalisations, 29% of readmissions): 395 PARRs (43% of RR), 181 NARRs (20%), and 334 OCRRs (37%). Loss in reimbursement would have amounted to CHF 3.157 million (0.6% of total reimbursement). As many as 95% of the 395 PARR patients lived at home. In total, 28% of PARRs occurred within 3 days after discharge, and 58% lasted less than 5 days; 79% of the patients were discharged home again. Loss in reimbursement would amount to CHF 1.771 million. CONCLUSION: PARs represent a sizeable number of 30-day readmissions, as do PARRs of 18-day RRs in the 2012 Swiss DRG system. They should be the focus of attention, as the PARRs represent an avoidable loss in reimbursement.
Resumo:
In its fifth decade of existence, the construct of schizotypy is recapturing the early scientific interest it attracted when Paul E. Meehl (1920-2003), who coined the term, pioneered the field of schizotypy research. The International Lemanic Workshop on Schizotypy, hosted at the University of Geneva in December 2013, recently offered an opportunity to address some of the fundamental questions in contemporary schizotypy research and situate the construct in the greater scheme of future scientific projects on schizophrenia and psychological health research. What kind of knowledge has schizotypy research provided in furthering our understanding of schizophrenia? What types of questions can schizotypy research tackle, and which are the conceptual and methodological frameworks to address them? How will schizotypy research contribute to future scientific endeavors? The International Lemanic Workshop brought together leading experts in the field around the tasks of articulating the essential findings in schizotypy research, as well as providing some key insights and guidance to face scientific challenges of the future. The current supplement contains 8 position articles, 4 research articles, and 1 invited commentary that outline the state of the art in schizotypy research today