913 resultados para integration of methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO: A Nigéria tem uma população estimada em cerca de 170 milhões de pessoas. O número de profissionais de saúde mental é muito diminuto, contando apenas com 150 psiquiatras o que perfaz aproximadamente um rácio de psiquiatra: população de mais de 1:1 milhão de pessoas. O Plano Nacional de Saúde Mental de 1991 reconheceu esta insuficiência e recomendou a integração dos serviços de saúde mental nos cuidados de saúde primários (CSP). Depois de mais de duas décadas, essa política não foi ainda implementada. Este estudo teve como objetivos mapear a estrutura organizacional dos serviços de saúde mental da Nigéria, e explorar os desafios e barreiras que impedem a integração bem-sucedida dos serviços de saúde mental nos cuidados de saúde primários, isto segundo a perspectiva dos profissionais dos cuidados de saúde primários. Com este objetivo, desenvolveu-se um estudo exploratório sequencial e utilizou-se um modelo misto para a recolha de dados. A aplicação em simultâneo de abordagens qualitativas e quantitativas permitiram compreender os problemas relacionados com a integração dos serviços de saúde mental nos CSP na Nigéria. No estudo qualitativo inicial, foram realizadas entrevistas com listagens abertas a 30 profissionais dos CSP, seguidas de dois grupos focais com profissionais dos CSP de duas zonas governamentais do estado de Oyo de forma a obter uma visão global das perspectivas destes profissionais locais sobre os desafios e barreiras que impedem uma integração bem-sucedida dos serviços de saúde mental nos CSP. Subsequentemente, foram realizadas entrevistas com quatro pessoas-chave, especificamente coordenadores e especialistas em saúde mental. Os resultados do estudo qualitativo foram utilizados para desenvolver um questionário para análise quantitativa das opiniões de uma amostra maior e mais representativa dos profissionais dos CSP do Estado de Oyo, bem como de duas zonas governamentais locais do Estado de Osun. As barreiras mais comummente identificadas a partir deste estudo incluem o estigma e os preconceitos sobre a doença mental, a formação inadequada dos profissionais dos CPS sobre saúde mental, a perceção pela equipa dos CSP de baixa prioridade de ação do Governo, o medo da agressão e violência pela equipa dos CSP, bem como a falta de disponibilidade de fármacos. As recomendações para superar estes desafios incluem a melhoria sustentada dos esforços da advocacia à saúde mental que vise uma maior valorização e apoio governamental, a formação e treino organizados dos profissionais dos cuidados primários, a criação de redes de referência e de apoio com instituições terciárias adjacentes, e o engajamento da comunidade para melhorar o acesso aos serviços e à reabilitação, pelas pessoas com doença mental. Estes resultados fornecem indicações úteis sobre a perceção das barreiras para a integração bem sucedida dos serviços de saúde mental nos CSP, enquanto se recomenda uma abordagem holística e abrangente. Esta informação pode orientar as futuras tentativas de implementação da integração dos serviços de saúde mental nos cuidados primários na Nigéria.------------ABSTRACT: Nigeria has an estimated population of about 170 million people but the number of mental health professionals is very small, with about 150 psychiatrists. This roughly translates to a psychiatrist:population ratio of more than 1:1 million people. The National Mental Health Policy of 1991 recognized this deficiency and recommended the integration of mental health into primary health care (PHC) delivery system. After more than two decades, this policy has yet to be implemented. This study aimed to map out the organizational structure of the mental health systems in Nigeria, and to explore the challenges and barriers preventing the successful integration of mental health into primary health care, from the perspective of the primary health care workers. A mixed methods exploratory sequential study design was employed, which entails the use of sequential timing in the combined methods of data collection. A combination of qualitative and uantitative approaches in sequence, were utilized to understand the problems of mental health services integration into PHC in Nigeria. The initial qualitative phase utilized free listing interviews with 30 PHC workers, followed by two focus group discussions with primary care workers from two Local Government Areas (LGA) of Oyo State to gain useful insight into the local perspectives of PHC workers about the challenges and barriers preventing successful integration of mental health care services into PHC. Subsequently, 4 key informant interviews with PHC co-ordinators and mental health experts were carried out. The findings from the qualitative study were utilized to develop a quantitative study questionnaire to understand the opinions of a larger and more representative sample of PHC staff in two more LGAs of Oyo State, as well as 2 LGAs from Osun State. The common barriers identified from this study include stigma and misconceptions about mental illness, inadequate training of PHC staff about mental health, low government priority, fear of aggression and violence by the PHC staff, as well as non-availability of medications. Recommendations for overcoming these challenges include improved and sustained efforts at mental health advocacy to gain governmental attention and support, organized training and retraining for primary care staff, establishment of referral and supportive networks with neighbouring tertiary facilities and community engagement to improve service utilization and rehabilitation of mentally ill persons. These findings provide useful insight into the barriers to the successful integration of mental health into PHC, while recommending a holistic and comprehensive approach. This information can guide future attempts to implement the integration of mental health into primary care in Nigeria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ground-penetrating radar (GPR) and microgravimetric surveys have been conducted in the southern Jura mountains of western Switzerland in order to map subsurface karstic features. The study site, La Grande Rolaz cave, is an extensive system in which many portions have been mapped. By using small station spacing and careful processing for the geophysical data, and by modeling these data with topographic information from within the cave, accurate interpretations have been achieved. The constraints on the interpreted geologic models are better when combining the geophysical methods than when using only one of the methods, despite the general limitations of two-dimensional (2D) profiling. For example, microgravimetry can complement GPR methods for accurately delineating a shallow cave section approximately 10 X 10 mt in size. Conversely, GPR methods can be complementary in determining cavity depths and in verifying the presence of off-line features and numerous areas of small cavities and fractures, which may be difficult to resolve in microgravimetric data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global positioning systems (GPS) offer a cost-effective and efficient method to input and update transportation data. The spatial location of objects provided by GPS is easily integrated into geographic information systems (GIS). The storage, manipulation, and analysis of spatial data are also relatively simple in a GIS. However, many data storage and reporting methods at transportation agencies rely on linear referencing methods (LRMs); consequently, GPS data must be able to link with linear referencing. Unfortunately, the two systems are fundamentally incompatible in the way data are collected, integrated, and manipulated. In order for the spatial data collected using GPS to be integrated into a linear referencing system or shared among LRMs, a number of issues need to be addressed. This report documents and evaluates several of those issues and offers recommendations. In order to evaluate the issues associated with integrating GPS data with a LRM, a pilot study was created. To perform the pilot study, point features, a linear datum, and a spatial representation of a LRM were created for six test roadway segments that were located within the boundaries of the pilot study conducted by the Iowa Department of Transportation linear referencing system project team. Various issues in integrating point features with a LRM or between LRMs are discussed and recommendations provided. The accuracy of the GPS is discussed, including issues such as point features mapping to the wrong segment. Another topic is the loss of spatial information that occurs when a three-dimensional or two-dimensional spatial point feature is converted to a one-dimensional representation on a LRM. Recommendations such as storing point features as spatial objects if necessary or preserving information such as coordinates and elevation are suggested. The lack of spatial accuracy characteristic of most cartography, on which LRM are often based, is another topic discussed. The associated issues include linear and horizontal offset error. The final topic discussed is some of the issues in transferring point feature data between LRMs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes advances in ground-based thermodynamic profiling of the lower troposphere through sensor synergy. The well-documented integrated profiling technique (IPT), which uses a microwave profiler, a cloud radar, and a ceilometer to simultaneously retrieve vertical profiles of temperature, humidity, and liquid water content (LWC) of nonprecipitating clouds, is further developed toward an enhanced performance in the boundary layer and lower troposphere. For a more accurate temperature profile, this is accomplished by including an elevation scanning measurement modus of the microwave profiler. Height-dependent RMS accuracies of temperature (humidity) ranging from 0.3 to 0.9 K (0.5–0.8 g m−3) in the boundary layer are derived from retrieval simulations and confirmed experimentally with measurements at distinct heights taken during the 2005 International Lindenberg Campaign for Assessment of Humidity and Cloud Profiling Systems and its Impact on High-Resolution Modeling (LAUNCH) of the German Weather Service. Temperature inversions, especially of the lower boundary layer, are captured in a very satisfactory way by using the elevation scanning mode. To improve the quality of liquid water content measurements in clouds the authors incorporate a sophisticated target classification scheme developed within the European cloud observing network CloudNet. It allows the detailed discrimination between different types of backscatterers detected by cloud radar and ceilometer. Finally, to allow IPT application also to drizzling cases, an LWC profiling method is integrated. This technique classifies the detected hydrometeors into three different size classes using certain thresholds determined by radar reflectivity and/or ceilometer extinction profiles. By inclusion into IPT, the retrieved profiles are made consistent with the measurements of the microwave profiler and an LWC a priori profile. Results of IPT application to 13 days of the LAUNCH campaign are analyzed, and the importance of integrated profiling for model evaluation is underlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integrating evidence from different imaging modalities is important to overcome specific limitations of any given imaging method, such as insensitivity of the EEG to unsynchronized neural events, or the lack of fMRI sensitivity to events of low metabolic demand. Processes that are visible in one modality may be related in a nontrivial way to other processes visible in another modality and insight may only be obtained by integrating both methods through a common analysis. For example, brain activity at rest seems to be at least partly determined by an interaction of cortical rhythms (visible to EEG but not to fMRI) with sub-cortical activity (visible to fMRI, but usually not to EEG without averaging). A combination of EEG and fMRI data during rest may thus be more informative than the sum of two separate analyses in both modalities. Integration is also an important source of converging evidence about specific aspects and general principles of neural functions and their dysfunctions in certain pathologies. This is because not only electrical, but also energetic, biochemical, hemodynamic and metabolic processes characterize neural states and functions, and because brain structure provides crucial constraints upon neural functions. Focusing on multimodal integration of functional data should not distract from the privileged status of the electric field as the primary direct, noninvasive real-time measure of neural transmission. The preceding chapters illustrate how electrical neuroimaging has turned scalp EEG into an imaging modality which directly captures the full temporal dynamics of neural activity in the brain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main drivers for the development and evolution of Cyber Physical Systems (CPS) are the reduction of development costs and time along with the enhancement of the designed products. The aim of this survey paper is to provide an overview of different types of system and the associated transition process from mechatronics to CPS and cloud-based (IoT) systems. It will further consider the requirement that methodologies for CPS-design should be part of a multi-disciplinary development process within which designers should focus not only on the separate physical and computational components, but also on their integration and interaction. Challenges related to CPS-design are therefore considered in the paper from the perspectives of the physical processes, computation and integration respectively. Illustrative case studies are selected from different system levels starting with the description of the overlaying concept of Cyber Physical Production Systems (CPPSs). The analysis and evaluation of the specific properties of a sub-system using a condition monitoring system, important for the maintenance purposes, is then given for a wind turbine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here, we study the stable integration of real time optimization (RTO) with model predictive control (MPC) in a three layer structure. The intermediate layer is a quadratic programming whose objective is to compute reachable targets to the MPC layer that lie at the minimum distance to the optimum set points that are produced by the RTO layer. The lower layer is an infinite horizon MPC with guaranteed stability with additional constraints that force the feasibility and convergence of the target calculation layer. It is also considered the case in which there is polytopic uncertainty in the steady state model considered in the target calculation. The dynamic part of the MPC model is also considered unknown but it is assumed to be represented by one of the models of a discrete set of models. The efficiency of the methods presented here is illustrated with the simulation of a low order system. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, there exist various standards for individual management systems (MSs), at least, one for each stakeholder. New ones will be published. An integrated management system (IMS) aims to integrate some or all components of the business into one coherent and efficient MS. Maximizing integration is more and more a strategic priority in that it constitutes an opportunity to eliminate and/or reduce potential factors of destruction of value for the organizations and also to be more competitive and consequently promote its sustainable success. A preliminary investigation was conducted on a Portuguese industrial company which, over the years, has been adopting gradually, in whole or in part, individualized management system standards (MSSs). A research, through a questionnaire, was performed with the objective to develop, in a real business environment, an adequate and efficient IMS-QES (quality, environment, and safety) model and to potentiate for the future a generic IMS model to integrate other MSSs. The strategy and research methods have taken into consideration the case study. It was obtained a set of relevant conclusions resulting from the statistical analyses of the responses to the survey. Globally, the investigation results, by themselves, justified and prioritized the conception of a model of development of the IMS-QES and consequent definition and validation of a structure of an IMS-QES model, to be implemented at the small- and medium-sized enterprise (SME) where the investigation was conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: The aim of this study was to assess the epidemiological and operational characteristics of the Leprosy Program before and after its integration into the Primary healthcare Services of the municipality of Aracaju-Sergipe, Brazil. METHODS: Data were drawn from the national database. The study periods were divided into preintegration (1996-2000) and postintegration (2001-2007). Annual rates of epidemiological detection were calculated. Frequency data on clinico-epidemiological variables of cases detected and treated for the two periods were compared using the Chi-squared (χ2) test adopting a 5% level of significance. RESULTS: Rates of detection overall, and in subjects younger than 15 years, were greater for the postintegration period and were higher than rates recorded for Brazil as a whole during the same periods. A total of 780 and 1,469 cases were registered during the preintegration and postintegration periods, respectively. Observations for the postintegration period were as follows: I) a higher proportion of cases with disability grade assessed at diagnosis, with increase of 60.9% to 78.8% (p < 0.001), and at end of treatment, from 41.4% to 44.4% (p < 0.023); II) an increase in proportion of cases detected by contact examination, from 2.1% to 4.1% (p < 0.001); and III) a lower level of treatment default with a decrease from 5.64 to 3.35 (p < 0.008). Only 34% of cases registered from 2001 to 2007 were examined. CONCLUSIONS: The shift observed in rates of detection overall, and in subjects younger than 15 years, during the postintegration period indicate an increased level of health care access. The fall in number of patients abandoning treatment indicates greater adherence to treatment. However, previous shortcomings in key actions, pivotal to attaining the outcomes and impact envisaged for the program, persisted in the postintegration period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work is to present a multitechnique approach to define the geometry, the kinematics, and the failure mechanism of a retrogressive large landslide (upper part of the La Valette landslide, South French Alps) by the combination of airborne and terrestrial laser scanning data and ground-based seismic tomography data. The advantage of combining different methods is to constrain the geometrical and failure mechanism models by integrating different sources of information. Because of an important point density at the ground surface (4. 1 points m?2), a small laser footprint (0.09 m) and an accurate three-dimensional positioning (0.07 m), airborne laser scanning data are adapted as a source of information to analyze morphological structures at the surface. Seismic tomography surveys (P-wave and S-wave velocities) may highlight the presence of low-seismic-velocity zones that characterize the presence of dense fracture networks at the subsurface. The surface displacements measured from the terrestrial laser scanning data over a period of 2 years (May 2008?May 2010) allow one to quantify the landslide activity at the direct vicinity of the identified discontinuities. An important subsidence of the crown area with an average subsidence rate of 3.07 m?year?1 is determined. The displacement directions indicate that the retrogression is controlled structurally by the preexisting discontinuities. A conceptual structural model is proposed to explain the failure mechanism and the retrogressive evolution of the main scarp. Uphill, the crown area is affected by planar sliding included in a deeper wedge failure system constrained by two preexisting fractures. Downhill, the landslide body acts as a buttress for the upper part. Consequently, the progression of the landslide body downhill allows the development of dip-slope failures, and coherent blocks start sliding along planar discontinuities. The volume of the failed mass in the crown area is estimated at 500,000 m3 with the sloping local base level method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.