966 resultados para quantitative method
Resumo:
Tese de Doutoramento em Biologia apresentada à Faculdade de Ciências da Universidade do Porto, 2015.
Resumo:
A clinical trial involving 80 patients of both sexes, from ages 15 to 55, with chronic intestinal or hepatointestinal schistosomiasis mansoni, was carried out to evaluate the therapeutical efficacy of different dose regimens of praziquantel. The patients were randomly allocated into four groups with an equal number of cases and were then treated with one of the following dosages: 60 mg/kg for 1 day; 60 mg/kg daily for 2 days; 60 mg/kg daily for 3 days; and 30 mg/kg daily for 6 days. The assessment of parasitological cure was based on the quantitative oogram technique through rectal mucosa biopsies which were undertaken prior to, as well as, 1,2,4 and 6 months post-treatment. Concurrently, stool examinations according to the qualitative Hoffman, Pons & Janer (HPJ) and the quantitative Kato-Katz (K-K) methods were also performed. The best tolerability was observed with 30 mg/kg daily for 6 days whereas the highest incidence of side-effects (mainly dizziness and nausea) was found with 60 mg/kg daily for 3 days. No serious adverse drug reaction has occurred. The achieved cure rates were: 25% with 60 mg/kg for 1 day; 60% with 60 mg/kg daily for 2 days; 89.5% with 60 mg/kg daily for 3 days; and 90% with 30 mg/kg daily for 6 days. At the same time there has been a downfall of 64%, 73%, 87% and 84% respectively, in the median number of viable S. mansoni ova per gram of tissue. Thus, a very clear direct correlation between dose and effect could be seen. The corresponding cure rates according to stool examinations by HPJ were 39%, 80%, 100% and 95%; by K-K 89%, 100%, 100% and 100%. This discrepancy in results amongst the three parasitological methods is certainly due to their unequal accuracy. In fact, when the number of viable eggs per gram of tissue fell below 5,000 the difference in the percentage of false negative findings between HPJ (28%) and K-K (80%) became significative. When this number dropped to less than 2,000 the percentage of false negative results obtained with HPJ (49%) turned significant in relation to the oogram as well. In conclusion, it has been proven that praziquantel is a highly efficacious agent against S. mansoni infections. If administered at a total dose of 180 mg/kg divided into either 3 or 6 days, it yields a 90% cure rate. Possibly, one could reach 100% by increasing the total dose to 240 mg/kg. Furthermore, it was confirmed that the quantitative oogram technique is the most reliable parasitological method when evaluating the efficacy of new drugs in schistosomiasis mansoni.
Resumo:
The pathogenesis of the renal lesion upon envenomation by snakebite has been related to myolysis, hemolysis, hypotension and/or direct venom nephrotoxicity caused by the venom. Both primary and continuous cell culture systems provide an in vitro alternative for quantitative evaluation of the toxicity of snake venoms. Crude Crotalus vegrandis venom was fractionated by molecular exclusion chromatography. The toxicity of C. vegrandis crude venom, hemorrhagic, and neurotoxic fractions were evaluated on mouse primary renal cells and a continuous cell line of Vero cells maintained in vitro. Cells were isolated from murine renal cortex and were grown in 96 well plates with Dulbecco's Modified Essential Medium (DMEM) and challenged with crude and venom fractions. The murine renal cortex cells exhibited epithelial morphology and the majority showed smooth muscle actin determined by immune-staining. The cytotoxicity was evaluated by the tetrazolium colorimetric method. Cell viability was less for crude venom, followed by the hemorrhagic and neurotoxic fractions with a CT50 of 4.93, 18.41 and 50.22 µg/mL, respectively. The Vero cell cultures seemed to be more sensitive with a CT50 of 2.9 and 1.4 µg/mL for crude venom and the hemorrhagic peak, respectively. The results of this study show the potential of using cell culture system to evaluate venom toxicity.
Resumo:
Objective: To assess quantitative real-time polymerase chain reaction (q-PCR) for the sputum smear diagnosis of pulmonary tuberculosis (PTB) in patients living with HIV/AIDS with a clinical suspicion of PTB.Method: This is a prospective study to assess the accuracy of a diagnostic test, conducted on 140 sputum specimens from 140 patients living with HIV/AIDS with a clinical suspicion of PTB, attended at two referral hospitals for people living with HIV/AIDS in the city of Recife, Pernambuco, Brazil. A Löwenstein-Jensen medium culture and 7H9 broth were used as gold standard.Results: Of the 140 sputum samples, 47 (33.6%) were positive with the gold standard. q-PCR was positive in 42 (30%) of the 140 patients. Only one (0.71%) did not correspond to the culture. The sensitivity, specificity and accuracy of the q-PCR were 87.2%, 98.9% and 95% respectively. In 39 (93%) of the 42 q-PCR positive cases, the CT (threshold cycle) was equal to or less than 37.Conclusion: q-PCR performed on sputum smears from patients living with HIV/AIDS demonstrated satisfactory sensitivity, specificity and accuracy, and may therefore be recommended as a method for diagnosing PTB.
Resumo:
INTRODUCTION: Spontaneous sedimentation is an important procedure for stool examination. A modification of this technique using conical tubes was performed and evaluated. METHODS: Fifty fecal samples were processed in sedimentation glass and in polypropylene conical tubes. Another 50 samples were used for quantitative evaluation of protozoan cysts. RESULTS: Although no significant differences occurred in the frequency of protozoa and helminths detected, significant differences in protozoan cyst counts did occur. CONCLUSIONS: The use of tube predicts a shorter path in the sedimentation of the sample, increases concentration of parasites for microscopy analysis, minimizes the risks of contamination, reduces the odor, and optimizes the workspace.
Resumo:
Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.
Resumo:
Abstract: This article presents both a brief systemic intervention method (IBS) consisting in 6 sessions developed in an ambulatory service for couples and families, and two research projects done in collaboration with the Institute for Psychotherapy of the University of Lausanne. The first project is quantitative and it aims at evaluating the effectiveness of ISB. One of its main feature is that outcomes are assessed at different levels of individual and family functioning: 1) symptoms and individual functioning; 2) quality of marital relationship; 3) parental and co-parental relationships; 4) familial relationships. The second project is a qualitative case study about a marital therapy which identifies and analyses significant moments of the therapeutic process from the patients' perspective. Methodology was largely inspired by Daniel Stem's work about "moments of meeting" in psychotherapy. Results show that patients' theories about relationship and change are important elements that deepen our understanding of the change process in couple and family therapy. The interest of associating clinicians and researchers for the development and validation of a new clinical model is discussed.
Resumo:
Q-sort is a research method which allows defining profiles of attitudes toward a set of statements, ordered in relation to each other. Pertaining to the Q Methodology, the qualitative analysis of the Q-sorts is based on quantitative techniques. This method is of particular interest for research in health professions, a field in which attitudes of patients and professionals are very important. The method is presented in this article, along with an example of application in nursing in old age psychiatry.
Resumo:
Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.
Resumo:
Aging is ubiquitous to the human condition. The MRI correlates of healthy aging have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI (qMRI), and diffusion tensor imaging. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analyzing this region. By utilizing a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of aging within the human brainstem in vivo. Using qMRI, tensor-based morphometry (TBM), and voxel-based quantification (VBQ), the volumetric and quantitative changes across healthy adults between 19 and 75 years were characterized. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetization transfer and increase in proton density (PD), accounting for the previously described "midbrain shrinkage." Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterized, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterized by early, pre-clinical involvement of the brainstem, such as Parkinson's and Alzheimer's diseases.
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Resumo:
Purpose: To evaluate the sensitivity of the perfusion parameters derived from Intravoxel Incoherent Motion (IVIM) MR imaging to hypercapnia-induced vasodilatation and hyperoxygenation-induced vasoconstriction in the human brain. Materials and Methods: This study was approved by the local ethics committee and informed consent was obtained from all participants. Images were acquired with a standard pulsed-gradient spin-echo sequence (Stejskal-Tanner) in a clinical 3-T system by using 16 b values ranging from 0 to 900 sec/mm(2). Seven healthy volunteers were examined while they inhaled four different gas mixtures known to modify brain perfusion (pure oxygen, ambient air, 5% CO(2) in ambient air, and 8% CO(2) in ambient air). Diffusion coefficient (D), pseudodiffusion coefficient (D*), perfusion fraction (f), and blood flow-related parameter (fD*) maps were calculated on the basis of the IVIM biexponential model, and the parametric maps were compared among the four different gas mixtures. Paired, one-tailed Student t tests were performed to assess for statistically significant differences. Results: Signal decay curves were biexponential in the brain parenchyma of all volunteers. When compared with inhaled ambient air, the IVIM perfusion parameters D*, f, and fD* increased as the concentration of inhaled CO(2) was increased (for the entire brain, P = .01 for f, D*, and fD* for CO(2) 5%; P = .02 for f, and P = .01 for D* and fD* for CO(2) 8%), and a trend toward a reduction was observed when participants inhaled pure oxygen (although P > .05). D remained globally stable. Conclusion: The IVIM perfusion parameters were reactive to hyperoxygenation-induced vasoconstriction and hypercapnia-induced vasodilatation. Accordingly, IVIM imaging was found to be a valid and promising method to quantify brain perfusion in humans. © RSNA, 2012.
Resumo:
The epithelial amiloride-sensitive sodium channel (ENaC) controls transepithelial Na+ movement in Na(+)-transporting epithelia and is associated with Liddle syndrome, an autosomal dominant form of salt-sensitive hypertension. Detailed analysis of ENaC channel properties and the functional consequences of mutations causing Liddle syndrome has been, so far, limited by lack of a method allowing specific and quantitative detection of cell-surface-expressed ENaC. We have developed a quantitative assay based on the binding of 125I-labeled M2 anti-FLAG monoclonal antibody (M2Ab*) directed against a FLAG reporter epitope introduced in the extracellular loop of each of the alpha, beta, and gamma ENaC subunits. Insertion of the FLAG epitope into ENaC sequences did not change its functional and pharmacological properties. The binding specificity and affinity (Kd = 3 nM) allowed us to correlate in individual Xenopus oocytes the macroscopic amiloride-sensitive sodium current (INa) with the number of ENaC wild-type and mutant subunits expressed at the cell surface. These experiments demonstrate that: (i) only heteromultimeric channels made of alpha, beta, and gamma ENaC subunits are maximally and efficiently expressed at the cell surface; (ii) the overall ENaC open probability is one order of magnitude lower than previously observed in single-channel recordings; (iii) the mutation causing Liddle syndrome (beta R564stop) enhances channel activity by two mechanisms, i.e., by increasing ENaC cell surface expression and by changing channel open probability. This quantitative approach provides new insights on the molecular mechanisms underlying one form of salt-sensitive hypertension.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la University of Calgary, Canadà, entre desembre del 2007 i febrer del 2008. El projecte ha consistit en l'anàlisi de les dades d'una recerca en el camp de la psicologia de la música, concretament en com influeix la música en l'atenció a través de les vies dels estats emocionals i enèrgics de la persona. Per a la recerca es feu ús de videu en les sessions, obtenint dades visuals i auditives per a complementar les dades de tipus quantitatiu provinents dels resultats d'uns tests d'atenció subministrats. L'anàlisi es realitzà segons mètodes i tècniques de caràcter qualitatiu, apresos durant l'estada. Així mateix també s'ha aprofundit en la comprensió del paradigma qualitatiu com a paradigma vàlid i realment complementari del paradigma qualitatiu. S'ha focalitzat especialment en l'anàlisi de la conversa des d'un punt de vista interpretatiu així com l'anàlisi de llenguatge corporal i facial a partir de l'observació de videu, tot formulant descriptors i subdescriptors de la conducta que està relacionada amb la hipòtesis. Alguns descriptors havien estat formulats prèviament a l’anàlisi, en base a altres investigacions i al background de la investigadora; altres s’han anat descobrint durant l’anàlisi. Els descriptors i subdescriptors de la conducta estan relacionats amb l'intent dels estats anímics i enèrgics dels diferents participants. L'anàlisi s'ha realitzat com un estudi de casos, fent un anàlisi exhaustiu persona per persona amb l'objectiu de trobar patrons de reacció intrapersonals i intrapersonals. Els patrons observats s'utilitzaran com a contrast amb la informació quantitativa, tot realitzant triangulació amb les dades per trobar-ne possibles recolzaments o contradiccions entre sí. Els resultats preliminars indiquen relació entre el tipus de música i el comportament, sent que la música d'emotivitat negativa està associada a un tancament de la persona, però quan la música és enèrgica els participants s'activen (conductualment observat) i somriuen si aquesta és positiva.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.