82 resultados para Target Held Method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

STAT transcription factors are expressed in many cell types and bind to similar sequences. However, different STAT gene knock-outs show very distinct phenotypes. To determine whether differences between the binding specificities of STAT proteins account for these effects, we compared the sequences bound by STAT1, STAT5A, STAT5B, and STAT6. One sequence set was selected from random oligonucleotides by recombinant STAT1, STAT5A, or STAT6. For another set including many weak binding sites, we quantified the relative affinities to STAT1, STAT5A, STAT5B, and STAT6. We compared the results to the binding sites in natural STAT target genes identified by others. The experiments confirmed the similar specificity of different STAT proteins. Detailed analysis indicated that STAT5A specificity is more similar to that of STAT6 than that of STAT1, as expected from the evolutionary relationships. The preference of STAT6 for sites in which the half-palindromes (TTC) are separated by four nucleotides (N(4)) was confirmed, but analysis of weak binding sites showed that STAT6 binds fairly well to N(3) sites. As previously reported, STAT1 and STAT5 prefer N(3) sites; however, STAT5A, but not STAT1, weakly binds N(4) sites. None of the STATs bound to half-palindromes. There were no specificity differences between STAT5A and STAT5B.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main problems in combating tuberculosis is caused by a poor penetration of drugs into the mycobacterial cells. A prodrug approach via activation inside mycobacterial cells is a possible strategy to overcome this hurdle and achieve efficient drug uptake. Esters are attractive candidates for such a strategy and we and others communicated previously the activity of esters of weak organic acids against mycobacteria. However very little is known about ester hydrolysis by mycobacteria and no biological model is available to study the activation of prodrugs by these microorganisms. To begin filling this gap, we have embarked in a project to develop an in vitro method to study prodrug activation by mycobacteria using Mycobacterium smegmatis homogenates. Model ester substrates were ethyl nicotinate and ethyl benzoate whose hydrolysis was monitored and characterized kinetically. Our studies showed that in M. smegmatis most esterase activity is associated with the soluble fraction (cytosol) and is preserved by storage at 5°C or at room temperature for one hour, or by storage at -80°C up to one year. In the range of homogenate concentrations studied (5-80% in buffer), k(obs) varied linearly with homogenate concentration for both substrates. We also found that the homogenates showed Michaelis-Menten kinetics behavior with both prodrugs. Since ethyl benzoate is a good substrate for the mycobacterial esterases, this compound can be used to standardize the esterasic activity of homogenates, allowing results of incubations of prodrugs with homogenates from different batches to be readily compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ewing's sarcoma family tumors (ESFT) are the second most common bone malignancy in children and young adults, characterized by unique chromosomal translocations that in 85% of cases lead to expression of the EWS-FLI-1 fusion protein. EWS-FLI-1 functions as an aberrant transcription factor that can both induce and suppress members of its target gene repertoire. We have recently demonstrated that EWS-FLI-1 can alter microRNA (miRNA) expression and that miRNA145 is a direct EWS-FLI-1 target whose suppression is implicated in ESFT development. Here, we use miRNA arrays to compare the global miRNA expression profile of human mesenchymal stem cells (MSC) and ESFT cell lines, and show that ESFT display a distinct miRNA signature that includes induction of the oncogenic miRNA 17-92 cluster and repression of the tumor suppressor let-7 family. We demonstrate that direct repression of let-7a by EWS-FLI-1 participates in the tumorigenic potential of ESFT cells in vivo. The mechanism whereby let-7a expression regulates ESFT growth is shown to be mediated by its target gene HMGA2, as let-7a overexpression and HMGA2 repression both block ESFT cell tumorigenicity. Consistent with these observations, systemic delivery of synthetic let-7a into ESFT-bearing mice restored its expression in tumor cells, decreased HMGA2 expression levels and resulted in ESFT growth inhibition in vivo. Our observations provide evidence that deregulation of let-7a target gene expression participates in ESFT development and identify let-7a as promising new therapeutic target for one of the most aggressive pediatric malignancies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Postmortem computed tomography angiography (PMCTA) was introduced into forensic investigations a few years ago. It provides reliable images that can be consulted at any time. Conventional autopsy remains the reference standard for defining the cause of death, but provides only limited possibility of a second examination. This study compares these two procedures and discusses findings that can be detected exclusively using each method. MATERIALS AND METHODS: This retrospective study compared radiological reports from PMCTA to reports from conventional autopsy for 50 forensic autopsy cases. Reported findings from autopsy and PMCTA were extracted and compared to each other. PMCTA was performed using a modified heart-lung machine and the oily contrast agent Angiofil® (Fumedica AG, Muri, Switzerland). RESULTS: PMCTA and conventional autopsy would have drawn similar conclusions regarding causes of death. Nearly 60 % of all findings were visualized with both techniques. PMCTA demonstrates a higher sensitivity for identifying skeletal and vascular lesions. However, vascular occlusions due to postmortem blood clots could be falsely assumed to be vascular lesions. In contrast, conventional autopsy does not detect all bone fractures or the exact source of bleeding. Conventional autopsy provides important information about organ morphology and remains the only way to diagnose a vital vascular occlusion with certitude. CONCLUSION: Overall, PMCTA and conventional autopsy provide comparable findings. However, each technique presents advantages and disadvantages for detecting specific findings. To correctly interpret findings and clearly define the indications for PMCTA, these differences must be understood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Purpose: New treatments against long-lasting uveitis need to be tested. Our aim was to develop a six-week model of uveitis in rabbits. Methods: Rabbits were presensitized with an s.c. injection of Mycobacterium tuberculosis H37RA emulsified with TiterMax® Gold adjuvant. Uveitis was induced at day 28 and 50, by intravitreal challenges of antigen suspension. Ocular inflammation was assessed till euthanasia at day 71 after s.c. injection of M. tuberculosis H37RA by: (a) the number of inflammatory cells in aqueous humor (AH); (b) the protein concentration in AH; (c) the clinical score (mean of conjunctival hyperaemia, conjunctival chemosis, oedema and secretion); (d) the microscopical score (mean presence of fibrin and synechiae, aqueous cell density and aqueous flare grade, as scored by slit lamp). Results: At the sites of presensitization injection, rabbits presented flat nodules which progressively vanished. The first challenge induced a significant increase in the four parameters (p < 0.05 the Wilcoxon/Kruskal-Wallis test). The AH contained 764 ± 82 cells/µl and 32 ± 0.77 mg protein/ml. During the following days, inflammatory parameters decreased slightly. The second intravitreal challenge increased inflammation (3564 ± 228 cells/µl AH and 31 ± 1 mg protein/ml), which remained at a high level for a longer period of time. Conclusion: We developed a model of long-term uveitis, which could be maintained in rabbits for at least six weeks. Such a model could be used to test the efficacy of either new drugs or various drug delivery systems intended to deliver active agents during a few months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostic information on children is typically elicited from both children and their parents. The aims of the present paper were to: (1) compare prevalence estimates according to maternal reports, paternal reports and direct interviews of children [major depressive disorder (MDD), anxiety and attention-deficit and disruptive behavioural disorders]; (2) assess mother-child, father-child and inter-parental agreement for these disorders; (3) determine the association between several child, parent and familial characteristics and the degree of diagnostic agreement or the likelihood of parental reporting; (4) determine the predictive validity of diagnostic information provided by parents and children. Analyses were based on 235 mother-offspring, 189 father-offspring and 128 mother-father pairs. Diagnostic assessment included the Kiddie-schedule for Affective Disorders and Schizophrenia (K-SADS) (offspring) and the Diagnostic Interview for Genetic Studies (DIGS) (parents and offspring at follow-up) interviews. Parental reports were collected using the Family History - Research Diagnostic Criteria (FH-RDC). Analyses revealed: (1) prevalence estimates for internalizing disorders were generally lower according to parental information than according to the K-SADS; (2) mother-child and father-child agreement was poor and within similar ranges; (3) parents with a history of MDD or attention deficit hyperactivity disorder (ADHD) reported these disorders in their children more frequently; (4) in a sub-sample followed-up into adulthood, diagnoses of MDD, separation anxiety and conduct disorder at baseline concurred with the corresponding lifetime diagnosis at age 19 according to the child rather than according to the parents. In conclusion, our findings support large discrepancies of diagnostic information provided by parents and children with generally lower reporting of internalizing disorders by parents, and differential reporting of depression and ADHD by parental disease status. Follow-up data also supports the validity of information provided by adolescent offspring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To investigate the associations between falls before¦hospital admission, falls during hospitalization, and length of stay in¦elderly people admitted to post-acute geriatric rehabilitation.¦Method: History of falling in the previous 12 months before admission¦was recorded among 249 older persons (mean age 82.3 ± 7.4 years,¦69.1% women) consecutively admitted to post-acute rehabilitation. Data¦on medical, functional and cognitive status were collected upon¦admission. Falls during hospitalization and length of stay were recorded¦at discharge.¦Results: Overall, 92 (40.4%) patients reported no fall in the 12 months¦before admission; 63(27.6%) reported 1 fall, and 73 (32.0%) reported¦multiple falls. Previous falls occurrence (one or more falls) was¦significantly associated with in-stay falls (19.9% of previous fallers fell¦during the stay vs 7.6% in patients without history of falling, P = .01),¦and with a longer length of stay (22.4 ± 10.1 days vs 27.1 ± 14.3 days,¦P = .01). In multivariate robust regression controlling for gender, age,¦functional and cognitive status, history of falling remained significantly¦associated with longer rehabilitation stay (2.8 days more than non¦fallers in single fallers, p = .05, and 3.3 days in multiple fallers, p = .0.1).¦Conclusion: History of falling in the 12 months prior to post acute¦geriatric rehabilitation is independently associated with a longer¦rehabilitation length of stay. Previous fallers also have an increased risk¦of falling during rehabilitation stay. This suggests that hospital fall¦prevention measures should particularly target these high risk patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the various determinants of treatment response, the achievement of sufficient blood levels is essential for curing malaria. For helping us at improving our current understanding of antimalarial drugs pharmacokinetics, efficacy and toxicity, we have developed a liquid chromatography-tandem mass spectrometry method (LC-MS/MS) requiring 200mul of plasma for the simultaneous determination of 14 antimalarial drugs and their metabolites which are the components of the current first-line combination treatments for malaria (artemether, artesunate, dihydroartemisinin, amodiaquine, N-desethyl-amodiaquine, lumefantrine, desbutyl-lumefantrine, piperaquine, pyronaridine, mefloquine, chloroquine, quinine, pyrimethamine and sulfadoxine). Plasma is purified by a combination of protein precipitation, evaporation and reconstitution in methanol/ammonium formate 20mM (pH 4.0) 1:1. Reverse-phase chromatographic separation of antimalarial drugs is obtained using a gradient elution of 20mM ammonium formate and acetonitrile both containing 0.5% formic acid, followed by rinsing and re-equilibration to the initial solvent composition up to 21min. Analyte quantification, using matrix-matched calibration samples, is performed by electro-spray ionization-triple quadrupole mass spectrometry by selected reaction monitoring detection in the positive mode. The method was validated according to FDA recommendations, including assessment of extraction yield, matrix effect variability, overall process efficiency, standard addition experiments as well as antimalarials short- and long-term stability in plasma. The reactivity of endoperoxide-containing antimalarials in the presence of hemolysis was tested both in vitro and on malaria patients samples. With this method, signal intensity of artemisinin decreased by about 20% in the presence of 0.2% hemolysed red-blood cells in plasma, whereas its derivatives were essentially not affected. The method is precise (inter-day CV%: 3.1-12.6%) and sensitive (lower limits of quantification 0.15-3.0 and 0.75-5ng/ml for basic/neutral antimalarials and artemisinin derivatives, respectively). This is the first broad-range LC-MS/MS assay covering the currently in-use antimalarials. It is an improvement over previous methods in terms of convenience (a single extraction procedure for 14 major antimalarials and metabolites reducing significantly the analytical time), sensitivity, selectivity and throughput. While its main limitation is investment costs for the equipment, plasma samples can be collected in the field and kept at 4 degrees C for up to 48h before storage at -80 degrees C. It is suited to detecting the presence of drug in subjects for screening purposes and quantifying drug exposure after treatment. It may contribute to filling the current knowledge gaps in the pharmacokinetics/pharmacodynamics relationships of antimalarials and better define the therapeutic dose ranges in different patient populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents reflexions about statistical considerations on illicit drug profiling and more specifically about the calculation of threshold for determining of the seizure are linked or not. The specific case of heroin and cocaine profiling is presented with the necessary details on the target profiling variables (major alkaloids) selected and the analytical method used. Statistical approach to compare illicit drug seizures is also presented with the introduction of different scenarios dealing with different data pre-treatment or transformation of variables.The main aim consists to demonstrate the influence of data pre-treatment on the statistical outputs. A thorough study of the evolution of the true positive rate (TP) and the false positive rate (FP) in heroin and cocaine comparison is then proposed to investigate this specific topic and to demonstrate that there is no universal approach available and that the calculations have to be revaluate for each new specific application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating the time since the last discharge of firearms and/or spent cartridges may be a useful piece of information in forensic firearm-related cases. The current approach consists of studying the diffusion of selected volatile organic compounds (such as naphthalene) released during the shooting using solid phase micro-extraction (SPME). However, this technique works poorly on handgun car-tridges because the extracted quantities quickly fall below the limit of detection. In order to find more effective solutions and further investigate the aging of organic gunshot residue after the discharge of handgun cartridges, an extensive study was carried out in this work using a novel approach based on high capacity headspace sorptive extraction (HSSE). By adopting this technique, for the first time 51 gunshot residue (GSR) volatile organic compounds could be simultaneously detected from fired handgun cartridge cases. Application to aged specimens showed that many of those compounds presented significant and complementary aging profiles. Compound-to-compound ratios were also tested and proved to be beneficial both in reducing the variability of the aging curves and in enlarging the time window useful in a forensic casework perspective. The obtained results were thus particularly promising for the development of a new complete forensic dating methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel hybrid (or multiphysics) algorithm, which couples pore-scale and Darcy descriptions of two-phase flow in porous media. The flow at the pore-scale is described by the Navier?Stokes equations, and the Volume of Fluid (VOF) method is used to model the evolution of the fluid?fluid interface. An extension of the Multiscale Finite Volume (MsFV) method is employed to construct the Darcy-scale problem. First, a set of local interpolators for pressure and velocity is constructed by solving the Navier?Stokes equations; then, a coarse mass-conservation problem is constructed by averaging the pore-scale velocity over the cells of a coarse grid, which act as control volumes; finally, a conservative pore-scale velocity field is reconstructed and used to advect the fluid?fluid interface. The method relies on the localization assumptions used to compute the interpolators (which are quite straightforward extensions of the standard MsFV) and on the postulate that the coarse-scale fluxes are proportional to the coarse-pressure differences. By numerical simulations of two-phase problems, we demonstrate that these assumptions provide hybrid solutions that are in good agreement with reference pore-scale solutions and are able to model the transition from stable to unstable flow regimes. Our hybrid method can naturally take advantage of several adaptive strategies and allows considering pore-scale fluxes only in some regions, while Darcy fluxes are used in the rest of the domain. Moreover, since the method relies on the assumption that the relationship between coarse-scale fluxes and pressure differences is local, it can be used as a numerical tool to investigate the limits of validity of Darcy's law and to understand the link between pore-scale quantities and their corresponding Darcy-scale variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives:To investigate the associations between falls before hospital¦admission, falls during hospitalization, and length of stay in elderly¦people admitted to post-acute geriatric rehabilitation. Method: History¦of falling in the previous 12 months before admission was recorded¦among 249 older persons (mean age 82.3±7.4 years, 69.1% women)¦consecutively admitted to post-acute rehabilitation. Data on medical,¦functional and cognitive status were collected upon admission. Falls¦during hospitalization and length of stay were recorded at discharge.¦Results: Overall, 92 (40.4%) patients reported no fall in the 12 months¦before admission; 63(27.6%) reported 1 fall, and 73(32.0%) reported¦multiple falls. Previous falls occurrence (one or more falls) was significantly¦associated with in-stay falls (19.9% of previous fallers fell¦during the stay vs 7.6% in patients without history of falling, P=.01),¦and with a longer length of stay (22.4 ± 10.1 days vs 27.1 ± 14.3 days,¦P=.01). In multivariate robust regression controlling for gender, age,¦functional and cognitive status, history of falling remained significantly¦associated with longer rehabilitation stay (2.8 days more in single fallers,¦p=.05, and 3.3 days more in multiple fallers, p=.0.1, compared to¦non-fallers). Conclusion: History of falling in the 12 months prior to¦post acute geriatric rehabilitation is independently associated with a¦longer rehabilitation length of stay. Previous fallers have also an¦increased risk of falling during rehabilitation stay. This suggests that¦hospital fall prevention measures should particularly target these high¦riskpatients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The provision of sufficient basal insulin to normalize fasting plasma glucose levels may reduce cardiovascular events, but such a possibility has not been formally tested. METHODS: We randomly assigned 12,537 people (mean age, 63.5 years) with cardiovascular risk factors plus impaired fasting glucose, impaired glucose tolerance, or type 2 diabetes to receive insulin glargine (with a target fasting blood glucose level of ≤95 mg per deciliter [5.3 mmol per liter]) or standard care and to receive n-3 fatty acids or placebo with the use of a 2-by-2 factorial design. The results of the comparison between insulin glargine and standard care are reported here. The coprimary outcomes were nonfatal myocardial infarction, nonfatal stroke, or death from cardiovascular causes and these events plus revascularization or hospitalization for heart failure. Microvascular outcomes, incident diabetes, hypoglycemia, weight, and cancers were also compared between groups. RESULTS: The median follow-up was 6.2 years (interquartile range, 5.8 to 6.7). Rates of incident cardiovascular outcomes were similar in the insulin-glargine and standard-care groups: 2.94 and 2.85 per 100 person-years, respectively, for the first coprimary outcome (hazard ratio, 1.02; 95% confidence interval [CI], 0.94 to 1.11; P=0.63) and 5.52 and 5.28 per 100 person-years, respectively, for the second coprimary outcome (hazard ratio, 1.04; 95% CI, 0.97 to 1.11; P=0.27). New diabetes was diagnosed approximately 3 months after therapy was stopped among 30% versus 35% of 1456 participants without baseline diabetes (odds ratio, 0.80; 95% CI, 0.64 to 1.00; P=0.05). Rates of severe hypoglycemia were 1.00 versus 0.31 per 100 person-years. Median weight increased by 1.6 kg in the insulin-glargine group and fell by 0.5 kg in the standard-care group. There was no significant difference in cancers (hazard ratio, 1.00; 95% CI, 0.88 to 1.13; P=0.97). CONCLUSIONS: When used to target normal fasting plasma glucose levels for more than 6 years, insulin glargine had a neutral effect on cardiovascular outcomes and cancers. Although it reduced new-onset diabetes, insulin glargine also increased hypoglycemia and modestly increased weight. (Funded by Sanofi; ORIGIN ClinicalTrials.gov number, NCT00069784.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.