140 resultados para Discrete models
Resumo:
The formation and accumulation of toxic amyloid-β peptides (Aβ) in the brain may drive the pathogenesis of Alzheimer's disease. Accordingly, disease-modifying therapies for Alzheimer's disease and related disorders could result from treatments regulating Aβ homeostasis. Examples are the inhibition of production, misfolding, and accumulation of Aβ or the enhancement of its clearance. Here we show that oral treatment with ACI-91 (Pirenzepine) dose-dependently reduced brain Aβ burden in AβPPPS1, hAβPPSL, and AβPP/PS1 transgenic mice. A possible mechanism of action of ACI-91 may occur through selective inhibition of muscarinic acetylcholine receptors (AChR) on endothelial cells of brain microvessels and enhanced Aβ peptide clearance across the blood-brain barrier. One month treatment with ACI-91 increased the clearance of intrathecally-injected Aβ in plaque-bearing mice. ACI-91 also accelerated the clearance of brain-injected Aβ in blood and peripheral tissues by favoring its urinal excretion. A single oral dose of ACI-91 reduced the half-life of interstitial Aβ peptide in pre-plaque mhAβPP/PS1d mice. By extending our studies to an in vitro model, we showed that muscarinic AChR inhibition by ACI-91 and Darifenacin augmented the capacity of differentiated endothelial monolayers for active transport of Aβ peptide. Finally, ACI-91 was found to consistently affect, in vitro and in vivo, the expression of endothelial cell genes involved in Aβ transport across the Blood Brain Brain (BBB). Thus increased Aβ clearance through the BBB may contribute to reduced Aβ burden and associated phenotypes. Inhibition of muscarinic AChR restricted to the periphery may present a therapeutic advantage as it avoids adverse central cholinergic effects.
Resumo:
A survey was undertaken among Swiss occupational health and safety specialists in 2004 to identify uses, difficulties, and possible developments of exposure models. Occupational hygienists (121), occupational physicians (169), and safety specialists (95) were surveyed with an in depth questionnaire. Results obtained indicate that models are not used very much in practice in Switzerland and are reserved to research groups focusing on specific topics. However, various determinants of exposure are often considered important by professionals (emission rate, work activity), and in some cases recorded and used (room parameters, operator activity). These parameters cannot be directly included in present models. Nevertheless, more than half of the occupational hygienists think that it is important to develop quantitative exposure models. Looking at research institutions, there is, however, a big interest in the use of models to solve problems which are difficult to address with direct measurements; i. e. retrospective exposure assessment for specific clinical cases and prospective evaluation for new situations or estimation of the effect of selected parameters. In a recent study about cases of acutepulmonary toxicity following water proofing spray exposure, exposure models have been used to reconstruct exposure of a group of patients. Finally, in the context of exposure prediction, it is also important to report about a measurement database existing in Switzerland since 1991. [Authors]
Resumo:
BACKGROUND: Optimal management of acute pulmonary embolism (PE) requires medical expertise, diagnostic testing, and therapies that may not be available consistently throughout the entire week. We sought to assess whether associations exist between weekday or weekend admission and mortality and length of hospital stay for patients hospitalized with PE. METHODS AND RESULTS: We evaluated patients discharged with a primary diagnosis of PE from 186 acute care hospitals in Pennsylvania (January 2000 to November 2002). We used random-effect logistic models to study the association between weekend admission and 30-day mortality and used discrete survival models to study the association between weekend admission and time to hospital discharge, adjusting for hospital (region, size, and teaching status) and patient factors (race, insurance, severity of illness, and use of thrombolytic therapy). Among 15 531 patient discharges with PE, 3286 patients (21.2%) had been admitted on a weekend. Patients admitted on weekends had a higher unadjusted 30-day mortality rate (11.1% versus 8.8%) than patients admitted on weekdays, with no difference in length of stay. Patients admitted on weekends had significantly greater adjusted odds of dying (odds ratio 1.17, 95% confidence interval 1.03 to 1.34) than patients admitted on weekdays. The higher mortality among patients hospitalized on weekends was driven by the increased mortality rate among the most severely ill patients. CONCLUSIONS: Patients with PE who are admitted on weekends have a significantly higher short-term mortality than patients admitted on weekdays. Quality-improvement efforts should aim to ensure a consistent approach to the management of PE 7 days a week.
Resumo:
Using numerical simulations we investigate shapes of random equilateral open and closed chains, one of the simplest models of freely fluctuating polymers in a solution. We are interested in the 3D density distribution of the modeled polymers where the polymers have been aligned with respect to their three principal axes of inertia. This type of approach was pioneered by Theodorou and Suter in 1985. While individual configurations of the modeled polymers are almost always nonsymmetric, the approach of Theodorou and Suter results in cumulative shapes that are highly symmetric. By taking advantage of asymmetries within the individual configurations, we modify the procedure of aligning independent configurations in a way that shows their asymmetry. This approach reveals, for example, that the 3D density distribution for linear polymers has a bean shape predicted theoretically by Kuhn. The symmetry-breaking approach reveals complementary information to the traditional, symmetrical, 3D density distributions originally introduced by Theodorou and Suter.
Resumo:
Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.
Resumo:
Machado-Joseph disease or spinocerebellar ataxia type 3, the most common dominantly-inherited spinocerebellar ataxia, results from translation of the polyglutamine-expanded and aggregation prone ataxin 3 protein. Clinical manifestations include cerebellar ataxia and pyramidal signs and there is no therapy to delay disease progression. Beclin 1, an autophagy-related protein and essential gene for cell survival, is decreased in several neurodegenerative disorders. This study aimed at evaluating if lentiviral-mediated beclin 1 overexpression would rescue motor and neuropathological impairments when administered to pre- and post-symptomatic lentiviral-based and transgenic mouse models of Machado-Joseph disease. Beclin 1-mediated significant improvements in motor coordination, balance and gait with beclin 1-treated mice equilibrating longer periods in the Rotarod and presenting longer and narrower footprints. Furthermore, in agreement with the improvements observed in motor function beclin 1 overexpression prevented neuronal dysfunction and neurodegeneration, decreasing formation of polyglutamine-expanded aggregates, preserving Purkinje cell arborization and immunoreactivity for neuronal markers. These data show that overexpression of beclin 1 in the mouse cerebellum is able to rescue and hinder the progression of motor deficits when administered to pre- and post-symptomatic stages of the disease.
Resumo:
A wide range of numerical models and tools have been developed over the last decades to support the decision making process in environmental applications, ranging from physical models to a variety of statistically-based methods. In this study, a landslide susceptibility map of a part of Three Gorges Reservoir region of China was produced, employing binary logistic regression analyses. The available information includes the digital elevation model of the region, geological map and different GIS layers including land cover data obtained from satellite imagery. The landslides were observed and documented during the field studies. The validation analysis is exploited to investigate the quality of mapping.
Resumo:
In this paper, we analyze magnitude and possible selectivity of attrition in first wave respondents in the Swiss Household Panel (SHP), from wave two (2000) through wave seven (2005). After comparing attrition of first wave respondents with that of other panel surveys, we proceed to model selectivity of attrition in two steps: we first build separate waveto- wave models, and second a longitudinal all-wave model. The latter model includes wave interaction effects. The first models allow for tracing of selectivity development, i.e. whether an initial selectivity might compensate or cumulates over time, the second to assessing the effects of the covariates in a specific wave, controlling for the base attrition effect. In particular it allows for the analysis of consequences due to discrete fieldwork events. Our results support the findings in the literature: attritors are in general the younger people and the males, foreigners, the socially and politically "excluded", i.e. those who show little social and political interest and participation, those who are mostly dissatisfied with various aspects in their life, and those who live in households with high unit nonresponse, and who exhibit a worse reporting behavior. This pattern is rather cumulative than compensating over panel waves. Excessive attrition in two waves presumably caused by two discrete events in the panel is not particularly selective. Still existing variation in selective attrition is worth to be further explored.
Resumo:
Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.
Resumo:
The geometry and connectivity of fractures exert a strong influence on the flow and transport properties of fracture networks. We present a novel approach to stochastically generate three-dimensional discrete networks of connected fractures that are conditioned to hydrological and geophysical data. A hierarchical rejection sampling algorithm is used to draw realizations from the posterior probability density function at different conditioning levels. The method is applied to a well-studied granitic formation using data acquired within two boreholes located 6 m apart. The prior models include 27 fractures with their geometry (position and orientation) bounded by information derived from single-hole ground-penetrating radar (GPR) data acquired during saline tracer tests and optical televiewer logs. Eleven cross-hole hydraulic connections between fractures in neighboring boreholes and the order in which the tracer arrives at different fractures are used for conditioning. Furthermore, the networks are conditioned to the observed relative hydraulic importance of the different hydraulic connections by numerically simulating the flow response. Among the conditioning data considered, constraints on the relative flow contributions were the most effective in determining the variability among the network realizations. Nevertheless, we find that the posterior model space is strongly determined by the imposed prior bounds. Strong prior bounds were derived from GPR measurements and helped to make the approach computationally feasible. We analyze a set of 230 posterior realizations that reproduce all data given their uncertainties assuming the same uniform transmissivity in all fractures. The posterior models provide valuable statistics on length scales and density of connected fractures, as well as their connectivity. In an additional analysis, effective transmissivity estimates of the posterior realizations indicate a strong influence of the DFN structure, in that it induces large variations of equivalent transmissivities between realizations. The transmissivity estimates agree well with previous estimates at the site based on pumping, flowmeter and temperature data.
Resumo:
Résumé Le but final de ce projet est d'utiliser des cellules T ou des cellules souches mésenchymateuses modifiées génétiquement afin de surexprimer localement les deux chémokines CXCL13 et CCL2 ensemble ou chacune séparément à l'intérieur d'une tumeur solide. CXCL13 est supposé induire des structures lymphoïdes ectopiques. Un niveau élevé de CCL2 est présumé initier une inflammation aiguë. La combinaison des deux effets amène à un nouveau modèle d'étude des mécanismes régulateur de la tolérance périphérique et de l'immunité tumorale. Les connaissances acquises grâce à ce modèle pourraient permettre le développement ou l'amélioration des thérapies immunes du cancer. Le but premier de ce travail a été l'établissement d'un modèle génétique de la souris permettant d'exprimer spécifiquement dans la tumeur les deux chémokines d'intérêt à des niveaux élevés. Pour accomplir cette tâche, qui est en fait une thérapie génétique de tumeurs solides, deux types de cellules porteuses potentielles ont été évaluées. Des cellules CD8+ T et des cellules mésenchymateuses de la moelle osseuse transférées dans des receveurs portant une tumeur. Si on pouvait répondre aux besoins de la thérapie génétique, indépendamment de la thérapie immune envisagée, on posséderait là un outil précieux pour bien d'autres approches thérapeutiques. Plusieurs lignées de souris transgéniques ont été générées comme source de cellules CD8+ T modifiées afin d'exprimer les chémokines d'intérêt. Dans une approche doublement transgénique les propriétés de deux promoteurs spécifiques de cellules T ont été combinées en utilisant la technologie Cre-loxP. Le promoteur de granzyme B confère une dépendance d'activation et le promoteur distal de lck assure une forte expression constitutive dès que les cellules CD8+ T ont été activées. Les transgènes construits ont montré une bonne performance in vivo et des souris qui expriment CCL2 dans des cellules CD8+ T activées ont été obtenues. Ces cellules peuvent maintenant être utilisées avec différents protocoles pour transférer des cellules T cytotoxiques (CTL) dans des receveurs porteur d'une tumeur, permettant ainsi d'évaluer leur capacité en tant que porteuse de chémokine d'infiltrer la tumeur. L'établissement de souris transgéniques, qui expriment pareillement CXCL13 est prévu dans un avenir proche. L'évaluation de cellules mésenchymateuses de la moelle osseuse a démontré que ces cellules se greffent efficacement dans le stroma tumoral suite à la co-injection avec des cellules tumorales. Cela représente un outil précieux pour la recherche, vu qu'il permet d'introduire des cellules manipulées dans un modèle tumoral. Les résultats confirment partiellement d'autres résultats rapportés dans un modèle amélioré. Cependant, l'efficacité et la spécificité suggérées de la migration systémique de cellules mésenchymateuses de la moelle osseuse dans une tumeur n'ont pas été observées dans notre modèle, ce qui indique, que ces cellules ne se prêtent pas à une utilisation thérapeutique. Un autre résultat majeur de ce travail est l'établissement de cultures de cellules mésenchymateuses de la moelle osseuse in vitro conditionnées par des tumeurs, ce qui a permis à ces cellules de s'étendre plus rapidement en gardant leur capacité de migration et de greffe. Cela offre un autre outil précieux, vu que la culture in vitro est un pas nécessaire pour une manipulation thérapeutique. Abstract The ultimate aim of the presented project is to use genetically modified T cells or mesenchymal stem cells to locally overexpress the two chemokines CXCL13 and CCL2 together or each one alone inside a solid tumor. CXCL13 is supposed to induce ectopic lymphoid structures and a high level of CCL2 is intended to trigger acute inflamation. The combination of these two effects represents a new model for studying mechanisms that regulate peripheral tolerance and tumor immunity. Gained insights may help developing or improving immunotherapy of cancer. The primary goal of the executed work was the establishment of a genetic mouse model that allows tumor-specific expression of high levels of the two chemokines of interest. For accomplishing this task, which represents gene therapy of solid tumors, two types of potentially useful carrier cells were evaluated. CD8+ T cells and mesenchymal bone marrow cells to be used in adoptive cell transfers into tumor-bearing mice. Irrespectively of the envisaged immunotherapy, satisfaction of so far unmet needs of gene therapy would be a highly valuable tool that may be employed by many other therapeutic approaches, too. Several transgenic mouse lines were generated as a source of CD8+ T cells modified to express the chemokines of interest. In a double transgenic approach the properties of two T cell-specific promoters were combined using Cre-loxP technology. The granzyme B promoter confers activation-dependency and the lck distal promoter assures strong constitutive expression once the CD8+ T cell has been activated. The constructed transgenes showed a good performance in vivo and mice expressing CCL2 in activated CD8+ T cells were obtained. These cells can now be used with different protocols for adoptively transferring cytotoxic T cells (CTL) into tumor-bearing recipients, thus allowing to study their capacity as tumor-infiltrating chemokine carrier. The establishment of transgenic mice likewisely expressing CXCL13 is expected in the near future. In addition, T cells from generated single transgenic mice that have high expression of an EGFP reporter in both CD4+ and CD8+ cells can be easily traced in vivo when setting up adoptive transfer conditions. The evaluation of mesenchymal bone marrow cells demonstrated that these cells can efficiently engraft into tumor stroma upon local coinjection with tumor cells. This represents a valuable tool for research purposes as it allows to introduce manipulated stromal cells into a tumor model. Therefore, the established engraftment model is suited for studying the envisaged immunotherapy. These results confirm to some extend previously reported results in an improved model, however, the suggested systemic tumor homing efficiency and specificity of mesenchymal bone marrow cells was not observed in our model indicating that these cells may not be suited for therapeutic use. Another major result of the presented work is the establishment oftumor-conditioned in vitro culture of mesenchymal bone marrow cells, which allowed to more rapidly expand these cells while maintaining their tumor homing and engrafting capacities. This offers another valuable tool as in vitro culture is a necessary step for therapeutic manipulations.
Resumo:
Over the last decade, the development of statistical models in support of forensic fingerprint identification has been the subject of increasing research attention, spurned on recently by commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. Such models are increasingly seen as useful tools in support of the fingerprint identification process within or in addition to the ACE-V framework. This paper provides a critical review of recent statistical models from both a practical and theoretical perspective. This includes analysis of models of two different methodologies: Probability of Random Correspondence (PRC) models that focus on calculating probabilities of the occurrence of fingerprint configurations for a given population, and Likelihood Ratio (LR) models which use analysis of corresponding features of fingerprints to derive a likelihood value representing the evidential weighting for a potential source.