117 resultados para Models and Principles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the beginning of the 1990s, the concept of "European integration" could still be said to be fairly unambiguous. Nowadays, it has become plural and complex almost to the point of unintelligibility. This is due, of course, to the internal differentiation of EU membership, with several Member States pulling out of key integrative projects such as establishing an area without frontiers, the "Schengen" area, and a common currency. But this is also due to the differentiated extension of key integrative projects to European non-EU countries - Schengen is again a case in point. Such processes of "integration without membership", the focus of the present publication, are acquiring an ever-growing topicality both in the political arena and in academia. International relations between the EU and its neighbouring countries are crucial for both, and their development through new agreements features prominently on the continent's political agenda. Over and above this aspect, the dissemination of EU values and standards beyond the Union's borders raises a whole host of theoretical and methodological questions, unsettling in some cases traditional conceptions of the autonomy and separation of national legal orders. This publication brings together the papers presented at the Integration without EU Membership workshop held in May 2008 at the EUI (Max Weber Programme and Department of Law). It aims to compare different models and experiences of integration between the EU, on the one hand, and those European countries that do not currently have an accession perspective on the other hand. In delimiting the geographical scope of the inquiry, so as to scale it down to manageable proportions, the guiding principles have been to include both the "Eastern" and "Western" neighbours of the EU, and to examine both structured frameworks of cooperation, such as the European Neighbourhood Policy and the European Economic Area, and bilateral relations developing on a more ad hoc basis. These principles are reflected in the arrangement of the papers, which consider in turn the positions of Ukraine, Russia, Norway, and Switzerland in European integration - current standing, perspectives for evolution, consequences in terms of the EU-ization of their respective legal orders1. These subjects are examined from several perspectives. We had the privilege of receiving contributions from leading practitioners and scholars from the countries concerned, from EU highranking officials, from prominent specialists in EU external relations law, and from young and talented researchers. We wish to thank them all here for their invaluable insights. We are moreover deeply indebted to Marise Cremona (EUI, Law Department, EUI) for her inspiring advice and encouragement, as well as to Ramon Marimon, Karin Tilmans, Lotte Holm, Alyson Price and Susan Garvin (Max Weber Programme, EUI) for their unflinching support throughout this project. A word is perhaps needed on the propriety and usefulness of the research concept embodied in this publication. Does it make sense to compare the integration models and experiences of countries as different as Norway, Russia, Switzerland, and Ukraine? Needless to say, this list of four evokes a staggering diversity of political, social, cultural, and economic conditions, and at least as great a diversity of approaches to European integration. Still, we would argue that such diversity only makes comparisons more meaningful. Indeed, while the particularities and idiosyncratic elements of each "model" of integration are fully displayed in the present volume, common themes and preoccupations run through the pages of every contribution: the difficulty in conceptualizing the finalité and essence of integration, which is evident in the EU today but which is greatly amplified for non-EU countries; the asymmetries and tradeoffs between integration and autonomy that are inherent in any attempt to participate in European integration from outside; the alteration of deeply seated legal concepts, and concepts about the law, that are already observable in the most integrated of the non-EU countries concerned. These issues are not transient or coincidental: they are inextricably bound up with the integration of non-EU countries in the EU project. By publishing this collection, we make no claim to have dealt with them in an exhaustive, still less in a definitive manner. Our ambition is more modest: to highlight the relevance of these themes, to place them more firmly on the scientific agenda, and to provide a stimulating basis for future research and reflection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Characterizing the risks posed by nanomaterials is extraordinarily complex because these materials can have a wide range of sizes, shapes, chemical compositions and surface modifications, all of which may affect toxicity. There is an urgent need for a testing strategy that can rapidly and efficiently provide a screening approach for evaluating the potential hazard of nanomaterials and inform the prioritization of additional toxicological testing where necessary. Predictive toxicity models could form an integral component of such an approach by predicting which nanomaterials, as a result of their physico-chemical characteristics, have potentially hazardous properties. Strategies for directing research towards predictive models and the ancillary benefits of such research are presented here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study was performed in an attempt to develop an in vitro integrated testing strategy (ITS) to evaluate drug-induced neurotoxicity. A number of endpoints were analyzed using two complementary brain cell culture models and an in vitro blood-brain barrier (BBB) model after single and repeated exposure treatments with selected drugs that covered the major biological, pharmacological and neuro-toxicological responses. Furthermore, four drugs (diazepam, cyclosporine A, chlorpromazine and amiodarone) were tested more in depth as representatives of different classes of neurotoxicants, inducing toxicity through different pathways of toxicity. The developed in vitro BBB model allowed detection of toxic effects at the level of BBB and evaluation of drug transport through the barrier for predicting free brain concentrations of the studied drugs. The measurement of neuronal electrical activity was found to be a sensitive tool to predict the neuroactivity and neurotoxicity of drugs after acute exposure. The histotypic 3D re-aggregating brain cell cultures, containing all brain cell types, were found to be well suited for OMICs analyses after both acute and long term treatment. The obtained data suggest that an in vitro ITS based on the information obtained from BBB studies and combined with metabolomics, proteomics and neuronal electrical activity measurements performed in stable in vitro neuronal cell culture systems, has high potential to improve current in vitro drug-induced neurotoxicity evaluation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Acute and chronic respiratory failure is one of the major and potentially life-threatening features in individuals with myotonic dystrophy type 1 (DM1). Despite several clinical demonstrations showing respiratory problems in DM1 patients, the mechanisms are still not completely understood. This study was designed to investigate whether the DMSXL transgenic mouse model for DM1 exhibits respiratory disorders and, if so, to identify the pathological changes underlying these respiratory problems. Using pressure plethysmography, we assessed the breathing function in control mice and DMSXL mice generated after large expansions of the CTG repeat in successive generations of DM1 transgenic mice. Statistical analysis of breathing function measurements revealed a significant decrease in the most relevant respiratory parameters in DMSXL mice, indicating impaired respiratory function. Histological and morphometric analysis showed pathological changes in diaphragmatic muscle of DMSXL mice, characterized by an increase in the percentage of type I muscle fibers, the presence of central nuclei, partial denervation of end-plates (EPs) and a significant reduction in their size, shape complexity and density of acetylcholine receptors, all of which reflect a possible breakdown in communication between the diaphragmatic muscles fibers and the nerve terminals. Diaphragm muscle abnormalities were accompanied by an accumulation of mutant DMPK RNA foci in muscle fiber nuclei. Moreover, in DMSXL mice, the unmyelinated phrenic afferents are significantly lower. Also in these mice, significant neuronopathy was not detected in either cervical phrenic motor neurons or brainstem respiratory neurons. Because EPs are involved in the transmission of action potentials and the unmyelinated phrenic afferents exert a modulating influence on the respiratory drive, the pathological alterations affecting these structures might underlie the respiratory impairment detected in DMSXL mice. Understanding mechanisms of respiratory deficiency should guide pharmaceutical and clinical research towards better therapy for the respiratory deficits associated with DM1.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Leptin is produced primarily by adipocytes. Although originally associated with the central regulation of satiety and energy metabolism, increasing evidence indicates that leptin may be an important factor for congestive heart faire (CHF). In the study, we aimed to test the hypothesis that leptin may influence CHF pathophysiology via a pathway of increasing body mass index (BMI). Methods: We studied 2,389 elderly participants aged 70 and older (M; 1161, F: 1228) without CHF and with serum leptin measures at the Health Aging, and Body Composition study. We analyzed the association between serum leptin level and risk of incident CHF using Cox hazard proportional regression models. Elevated leptin level was defined as more than the highest quartile (Q4) of leptin distribution in the total sample for each gender. Adjusted-covariates included demographic, behavior, lipid and inflammation variables (partially-adjusted models), and further included BMI (fully-adjusted models). Results: In a mean 9-year follow-up, 316 participants (13.2%) developed CHF. The partially-adjusted models indicated that men and women with elevated serum leptin levels (>=9.89 ng/ml in men and >=25 ng/ml in women) had significantly higher risks of developing CHF than those with leptin level of less than Q4. The adjusted hazard ratios (95%CI) for incident CHF was 1.49 (1.04 -2.13) in men and 1.71 (1.12 -2.58) in women. However, these associations became non-significant after adjustment for including BMI for each gender. The fully-adjusted hazard ratios (95%CI) were 1.43 (0.94 -2.18) in men and 1.24 (0.77-1.99) in women. Conclusion: Subjects with elevated leptin levels have a higher risk of CHF. The study supports the hypothesis that the influence of leptin level on risk of CHF may be through a pathway related to increasing BMI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Gas6 downregulates the activation state of macrophages and thereby their production of proinflammatory cytokines induced by various stimuli. We aimed to determine whether Gas6 is involved in sepsis. We measured Gas6 plasma levels in 13 healthy subjects, 29 patients with severe sepsis, and 18 patients with non-infectious inflammatory diseases. Gas6 level was higher in septic patients than in control groups (P 0.0001). The sensitivity and specificity of Gas6 levels to predict fatal outcome were 83% and 88%. We next investigated whether Gas6 affects cytokine production and outcome in experimental models of endotoxemia and peritonitis in wild-type (WT) and Gas6-/- mice. Circulating levels of Gas6 after LPS 25mg/kg i.p. peaked at 1 hour (P<0.001). Similarly, TNF- was higher in Gas6-/- than in WT mice 1 hour after LPS (P<0.05). Furthermore, 62 anti- and pro-inflammatory cytokines were quantified in plasma after LPS injection. Their levels were globally higher in Gas6-/- plasma after LPS, 47/62 cytokines being at least 50% higher in Gas6-/- than in WT plasma after 1 hour. Mortality induced by 25mg/kg LPS was 25% in WT versus 87% in Gas6-/- mice (P<0.05). LPS-induced mortality in Gas6 receptors Axl-/-, Tyro3-/- and Merkd was also enhanced when compared to WT mice (P<0.001). In peritonitis models (cecal ligation and puncture, CLP, and i.p. injection of E. coli), Gas6 plasma levels increased and remained elevated at least 24 hours. CLP increased mortality in Gas6-/- mice. Finally, we explored the role of Gas6 in LPS-treated macrophages. We found that Gas6 was released by LPS-stimulated WT macrophages and that Gas6-/- macrophages produced more TNF- and IL-6 than WT macrophages. Cytokine release by Gas6-/- macrophages was higher than by WT macrophages (cytokine array). Adjunction of recombinant Gas6 to the culture medium of Gas6-/- macrophages diminished the cytokine production to WT levels. In LPS-treated Gas6-/- macrophages, Akt and Erk1/2 phosphorylation was reduced whereas p38 and NF B activation was enhanced. Thus, in septic patients, elevated Gas6 levels were associated with fatal outcome. In mice, they raised in experimental endotoxemia and peritonitis models, and correlated also with sepsis severity. However, Gas6-/- mice survival in these models was reduced compared to WT. Gas6 secreted by macrophages in response to LPS activated Akt and restrained p38 and NF B activation, thereby dampening macrophage activation. Altogether these data suggest that, during endotoxemia, Gas6-/- mice phenotype resembles that of mice which have undergone PI3K inhibition, indicating that Gas6 is a major modulator of innate immunity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: Fragile X syndrome (FXS) is the most common inherited cause of intellectual disability. With no curative treatment available, current therapeutic approaches are aimed at symptom management. FXS is caused by silencing the FMR1 gene, which encodes FMRP; as loss of FMRP leads to the development of symptoms associated with FXS. Areas covered: In this evaluation, the authors examine the role of the metabotropic glutamate receptor 5 (mGluR5) in the pathophysiology of FXS, and its suitability as a target for rescuing the disease state. Furthermore, the authors review the evidence from preclinical studies of pharmacological interventions targeting mGluR5 in FXS. Lastly, the authors assess the findings from clinical studies in FXS, in particular the use of the Aberrant Behavior Checklist-Community Edition (ABC-C) and the recently developed ABC-C for FXS scale, as clinical endpoints to assess disease modification in this patient population. Expert opinion: There is cautious optimism for the successful treatment of the core behavioral and cognitive symptoms of FXS based on preclinical data in animal models and early studies in humans. However, the association between mGluR5-heightened responsiveness and the clinical phenotype in humans remains to be demonstrated. Many questions regarding the optimal treatment and outcome measures of FXS remain unanswered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For several years, all five medical faculties of Switzerland have embarked on a reform of their training curricula for two reasons: first, according to a new federal act issued in 2006 by the administration of the confederation, faculties needed to meet international standards in terms of content and pedagogic approaches; second, all Swiss universities and thus all medical faculties had to adapt the structure of their curriculum to the frame and principles which govern the Bologna process. This process is the result of the Bologna Declaration of June 1999 which proposes and requires a series of reforms to make European Higher Education more compatible and comparable, more competitive and more attractive for Europeans students. The present paper reviews some of the results achieved in the field, focusing on several issues such as the shortage of physicians and primary care practitioners, the importance of public health, community medicine and medical humanities, and the implementation of new training approaches including e-learning and simulation. In the future, faculties should work on several specific challenges such as: students' mobility, the improvement of students' autonomy and critical thinking as well as their generic and specific skills and finally a reflection on how to improve the attractiveness of the academic career, for physicians of both sexes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In his timely article, Cherniss offers his vision for the future of "Emotional Intelligence" (EI). However, his goal of clarifying the concept by distinguishing definitions from models and his support for "Emotional and Social Competence" (ESC) models will, in our opinion, not make the field advance. To be upfront, we agree that emotions are important for effective decision-making, leadership, performance and the like; however, at this time, EI and ESC have not yet demonstrated incremental validity over and above IQ and personality tests in meta-analyses (Harms & Credé, 2009; Van Rooy & Viswesvaran, 2004). If there is a future for EI, we see it in the ability model of Mayer, Salovey and associates (e.g, Mayer, Caruso, & Salovey, 2000), which detractors and supporters agree holds the most promise (Antonakis, Ashkanasy, & Dasborough, 2009; Zeidner, Roberts, & Matthews, 2008). With their use of quasi-objective scoring measures, the ability model grounds EI in existing frameworks of intelligence, thus differentiating itself from ESC models and their self-rated trait inventories. In fact, we do not see the value of ESC models: They overlap too much with current personality models to offer anything new for science and practice (Zeidner, et al., 2008). In this commentary we raise three concerns we have with Cherniss's suggestions for ESC models: (1) there are important conceptual problems in both the definition of ESC and the distinction of ESC from EI; (2) Cherniss's interpretation of neuroscience findings as supporting the constructs of EI and ESC is outdated, and (3) his interpretation of the famous marshmallow experiment as indicating the existence of ESCs is flawed. Building on the promise of ability models, we conclude by providing suggestions to improve research in EI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: Several population pharmacokinetic (PPK) and pharmacokinetic-pharmacodynamic (PK-PD) analyses have been performed with the anticancer drug imatinib. Inspired by the approach of meta-analysis, we aimed to compare and combine results from published studies in a useful way - in particular for improving the clinical interpretation of imatinib concentration measurements in the scope of therapeutic drug monitoring (TDM). Methods: Original PPK analyses and PK-PD studies (PK surrogate: trough concentration Cmin; PD outcomes: optimal early response and specific adverse events) were searched systematically on MEDLINE. From each identified PPK model, a predicted concentration distribution under standard dosage was derived through 1000 simulations (NONMEM), after standardizing model parameters to common covariates. A "reference range" was calculated from pooled simulated concentrations in a semi-quantitative approach (without specific weighting) over the whole dosing interval. Meta-regression summarized relationships between Cmin and optimal/suboptimal early treatment response. Results: 9 PPK models and 6 relevant PK-PD reports in CML patients were identified. Model-based predicted median Cmin ranged from 555 to 1388 ng/ml (grand median: 870 ng/ml and inter-quartile range: 520-1390 ng/ml). The probability to achieve optimal early response was predicted to increase from 60 to 85% from 520 to 1390 ng/ml across PK-PD studies (odds ratio for doubling Cmin: 2.7). Reporting of specific adverse events was too heterogeneous to perform a regression analysis. The general frequency of anemia, rash and fluid retention increased however consistently with Cmin, but less than response probability. Conclusions: Predicted drug exposure may differ substantially between various PPK analyses. In this review, heterogeneity was mainly attributed to 2 "outlying" models. The established reference range seems to cover the range where both good efficacy and acceptable tolerance are expected for most patients. TDM guided dose adjustment appears therefore justified for imatinib in CML patients. Its usefulness remains now to be prospectively validated in a randomized trial.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

On December 4th 2007, a 3-Mm3 landslide occurred along the northwestern shore of Chehalis Lake. The initiation zone is located at the intersection of the main valley slope and the northern sidewall of a prominent gully. The slope failure caused a displacement wave that ran up to 38 m on the opposite shore of the lake. The landslide is temporally associated with a rain-on-snow meteorological event which is thought to have triggered it. This paper describes the Chehalis Lake landslide and presents a comparison of discontinuity orientation datasets obtained using three techniques: field measurements, terrestrial photogrammetric 3D models and an airborne LiDAR digital elevation model to describe the orientation and characteristics of the five discontinuity sets present. The discontinuity orientation data are used to perform kinematic, surface wedge limit equilibrium and three-dimensional distinct element analyses. The kinematic and surface wedge analyses suggest that the location of the slope failure (intersection of the valley slope and a gully wall) has facilitated the development of the unstable rock mass which initiated as a planar sliding failure. Results from the three-dimensional distinct element analyses suggest that the presence, orientation and high persistence of a discontinuity set dipping obliquely to the slope were critical to the development of the landslide and led to a failure mechanism dominated by planar sliding. The three-dimensional distinct element modelling also suggests that the presence of a steeply dipping discontinuity set striking perpendicular to the slope and associated with a fault exerted a significant control on the volume and extent of the failed rock mass but not on the overall stability of the slope.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.