175 resultados para Media Events


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les instabilités engendrées par des gradients de densité interviennent dans une variété d'écoulements. Un exemple est celui de la séquestration géologique du dioxyde de carbone en milieux poreux. Ce gaz est injecté à haute pression dans des aquifères salines et profondes. La différence de densité entre la saumure saturée en CO2 dissous et la saumure environnante induit des courants favorables qui le transportent vers les couches géologiques profondes. Les gradients de densité peuvent aussi être la cause du transport indésirable de matières toxiques, ce qui peut éventuellement conduire à la pollution des sols et des eaux. La gamme d'échelles intervenant dans ce type de phénomènes est très large. Elle s'étend de l'échelle poreuse où les phénomènes de croissance des instabilités s'opèrent, jusqu'à l'échelle des aquifères à laquelle interviennent les phénomènes à temps long. Une reproduction fiable de la physique par la simulation numérique demeure donc un défi en raison du caractère multi-échelles aussi bien au niveau spatial et temporel de ces phénomènes. Il requiert donc le développement d'algorithmes performants et l'utilisation d'outils de calculs modernes. En conjugaison avec les méthodes de résolution itératives, les méthodes multi-échelles permettent de résoudre les grands systèmes d'équations algébriques de manière efficace. Ces méthodes ont été introduites comme méthodes d'upscaling et de downscaling pour la simulation d'écoulements en milieux poreux afin de traiter de fortes hétérogénéités du champ de perméabilité. Le principe repose sur l'utilisation parallèle de deux maillages, le premier est choisi en fonction de la résolution du champ de perméabilité (grille fine), alors que le second (grille grossière) est utilisé pour approximer le problème fin à moindre coût. La qualité de la solution multi-échelles peut être améliorée de manière itérative pour empêcher des erreurs trop importantes si le champ de perméabilité est complexe. Les méthodes adaptatives qui restreignent les procédures de mise à jour aux régions à forts gradients permettent de limiter les coûts de calculs additionnels. Dans le cas d'instabilités induites par des gradients de densité, l'échelle des phénomènes varie au cours du temps. En conséquence, des méthodes multi-échelles adaptatives sont requises pour tenir compte de cette dynamique. L'objectif de cette thèse est de développer des algorithmes multi-échelles adaptatifs et efficaces pour la simulation des instabilités induites par des gradients de densité. Pour cela, nous nous basons sur la méthode des volumes finis multi-échelles (MsFV) qui offre l'avantage de résoudre les phénomènes de transport tout en conservant la masse de manière exacte. Dans la première partie, nous pouvons démontrer que les approximations de la méthode MsFV engendrent des phénomènes de digitation non-physiques dont la suppression requiert des opérations de correction itératives. Les coûts de calculs additionnels de ces opérations peuvent toutefois être compensés par des méthodes adaptatives. Nous proposons aussi l'utilisation de la méthode MsFV comme méthode de downscaling: la grille grossière étant utilisée dans les zones où l'écoulement est relativement homogène alors que la grille plus fine est utilisée pour résoudre les forts gradients. Dans la seconde partie, la méthode multi-échelle est étendue à un nombre arbitraire de niveaux. Nous prouvons que la méthode généralisée est performante pour la résolution de grands systèmes d'équations algébriques. Dans la dernière partie, nous focalisons notre étude sur les échelles qui déterminent l'évolution des instabilités engendrées par des gradients de densité. L'identification de la structure locale ainsi que globale de l'écoulement permet de procéder à un upscaling des instabilités à temps long alors que les structures à petite échelle sont conservées lors du déclenchement de l'instabilité. Les résultats présentés dans ce travail permettent d'étendre les connaissances des méthodes MsFV et offrent des formulations multi-échelles efficaces pour la simulation des instabilités engendrées par des gradients de densité. - Density-driven instabilities in porous media are of interest for a wide range of applications, for instance, for geological sequestration of CO2, during which CO2 is injected at high pressure into deep saline aquifers. Due to the density difference between the C02-saturated brine and the surrounding brine, a downward migration of CO2 into deeper regions, where the risk of leakage is reduced, takes place. Similarly, undesired spontaneous mobilization of potentially hazardous substances that might endanger groundwater quality can be triggered by density differences. Over the last years, these effects have been investigated with the help of numerical groundwater models. Major challenges in simulating density-driven instabilities arise from the different scales of interest involved, i.e., the scale at which instabilities are triggered and the aquifer scale over which long-term processes take place. An accurate numerical reproduction is possible, only if the finest scale is captured. For large aquifers, this leads to problems with a large number of unknowns. Advanced numerical methods are required to efficiently solve these problems with today's available computational resources. Beside efficient iterative solvers, multiscale methods are available to solve large numerical systems. Originally, multiscale methods have been developed as upscaling-downscaling techniques to resolve strong permeability contrasts. In this case, two static grids are used: one is chosen with respect to the resolution of the permeability field (fine grid); the other (coarse grid) is used to approximate the fine-scale problem at low computational costs. The quality of the multiscale solution can be iteratively improved to avoid large errors in case of complex permeability structures. Adaptive formulations, which restrict the iterative update to domains with large gradients, enable limiting the additional computational costs of the iterations. In case of density-driven instabilities, additional spatial scales appear which change with time. Flexible adaptive methods are required to account for these emerging dynamic scales. The objective of this work is to develop an adaptive multiscale formulation for the efficient and accurate simulation of density-driven instabilities. We consider the Multiscale Finite-Volume (MsFV) method, which is well suited for simulations including the solution of transport problems as it guarantees a conservative velocity field. In the first part of this thesis, we investigate the applicability of the standard MsFV method to density- driven flow problems. We demonstrate that approximations in MsFV may trigger unphysical fingers and iterative corrections are necessary. Adaptive formulations (e.g., limiting a refined solution to domains with large concentration gradients where fingers form) can be used to balance the extra costs. We also propose to use the MsFV method as downscaling technique: the coarse discretization is used in areas without significant change in the flow field whereas the problem is refined in the zones of interest. This enables accounting for the dynamic change in scales of density-driven instabilities. In the second part of the thesis the MsFV algorithm, which originally employs one coarse level, is extended to an arbitrary number of coarse levels. We prove that this keeps the MsFV method efficient for problems with a large number of unknowns. In the last part of this thesis, we focus on the scales that control the evolution of density fingers. The identification of local and global flow patterns allows a coarse description at late times while conserving fine-scale details during onset stage. Results presented in this work advance the understanding of the Multiscale Finite-Volume method and offer efficient dynamic multiscale formulations to simulate density-driven instabilities. - Les nappes phréatiques caractérisées par des structures poreuses et des fractures très perméables représentent un intérêt particulier pour les hydrogéologues et ingénieurs environnementaux. Dans ces milieux, une large variété d'écoulements peut être observée. Les plus communs sont le transport de contaminants par les eaux souterraines, le transport réactif ou l'écoulement simultané de plusieurs phases non miscibles, comme le pétrole et l'eau. L'échelle qui caractérise ces écoulements est définie par l'interaction de l'hétérogénéité géologique et des processus physiques. Un fluide au repos dans l'espace interstitiel d'un milieu poreux peut être déstabilisé par des gradients de densité. Ils peuvent être induits par des changements locaux de température ou par dissolution d'un composé chimique. Les instabilités engendrées par des gradients de densité revêtent un intérêt particulier puisque qu'elles peuvent éventuellement compromettre la qualité des eaux. Un exemple frappant est la salinisation de l'eau douce dans les nappes phréatiques par pénétration d'eau salée plus dense dans les régions profondes. Dans le cas des écoulements gouvernés par les gradients de densité, les échelles caractéristiques de l'écoulement s'étendent de l'échelle poreuse où les phénomènes de croissance des instabilités s'opèrent, jusqu'à l'échelle des aquifères sur laquelle interviennent les phénomènes à temps long. Etant donné que les investigations in-situ sont pratiquement impossibles, les modèles numériques sont utilisés pour prédire et évaluer les risques liés aux instabilités engendrées par les gradients de densité. Une description correcte de ces phénomènes repose sur la description de toutes les échelles de l'écoulement dont la gamme peut s'étendre sur huit à dix ordres de grandeur dans le cas de grands aquifères. Il en résulte des problèmes numériques de grande taille qui sont très couteux à résoudre. Des schémas numériques sophistiqués sont donc nécessaires pour effectuer des simulations précises d'instabilités hydro-dynamiques à grande échelle. Dans ce travail, nous présentons différentes méthodes numériques qui permettent de simuler efficacement et avec précision les instabilités dues aux gradients de densité. Ces nouvelles méthodes sont basées sur les volumes finis multi-échelles. L'idée est de projeter le problème original à une échelle plus grande où il est moins coûteux à résoudre puis de relever la solution grossière vers l'échelle de départ. Cette technique est particulièrement adaptée pour résoudre des problèmes où une large gamme d'échelle intervient et évolue de manière spatio-temporelle. Ceci permet de réduire les coûts de calculs en limitant la description détaillée du problème aux régions qui contiennent un front de concentration mobile. Les aboutissements sont illustrés par la simulation de phénomènes tels que l'intrusion d'eau salée ou la séquestration de dioxyde de carbone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Anaesthesia Databank Switzerland (ADS) is a voluntary data registry introduced in 1996. Its ultimate goal is to promote quality in anaesthesiology. METHODS: The ADS registry analyses routinely recorded adverse events and provides benchmark comparisons between anaesthesia departments. Data collection comprises a set of 31 variables organised into three modules, one mandatory and two optional. RESULTS: In 2010, the database included 2,158,735 anaesthetic procedures. Over time, the proportions of older patients have increased, the largest group being aged 50-64 years. The percentage of patients with American Society of Anesthesiologists (ASA) status 1 has decreased while the percentage of ASA status 2 or 3 patients has increased. The most frequent comorbidities recorded were hypertension (21%), smoking (16%), allergy (15%) and obesity (12%). Between 1996 and 2010, 125,579 adverse events were recorded, of which 34% were cardiovascular, 7% respiratory, 39% technical and 20% non-specific. The most severe events were resuscitation (50%), oliguria (22%), myocardial ischaemia (17%) and haemorrhage (10%). CONCLUSION: Routine ADS data collection contributes to the monitoring of trends in anaesthesia care in Switzerland. The ADS system has proved to be usable in daily practice, although this remains a constant challenge that is highly dependent on local quality management and quality culture. Nevertheless, success in developing routine regular feedback to users to initiate discussions about anaesthetic events would most likely help strengthen departmental culture regarding safety and quality of care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Serum-free aggregating brain cell cultures are free-floating three-dimensional primary cell cultures able to reconstitute spontaneously a histotypic brain architecture to reproduce critical steps of brain development and to reach a high level of structural and functional maturity. This culture system offers, therefore, a unique model for neurotoxicity testing both during the development and at advanced cellular differentiation, and the high number of aggregates available combined with the excellent reproducibility of the cultures facilitates routine test procedures. This chapter presents a detailed description of the preparation, maintenance, and use of these cultures for neurotoxicity studies and a comparison of the developmental characteristics between cultures derived from the telencephalon and cultures derived from the whole brain. For culture preparation, mechanically dissociated embryonic brain tissue is used. The initial cell suspension, composed of neural stem cells, neural progenitor cells, immature postmitotic neurons, glioblasts, and microglial cells, is kept in a serum-free, chemically defined medium under continuous gyratory agitation. Spherical aggregates form spontaneously and are maintained in suspension culture for several weeks. Within the aggregates, the cells rearrange and mature, reproducing critical morphogenic events, such as migration, proliferation, differentiation, synaptogenesis, and myelination. For experimentation, replicate cultures are prepared by the randomization of aggregates from several original flasks. The high yield and reproducibility of the cultures enable multiparametric endpoint analyses, including "omics" approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the media coverage of parental leave policies (parental and paternity leaves) in Swiss French-speaking press articles from 1999 to 2009. Switzerland is one of the rare European countries which has no statutory parental or paternity leave. The aim is to describe the mediatisation of these policies and to analyse the arguments in favour and against their implementation. We investigate the status of a fertility frame - the mobilisation of discourse relating to fertility issues - among the various arguments used to justify or reject parental leave policies. We proceed with a content analysis of 579 press articles, as well as a frame analysis on a subset in which parental leave policies are the central theme (N=206). Results show that paternity leave is the predominant public issue addressed in the dataset. A mediatisation peak was reached in 2007, following an initiative of a member of the Federal executive to implement a short paternity leave. Parental leave policies are predominantly represented in a positive light. The main positive frame is economic, in which leaves are represented as serving the interests of companies. Involved fatherhood and gender equality are also frequently mentioned as positive frames. The fertility frame is only moderately used in articles covering Swiss news on paternity leaves. Conversely, the fertility frame is largely mobilised in articles covering parental leave in other countries. We discuss some interpretations of this discrepancy and suggest future avenues of research on parental leave policies in Switzerland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: There is no strong evidence that all ischaemic stroke types are associated with high cardiovascular risk. Our aim was to investigate whether all ischaemic stroke types are associated with high cardiovascular risk. METHODS: All consecutive patients with ischaemic stroke registered in the Athens Stroke Registry between 1 January 1993 and 31 December 2010 were categorized according to the TOAST classification and were followed up for up to 10 years. Outcomes assessed were cardiovascular and all-cause mortality, myocardial infarction, stroke recurrence, and a composite cardiovascular outcome consisting of myocardial infarction, angina pectoris, acute heart failure, sudden cardiac death, stroke recurrence and aortic aneurysm rupture. The Kaplan-Meier product limit method was used to estimate the probability of each end-point in each patient group. Cox proportional hazards models were used to determine the independent covariates of each end-point. RESULTS: Two thousand seven hundred and thirty patients were followed up for 48.1 ± 41.9 months. The cumulative probabilities of 10-year cardiovascular mortality in patients with cardioembolic stroke [46.6%, 95% confidence interval (CI) 40.6-52.8], lacunar stroke (22.1%, 95% CI 16.2-28.0) or undetermined stroke (35.2%, 95% CI 27.8-42.6) were either similar to or higher than those of patients with large-artery atherosclerotic stroke (LAA) (28.7%, 95% CI 22.4-35.0). Compared with LAA, all other TOAST types had a higher probability of 10-year stroke recurrence. In Cox proportional hazards analysis, compared with patients with LAA, patients with any other stroke type were associated with similar or higher risk for the outcomes of overall mortality, cardiovascular mortality, stroke recurrence and composite cardiovascular outcome. CONCLUSIONS: Large-artery atherosclerotic stroke and cardioembolic stroke are associated with the highest risk for future cardiovascular events, with the latter carrying at least as high a risk as LAA stroke.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the prevalence of cardiovascular (CV) risk factors in Seychelles, a middle-income African country, and compare the cost-effectiveness of single-risk-factor management (treating individuals with arterial blood pressure >/= 140/90 mmHg and/or total serum cholesterol >/= 6.2 mmol/l) with that of management based on total CV risk (treating individuals with a total CV risk >/= 10% or >/= 20%).METHODS: CV risk factor prevalence and a CV risk prediction chart for Africa were used to estimate the 10-year risk of suffering a fatal or non-fatal CV event among individuals aged 40-64 years. These figures were used to compare single-risk-factor management with total risk management in terms of the number of people requiring treatment to avert one CV event and the number of events potentially averted over 10 years. Treatment for patients with high total CV risk (>/= 20%) was assumed to consist of a fixed-dose combination of several drugs (polypill). Cost analyses were limited to medication.FINDINGS: A total CV risk of >/= 10% and >/= 20% was found among 10.8% and 5.1% of individuals, respectively. With single-risk-factor management, 60% of adults would need to be treated and 157 cardiovascular events per 100 000 population would be averted per year, as opposed to 5% of adults and 92 events with total CV risk management. Management based on high total CV risk optimizes the balance between the number requiring treatment and the number of CV events averted.CONCLUSION: Total CV risk management is much more cost-effective than single-risk-factor management. These findings are relevant for all countries, but especially for those economically and demographically similar to Seychelles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To investigate the association between common carotid artery intima-media thickness (cIMT) and exposure to secondhand smoke (SHS) in children. Methods: Data were available at baseline in the Quebec Adiposity and Lifestyle investigation in Youth (QUALITY) study, an ongoing longitudinal investigation of Caucasian children aged 8e10 years at cohort inception, who had at least one obese parent. Data on exposure to parents, siblings and friends smoking were collected in interviewer-administered child, and self-report parent questionnaires. Blood cotinine was measured with a high sensitivity ELISA. cIMTwas measured by ultrasound. The association between blood cotinine and cIMT was investigated in multivariable linear regression analyses controlling for age, body mass index, and child smoking status. Results: Mean (SD) cIMT (0.5803 (0.04602)) did not differ across age or sex. Overall 26%, 6% and 3% of children were exposed to parents, siblings and friends smoking, respectively. Cotinine ranged from 0.13 ng/ml to 7.38 ng/ml (median (IQR)¼0.18 ng/ml)). Multivariately, a 1 ng/ml increase in cotinine was associated with a 0.090 mm increase in cIMT (p¼0.034). Conclusion: In children as young as age 8e10 years, exposure to SHS relates to cIMT, a marker of pre-clinical atherosclerosis. Given the wide range of health effects of SHS, increased public health efforts are needed to reduced exposure among children in homes an private vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To retrospectively assess the frequency of adverse events related to percutaneous preoperative portal vein embolization (PPVE). MATERIALS AND METHODS: Institutional review board did not require its approval or patient informed consent for this study. The adverse events that occurred during PPVE or until planned hepatic surgery was performed or cancelled were retrospectively obtained from clinical, imaging, and laboratory data files in 188 patients (109 male and 79 female patients; mean age, 60 years; range, 16-78 years). Liver resection was planned for metastases (n = 137), hepatocarcinoma (n = 31), cholangiocarcinoma (n = 15), fibrolamellar hepatoma (n = 1), and benign disease (n = 4). PPVE was performed with a single-lumen 5-F catheter and a contralateral approach with n-butyl cyanoacrylate mixed with iodized oil as the main embolic agent. The rate of complications in patients with cirrhosis was compared with that in patients without cirrhosis by using the chi(2) test. RESULTS: Adverse events occurred in 24 (12.8%) of 188 patients, including 12 complications and 12 incidental imaging findings. Complications included thrombosis of the portal vein feeding the future remnant liver (n = 1); migration of emboli in the portal vein feeding the future remnant liver, which necessitated angioplasty (n = 2); hemoperitoneum (n = 1); rupture of a metastasis in the gallbladder (n = 1); transitory hemobilia (n = 1); and transient liver failure (n = 6). Incidental findings were migration of small emboli in nontargeted portal branches (n = 10) and subcapsular hematoma (n = 2). Among the 187 patients in whom PPVE was technically successful, there was a significant difference (P < .001) between the occurrence of liver failure after PPVE in patients with cirrhosis (five of 30) and those without (one of 157). Sixteen liver resections were cancelled due to cancer progression (n = 12), insufficient hypertrophy of the nonembolized liver (n = 3), and complete portal thrombosis (n = 1). CONCLUSION: PPVE is a safe adjuvant technique for hypertrophy of the initially insufficient liver reserve. Post-PPVE transient liver failure is more common in patients with cirrhosis than in those without cirrhosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An Actively Heated Fiber Optics (AHFO) method to estimate soil moisture is tested and the analysis technique improved on. The measurements were performed in a lysimeter uniformly packed with loam soil with variable water content profiles. In the first meter of the soil profi le, 30 m of fiber optic cable were installed in a 12 loops coil. The metal sheath armoring the fiber cable was used as an electrical resistance heater to generate a heat pulse, and the soil response was monitored with a Distributed Temperature Sensing (DTS) system. We study the cooling following three continuous heat pulses of 120 s at 36 W m(-1) by means of long-time approximation of radial heat conduction. The soil volumetric water contents were then inferred from the estimated thermal conductivities through a specifically calibrated model relating thermal conductivity and volumetric water content. To use the pre-asymptotic data we employed a time correction that allowed the volumetric water content to be estimated with a precision of 0.01-0.035 (m(3) m(-3)). A comparison of the AHFO measurements with soil-moisture measurements obtained with calibrated capacitance-based probes gave good agreement for wetter soils [discrepancy between the two methods was less than 0.04 (m(3) m(-3))]. In the shallow drier soils, the AHFO method underestimated the volumetric water content due to the longertime required for the temperature increment to become asymptotic in less thermally conductive media [discrepancy between the two methods was larger than 0.1 (m(3) m(-3))]. The present work suggests that future applications of the AHFO method should include longer heat pulses, that longer heating and cooling events are analyzed, and, temperature increments ideally be measured with higher frequency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first experimental evidence for the development of polarized CD4+ Th1 and Th2 responses in vivo has been obtained using the murine model of infection with Leishmania major, an intracellular parasite of macrophages in their vertebrate host. Genetically determined resistance and susceptibility to infection with this parasite have been clearly demonstrated to result from the development of polarized Th1 and Th2 responses, respectively. Using this model system, the dominant role of cytokines in the induction of polarized CD4+ responses has been validated in vivo. The requisite role of IL-4 in mediating both Th2 differentiation and susceptibility to infection in BALB/c mice has directed interest towards the search for evidence of IL-4 production early after infection and identification of its cellular source. We have been able to demonstrate a burst of IL-4 production in susceptible BALB/c mice within the first day of infection with L. major and could establish that this rapidly produced IL-4 instructed Th2 lineage commitment of subsequently activated CD4+ T cells and stabilized this commitment by downregulating IL-12 Rbeta2 chain expression, resulting in susceptibility to infection. Strikingly, this early IL-4 response to infection resulted from the cognate recognition of a single epitope in a distinctive antigen, LACK, from this complex microorganism by a restricted population of CD4+ T cells that express Vbeta4-Valpha8 T cell receptors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: With the aging of the population, the heart failure (HF) incidence and prevalence trends are expected to significantly worsen unless concentrated prevention efforts are undertaken. ECG abnormalities are common in the elderly but data are limited for their association with HF risk. Objective: To assess whether baseline ECG abnormalities or dynamic changes are associated with an increased risk of HF. Method: A prospective cohort study of 2915 participants aged 70 to 79 years without a preexisting HF followed for a median period of 11.4 (IQR 7.0-11.7) years from the Health Aging and Body Composition study. The Minnesota Code was used to define major and minor ECG abnormalities at baseline and at 4-year. Main outcome measure was adjudicated incident HF events. Using Cox models, the (1) the association between ECG abnormalities and incident HF and (2) incremental value of adding ECG to the Health ABC HF Risk Score, was assessed. Results: At baseline, 380 participants (13.0%) had minor and 620 (21.3%) had major ECG abnormalities. During follow-up, 485 (16.6%) participants developed incident HF. After adjusting for the eight clinical variables in the Health ABC HF Risk Score, the hazard ratio (HR) was 1.27 (95% confidence interval [CI] 0.96-1.68) for minor and 1.99 (CI 1.61-2.44) for major ECG abnormalities (P for trend <0.001) compared to no ECG abnormalities. The association did not change according to presence of baseline CHD. At 4-year, 263 participants developed new and 549 had persistent abnormalities and both were associated with increased HF risk (HR = 1.94, CI 1.38-2.72 for new and HR=2.35, CI 1.82-3.02 for persistent compared to no ECG abnormalities). Baseline ECG correctly reclassified 10.6% of overall participants across the categories of the Health ABC HF Risk Score. Conclusion: Among older adults, baseline ECG abnormalities and changes in them over time are common; both are associated with an increased risk of HF. Whether ECG should be incorporated in routine screening of older adults should be evaluated in randomized controlled trials.