80 resultados para Counter-insurgency
em Université de Lausanne, Switzerland
Resumo:
Recent studies have led to the discovery of a mediator that acts as an endogenous counter-regulator of glucocorticoid action within the immune system. Isolated as a product of anterior pituitary cells, this protein was found to have the sequence of macrophage migration inhibitory factor (MIF), one of the first cytokine activities to be described. Macrophages and T cells release MIF in response both to various inflammatory stimuli and upon incubation with low concentrations of glucocorticoids. The glucocorticoid-induced secretion of MIF is tightly regulated and decreases at high, anti-inflammatory steroid concentrations. Once secreted, MIF "overrides" the anti-inflammatory and immunosuppressive effects of steroids on macrophage and T-cell cytokine production. The physiological role of MIF thus appears to be to counter-balance steroid inhibition of the inflammatory response. Anti-MIF antibodies fully protect animals from experimentally induced gram-negative or gram-positive septic shock, an effect that may be the result of the increased anti-inflammatory effects of glucocorticoids after neutralization of endogenous MIF. Anti-MIF therapeutic strategies are presently under development and may prove to be a means to modulate cytokine production in septic shock as well as in other inflammatory disease states.
Resumo:
Making statin available over the counter is one of the measures proposed to correct its underuse. Since May 2004, simvastatin 10 mg is sold over the counter in Great Britain. But uncertainties persist concerning the efficacy of statin in primary prevention and at a 10 mg dose. Finally, there is a risk of side effects and drug interactions. Beyond the correction of statin underuse and the hope of coronary heart disease mortality reduction, the British decision highlighted the will to give individuals a sense of responsibility concerning their health and its financial cost. Anyway, the benefit of switching statin from prescription to over the counter should be experimentally evaluated before its introduction.
Resumo:
Whole-body counting is a technique of choice for assessing the intake of gamma-emitting radionuclides. An appropriate calibration is necessary, which is done either by experimental measurement or by Monte Carlo (MC) calculation. The aim of this work was to validate a MC model for calibrating whole-body counters (WBCs) by comparing the results of computations with measurements performed on an anthropomorphic phantom and to investigate the effect of a change in phantom's position on the WBC counting sensitivity. GEANT MC code was used for the calculations, and an IGOR phantom loaded with several types of radionuclides was used for the experimental measurements. The results show a reasonable agreement between measurements and MC computation. A 1-cm error in phantom positioning changes the activity estimation by >2%. Considering that a 5-cm deviation of the positioning of the phantom may occur in a realistic counting scenario, this implies that the uncertainty of the activity measured by a WBC is ∼10-20%.
Resumo:
A joint project between the Paul Scherrer Institut (PSI) and the Institute of Radiation Physics was initiated to characterise the PSI whole body counter in detail through measurements and Monte Carlo simulation. Accurate knowledge of the detector geometry is essential for reliable simulations of human body phantoms filled with known activity concentrations. Unfortunately, the technical drawings provided by the manufacturer are often not detailed enough and sometimes the specifications do not agree with the actual set-up. Therefore, the exact detector geometry and the position of the detector crystal inside the housing were determined through radiographic images. X-rays were used to analyse the structure of the detector, and (60)Co radiography was employed to measure the core of the germanium crystal. Moreover, the precise axial alignment of the detector within its housing was determined through a series of radiographic images with different incident angles. The hence obtained information enables us to optimise the Monte Carlo geometry model and to perform much more accurate and reliable simulations.
Resumo:
This paper examines argumentative talk-in-interaction in the workplace. It focuses on counter-argumentative references, which consist of the various resources that the opponent uses to refer to the origin/source of his/her opposition, namely the confronted position and the person who expressed it. Particular attention is paid to the relationship - in terms of sequential positioning and referential extension - between reported speech, polyphony, pointing gestures and shifts in gaze direction. Data are taken from workplace management meetings that have been recorded in New Zealand by the Language in the Workplace Project.
Resumo:
Fifty-three patients with histologically proven carcinoma were injected with highly purified [131I]-labeled goat antibodies or fragments of antibodies against carcinoembryonic antigen (CEA). Each patient was tested by external photoscanning 4, 24, 36 and 48 h after injection. In 22 patients (16 of 38 injected with intact antibodies, 5 of 13 with F(ab')2 fragments and 1 of 2 with Fab' fragments), an increased concentration of 131I radioactivity corresponding to the previously known tumor location was detected by photoscanning 36-48 h after injection. Blood pool and secreted radioactivity was determined in all patients by injecting 15 min before scanning, [99mTc]-labeled normal serum albumin and free 99mTc04-. The computerized subtraction of 99mTc from 131I radioactivity enhanced the definition of tumor localization in the 22 positive patients. However, in spite of the computerized subtraction, interpretation of the scans remained doubtful for 12 patients and was entirely negative for 19 additional patients. In order to provide a more objective evaluation for the specificity of the tumor localization of antibodies, 14 patients scheduled for tumor resection were injected simultaneously with [131I]-labeled antibodies or fragments and with [125I]-labeled normal goat IgG or fragments. After surgery, the radioactivity of the two isotopes present either in tumor or adjacent normal tissues was measured in a dual channel scintillation counter. The results showed that the antibodies or their fragments were 2-4 times more concentrated in the tumor than in the normal tissues. In addition, it was shown that the injected antibodies formed immune complexes with circulating CEA and that the amount of immune complexes detectable in serum was roughly proportional to the level of circulating CEA.
Resumo:
Most research in gout has concentrated on the proinflammatory mechanisms to explain the inflammation that is generated when leucocytes are in contact with monosodium urate crystals. However, the episodic nature of gout and the absence of inflammation even when crystals are present suggest that there are natural counter-regulatory mechanisms to limit the inflammatory response. Gagné and colleagues showed that myeloid inhibitory C-type lectin, a C-type lectin inhibitory receptor expressed on neutrophils, modulates monosodium urate-induced neutrophil responses in vitro.
Resumo:
BACKGROUND: Identification of a Primary Care Physician (PCP) by older patients is considered as essential for the coordination of care, but the extent to which identified PCPs are general practitioners or specialists is unknown. This study described older patients' experiences with their PCP and tested the hypothesis of differences between patients who identify a specialist as their PCP (SP PCP) and those who turn to a general practitioner (GP PCP). METHODS: In 2012, a cross-sectional postal survey on care was conducted in the 68+ year old population of the canton of Vaud. Data was provided by 2,276 participants in the ongoing Lausanne cohort 65+ (Lc65+), a study of those born between 1934 and 1943, and by 998 persons from an additional sample drawn to include the population outside of Lausanne or born before 1934. RESULTS: Participants expressed favourable perceptions, at rates exceeding 75% for most items. However, only 38% to 51% responded positively for out-of-hours availability, easy access and at home visits, likelihood of prescribing expensive medication if needed, and doctors' awareness of over-the-counter drugs. 12.0% had an SP PCP, in 95.9% specialised in a discipline implying training in internal medicine. Bivariate and multivariate analyses did not result in significant differences between GP and SP PCPs regarding perceptions of accessibility/availability, doctor-patient relationship, information and continuity of care, prevention, spontaneous use of the emergency department or ambulatory care utilisation. CONCLUSIONS: Experiences of old patients were mostly positive despite some lack in reported hearing, memory testing, and colorectal cancer screening. We found no differences between GP and SP PCP groups.
Resumo:
Background: Previous studies reported an increase of mean platelet volume (MPV) in patients with acute ischemic stroke. However, its correlation with stroke severity has not been investigated. Moreover, studies on the association of MPV with functional outcome yielded inconsistent results. Methods: We included all consecutive ischemic stroke patients admitted to CHUV (Centre Hospitalier Universitaire Vaudois) Neurology Service within 24 h after stroke onset who had MPV measured on admission. The association of MPV with stroke severity (NIHSS score at admission and at 24 h) and outcome (Rankin Scale score at 3 and 12 months) was analyzed in univariate analysis. The chi(2) test was performed to compare the frequency of minor strokes (NIHSS score </=4) and good functional outcome (Rankin Scale score </=2) across MPV quartiles. The ANOVA test was used to compare MPV between stroke subtypes according to the TOAST classification. Student's two-tailed unpaired t test was performed to compare MPV between lacunar and nonlacunar strokes. MPV was generated at admission by the Sysmex XE-2100 automated cell counter (Sysmex Corporation, Kobe, Japan) from EDTA blood samples. Results: There was no significant difference in the frequency of minor strokes (p = 0.46) and good functional outcome (p = 0.06) across MPV quartiles. MPV was not associated with stroke severity or outcome in univariate analysis. There was no significant difference in MPV between stroke subtypes according to the TOAST classification (p = 0.173) or between lacunar and nonlacunar strokes (10.50 +/- 0.91 vs. 10.40 +/- 0.81 fl, p = 0.322). Conclusions: MPV, assessed within 24 h after ischemic stroke onset, is not associated with stroke severity or functional outcome.
Resumo:
OBJECTIVEEvaluate whether healthy or diabetic adult mice can tolerate an extreme loss of pancreatic α-cells and how this sudden massive depletion affects β-cell function and blood glucose homeostasis.RESEARCH DESIGN AND METHODSWe generated a new transgenic model allowing near-total α-cell removal specifically in adult mice. Massive α-cell ablation was triggered in normally grown and healthy adult animals upon diphtheria toxin (DT) administration. The metabolic status of these mice was assessed in 1) physiologic conditions, 2) a situation requiring glucagon action, and 3) after β-cell loss.RESULTSAdult transgenic mice enduring extreme (98%) α-cell removal remained healthy and did not display major defects in insulin counter-regulatory response. We observed that 2% of the normal α-cell mass produced enough glucagon to ensure near-normal glucagonemia. β-Cell function and blood glucose homeostasis remained unaltered after α-cell loss, indicating that direct local intraislet signaling between α- and β-cells is dispensable. Escaping α-cells increased their glucagon content during subsequent months, but there was no significant α-cell regeneration. Near-total α-cell ablation did not prevent hyperglycemia in mice having also undergone massive β-cell loss, indicating that a minimal amount of α-cells can still guarantee normal glucagon signaling in diabetic conditions.CONCLUSIONSAn extremely low amount of α-cells is sufficient to prevent a major counter-regulatory deregulation, both under physiologic and diabetic conditions. We previously reported that α-cells reprogram to insulin production after extreme β-cell loss and now conjecture that the low α-cell requirement could be exploited in future diabetic therapies aimed at regenerating β-cells by reprogramming adult α-cells.
Resumo:
This thesis focuses on theoretical asset pricing models and their empirical applications. I aim to investigate the following noteworthy problems: i) if the relationship between asset prices and investors' propensities to gamble and to fear disaster is time varying, ii) if the conflicting evidence for the firm and market level skewness can be explained by downside risk, Hi) if costly learning drives liquidity risk. Moreover, empirical tests support the above assumptions and provide novel findings in asset pricing, investment decisions, and firms' funding liquidity. The first chapter considers a partial equilibrium model where investors have heterogeneous propensities to gamble and fear disaster. Skewness preference represents the desire to gamble, while kurtosis aversion represents fear of extreme returns. Using US data from 1988 to 2012, my model demonstrates that in bad times, risk aversion is higher, more people fear disaster, and fewer people gamble, in contrast to good times. This leads to a new empirical finding: gambling preference has a greater impact on asset prices during market downturns than during booms. The second chapter consists of two essays. The first essay introduces a foramula based on conditional CAPM for decomposing the market skewness. We find that the major market upward and downward movements can be well preadicted by the asymmetric comovement of betas, which is characterized by an indicator called "Systematic Downside Risk" (SDR). We find that SDR can efafectively forecast future stock market movements and we obtain out-of-sample R-squares (compared with a strategy using historical mean) of more than 2.27% with monthly data. The second essay reconciles a well-known empirical fact: aggregating positively skewed firm returns leads to negatively skewed market return. We reconcile this fact through firms' greater response to negative maraket news than positive market news. We also propose several market return predictors, such as downside idiosyncratic skewness. The third chapter studies the funding liquidity risk based on a general equialibrium model which features two agents: one entrepreneur and one external investor. Only the investor needs to acquire information to estimate the unobservable fundamentals driving the economic outputs. The novelty is that information acquisition is more costly in bad times than in good times, i.e. counter-cyclical information cost, as supported by previous empirical evidence. Later we show that liquidity risks are principally driven by costly learning. Résumé Cette thèse présente des modèles théoriques dévaluation des actifs et leurs applications empiriques. Mon objectif est d'étudier les problèmes suivants: la relation entre l'évaluation des actifs et les tendances des investisseurs à parier et à crainadre le désastre varie selon le temps ; les indications contraires pour l'entreprise et l'asymétrie des niveaux de marché peuvent être expliquées par les risques de perte en cas de baisse; l'apprentissage coûteux augmente le risque de liquidité. En outre, des tests empiriques confirment les suppositions ci-dessus et fournissent de nouvelles découvertes en ce qui concerne l'évaluation des actifs, les décisions relatives aux investissements et la liquidité de financement des entreprises. Le premier chapitre examine un modèle d'équilibre où les investisseurs ont des tendances hétérogènes à parier et à craindre le désastre. La préférence asymétrique représente le désir de parier, alors que le kurtosis d'aversion représente la crainte du désastre. En utilisant les données des Etats-Unis de 1988 à 2012, mon modèle démontre que dans les mauvaises périodes, l'aversion du risque est plus grande, plus de gens craignent le désastre et moins de gens parient, conatrairement aux bonnes périodes. Ceci mène à une nouvelle découverte empirique: la préférence relative au pari a un plus grand impact sur les évaluations des actifs durant les ralentissements de marché que durant les booms économiques. Exploitant uniquement cette relation générera un revenu excédentaire annuel de 7,74% qui n'est pas expliqué par les modèles factoriels populaires. Le second chapitre comprend deux essais. Le premier essai introduit une foramule base sur le CAPM conditionnel pour décomposer l'asymétrie du marché. Nous avons découvert que les mouvements de hausses et de baisses majeures du marché peuvent être prédits par les mouvements communs des bêtas. Un inadicateur appelé Systematic Downside Risk, SDR (risque de ralentissement systématique) est créé pour caractériser cette asymétrie dans les mouvements communs des bêtas. Nous avons découvert que le risque de ralentissement systématique peut prévoir les prochains mouvements des marchés boursiers de manière efficace, et nous obtenons des carrés R hors échantillon (comparés avec une stratégie utilisant des moyens historiques) de plus de 2,272% avec des données mensuelles. Un investisseur qui évalue le marché en utilisant le risque de ralentissement systématique aurait obtenu une forte hausse du ratio de 0,206. Le second essai fait cadrer un fait empirique bien connu dans l'asymétrie des niveaux de march et d'entreprise, le total des revenus des entreprises positiveament asymétriques conduit à un revenu de marché négativement asymétrique. Nous décomposons l'asymétrie des revenus du marché au niveau de l'entreprise et faisons cadrer ce fait par une plus grande réaction des entreprises aux nouvelles négatives du marché qu'aux nouvelles positives du marché. Cette décomposition révélé plusieurs variables de revenus de marché efficaces tels que l'asymétrie caractéristique pondérée par la volatilité ainsi que l'asymétrie caractéristique de ralentissement. Le troisième chapitre fournit une nouvelle base théorique pour les problèmes de liquidité qui varient selon le temps au sein d'un environnement de marché incomplet. Nous proposons un modèle d'équilibre général avec deux agents: un entrepreneur et un investisseur externe. Seul l'investisseur a besoin de connaitre le véritable état de l'entreprise, par conséquent, les informations de paiement coutent de l'argent. La nouveauté est que l'acquisition de l'information coute plus cher durant les mauvaises périodes que durant les bonnes périodes, comme cela a été confirmé par de précédentes expériences. Lorsque la récession comamence, l'apprentissage coûteux fait augmenter les primes de liquidité causant un problème d'évaporation de liquidité, comme cela a été aussi confirmé par de précédentes expériences.
Resumo:
Recent advances in signal analysis have engendered EEG with the status of a true brain mapping and brain imaging method capable of providing spatio-temporal information regarding brain (dys)function. Because of the increasing interest in the temporal dynamics of brain networks, and because of the straightforward compatibility of the EEG with other brain imaging techniques, EEG is increasingly used in the neuroimaging community. However, the full capability of EEG is highly underestimated. Many combined EEG-fMRI studies use the EEG only as a spike-counter or an oscilloscope. Many cognitive and clinical EEG studies use the EEG still in its traditional way and analyze grapho-elements at certain electrodes and latencies. We here show that this way of using the EEG is not only dangerous because it leads to misinterpretations, but it is also largely ignoring the spatial aspects of the signals. In fact, EEG primarily measures the electric potential field at the scalp surface in the same way as MEG measures the magnetic field. By properly sampling and correctly analyzing this electric field, EEG can provide reliable information about the neuronal activity in the brain and the temporal dynamics of this activity in the millisecond range. This review explains some of these analysis methods and illustrates their potential in clinical and experimental applications.
Resumo:
Although tumor-specific CD8 T-cell responses often develop in cancer patients, they rarely result in tumor eradication. We aimed at studying directly the functional efficacy of tumor-specific CD8 T cells at the site of immune attack. Tumor lesions in lymphoid and nonlymphoid tissues (metastatic lymph nodes and soft tissue/visceral metastases, respectively) were collected from stage III/IV melanoma patients and investigated for the presence and function of CD8 T cells specific for the tumor differentiation antigen Melan-A/MART-1. Comparative analysis was conducted with peripheral blood T cells. We provide evidence that in vivo-priming selects, within the available naive Melan-A/MART-1-specific CD8 T-cell repertoire, cells with high T-cell receptor avidity that can efficiently kill melanoma cells in vitro. In vivo, primed Melan-A/MART-1-specific CD8 T cells accumulate at high frequency in both lymphoid and nonlymphoid tumor lesions. Unexpectedly, however, whereas primed Melan-A/MART-1-specific CD8 T cells that circulate in the blood display robust inflammatory and cytotoxic functions, those that reside in tumor lesions (particularly in metastatic lymph nodes) are functionally tolerant. We show that both the lymph node and the tumor environments blunt T-cell effector functions and offer a rationale for the failure of tumor-specific responses to effectively counter tumor progression.
Resumo:
INTRODUCTION: Red cell distribution width was recently identified as a predictor of cardiovascular and all-cause mortality in patients with previous stroke. Red cell distribution width is also higher in patients with stroke compared with those without. However, there are no data on the association of red cell distribution width, assessed during the acute phase of ischemic stroke, with stroke severity and functional outcome. In the present study, we sought to investigate this relationship and ascertain the main determinants of red cell distribution width in this population. METHODS: We used data from the Acute Stroke Registry and Analysis of Lausanne for patients between January 2003 and December 2008. Red cell distribution width was generated at admission by the Sysmex XE-2100 automated cell counter from ethylene diamine tetraacetic acid blood samples stored at room temperature until measurement. An χ(2) -test was performed to compare frequencies of categorical variables between different red cell distribution width quartiles, and one-way analysis of variance for continuous variables. The effect of red cell distribution width on severity and functional outcome was investigated in univariate and multivariate robust regression analysis. Level of significance was set at 95%. RESULTS: There were 1504 patients (72±15·76 years, 43·9% females) included in the analysis. Red cell distribution width was significantly associated to NIHSS (β-value=0·24, P=0·01) and functional outcome (odds ratio=10·73 for poor outcome, P<0·001) at univariate analysis but not multivariate. Prehospital Rankin score (β=0·19, P<0·001), serum creatinine (β=0·008, P<0·001), hemoglobin (β=-0·009, P<0·001), mean platelet volume (β=0·09, P<0·05), age (β=0·02, P<0·001), low ejection fraction (β=0·66, P<0·001) and antihypertensive treatment (β=0·32, P<0·001) were independent determinants of red cell distribution width. CONCLUSIONS: Red cell distribution width, assessed during the early phase of acute ischemic stroke, does not predict severity or functional outcome.