76 resultados para Term of protection


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: CD8 T-cells play a critical role in antiviral immunity. However, mechanisms of virus control and immune correlates of protection are still not fully understood. Among other factors, TCR avidity (antigen sensitivity) is thought to play a critical role. Whereas there is a large consensus that high TCR avidity T-cell responses are correlated to higher efficacy against cancer and acute viral infections, it may be not the case in chronic persistent viral infections. Methods: TCR avidity (measured by the effect concentration 50% [EC50]) of HIV-1-specific CD8 T-cell responses directed against optimal epitopes was investigated in different cohorts of HIV-1- infected subjects (n¼114) including early acute and chronic (progressive and non-progressive) HIV-1-infection. Overall, TCR avidity was investigated in 245 HIV-1-specific CD8 T-cell responses. The relationships between TCR avidity, T-cell differentiation and functional profile including cytokine secretion, proliferation and cytotoxic potential (determined by polychromatic flow cytometry) were analyzed. Results: HIV-1-specific CD8 T-cell responses from patients with acute infection had significantly lower TCR avidity as compared to patients with chronic (progressive or non-progressive) HIVinfection (P¼0.03 and 0.003, respectively). These differences remained significant when the analyses were restricted to common epitopes (same epitopes restricted by the same class I HLA). Interestingly, some patients treated during acute infection underwent spontaneous treatment interruption. Re-exposure to high viral load induced two major effects: a) the increase in TCR avidity of pre-existing high avidity (EC50<0.01) T-cell responses (P<0.02) and b) the generation of new T-cell responses with higher TCR avidity as compared to the average pre-existing T-cell responses. Conclusion: These results suggest that high TCR avidity T-cell responses are selected during the course of HIV-1 infection and that one of the potential driving mechanisms is continuous exposure to HIV-1 antigens. These results advance our understanding of the relationship between TCR avidity and Ag exposure of antiviral memory CD8 T-cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The generation of vaccines against HIV/AIDS able to induce long-lasting protective immunity remains a major goal in the HIV field. The modest efficacy (31.2%) against HIV infection observed in the RV144 phase III clinical trial highlighted the need for further improvement of HIV vaccine candidates, formulation, and vaccine regimen. In this study, we have generated two novel NYVAC vectors, expressing HIV-1 clade C gp140(ZM96) (NYVAC-gp140) or Gag(ZM96)-Pol-Nef(CN54) (NYVAC-Gag-Pol-Nef), and defined their virological and immunological characteristics in cultured cells and in mice. The insertion of HIV genes does not affect the replication capacity of NYVAC recombinants in primary chicken embryo fibroblast cells, HIV sequences remain stable after multiple passages, and HIV antigens are correctly expressed and released from cells, with Env as a trimer (NYVAC-gp140), while in NYVAC-Gag-Pol-Nef-infected cells Gag-induced virus-like particles (VLPs) are abundant. Electron microscopy revealed that VLPs accumulated with time at the cell surface, with no interference with NYVAC morphogenesis. Both vectors trigger specific innate responses in human cells and show an attenuation profile in immunocompromised adult BALB/c and newborn CD1 mice after intracranial inoculation. Analysis of the immune responses elicited in mice after homologous NYVAC prime/NYVAC boost immunization shows that recombinant viruses induced polyfunctional Env-specific CD4 or Gag-specific CD8 T cell responses. Antibody responses against gp140 and p17/p24 were elicited. Our findings showed important insights into virus-host cell interactions of NYVAC vectors expressing HIV antigens, with the activation of specific immune parameters which will help to unravel potential correlates of protection against HIV in human clinical trials with these vectors. IMPORTANCE: We have generated two novel NYVAC-based HIV vaccine candidates expressing HIV-1 clade C trimeric soluble gp140 (ZM96) and Gag(ZM96)-Pol-Nef(CN54) as VLPs. These vectors are stable and express high levels of both HIV-1 antigens. Gag-induced VLPs do not interfere with NYVAC morphogenesis, are highly attenuated in immunocompromised and newborn mice after intracranial inoculation, trigger specific innate immune responses in human cells, and activate T (Env-specific CD4 and Gag-specific CD8) and B cell immune responses to the HIV antigens, leading to high antibody titers against gp140. For these reasons, these vectors can be considered vaccine candidates against HIV/AIDS and currently are being tested in macaques and humans.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A passive sampling device called Monitor of NICotine or "MoNIC", was constructed and evaluated by IST laboratory for determining nicotine in Environmental Tobacco Smoke (ETS). Vapour nicotine was passively collected on a potassium bisulfate treated glass fibre filter as collection medium. Analysis of amount of nicotine on the treated filter by gas chromatography equipped with Thermoionic-Specific Detector (GCTSD) after liquid-liquid extraction of 1mL of 5N NaOH : 1 mL of n-heptane saturated with NH3 using quinoline as internal standard. Based on nicotine amount of 0.2 mg/cigarette as reference, the inhaled Cigarette Equivalents (CE) by non-smokers can be calculated. Using the detected CE on the badge for nonsmokers, and comparing with amount of nicotine and cotinine level in saliva of both smokers and exposed non-smokers (N=49), we can confirm the use of the CE concept for estimating exposure to ETS. The Valais CIPRET (Center of information and prevention of the addiction to smoking), is going to organize a big campaign on the subject of the passive addiction to smoking entitled "Smoked passive, we suffer from it, we die from it ". This campaign will take place in 2007 and has for objective to inform clearly the population of Valais of the dangerousness of the passive smoke. More than 1'500 MoNIC badges were gracefully distributed to Swiss population to perform a self-monitoring of population exposure level to ETS, expressed in term of CE. Non-stimulated saliva were also collected to determine ETS biomarkers nicotine/cotinine levels of participating volunteers. Preliminary results of different levels of CE in occupational and non-occupational situations in relation with ETS were presented in this study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rock slope instabilities are implicitly linked to the supply of sediment and debris recharging channels prone to debris flow. Hence, the incorporation of bedrock structure and terrain morphology can be relevant in the analysis of sediment budget and debris flow hazard assessment. Here, the mode of debris production of the Manival catchment (northern French Alps) is documented by the study of its morphostructural aspects extracted from high resolution DEM. Terrain implication in the process of debris supply is evaluated by: a) A systematic classification of the major morphological units based on the slope gradient that enables a spatial analysis of zones of debris production and deposition. b) A detailed structural analysis performed on DEM in order to identify potential unstable slopes. c) An analysis of the gullies orientation that informs in term of structural control of the sources zones. d) Localisation of high density joints sets that document about whether sources of continuous debris production are controlled by the structural setting of the catchment. These DEM-based indicators can be used as proxies for assessing the influences of the current topography and enable to quantify a degree of susceptibility to mass wasting and hillslope erosion activity. This present contribution suggests some directions for characterizing sediment flux dynamic in small alpine catchment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: The aim of this study was to compare specificity and sensitivity of different biological markers that can be used in a forensic field to identify potentially dangerous drivers because of their alcohol habits. Methods: We studied 280 Swiss drivers after driving while under the alcohol influence. 33 were excluded for not having CDT N results, 247 were included (218 men (88%) and 29 women (12%). Mean age was 42,4 (SD:12, min: 20 max: 76). The evaluation of the alcohol consumption concerned the month before the CDT test and was considered as such after the interview: Heavy drinkers (>3 drinks per day): 60 (32.7%), < 3 drinks per day and moderate: 127 (51.4%) 114 (46.5%), abstinent: 60 (24.3%) 51 (21%). Alcohol intake was monitored by structured interviews, self-reported drinking habits and the C-Audit questionnaire as well as information provided by their family and general practitioner. Consumption was quantified in terms of standard drinks, which contain approximately 10 grams of pure alcohol (Ref. WHO). Results: comparison between moderate (less or equal to 3 drinks per day) and excessive drinkers (more than 3 drinks) Marker ROC area 95% CI cut-off sensitivity specificity CDT TIA 0.852 0.786-0917 2.6* 0.93 LR+1.43 0.35 LR-0.192 CDT N latex 0.875 0.821-0.930 2.5* 0.66 LR+ 6.93 0.90 LR- 0.369 Asialo+disialo-tf 0.881 0.826-0.936 1.2* 0.78 LR+4.07 0.80 LR-0.268 1.7° 0.66 LR+8.9 0.93 LR-0.360 GGT 0.659 0.580-0.737 85* 0.37 LR+2.14 0.83 LR-0.764 * cut-off point suggested by the manufacturer ° cut-off point suggested by our laboratory Conclusion: With the cut-off point established by the manufacturer, CDT TIA performed poorly in term of specificity. N latex CDT and CZE CDT were better, especially if a 1.7 cut-off is used with CZE

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: We describe the results of a preliminary prospective study using different recently developed temporary and retrievable inferior vena cava (IVC) filters. METHODS: Fifty temporary IVC filters (Gunther, Gunther Tulip, Antheor) were inserted in 47 patients when the required period of protection against pulmonary embolism (PE) was estimated to be less than 2 weeks. The indications were documented deep vein thrombosis (DVT) and temporary contraindications for anticoagulation, a high risk for PE, and PE despite DVT prophylaxis. RESULTS: Filters were removed 1-12 days after placement and nine (18%) had captured thrombi. Complications were one PE during and after removal of a filter, two minor filter migrations, and one IVC thrombosis. CONCLUSION: Temporary filters are effective in trapping clots and protecting against PE, and the complication rate does not exceed that of permanent filters. They are an alternative when protection from PE is required temporarily, and should be considered in patients with a normal life expectancy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim: To compare a less intensive regimen based on high-dose imatinib (IM) to an intensive IM/HyperCVAD regimen in adults with Ph+ ALL, in terms of early response and outcome after stem cell transplantation (SCT). Methods: Patients aged 18-60 years with previously untreated Ph+ ALL not evolving from chronic myeloid leukemia were eligible if no contra-indication to chemotherapy and SCT (ClinicalTrials.gov ID, NCT00327678). After a steroid prephase allowing Ph and/or BCR-ABL diagnosis, cycle 1 differed between randomization arms. In arm A (IM-based), IM was given at 800 mg on day 1-28, combined with vincristine (2 mg, day 1, 8, 15, 22) and dexamethasone (40 mg, day 1-2, 8-9, 15-16, and 22-23) only. In arm B (IM/HyperCVAD), IM was given at 800 mg on day 1-14, combined with adriamycin (50 mg/m2, day 4), cyclophosphamide (300 mg/m2/12h, day 1, 2, 3), vincristine (2 mg, day 4 and 11), and dexamethasone (40 mg, day 1-4 and 11-14). All patients received a cycle 2 combining high-dose methotrexate (1 g/m2, day 1) and AraC (3 g/m2/12h, day 2 and 3) with IM at 800 mg on day 1-14, whatever their response. Four intrathecal infusions were given during this induction/consolidation period. Minimal residual disease (MRD) was centrally evaluated by quantitative RQ-PCR after cycle 1 (MRD1) and cycle 2 (MRD2). Major MRD response was defined as BCR-ABL/ABL ratio <0.1%. Then, all patients were to receive allogeneic SCT using related or unrelated matched donor stem cells or autologous SCT if no donor and a major MRD2 response. IM/chemotherapy maintenance was planned after autologous SCT. In the absence of SCT, patients received alternating cycles 1 (as in arm B) and cycles 2 followed by maintenance, like in the published IM/HyperCVAD regimen. The primary objective was non-inferiority of arm A in term of major MRD2 response. Secondary objectives were CR rate, SCT rate, treatment- and transplant-related mortality, relapse-free (RFS), event-free (EFS) and overall (OS) survival. Results: Among the 270 patients randomized between May 2006 and August 2011, 265 patients were evaluable for this analysis (133 arm A, 132 arm B; median age, 47 years; median follow-up, 40 months). Main patient characteristics were well-balanced between both arms. Due to higher induction mortality in arm B (9 versus 1 deaths; P=0.01), CR rate was higher in the less intensive arm A (98% versus 89% after cycle 1 and 98% versus 91% after cycle 2; P= 0.003 and 0.006, respectively). A total of 213 and 205 patients were evaluated for bone marrow MRD1 and MRD2. The rates of patients reaching major MRD response and undetectable MRD were 45% (44% arm A, 46% arm B; P=0.79) and 10% (in both arms) at MRD1 and 66% (68% arm A, 63.5% arm B; P=0.56) and 25% (28% arm A, 22% arm B; P=0.33) at MRD2, respectively. The non-inferiority primary endpoint was thus demonstrated (P= 0.002). Overall, EFS was estimated at 42% (95% CI, 35-49) and OS at 51% (95% CI, 44-57) at 3 years, with no difference between arm A and B (46% versus 38% and 53% versus 49%; P=0.25 and 0.61, respectively). Of the 251 CR patients, 157 (80 arm A, 77 arm B) and 34 (17 in both arms) received allogeneic and autologous SCT in first CR, respectively. Allogeneic transplant-related mortality was similar in both arms (31.5% versus 22% at 3 years; P=0.51). Of the 157 allografted patients, 133 had MRD2 evaluation and 89 had MRD2 <0.1%. In these patients, MRD2 did not significantly influence post-transplant RFS and OS, either when tested with the 0.1% cutoff or as a continuous log covariate. Of the 34 autografted patients, 31 had MRD2 evaluation and, according to the protocol, 28 had MRD2 <0.1%. When restricting the comparison to patients achieving major MRD2 response and with the current follow-up, a trend for better results was observed after autologous as compared to allogeneic SCT (RFS, 63% versus 49.5% and OS, 69% versus 58% at 3 years; P=0.35 and P=0.08, respectively). Conclusions: In adults, the use of TK inhibitors (TKI) has markedly improved the results of Ph+ ALL therapy, now close to those observed in Ph-negative ALL. We demonstrated here that chemotherapy intensity may be safely reduced when associated with high-dose IM. We will further explore this TKI-based strategy using nilotinib prior to SCT in our next GRAAPH-2013 trial. The trend towards a better outcome after autologous compared to allogeneic SCT observed in MRD responders validates MRD as an important early surrogate endpoint for treatment stratification and new drug investigation in this disease.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite numerous discussions, workshops, reviews and reports about responsible development of nanotechnology, information describing health and environmental risk of engineered nanoparticles or nanomaterials is severely lacking and thus insufficient for completing rigorous risk assessment on their use. However, since preliminary scientific evaluations indicate that there are reasonable suspicions that activities involving nanomaterials might have damaging effects on human health; the precautionary principle must be applied. Public and private institutions as well as industries have the duty to adopt preventive and protective measures proportionate to the risk intensity and the desired level of protection. In this work, we present a practical, 'user-friendly' procedure for a university-wide safety and health management of nanomaterials, developed as a multi-stakeholder effort (government, accident insurance, researchers and experts for occupational safety and health). The process starts using a schematic decision tree that allows classifying the nano laboratory into three hazard classes similar to a control banding approach (from Nano 3 - highest hazard to Nano1 - lowest hazard). Classifying laboratories into risk classes would require considering actual or potential exposure to the nanomaterial as well as statistical data on health effects of exposure. Due to the fact that these data (as well as exposure limits for each individual material) are not available, risk classes could not be determined. For each hazard level we then provide a list of required risk mitigation measures (technical, organizational and personal). The target 'users' of this safety and health methodology are researchers and safety officers. They can rapidly access the precautionary hazard class of their activities and the corresponding adequate safety and health measures. We succeed in convincing scientist dealing with nano-activities that adequate safety measures and management are promoting innovation and discoveries by ensuring them a safe environment even in the case of very novel products. The proposed measures are not considered as constraints but as a support to their research. This methodology is being implemented at the Ecole Polytechnique de Lausanne in over 100 research labs dealing with nanomaterials. It is our opinion that it would be useful to other research and academia institutions as well. [Authors]

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ces dernières années, de nombreuses recherches ont mis en évidence les effets toxiques des micropolluants organiques pour les espèces de nos lacs et rivières. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, alors que les organismes sont exposés tous les jours à des milliers de substances en mélange. Or les effets de ces cocktails ne sont pas négligeables. Cette thèse de doctorat s'est ainsi intéressée aux modèles permettant de prédire le risque environnemental de ces cocktails pour le milieu aquatique. Le principal objectif a été d'évaluer le risque écologique des mélanges de substances chimiques mesurées dans le Léman, mais aussi d'apporter un regard critique sur les méthodologies utilisées afin de proposer certaines adaptations pour une meilleure estimation du risque. Dans la première partie de ce travail, le risque des mélanges de pesticides et médicaments pour le Rhône et pour le Léman a été établi en utilisant des approches envisagées notamment dans la législation européenne. Il s'agit d'approches de « screening », c'est-à-dire permettant une évaluation générale du risque des mélanges. Une telle approche permet de mettre en évidence les substances les plus problématiques, c'est-à-dire contribuant le plus à la toxicité du mélange. Dans notre cas, il s'agit essentiellement de 4 pesticides. L'étude met également en évidence que toutes les substances, même en trace infime, contribuent à l'effet du mélange. Cette constatation a des implications en terme de gestion de l'environnement. En effet, ceci implique qu'il faut réduire toutes les sources de polluants, et pas seulement les plus problématiques. Mais l'approche proposée présente également un biais important au niveau conceptuel, ce qui rend son utilisation discutable, en dehors d'un screening, et nécessiterait une adaptation au niveau des facteurs de sécurité employés. Dans une deuxième partie, l'étude s'est portée sur l'utilisation des modèles de mélanges dans le calcul de risque environnemental. En effet, les modèles de mélanges ont été développés et validés espèce par espèce, et non pour une évaluation sur l'écosystème en entier. Leur utilisation devrait donc passer par un calcul par espèce, ce qui est rarement fait dû au manque de données écotoxicologiques à disposition. Le but a été donc de comparer, avec des valeurs générées aléatoirement, le calcul de risque effectué selon une méthode rigoureuse, espèce par espèce, avec celui effectué classiquement où les modèles sont appliqués sur l'ensemble de la communauté sans tenir compte des variations inter-espèces. Les résultats sont dans la majorité des cas similaires, ce qui valide l'approche utilisée traditionnellement. En revanche, ce travail a permis de déterminer certains cas où l'application classique peut conduire à une sous- ou sur-estimation du risque. Enfin, une dernière partie de cette thèse s'est intéressée à l'influence que les cocktails de micropolluants ont pu avoir sur les communautés in situ. Pour ce faire, une approche en deux temps a été adoptée. Tout d'abord la toxicité de quatorze herbicides détectés dans le Léman a été déterminée. Sur la période étudiée, de 2004 à 2009, cette toxicité due aux herbicides a diminué, passant de 4% d'espèces affectées à moins de 1%. Ensuite, la question était de savoir si cette diminution de toxicité avait un impact sur le développement de certaines espèces au sein de la communauté des algues. Pour ce faire, l'utilisation statistique a permis d'isoler d'autres facteurs pouvant avoir une influence sur la flore, comme la température de l'eau ou la présence de phosphates, et ainsi de constater quelles espèces se sont révélées avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps. Fait intéressant, une partie d'entre-elles avait déjà montré des comportements similaires dans des études en mésocosmes. En conclusion, ce travail montre qu'il existe des modèles robustes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques, et qu'ils peuvent être utilisés pour expliquer le rôle des substances dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application. - Depuis plusieurs années, les risques que posent les micropolluants organiques pour le milieu aquatique préoccupent grandement les scientifiques ainsi que notre société. En effet, de nombreuses recherches ont mis en évidence les effets toxiques que peuvent avoir ces substances chimiques sur les espèces de nos lacs et rivières, quand elles se retrouvent exposées à des concentrations aiguës ou chroniques. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, c'est à dire considérées séparément. Actuellement, il en est de même dans les procédures de régulation européennes, concernant la partie évaluation du risque pour l'environnement d'une substance. Or, les organismes sont exposés tous les jours à des milliers de substances en mélange, et les effets de ces "cocktails" ne sont pas négligeables. L'évaluation du risque écologique que pose ces mélanges de substances doit donc être abordé par de la manière la plus appropriée et la plus fiable possible. Dans la première partie de cette thèse, nous nous sommes intéressés aux méthodes actuellement envisagées à être intégrées dans les législations européennes pour l'évaluation du risque des mélanges pour le milieu aquatique. Ces méthodes sont basées sur le modèle d'addition des concentrations, avec l'utilisation des valeurs de concentrations des substances estimées sans effet dans le milieu (PNEC), ou à partir des valeurs des concentrations d'effet (CE50) sur certaines espèces d'un niveau trophique avec la prise en compte de facteurs de sécurité. Nous avons appliqué ces méthodes à deux cas spécifiques, le lac Léman et le Rhône situés en Suisse, et discuté les résultats de ces applications. Ces premières étapes d'évaluation ont montré que le risque des mélanges pour ces cas d'étude atteint rapidement une valeur au dessus d'un seuil critique. Cette valeur atteinte est généralement due à deux ou trois substances principales. Les procédures proposées permettent donc d'identifier les substances les plus problématiques pour lesquelles des mesures de gestion, telles que la réduction de leur entrée dans le milieu aquatique, devraient être envisagées. Cependant, nous avons également constaté que le niveau de risque associé à ces mélanges de substances n'est pas négligeable, même sans tenir compte de ces substances principales. En effet, l'accumulation des substances, même en traces infimes, atteint un seuil critique, ce qui devient plus difficile en terme de gestion du risque. En outre, nous avons souligné un manque de fiabilité dans ces procédures, qui peuvent conduire à des résultats contradictoires en terme de risque. Ceci est lié à l'incompatibilité des facteurs de sécurité utilisés dans les différentes méthodes. Dans la deuxième partie de la thèse, nous avons étudié la fiabilité de méthodes plus avancées dans la prédiction de l'effet des mélanges pour les communautés évoluant dans le système aquatique. Ces méthodes reposent sur le modèle d'addition des concentrations (CA) ou d'addition des réponses (RA) appliqués sur les courbes de distribution de la sensibilité des espèces (SSD) aux substances. En effet, les modèles de mélanges ont été développés et validés pour être appliqués espèce par espèce, et non pas sur plusieurs espèces agrégées simultanément dans les courbes SSD. Nous avons ainsi proposé une procédure plus rigoureuse, pour l'évaluation du risque d'un mélange, qui serait d'appliquer d'abord les modèles CA ou RA à chaque espèce séparément, et, dans une deuxième étape, combiner les résultats afin d'établir une courbe SSD du mélange. Malheureusement, cette méthode n'est pas applicable dans la plupart des cas, car elle nécessite trop de données généralement indisponibles. Par conséquent, nous avons comparé, avec des valeurs générées aléatoirement, le calcul de risque effectué selon cette méthode plus rigoureuse, avec celle effectuée traditionnellement, afin de caractériser la robustesse de cette approche qui consiste à appliquer les modèles de mélange sur les courbes SSD. Nos résultats ont montré que l'utilisation de CA directement sur les SSDs peut conduire à une sous-estimation de la concentration du mélange affectant 5 % ou 50% des espèces, en particulier lorsque les substances présentent un grand écart- type dans leur distribution de la sensibilité des espèces. L'application du modèle RA peut quant à lui conduire à une sur- ou sous-estimations, principalement en fonction de la pente des courbes dose- réponse de chaque espèce composant les SSDs. La sous-estimation avec RA devient potentiellement importante lorsque le rapport entre la EC50 et la EC10 de la courbe dose-réponse des espèces est plus petit que 100. Toutefois, la plupart des substances, selon des cas réels, présentent des données d' écotoxicité qui font que le risque du mélange calculé par la méthode des modèles appliqués directement sur les SSDs reste cohérent et surestimerait plutôt légèrement le risque. Ces résultats valident ainsi l'approche utilisée traditionnellement. Néanmoins, il faut garder à l'esprit cette source d'erreur lorsqu'on procède à une évaluation du risque d'un mélange avec cette méthode traditionnelle, en particulier quand les SSD présentent une distribution des données en dehors des limites déterminées dans cette étude. Enfin, dans la dernière partie de cette thèse, nous avons confronté des prédictions de l'effet de mélange avec des changements biologiques observés dans l'environnement. Dans cette étude, nous avons utilisé des données venant d'un suivi à long terme d'un grand lac européen, le lac Léman, ce qui offrait la possibilité d'évaluer dans quelle mesure la prédiction de la toxicité des mélanges d'herbicide expliquait les changements dans la composition de la communauté phytoplanctonique. Ceci à côté d'autres paramètres classiques de limnologie tels que les nutriments. Pour atteindre cet objectif, nous avons déterminé la toxicité des mélanges sur plusieurs années de 14 herbicides régulièrement détectés dans le lac, en utilisant les modèles CA et RA avec les courbes de distribution de la sensibilité des espèces. Un gradient temporel de toxicité décroissant a pu être constaté de 2004 à 2009. Une analyse de redondance et de redondance partielle, a montré que ce gradient explique une partie significative de la variation de la composition de la communauté phytoplanctonique, même après avoir enlevé l'effet de toutes les autres co-variables. De plus, certaines espèces révélées pour avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps, ont montré des comportements similaires dans des études en mésocosmes. On peut en conclure que la toxicité du mélange herbicide est l'un des paramètres clés pour expliquer les changements de phytoplancton dans le lac Léman. En conclusion, il existe diverses méthodes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques et celui-ci peut jouer un rôle dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application, avant d'utiliser leurs résultats pour la gestion des risques environnementaux. - For several years now, the scientists as well as the society is concerned by the aquatic risk organic micropollutants may pose. Indeed, several researches have shown the toxic effects these substances may induce on organisms living in our lakes or rivers, especially when they are exposed to acute or chronic concentrations. However, most of the studies focused on the toxicity of single compounds, i.e. considered individually. The same also goes in the current European regulations concerning the risk assessment procedures for the environment of these substances. But aquatic organisms are typically exposed every day simultaneously to thousands of organic compounds. The toxic effects resulting of these "cocktails" cannot be neglected. The ecological risk assessment of mixtures of such compounds has therefore to be addressed by scientists in the most reliable and appropriate way. In the first part of this thesis, the procedures currently envisioned for the aquatic mixture risk assessment in European legislations are described. These methodologies are based on the mixture model of concentration addition and the use of the predicted no effect concentrations (PNEC) or effect concentrations (EC50) with assessment factors. These principal approaches were applied to two specific case studies, Lake Geneva and the River Rhône in Switzerland, including a discussion of the outcomes of such applications. These first level assessments showed that the mixture risks for these studied cases exceeded rapidly the critical value. This exceeding is generally due to two or three main substances. The proposed procedures allow therefore the identification of the most problematic substances for which management measures, such as a reduction of the entrance to the aquatic environment, should be envisioned. However, it was also showed that the risk levels associated with mixtures of compounds are not negligible, even without considering these main substances. Indeed, it is the sum of the substances that is problematic, which is more challenging in term of risk management. Moreover, a lack of reliability in the procedures was highlighted, which can lead to contradictory results in terms of risk. This result is linked to the inconsistency in the assessment factors applied in the different methods. In the second part of the thesis, the reliability of the more advanced procedures to predict the mixture effect to communities in the aquatic system were investigated. These established methodologies combine the model of concentration addition (CA) or response addition (RA) with species sensitivity distribution curves (SSD). Indeed, the mixture effect predictions were shown to be consistent only when the mixture models are applied on a single species, and not on several species simultaneously aggregated to SSDs. Hence, A more stringent procedure for mixture risk assessment is proposed, that would be to apply first the CA or RA models to each species separately and, in a second step, to combine the results to build an SSD for a mixture. Unfortunately, this methodology is not applicable in most cases, because it requires large data sets usually not available. Therefore, the differences between the two methodologies were studied with datasets created artificially to characterize the robustness of the traditional approach applying models on species sensitivity distribution. The results showed that the use of CA on SSD directly might lead to underestimations of the mixture concentration affecting 5% or 50% of species, especially when substances present a large standard deviation of the distribution from the sensitivity of the species. The application of RA can lead to over- or underestimates, depending mainly on the slope of the dose-response curves of the individual species. The potential underestimation with RA becomes important when the ratio between the EC50 and the EC10 for the dose-response curve of the species composing the SSD are smaller than 100. However, considering common real cases of ecotoxicity data for substances, the mixture risk calculated by the methodology applying mixture models directly on SSDs remains consistent and would rather slightly overestimate the risk. These results can be used as a theoretical validation of the currently applied methodology. Nevertheless, when assessing the risk of mixtures, one has to keep in mind this source of error with this classical methodology, especially when SSDs present a distribution of the data outside the range determined in this study Finally, in the last part of this thesis, we confronted the mixture effect predictions with biological changes observed in the environment. In this study, long-term monitoring of a European great lake, Lake Geneva, provides the opportunity to assess to what extent the predicted toxicity of herbicide mixtures explains the changes in the composition of the phytoplankton community next to other classical limnology parameters such as nutrients. To reach this goal, the gradient of the mixture toxicity of 14 herbicides regularly detected in the lake was calculated, using concentration addition and response addition models. A decreasing temporal gradient of toxicity was observed from 2004 to 2009. Redundancy analysis and partial redundancy analysis showed that this gradient explains a significant portion of the variation in phytoplankton community composition, even when having removed the effect of all other co-variables. Moreover, some species that were revealed to be influenced positively or negatively, by the decrease of toxicity in the lake over time, showed similar behaviors in mesocosms studies. It could be concluded that the herbicide mixture toxicity is one of the key parameters to explain phytoplankton changes in Lake Geneva. To conclude, different methods exist to predict the risk of mixture in the ecosystems. But their reliability varies depending on the underlying hypotheses. One should therefore carefully consider these hypotheses, as well as the limits of the approaches, before using the results for environmental risk management

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND &amp; AIMS: Despite the proven ability of immunization to reduce Helicobacter infection in mouse models, the precise mechanism of protection has remained elusive. This study explores the possibility that interleukin (IL)-17 plays a role in the reduction of Helicobacter infection following vaccination of wild-type animals or in spontaneous reduction of bacterial infection in IL-10-deficient mice. METHODS: In mice, reducing Helicobacter infection, the levels and source of IL-17 were determined and the role of IL-17 in reduction of Helicobacter infection was probed by neutralizing antibodies. RESULTS: Gastric IL-17 levels were strongly increased in mice mucosally immunized with urease plus cholera toxin and challenged with Helicobacter felis as compared with controls (654 +/- 455 and 34 +/- 84 relative units for IL-17 messenger RNA expression [P &lt; .01] and 6.9 +/- 8.4 and 0.02 +/- 0.04 pg for IL-17 protein concentration [P &lt; .01], respectively). Flow cytometry analysis showed that a peak of CD4(+)IL-17(+) T cells infiltrating the gastric mucosa occurred in immunized mice in contrast to control mice (4.7% +/- 0.3% and 1.4% +/- 0.3% [P &lt; .01], respectively). Gastric mucosa-infiltrating CD4(+)IL-17(+) T cells were also observed in IL-10-deficient mice that spontaneously reduced H felis infection (4.3% +/- 2.3% and 2% +/- 0.6% [P &lt; .01], for infected and noninfected IL-10-deficient mice, respectively). In wild-type immunized mice, intraperitoneal injection of anti-IL-17 antibodies significantly inhibited inflammation and the reduction of Helicobacter infection in comparison with control antibodies (1 of 12 mice vs 9 of 12 mice reduced Helicobacter infection [P &lt; .01], respectively). CONCLUSIONS: IL-17 plays a critical role in the immunization-induced reduction of Helicobacter infection from the gastric mucosa.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: 700 to 1000 UI Vitamin D/day prevent 20% of fall and fracture. Higher dosage could prevent other health problems, such as immune diseases. Adherence to oral daily vitamin D supplementation is low. There is no guideline on how to supplement patients with rheumatic diseases. We evaluated if 1-2 dose(s) of 300'000 UI oral vitamin D3 was enough to reach an optimal level of 25-OH vitamin D in late winter in patients with insufficiency. Methods: During November 2009 (M0) patients attending our Rheumatology Outpatient Clinic had a blood test to measure 25-OH vitamin D. Results were classified as: deficiency <10µg/l, insufficiency 10µg/l to 30µg/l and normal >30µg/l. Patients on daily oral vitamin D3 or who received a single high dose of vitamin D3 in the last 6 months and patients with deficiency or normal results were excluded. Patients included received a single dose of 300'000 IU of oral vitamin D3 and were asked to come back for a blood test for 25-OH vitamin D after 3 (M3) and 6 months (M6). If they were still insufficient at M3, they received a second high dose of 300'000 IU of oral vitamin D3. Results: 292 patients had their level of 25-OH vitamin D determined at M0. 141 patients (70% women) had vitamin D insufficiency (18.5µg/l (10.2-29.1)) and received a prescription for a single dose of 300'000 IU of oral vitamin D3. Men and women were not statistically different in term of age and 25-OH vitamin D level at M0. 124/141 (88%) patients had a blood test at M3. 2/124 (2%) had deficiency (8.1µg/l (7.5-8.7)), 50/124 (40%) normal results (36.7µg/l (30.5-56.5)). 58% (72/124) were insufficient (23.6µg/l (13.8-29.8)) and received a second prescription for 300'000 IU of oral vitamin D3. Of the 50/124 patients who had normal results at M3 and did not receive a second prescription, 36 (72%) had a test at M6. 47% (17/36) had normal results (34.8µg/l (30.3-42.8)), 53% (19/36) were insufficient (25.6µg/l (15.2-29.9)). Out of the 54/72 (75%) patients who received a second prescription, 28/54 (52%) had insufficiency (23.2µg/l (12.8-28.7)) and 26/54 (48%) had normal results (33.8µg/l (30.0-43.7)) at M 6. Discussion: This real life study has shown that one or two oral bolus of 300'000 IU of vitamin D3 in autumn and winter was not enough to completely correct hypovitaminosis D but was a good way of preventing a nadir of 25-OH vitamin D usually observed in spring in a Swiss rheumatologic population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION: Video records are widely used to analyze performance in alpine skiing at professional or amateur level. Parts of these analyses require the labeling of some movements (i.e. determining when specific events occur). If differences among coaches and differences for the same coach between different dates are expected, they have never been quantified. Moreover, knowing these differences is essential to determine which parameters reliable should be used. This study aimed to quantify the precision and the repeatability for alpine skiing coaches of various levels, as it is done in other fields (Koo et al, 2005). METHODS: A software similar to commercialized products was designed to allow video analyses. 15 coaches divided into 3 groups (5 amateur coaches (G1), 5 professional instructors (G2) and 5 semi-professional coaches (G3)) were enrolled. They were asked to label 15 timing parameters (TP) according to the Swiss ski manual (Terribilini et al, 2001) for each curve. TP included phases (initiation, steering I-II), body and ski movements (e.g. rotation, weighting, extension, balance). Three video sequences sampled at 25 Hz were used and one curve per video was labeled. The first video was used to familiarize the analyzer to the software. The two other videos, corresponding to slalom and giant slalom, were considered for the analysis. G1 realized twice the analysis (A1 and A2) at different dates and TP were randomized between both analyses. Reference TP were considered as the median of G2 and G3 at A1. The precision was defined as the RMS difference between individual TP and reference TP, whereas the repeatability was calculated as the RMS difference between individual TP at A1 and at A2. RESULTS AND DISCUSSION: For G1, G2 and G3, a precision of +/-5.6 frames, +/-3.0 and +/-2.0 frames, was respectively obtained. These results showed that G2 was more precise than G1, and G3 more precise than G2, were in accordance with group levels. The repeatability for G1 was +/-3.1 frames. Furthermore, differences among TP precision were observed, considering G2 and G3, with largest differences of +/-5.9 frames for "body counter rotation movement in steering phase II", and of 0.8 frame for "ski unweighting in initiation phase". CONCLUSION: This study quantified coach ability to label video in term of precision and repeatability. The best precision was obtained for G3 and was of +/-0.08s, which corresponds to +/-6.5% of the curve cycle. Regarding the repeatability, we obtained a result of +/-0.12s for G1, corresponding to +/-12% of the curve cycle. The repeatability of G2 and G3 are expected to be lower than the precision of G1 and the corresponding repeatability will be assessed soon. In conclusion, our results indicate that the labeling of video records is reliable for some TP, whereas caution is required for others. REFERENCES Koo S, Gold MD, Andriacchi TP. (2005). Osteoarthritis, 13, 782-789. Terribilini M, et al. (2001). Swiss Ski manual, 29-46. IASS, Lucerne.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This review assesses the presentation, management, and outcome of delayed postpancreatectomy hemorrhage (PPH) and suggests a novel algorithm as possible standard of care.An electronic search of Medline and Embase databases from January 1990 to February 2010 was undertaken. A random-effect meta-analysis for success rate and mortality of laparotomy vs. interventional radiology after delayed PPH was performed.Fifteen studies comprising of 248 patients with delayed PPH were included. Its incidence was of 3.3%. A sentinel bleed heralding a delayed PPH was observed in 45% of cases. Pancreatic leaks or intraabdominal abscesses were found in 62%. Interventional radiology was attempted in 41%, and laparotomy was undertaken in 49%. On meta-analysis comparing laparotomy vs. interventional radiology, no significant difference could be found in terms of complete hemostasis (76% vs. 80%; P = 0.35). A statistically significant difference favored interventional radiology vs. laparotomy in term of mortality (22% vs. 47%; P = 0.02).Proper management of postoperative complications, such as pancreatic leak and intraabdominal abscess, minimizes the risk of delayed PPH. Sentinel bleeding needs to be thoroughly investigated. If a pseudoaneurysm is detected, it has to be treated by interventional angiography, in order to prevent a further delayed PPH. Early angiography and embolization or stenting is safe and should be the procedure of choice. Surgery remains a therapeutic option if no interventional radiology is available, or patients cannot be resuscitated for an interventional treatment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective:This review assesses the presentation,management, and outcome of delayed postpancreatectomy hemorrhage (PPH) and suggests a novel algorithm as possible standard of care.Methods: An electronic search of Medline and Embase databases from January 1990 to February 2010 was undertaken. A random-effect meta-analysis for success rate and mortality of laparotomy vs. interventional radiology after delayed PPH was performed.Results: Fifteen studies including 248 patients with delayed PPH were included. Its incidence was 3?3%. A sentinel bleed heralding a delayed PPH was observed in 45% of cases. Pancreatic leaks or intraabdominal abscesses were found in 62%. Interventional radiology was attempted in 41%, and laparotomy was undertaken in 49%. On meta-analysis comparing laparotomy vs. interventional radiology, no significant difference could be observed in term of complete hemostasis (76% vs. 80%; P = 0?35). A statistically significant difference favored interventional radiology vs. laparotomy in term of mortality (22% vs. 47%; P = 0?02).Conclusion: Proper and early management of postoperative complications, such as pancreatic leak and intraabdominal abscess, minimizes the risk of delayed PPH. Sentinel bleeding needs to be thoroughly investigated. If a pseudoaneurysm is detected, it has to be treated by interventional angiography, in order to prevent a further delayed PPH. Early angiography and embolization or stenting is safe and should be the procedure of choice. Surgery remains a therapeutic option if no interventional radiology is available, or patients cannot be resuscitated for an interventional treatment.