928 resultados para critical path methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the report of the first workshop on Incorporating In Vitro Alternative Methods for Developmental Neurotoxicity (DNT) Testing into International Hazard and Risk Assessment Strategies, held in Ispra, Italy, on 19-21 April 2005. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and jointly organized by ECVAM, the European Chemical Industry Council, and the Johns Hopkins University Center for Alternatives to Animal Testing. The primary aim of the workshop was to identify and catalog potential methods that could be used to assess how data from in vitro alternative methods could help to predict and identify DNT hazards. Working groups focused on two different aspects: a) details on the science available in the field of DNT, including discussions on the models available to capture the critical DNT mechanisms and processes, and b) policy and strategy aspects to assess the integration of alternative methods in a regulatory framework. This report summarizes these discussions and details the recommendations and priorities for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Endovascular treatment of wide-neck bifurcation aneurysms often results in incomplete occlusion or aneurysm recurrence. The goals of this study were to compare results of coil embolization with or without the assistance of self-expandable stents and to examine how stents may influence neointima formation. MATERIALS AND METHODS: Wide-neck bifurcation aneurysms were constructed in 24 animals and, after 4-6 weeks, were randomly allocated to 1 of 5 groups: 1) coil embolization using the assistance of 1 braided stent (n = 5); 2) coil embolization using the assistance of 2 braided stents in a Y configuration (n = 5); 3) coil embolization without stent assistance (n = 6); 4) Y-stenting alone (n = 4); and 5) untreated controls (n = 4). Angiographic results were compared at baseline and at 12 weeks, by using an ordinal scale. Neointima formation at the neck at 12 weeks was compared among groups by using a semiquantitative grading scale. Bench studies were performed to assess stent porosities. RESULTS: Initial angiographic results were improved with single stent-assisted coiling compared with simple coiling (P = .013). Angiographic results at 12 weeks were improved with any stent assistance (P = .014). Neointimal closure of the aneurysm neck was similar with or without stent assistance (P = .908), with neointima covering coil loops but rarely stent struts. Y-stent placement alone had no therapeutic effect. Bench studies showed that porosities can be decreased with stent compaction, but a relatively stable porous transition zone was a limiting factor. CONCLUSIONS: Stent-assisted coiling may improve results of embolization by allowing more complete initial coiling, but these high-porosity stents did not provide a scaffold for more complete neointimal closure of aneurysms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Outcome following foot and ankle surgery can be assessed by disease- and region-specific scores. Many scoring systems exist, making comparison among studies difficult. The present study focused on outcome measures for a common foot and ankle abnormality and compared the results obtained by 2 disease-specific and 2 body region-specific scores. METHODS: We reviewed 41 patients who underwent lateral ankle ligament reconstruction. Four outcome scales were administered simultaneously: the Cumberland Ankle Instability Tool (CAIT) and the Chronic Ankle Instability Scale (CAIS), which are disease specific, and the American Orthopedic Foot & Ankle Society (AOFAS) hindfoot scale and the Foot and Ankle Ability Measure (FAAM), which are both body region-specific. The degree of correlation between scores was assessed by Pearson's correlation coefficient. Nonparametric tests, the Kruskal-Wallis and the Mann-Whitney test for pairwise comparison of the scores, were performed. RESULTS: A significant difference (P < .005) was observed between the CAIS and the AOFAS score (P = .0002), between the CAIS and the FAAM 1 (P = .0001), and between the CAIT and the AOFAS score (P = .0003). CONCLUSIONS: This study compared the performances of 4 disease- and body region-specific scoring systems. We demonstrated a correlation between the 4 administered scoring systems and notable differences between the results given by each of them. Disease-specific scores appeared more accurate than body region-specific scores. A strong correlation between the AOFAS score and the other scales was observed. The FAAM seemed a good compromise because it offered the possibility to evaluate the patient according to his or her own functional demand. CLINICAL RELEVANCE: The present study contributes to the development of more critical and accurate outcome assesment methods in foot and ankle surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report summarizes research conducted at Iowa State University on behalf of the Iowa Department of Transportation, focusing on the volumetric state of hot-mix asphalt (HMA) mixtures as they transition from stable to unstable configurations. This has raditionally been addressed during mix design by meeting a minimum voids in the mineral aggregate (VMA) requirement, based solely upon the nominal maximum aggregate size without regard to other significant aggregate-related properties. The goal was to expand the current specification to include additional aggregate properties, e.g., fineness modulus, percent crushed fine and coarse aggregate, and their interactions. The work was accomplished in three phases: a literature review, extensive laboratory testing, and statistical analysis of test results. The literature review focused on the history and development of the current specification, laboratory methods of identifying critical mixtures, and the effects of other aggregate-related factors on critical mixtures. The laboratory testing involved three maximum aggregate sizes (19.0, 12.5, and 9.5 millimeters), three gradations (coarse, fine, and dense), and combinations of natural and manufactured coarse and fine aggregates. Specimens were compacted using the Superpave Gyratory Compactor (SGC), conventionally tested for bulk and maximum theoretical specific gravities and physically tested using the Nottingham Asphalt Tester (NAT) under a repeated load confined configuration to identify the transition state from sound to unsound. The statistical analysis involved using ANOVA and linear regression to examine the effects of identified aggregate factors on critical state transitions in asphalt paving mixtures and to develop predictive equations. The results clearly demonstrate that the volumetric conditions of an HMA mixture at the stable unstable threshold are influenced by a composite measure of the maximum aggregate size and gradation and by aggregate shape and texture. The currently defined VMA criterion, while significant, is seen to be insufficient by itself to correctly differentiate sound from unsound mixtures. Under current specifications, many otherwise sound mixtures are subject to rejection solely on the basis of failing to meet the VMA requirement. Based on the laboratory data and statistical analysis, a new paradigm to volumetric mix design is proposed that explicitly accounts for aggregate factors (gradation, shape, and texture).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ces dernières années, de nombreuses recherches ont mis en évidence les effets toxiques des micropolluants organiques pour les espèces de nos lacs et rivières. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, alors que les organismes sont exposés tous les jours à des milliers de substances en mélange. Or les effets de ces cocktails ne sont pas négligeables. Cette thèse de doctorat s'est ainsi intéressée aux modèles permettant de prédire le risque environnemental de ces cocktails pour le milieu aquatique. Le principal objectif a été d'évaluer le risque écologique des mélanges de substances chimiques mesurées dans le Léman, mais aussi d'apporter un regard critique sur les méthodologies utilisées afin de proposer certaines adaptations pour une meilleure estimation du risque. Dans la première partie de ce travail, le risque des mélanges de pesticides et médicaments pour le Rhône et pour le Léman a été établi en utilisant des approches envisagées notamment dans la législation européenne. Il s'agit d'approches de « screening », c'est-à-dire permettant une évaluation générale du risque des mélanges. Une telle approche permet de mettre en évidence les substances les plus problématiques, c'est-à-dire contribuant le plus à la toxicité du mélange. Dans notre cas, il s'agit essentiellement de 4 pesticides. L'étude met également en évidence que toutes les substances, même en trace infime, contribuent à l'effet du mélange. Cette constatation a des implications en terme de gestion de l'environnement. En effet, ceci implique qu'il faut réduire toutes les sources de polluants, et pas seulement les plus problématiques. Mais l'approche proposée présente également un biais important au niveau conceptuel, ce qui rend son utilisation discutable, en dehors d'un screening, et nécessiterait une adaptation au niveau des facteurs de sécurité employés. Dans une deuxième partie, l'étude s'est portée sur l'utilisation des modèles de mélanges dans le calcul de risque environnemental. En effet, les modèles de mélanges ont été développés et validés espèce par espèce, et non pour une évaluation sur l'écosystème en entier. Leur utilisation devrait donc passer par un calcul par espèce, ce qui est rarement fait dû au manque de données écotoxicologiques à disposition. Le but a été donc de comparer, avec des valeurs générées aléatoirement, le calcul de risque effectué selon une méthode rigoureuse, espèce par espèce, avec celui effectué classiquement où les modèles sont appliqués sur l'ensemble de la communauté sans tenir compte des variations inter-espèces. Les résultats sont dans la majorité des cas similaires, ce qui valide l'approche utilisée traditionnellement. En revanche, ce travail a permis de déterminer certains cas où l'application classique peut conduire à une sous- ou sur-estimation du risque. Enfin, une dernière partie de cette thèse s'est intéressée à l'influence que les cocktails de micropolluants ont pu avoir sur les communautés in situ. Pour ce faire, une approche en deux temps a été adoptée. Tout d'abord la toxicité de quatorze herbicides détectés dans le Léman a été déterminée. Sur la période étudiée, de 2004 à 2009, cette toxicité due aux herbicides a diminué, passant de 4% d'espèces affectées à moins de 1%. Ensuite, la question était de savoir si cette diminution de toxicité avait un impact sur le développement de certaines espèces au sein de la communauté des algues. Pour ce faire, l'utilisation statistique a permis d'isoler d'autres facteurs pouvant avoir une influence sur la flore, comme la température de l'eau ou la présence de phosphates, et ainsi de constater quelles espèces se sont révélées avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps. Fait intéressant, une partie d'entre-elles avait déjà montré des comportements similaires dans des études en mésocosmes. En conclusion, ce travail montre qu'il existe des modèles robustes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques, et qu'ils peuvent être utilisés pour expliquer le rôle des substances dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application. - Depuis plusieurs années, les risques que posent les micropolluants organiques pour le milieu aquatique préoccupent grandement les scientifiques ainsi que notre société. En effet, de nombreuses recherches ont mis en évidence les effets toxiques que peuvent avoir ces substances chimiques sur les espèces de nos lacs et rivières, quand elles se retrouvent exposées à des concentrations aiguës ou chroniques. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, c'est à dire considérées séparément. Actuellement, il en est de même dans les procédures de régulation européennes, concernant la partie évaluation du risque pour l'environnement d'une substance. Or, les organismes sont exposés tous les jours à des milliers de substances en mélange, et les effets de ces "cocktails" ne sont pas négligeables. L'évaluation du risque écologique que pose ces mélanges de substances doit donc être abordé par de la manière la plus appropriée et la plus fiable possible. Dans la première partie de cette thèse, nous nous sommes intéressés aux méthodes actuellement envisagées à être intégrées dans les législations européennes pour l'évaluation du risque des mélanges pour le milieu aquatique. Ces méthodes sont basées sur le modèle d'addition des concentrations, avec l'utilisation des valeurs de concentrations des substances estimées sans effet dans le milieu (PNEC), ou à partir des valeurs des concentrations d'effet (CE50) sur certaines espèces d'un niveau trophique avec la prise en compte de facteurs de sécurité. Nous avons appliqué ces méthodes à deux cas spécifiques, le lac Léman et le Rhône situés en Suisse, et discuté les résultats de ces applications. Ces premières étapes d'évaluation ont montré que le risque des mélanges pour ces cas d'étude atteint rapidement une valeur au dessus d'un seuil critique. Cette valeur atteinte est généralement due à deux ou trois substances principales. Les procédures proposées permettent donc d'identifier les substances les plus problématiques pour lesquelles des mesures de gestion, telles que la réduction de leur entrée dans le milieu aquatique, devraient être envisagées. Cependant, nous avons également constaté que le niveau de risque associé à ces mélanges de substances n'est pas négligeable, même sans tenir compte de ces substances principales. En effet, l'accumulation des substances, même en traces infimes, atteint un seuil critique, ce qui devient plus difficile en terme de gestion du risque. En outre, nous avons souligné un manque de fiabilité dans ces procédures, qui peuvent conduire à des résultats contradictoires en terme de risque. Ceci est lié à l'incompatibilité des facteurs de sécurité utilisés dans les différentes méthodes. Dans la deuxième partie de la thèse, nous avons étudié la fiabilité de méthodes plus avancées dans la prédiction de l'effet des mélanges pour les communautés évoluant dans le système aquatique. Ces méthodes reposent sur le modèle d'addition des concentrations (CA) ou d'addition des réponses (RA) appliqués sur les courbes de distribution de la sensibilité des espèces (SSD) aux substances. En effet, les modèles de mélanges ont été développés et validés pour être appliqués espèce par espèce, et non pas sur plusieurs espèces agrégées simultanément dans les courbes SSD. Nous avons ainsi proposé une procédure plus rigoureuse, pour l'évaluation du risque d'un mélange, qui serait d'appliquer d'abord les modèles CA ou RA à chaque espèce séparément, et, dans une deuxième étape, combiner les résultats afin d'établir une courbe SSD du mélange. Malheureusement, cette méthode n'est pas applicable dans la plupart des cas, car elle nécessite trop de données généralement indisponibles. Par conséquent, nous avons comparé, avec des valeurs générées aléatoirement, le calcul de risque effectué selon cette méthode plus rigoureuse, avec celle effectuée traditionnellement, afin de caractériser la robustesse de cette approche qui consiste à appliquer les modèles de mélange sur les courbes SSD. Nos résultats ont montré que l'utilisation de CA directement sur les SSDs peut conduire à une sous-estimation de la concentration du mélange affectant 5 % ou 50% des espèces, en particulier lorsque les substances présentent un grand écart- type dans leur distribution de la sensibilité des espèces. L'application du modèle RA peut quant à lui conduire à une sur- ou sous-estimations, principalement en fonction de la pente des courbes dose- réponse de chaque espèce composant les SSDs. La sous-estimation avec RA devient potentiellement importante lorsque le rapport entre la EC50 et la EC10 de la courbe dose-réponse des espèces est plus petit que 100. Toutefois, la plupart des substances, selon des cas réels, présentent des données d' écotoxicité qui font que le risque du mélange calculé par la méthode des modèles appliqués directement sur les SSDs reste cohérent et surestimerait plutôt légèrement le risque. Ces résultats valident ainsi l'approche utilisée traditionnellement. Néanmoins, il faut garder à l'esprit cette source d'erreur lorsqu'on procède à une évaluation du risque d'un mélange avec cette méthode traditionnelle, en particulier quand les SSD présentent une distribution des données en dehors des limites déterminées dans cette étude. Enfin, dans la dernière partie de cette thèse, nous avons confronté des prédictions de l'effet de mélange avec des changements biologiques observés dans l'environnement. Dans cette étude, nous avons utilisé des données venant d'un suivi à long terme d'un grand lac européen, le lac Léman, ce qui offrait la possibilité d'évaluer dans quelle mesure la prédiction de la toxicité des mélanges d'herbicide expliquait les changements dans la composition de la communauté phytoplanctonique. Ceci à côté d'autres paramètres classiques de limnologie tels que les nutriments. Pour atteindre cet objectif, nous avons déterminé la toxicité des mélanges sur plusieurs années de 14 herbicides régulièrement détectés dans le lac, en utilisant les modèles CA et RA avec les courbes de distribution de la sensibilité des espèces. Un gradient temporel de toxicité décroissant a pu être constaté de 2004 à 2009. Une analyse de redondance et de redondance partielle, a montré que ce gradient explique une partie significative de la variation de la composition de la communauté phytoplanctonique, même après avoir enlevé l'effet de toutes les autres co-variables. De plus, certaines espèces révélées pour avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps, ont montré des comportements similaires dans des études en mésocosmes. On peut en conclure que la toxicité du mélange herbicide est l'un des paramètres clés pour expliquer les changements de phytoplancton dans le lac Léman. En conclusion, il existe diverses méthodes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques et celui-ci peut jouer un rôle dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application, avant d'utiliser leurs résultats pour la gestion des risques environnementaux. - For several years now, the scientists as well as the society is concerned by the aquatic risk organic micropollutants may pose. Indeed, several researches have shown the toxic effects these substances may induce on organisms living in our lakes or rivers, especially when they are exposed to acute or chronic concentrations. However, most of the studies focused on the toxicity of single compounds, i.e. considered individually. The same also goes in the current European regulations concerning the risk assessment procedures for the environment of these substances. But aquatic organisms are typically exposed every day simultaneously to thousands of organic compounds. The toxic effects resulting of these "cocktails" cannot be neglected. The ecological risk assessment of mixtures of such compounds has therefore to be addressed by scientists in the most reliable and appropriate way. In the first part of this thesis, the procedures currently envisioned for the aquatic mixture risk assessment in European legislations are described. These methodologies are based on the mixture model of concentration addition and the use of the predicted no effect concentrations (PNEC) or effect concentrations (EC50) with assessment factors. These principal approaches were applied to two specific case studies, Lake Geneva and the River Rhône in Switzerland, including a discussion of the outcomes of such applications. These first level assessments showed that the mixture risks for these studied cases exceeded rapidly the critical value. This exceeding is generally due to two or three main substances. The proposed procedures allow therefore the identification of the most problematic substances for which management measures, such as a reduction of the entrance to the aquatic environment, should be envisioned. However, it was also showed that the risk levels associated with mixtures of compounds are not negligible, even without considering these main substances. Indeed, it is the sum of the substances that is problematic, which is more challenging in term of risk management. Moreover, a lack of reliability in the procedures was highlighted, which can lead to contradictory results in terms of risk. This result is linked to the inconsistency in the assessment factors applied in the different methods. In the second part of the thesis, the reliability of the more advanced procedures to predict the mixture effect to communities in the aquatic system were investigated. These established methodologies combine the model of concentration addition (CA) or response addition (RA) with species sensitivity distribution curves (SSD). Indeed, the mixture effect predictions were shown to be consistent only when the mixture models are applied on a single species, and not on several species simultaneously aggregated to SSDs. Hence, A more stringent procedure for mixture risk assessment is proposed, that would be to apply first the CA or RA models to each species separately and, in a second step, to combine the results to build an SSD for a mixture. Unfortunately, this methodology is not applicable in most cases, because it requires large data sets usually not available. Therefore, the differences between the two methodologies were studied with datasets created artificially to characterize the robustness of the traditional approach applying models on species sensitivity distribution. The results showed that the use of CA on SSD directly might lead to underestimations of the mixture concentration affecting 5% or 50% of species, especially when substances present a large standard deviation of the distribution from the sensitivity of the species. The application of RA can lead to over- or underestimates, depending mainly on the slope of the dose-response curves of the individual species. The potential underestimation with RA becomes important when the ratio between the EC50 and the EC10 for the dose-response curve of the species composing the SSD are smaller than 100. However, considering common real cases of ecotoxicity data for substances, the mixture risk calculated by the methodology applying mixture models directly on SSDs remains consistent and would rather slightly overestimate the risk. These results can be used as a theoretical validation of the currently applied methodology. Nevertheless, when assessing the risk of mixtures, one has to keep in mind this source of error with this classical methodology, especially when SSDs present a distribution of the data outside the range determined in this study Finally, in the last part of this thesis, we confronted the mixture effect predictions with biological changes observed in the environment. In this study, long-term monitoring of a European great lake, Lake Geneva, provides the opportunity to assess to what extent the predicted toxicity of herbicide mixtures explains the changes in the composition of the phytoplankton community next to other classical limnology parameters such as nutrients. To reach this goal, the gradient of the mixture toxicity of 14 herbicides regularly detected in the lake was calculated, using concentration addition and response addition models. A decreasing temporal gradient of toxicity was observed from 2004 to 2009. Redundancy analysis and partial redundancy analysis showed that this gradient explains a significant portion of the variation in phytoplankton community composition, even when having removed the effect of all other co-variables. Moreover, some species that were revealed to be influenced positively or negatively, by the decrease of toxicity in the lake over time, showed similar behaviors in mesocosms studies. It could be concluded that the herbicide mixture toxicity is one of the key parameters to explain phytoplankton changes in Lake Geneva. To conclude, different methods exist to predict the risk of mixture in the ecosystems. But their reliability varies depending on the underlying hypotheses. One should therefore carefully consider these hypotheses, as well as the limits of the approaches, before using the results for environmental risk management

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The liquid-liquid critical point scenario of water hypothesizes the existence of two metastable liq- uid phases low-density liquid (LDL) and high-density liquid (HDL) deep within the supercooled region. The hypothesis originates from computer simulations of the ST2 water model, but the stabil- ity of the LDL phase with respect to the crystal is still being debated. We simulate supercooled ST2 water at constant pressure, constant temperature, and constant number of molecules N for N ≤ 729 and times up to 1 μs. We observe clear differences between the two liquids, both structural and dynamical. Using several methods, including finite-size scaling, we confirm the presence of a liquid-liquid phase transition ending in a critical point. We find that the LDL is stable with respect to the crystal in 98% of our runs (we perform 372 runs for LDL or LDL-like states), and in 100% of our runs for the two largest system sizes (N = 512 and 729, for which we perform 136 runs for LDL or LDL-like states). In all these runs, tiny crystallites grow and then melt within 1 μs. Only for N ≤ 343 we observe six events (over 236 runs for LDL or LDL-like states) of spontaneous crystal- lization after crystallites reach an estimated critical size of about 70 ± 10 molecules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Suction-based wound healing devices with open-pore foam interfaces are widely used to treat complex tissue defects. The impact of changes in physicochemical parameters of the wound interfaces has not been investigated. METHODS: Full-thickness wounds in diabetic mice were treated with occlusive dressing or a suction device with a polyurethane foam interface varying in mean pore size diameter. Wound surface deformation on day 2 was measured on fixed tissues. Histologic cross-sections were analyzed for granulation tissue thickness (hematoxylin and eosin), myofibroblast density (α-smooth muscle actin), blood vessel density (platelet endothelial cell adhesion molecule-1), and cell proliferation (Ki67) on day 7. RESULTS: Polyurethane foam-induced wound surface deformation increased with polyurethane foam pore diameter: 15 percent (small pore size), 60 percent (medium pore size), and 150 percent (large pore size). The extent of wound strain correlated with granulation tissue thickness that increased 1.7-fold in small pore size foam-treated wounds, 2.5-fold in medium pore size foam-treated wounds, and 4.9-fold in large pore size foam-treated wounds (p < 0.05) compared with wounds treated with an occlusive dressing. All polyurethane foams increased the number of myofibroblasts over occlusive dressing, with maximal presence in large pore size foam-treated wounds compared with all other groups (p < 0.05). CONCLUSIONS: The pore size of the interface material of suction devices has a significant impact on the wound healing response. Larger pores increased wound surface strain, tissue growth, and transformation of contractile cells. Modification of the pore size is a powerful approach for meeting biological needs of specific wounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Hypertriglyceridemia (hyperTG) is common among intensive care unit (ICU) patients, but knowledge about hyperTG risk factors is scarce. The present study aims to identify risk factors favoring its development in patients requiring prolonged ICU treatment. METHODS: Prospective observational study in the medicosurgical ICU of a university teaching hospital. All consecutive patients staying ≥4 days were enrolled. Potential risk factors were recorded: pathology, energy intake, amount and type of nutritional lipids, intake of propofol, glucose intake, laboratory parameters, and drugs. Triglyceride (TG) levels were assessed three times weekly. Statistics was based on two-way analysis of variance (ANOVA) and linear regression with potential risk factors. RESULTS: Out of 1,301 consecutive admissions, 220 patients were eligible, of whom 99 (45 %) presented hyperTG (triglycerides >2 mmol/L). HyperTG patients were younger, heavier, with more brain injury and multiple trauma. Intake of propofol (mg/kg/h) and lipids' propofol had the highest correlation with plasma TG (r (2) = 0.28 and 0.26, respectively, both p < 0.001). Infection and inflammation were associated with development of hyperTG [C-reactive protein (CRP), r (2) = 0.19, p = 0.004]. No strong association could be found with nutritional lipids or other risk factors. Outcome was similar in normo- and hyperTG patients. CONCLUSIONS: HyperTG is frequent in the ICU but is not associated with adverse outcome. Propofol and accompanying lipid emulsion are the strongest risk factors. Our results suggest that plasma TG should be monitored at least twice weekly in patients on propofol. The clinical consequences of propofol-related hyperTG should be investigated in further studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND &amp; AIMS: Despite the proven ability of immunization to reduce Helicobacter infection in mouse models, the precise mechanism of protection has remained elusive. This study explores the possibility that interleukin (IL)-17 plays a role in the reduction of Helicobacter infection following vaccination of wild-type animals or in spontaneous reduction of bacterial infection in IL-10-deficient mice. METHODS: In mice, reducing Helicobacter infection, the levels and source of IL-17 were determined and the role of IL-17 in reduction of Helicobacter infection was probed by neutralizing antibodies. RESULTS: Gastric IL-17 levels were strongly increased in mice mucosally immunized with urease plus cholera toxin and challenged with Helicobacter felis as compared with controls (654 +/- 455 and 34 +/- 84 relative units for IL-17 messenger RNA expression [P &lt; .01] and 6.9 +/- 8.4 and 0.02 +/- 0.04 pg for IL-17 protein concentration [P &lt; .01], respectively). Flow cytometry analysis showed that a peak of CD4(+)IL-17(+) T cells infiltrating the gastric mucosa occurred in immunized mice in contrast to control mice (4.7% +/- 0.3% and 1.4% +/- 0.3% [P &lt; .01], respectively). Gastric mucosa-infiltrating CD4(+)IL-17(+) T cells were also observed in IL-10-deficient mice that spontaneously reduced H felis infection (4.3% +/- 2.3% and 2% +/- 0.6% [P &lt; .01], for infected and noninfected IL-10-deficient mice, respectively). In wild-type immunized mice, intraperitoneal injection of anti-IL-17 antibodies significantly inhibited inflammation and the reduction of Helicobacter infection in comparison with control antibodies (1 of 12 mice vs 9 of 12 mice reduced Helicobacter infection [P &lt; .01], respectively). CONCLUSIONS: IL-17 plays a critical role in the immunization-induced reduction of Helicobacter infection from the gastric mucosa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: The aim of this study was to assess the blood flow in the feet before and after lower limb revascularization using laser Doppler imaging (LDI). METHODS: Ten patients with critical lower limb ischemia were prospectively enrolled from June to October 2004. All patients underwent successful unilateral surgical interventions including above-knee bypass, distal bypass and endarterectomy. Skin blood flow (SBF) over the plantar surface of both forefeet and heels was measured by LDI 24h before and 10 days after revascularization, expressed in perfusion units (PU), and reported as mean+/-SD. RESULTS: Measurements in the forefoot and heel were similar. Before revascularization mean SBF was significantly lower in the ischemic foot (130+/-71 PU) compared to the contralateral foot (212+/-68 PU), p<0.05. After revascularization a significant increase of the SBF in the forefoot (from 135+/-67 to 202+/-86 PU, p=0.001) and hindfoot (from 148+/-58 to 203+/-83, p=0.001) was observed on the treatment side. However, a large decrease of the SBF was seen in forefoot and hindfoot on the untreated side (from 250+/-123 PU to 176+/-83 and from 208+/-116 to 133+/-40, p=0.001, respectively). CONCLUSION: This study confirms the benefits of revascularization in patients with nonhealing foot lesions due to critical limb ischemia. A significant increase of the SBF was observed on the treatment side. However, an unexpected decrease was observed on the untreated side.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Radiosurgery (RS) is gaining increasing acceptance in the upfront management of brain metastases (BM). It was initially used in so-called radioresistant metastases (melanoma, renal cell, sarcoma) because it allowed delivering higher dose to the tumor. Now, RS is also used for BM of other cancers. The risk of high incidence of new BM questions the need for associated whole-brain radiotherapy (WBRT). Recent evidence suggests that RS alone allows avoiding cognitive impairment related to WBRT, and the latter should be upheld for salvage therapy. Thus the increase use of RS for single and multiple BM raises new technical challenges for treatment delivery and dosimetry. We present our single institution experience focusing on the criteria that led to patients' selection for RS treatment with Gamma Knife (GK) in lieu of Linac. METHODS: Leksell Gamma Knife Perfexion (Elekta, Sweden) was installed in July 2010. Currently, the Swiss federal health care supports the costs of RS for BM with Linac but not with GK. Therefore, in our center, we always consider first the possibility to use Linac for this indication, and only select patients for GK in specific situations. All cases of BM treated with GK were retrospectively reviewed for criteria yielding to GK indication, clinical information, and treatment data. Further work in progress includes a posteriori dosimetry comparison with our Linac planning system (Brainscan V.5.3, Brainlab, Germany). RESULTS: From July 2010 to March 2012, 20 patients had RS for BM with GK (7 patients with single BM, and 13 with multiple BM). During the same period, 31 had Linac-based RS. Primary tumor was melanoma in 9, lung in 7, renal in 2, and gastrointestinal tract in 2 patients. In single BM, the reason for choosing of GK was the anatomical location close to, or in highly functional areas (1 motor cortex, 1 thalamic, 1 ventricular, 1 mesio-temporal, 3 deep cerebellar close to the brainstem), especially since most of these tumors were intended to be treated with high-dose RS (24 Gy at margin) because of their histology (3 melanomas, 1 renal cell). In multiple BM, the reason for choosing GK in relation with the anatomical location of the lesions was either technical (limitations of Linac movements, especially in lower posterior fossa locations) or closeness of multiple lesions to highly functional areas (typically, multiple posterior fossa BM close to the brainstem), precluding optimal dosimetry with Linac. Again, this was made more critical for multiple BM needing high-dose RS (6 melanoma, 2 hypernephroma). CONCLUSION: Radiosurgery for BM may represent some technical challenge in relation with the anatomical location and multiplicity of the lesions. These considerations may be accentuated for so-called radioresistant BM, when higher dose RS in needed. In our experience, Leksell Gamma Knife Perfexion proves to be useful in addressing these challenges for the treatment of BM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acid-sensing ion channels are members of the epithelial Na(+) channel/degenerin family. They are neuronal nonvoltage-gated Na(+) channels that are activated by extracellular acidification. In this study, we investigated the role of a highly conserved region of the extracellular part of ASIC1a that forms the contact between the finger domain, the adjacent beta-ball, and the upper palm domain in ASIC1a. The finger domain contributes to the pH-dependent gating and is linked via this contact zone to the rest of the protein. We found that mutation to Cys of residues in this region led to decreased channel expression and current amplitudes. Exposure of the engineered Cys residues to Cd(2+) or to charged methane thiosulfonate sulfhydryl reagents further reduced current amplitudes. This current inhibition was not due to changes in acid-sensing ion channel pH dependence or unitary conductance and was likely due to a decrease of the probability of channel opening. For some mutants, the effect of sulfhydryl reagents depended on the pH of exposure in the range 7.4 to 6.8, suggesting that this zone undergoes conformational changes during inactivation. Our study identifies a region in ASIC1a whose integrity is required for normal channel function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the beginning of channel straightening at the turn of the century, the streams of western Iowa have degraded 1.5 to 5 times their original depth. This vertical degradation is often accompanied by increases in channel widths of 2 to 4 times the original widths. The deepening and widening of these streams has jeopardized the structural safety of many bridges by undercutting footings or pile caps, exposing considerable length of piling, and removing soil beneath and adjacent to abutments. Various types of flume and drop structures have been introduced in an effort to partially or totally stabilize these channels, protecting or replacing bridge structures. Although there has always been a need for economical grade stabilization structures to stop stream channel degradation and protect highway bridges and culverts, the problem is especially critical at the present time due to rapidly increasing construction costs and decreasing revenues. Benefits derived from stabilization extend beyond the transportation sector to the agricultural sector, and increased public interest and attention is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND/AIMS: The present report examines a new pig model for progressive induction of high-grade stenosis, for the study of chronic myocardial ischemia and the dynamics of collateral vessel growth. METHODS: Thirty-nine Landrace pigs were instrumented with a novel experimental stent (GVD stent) in the left anterior descending coronary artery. Eight animals underwent transthoracic echocardiography at rest and under low-dose dobutamine. Seven animals were examined by nuclear PET and SPECT analysis. Epi-, mid- and endocardial fibrosis and the numbers of arterial vessels were examined by histology. RESULTS: Functional analysis showed a significant decrease in global left ventricular ejection fraction (24.5 +/- 1.6%) 3 weeks after implantation. There was a trend to increased left ventricular ejection fraction after low-dose dobutamine stress (36.0 +/- 6.6%) and a significant improvement of the impaired regional anterior wall motion. PET and SPECT imaging documented chronic hibernation. Myocardial fibrosis increased significantly in the ischemic area with a gradient from epi- to endocardial. The number of arterial vessels in the ischemic area increased and coronary angiography showed abundant collateral vessels of Rentrop class 1. CONCLUSION: The presented experimental model mimics the clinical situation of chronic myocardial ischemia secondary to 1-vessel coronary disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been relatively little change over recent decades in the methods used in research on self-reported delinquency. Face-to-face interviews and selfadministered interviews in the classroom are still the predominant alternatives envisaged. New methods have been brought into the picture by recent computer technology, the Internet, and an increasing availability of computer equipment and Internet access in schools. In the autumn of 2004, a controlled experiment was conducted with 1,203 students in Lausanne (Switzerland), where "paper-and-pencil" questionnaires were compared with computer-assisted interviews through the Internet. The experiment included a test of two different definitions of the (same) reference period. After the introductory question ("Did you ever..."), students were asked how many times they had done it (or experienced it), if ever, "over the last 12 months" or "since the October 2003 vacation". Few significant differences were found between the results obtained by the two methods and for the two definitions of the reference period, in the answers concerning victimisation, self-reported delinquency, drug use, failure to respond (missing data). Students were found to be more motivated to respond through the Internet, take less time for filling out the questionnaire, and were apparently more confident of privacy, while the school principals were less reluctant to allow classes to be interviewed through the Internet. The Internet method also involves considerable cost reductions, which is a critical advantage if self-reported delinquency surveys are to become a routinely applied method of evaluation, particularly so in countries with limited resources. On balance, the Internet may be instrumental in making research on self-reported delinquency far more feasible in situations where limited resources so far have prevented its implementation.