185 resultados para feeding index (IAi)

em Université de Lausanne, Switzerland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Near infrared spectroscopy (NIRS) is a non-invasive method of estimating the haemoglobin concentration changes in certain tissues. It is frequently used to monitor oxygenation of the brain in neonates. At present it is not clear whether near infrared spectroscopy of other organs (e.g. the liver as a corresponding site in the splanchnic region, which reacts very sensitively to haemodynamic instability) provides reliable values on their tissue oxygenation. The aim of the study was to test near infrared spectroscopy by measuring known physiologic changes in tissue oxygenation of the liver in newborn infants during and after feeding via a naso-gastric tube. The test-retest variability of such measurements was also determined. On 28 occasions in 25 infants we measured the tissue oxygenation index (TOI) of the liver and the brain continuously before, during and 30 minutes after feeding via a gastric tube. Simultaneously we measured arterial oxygen saturation (SaO2), heart rate (HR) and mean arterial blood pressure (MAP). In 10 other newborn infants we performed a test-retest analysis of the liver tissue oxygenation index to estimate the variability in repeated intra-individual measurements. The tissue oxygenation index of the liver increased significantly from 56.7 +/- 7.5% before to 60.3 +/- 5.6% after feeding (p < 0.005), and remained unchanged for the next 30 minutes. The tissue oxygenation index of the brain (62.1 +/- 9.7%), SaO2 (94.4 +/- 7.1%), heart rate (145 +/- 17.3 min-1) and mean arterial blood pressure (52.8 +/- 10.2 mm Hg) did not change significantly. The test-retest variability for intra-individual measurements was 2.7 +/- 2.1%. After bolus feeding the tissue oxygenation index of the liver increased as expected. This indicates that near infrared spectroscopy is suitable for monitoring changes in tissue oxygenation of the liver in newborn infants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Gaining postpyloric access in ventilated, sedated ICU patients usually requires time-consuming procedures such as endoscopy. Recently, a feeding tube has been introduced that migrates spontaneously into the jejunum in surgical patients. The study aimed at assessing the rate of migration of this tube in critically ill patients. DESIGN: Prospective descriptive trial. SETTING: Surgical ICU in a tertiary University Hospital. PATIENTS: One hundred and five consecutive surgical ICU patients requiring enteral feeding were enrolled, resulting in 128 feeding-tube placement attempts. METHODS: A self-propelled tube was used and followed up for 3 days: progression was assessed by daily contrast-injected X-ray. Severity of illness was assessed with SAPS II and organ failure assessed with SOFA score. RESULTS: The patients were aged 55+/-19 years (mean+/-SD) with SAPS II score of 45+/-18. Of the 128 tube placement attempts, 12 could not be placed in the stomach; eight were accidentally pulled out while in gastric position due to the necessity to avoid fixation during the progression phase. Among organ failures, respiratory failure predominated, followed by cardiovascular. By day 3, the postpyloric progression rate was 63/128 tubes (49%). There was no association between migration and age, or SAPS II score, but the progression rate was significantly poorer in patients with hemodynamic failure. Use of norepinephrine and morphine were negatively associated with tube progression (P<0.001), while abdominal surgery was not. In ten patients, jejunal tubes were placed by endoscopy. CONCLUSION: Self-propelled feeding tubes progressed from the stomach to the postpyloric position in 49% of patients, reducing the number of endoscopic placements: these tubes may facilitate enteral nutrient delivery in the ICU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Introduction : Les patients nécessitant une prise en charge prolongée en milieu de soins intensifs et présentant une évolution compliquée, développent une réponse métabolique intense caractérisée généralement par un hypermétabolisme et un catabolisme protéique. La sévérité de leur atteinte pathologique expose ces patients à la malnutrition, due principalement à un apport nutritionnel insuffisant, et entraînant une balance énergétique déficitaire. Dans un nombre important d'unités de soins intensifs la nutrition des patients n'apparaît pas comme un objectif prioritaire de la prise en charge. En menant une étude prospective d'observation afin d'analyser la relation entre la balance énergétique et le pronostic clinique des patients avec séjours prolongés en soins intensifs, nous souhaitions changer cette attitude et démonter l'effet délétère de la malnutrition chez ce type de patient. Méthodes : Sur une période de 2 ans, tous les patients, dont le séjour en soins intensifs fut de 5 jours ou plus, ont été enrôlés. Les besoins en énergie pour chaque patient ont été déterminés soit par calorimétrie indirecte, soit au moyen d'une formule prenant en compte le poids du patient (30 kcal/kg/jour). Les patients ayant bénéficié d'une calorimétrie indirecte ont par ailleurs vérifié la justesse de la formule appliquée. L'âge, le sexe le poids préopératoire, la taille, et le « Body mass index » index de masse corporelle reconnu en milieu clinique ont été relevés. L'énergie délivrée l'était soit sous forme nutritionnelle (administration de nutrition entérale, parentérale ou mixte) soit sous forme non-nutritionnelle (perfusions : soluté glucosé, apport lipidique non nutritionnel). Les données de nutrition (cible théorique, cible prescrite, énergie nutritionnelle, énergie non-nutritionnelle, énergie totale, balance énergétique nutritionnelle, balance énergétique totale), et d'évolution clinique (nombre des jours de ventilation mécanique, nombre d'infections, utilisation des antibiotiques, durée du séjour, complications neurologiques, respiratoires gastro-intestinales, cardiovasculaires, rénales et hépatiques, scores de gravité pour patients en soins intensifs, valeurs hématologiques, sériques, microbiologiques) ont été analysées pour chacun des 669 jours de soins intensifs vécus par un total de 48 patients. Résultats : 48 patients de 57±16 ans dont le séjour a varié entre 5 et 49 jours (motif d'admission : polytraumatisés 10; chirurgie cardiaque 13; insuffisance respiratoire 7; pathologie gastro-intestinale 3; sepsis 3; transplantation 4; autre 8) ont été retenus. Si nous n'avons pu démontrer une relation entre la balance énergétique et plus particulièrement, le déficit énergétique, et la mortalité, il existe une relation hautement significative entre le déficit énergétique et la morbidité, à savoir les complications et les infections, qui prolongent naturellement la durée du séjour. De plus, bien que l'étude ne comporte aucune intervention et que nous ne puissions avancer qu'il existe une relation de cause à effet, l'analyse par régression multiple montre que le facteur pronostic le plus fiable est justement la balance énergétique, au détriment des scores habituellement utilisés en soins intensifs. L'évolution est indépendante tant de l'âge et du sexe, que du status nutritionnel préopératoire. L'étude ne prévoyait pas de récolter des données économiques : nous ne pouvons pas, dès lors, affirmer que l'augmentation des coûts engendrée par un séjour prolongé en unité de soins intensifs est induite par un déficit énergétique, même si le bon sens nous laisse penser qu'un séjour plus court engendre un coût moindre. Cette étude attire aussi l'attention sur l'origine du déficit énergétique : il se creuse au cours de la première semaine en soins intensifs, et pourrait donc être prévenu par une intervention nutritionnelle précoce, alors que les recommandations actuelles préconisent un apport énergétique, sous forme de nutrition artificielle, qu'à partir de 48 heures de séjour aux soins intensifs. Conclusions : L'étude montre que pour les patients de soins intensifs les plus graves, la balance énergétique devrait être considérée comme un objectif important de la prise en charge, nécessitant l'application d'un protocole de nutrition précoce. Enfin comme l'évolution à l'admission des patients est souvent imprévisible, et que le déficit s'installe dès la première semaine, il est légitime de s'interroger sur la nécessité d'appliquer ce protocole à tous les patients de soins intensifs et ceci dès leur admission. Summary Background and aims: Critically ill patients with complicated evolution are frequently hypermetabolic, catabolic, and at risk of underfeeding. The study aimed at assessing the relationship between energy balance and outcome in critically ill patients. Methods: Prospective observational study conducted in consecutive patients staying 5 days in the surgical ICU of a University hospital. Demographic data, time to feeding, route, energy delivery, and outcome were recorded. Energy balance was calculated as energy delivery minus target. Data in means+ SD, linear regressions between energy balance and outcome variables. Results: Forty eight patients aged 57±16 years were investigated; complete data are available in 669 days. Mechanical ventilation lasted 11±8 days, ICU stay 15+9 was days, and 30-days mortality was 38%. Time to feeding was 3.1 ±2.2 days. Enteral nutrition was the most frequent route with 433 days. Mean daily energy delivery was 1090±930 kcal. Combining enteral and parenteral nutrition achieved highest energy delivery. Cumulated energy balance was between -12,600+ 10,520 kcal, and correlated with complications (P<0.001), already after 1 week. Conclusion: Negative energy balances were correlated with increasing number of complications, particularly infections. Energy debt appears as a promising tool for nutritional follow-up, which should be further tested. Delaying initiation of nutritional support exposes the patients to energy deficits that cannot be compensated later on.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The WOSI (Western Ontario Shoulder Instability Index) is a self-administered quality of life questionnaire designed to be used as a primary outcome measure in clinical trials on shoulder instability, as well as to measure the effect of an intervention on any particular patient. It is validated and is reliable and sensitive. As it is designed to measure subjective outcome, it is important that translation should be methodologically rigorous, as it is subject to both linguistic and cultural interpretation. OBJECTIVE: To produce a French language version of the WOSI that is culturally adapted to both European and North American French-speaking populations. MATERIALS AND METHODS: A validated protocol was used to create a French language WOSI questionnaire (WOSI-Fr) that would be culturally acceptable for both European and North American French-speaking populations. Reliability and responsiveness analyses were carried out, and the WOSI-Fr was compared to the F-QuickDASH-D/S (Disability of the Arm, Shoulder and Hand-French translation), and Walch-Duplay scores. RESULTS: A French language version of the WOSI (WOSI-Fr) was accepted by a multinational committee. The WOSI-Fr was then validated using a total of 144 native French-speaking subjects from Canada and Switzerland. Comparison of results on two WOSI-Fr questionnaires completed at a mean interval of 16 days showed that the WOSI-Fr had strong reliability, with a Pearson and interclass correlation of r=0.85 (P=0.01) and ICC=0.84 [95% CI=0.78-0.88]. Responsiveness, at a mean 378.9 days after surgical intervention, showed strong correlation with that of the F-QuickDASH-D/S, with r=0.67 (P<0.01). Moreover, a standardized response means analysis to calculate effect size for both the WOSI-Fr and the F-QuickDASH-D/S showed that the WOSI-Fr had a significantly greater ability to detect change (SRM 1.55 versus 0.87 for the WOSI-Fr and F-QuickDASH-D/S respectively, P<0.01). The WOSI-Fr showed fair correlation with the Walch-Duplay. DISCUSSION: A French-language translation of the WOSI questionnaire was created and validated for use in both Canadian and Swiss French-speaking populations. This questionnaire will facilitate outcome assessment in French-speaking settings, collaboration in multinational studies and comparison between studies performed in different countries. TYPE OF STUDY: Multicenter cohort study. LEVEL OF EVIDENCE: II.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/Purpose: The trabecular bone score (TBS), a novel graylevel texture index determined from lumbar spine DXA scans, correlates with 3D parameters of trabecular bone microarchitecture known to predict fracture. TBS may enhance the identification of patients at increased risk for vertebral fracture independently of bone mineral density (BMD) (Boutroy JBMR 2010; Hans JBMR 2011). Denosumab treatment for 36 months decreased bone turnover, increased BMD, and reduced new vertebral fractures in postmenopausal women with osteoporosis (Cummings NEJM 2009). We explored the effect of denosumab on TBS over 36 months and evaluated the association between TBS and lumbar spine BMD in women who had DXA scans obtained from eligible scanners for TBS evaluation in FREEDOM. Methods: FREEDOM was a 3-year, randomized, double-blind trial that enrolled postmenopausal women with a lumbar spine or total hip DXA T-score __2.5, but not __4.0 at both sites. Women received placebo or 60 mg denosumab every 6 months. A subset of women in FREEDOM participated in a DXA substudy where lumbar spine DXA scans were obtained at baseline and months 1, 6, 12, 24, and 36. We retrospectively applied, in a blinded-to-treatment manner, a novel software program (TBS iNsightR v1.9, Med-Imaps, Pessac, France) to the standard lumbar spine DXA scans obtained in these women to determine their TBS indices at baseline and months 12, 24, and 36. From previous studies, a TBS _1.35 is considered as normal microarchitecture, a TBS between 1.35 and _1.20 as partially deteriorated, and 1.20 reflects degraded microarchitecture. Results: There were 285 women (128 placebo, 157 denosumab) with a TBS value at baseline and _1 post-baseline visit. Their mean age was 73, their mean lumbar spine BMD T-score was _2.79, and their mean lumbar spine TBS was 1.20. In addition to the robust gains in DXA lumbar spine BMD observed with denosumab (9.8% at month 36), there were consistent, progressive, and significant increases in TBS compared with placebo and baseline (Table & Figure). BMD explained a very small fraction of the variance in TBS at baseline (r2_0.07). In addition, the variance in the TBS change was largely unrelated to BMD change, whether expressed in absolute or percentage changes, regardless of treatment, throughout the study (all r2_0.06); indicating that TBS provides distinct information, independently of BMD. Conclusion: In postmenopausal women with osteoporosis, denosumab significantly improved TBS, an index of lumbar spine trabecular microarchitecture, independently of BMD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comment on: Prospective Studies Collaboration, Whitlock G, Lewington S et al. Body-mass index and cause-specific mortality in 900 000 adults: collaborative analyses of 57 prospective studies. Lancet. 2009;373(9669):1083-96. PMID: 19299006.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'artériopathie oblitérante des membres inférieurs (AOMI) est un marqueur de l'athérosclérose systémique et est associée à une morbi-mortalité importante. L'anamnèse et l'examen clinique sont très peu sensibles pour le diagnostic d'AOMI raison pour laquelle on sous-estime largement sa vraie prévalence. La mesure de l'ABI est l'examen de choix pour le dépistage d'une AOMI. Il s'agit d'un test fiable, de courte durée et bon marché, qui peut être facilement appris et utilisé par le médecin de premier recours. Une fois le diagnostic d'AOMI posé, comme pour les autres maladies cardiovasculaires, des mesures médicamenteuses et non médicamenteuses doivent être mises en place, afin de modifier les facteurs de risque cardiovasculaires, freiner la progression de la maladie et prévenir les complications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background a nd Aims: T he international E EsAI study g roupis currently developing the first activity index (EEsAI) specificfor Eosinophilic Esophagitis (EoE). Goal: To develop, evaluateand validate the EEsAI.Methods: T he d evelopment comprises three phases: 1.Selection of candidate items; 2. Evaluation of the activity indexin a f irst patient cohort; and 3. V alidation in a s econd EoEpatient cohort. Focus group interviews with patients were usedin p hase 1 to generate p atient r eported outcomes ( PRO)according to guidelines o f regulatory authorities ( FDA andEMA), whereas the section of biologic items was developed byDelphi r ounds of i nternational E oE experts from E urope andNorth America.Results: The EEsAI has a modular composition to assess thefollowing components o f EoE activity: p atient reportedoutcomes, endoscopic activity, histologic activity, laboratoryactivity, a nd quality of life. D efinitions f or all aspects o fendoscopic and histologic appearance were established byconsensus rounds among EoE experts. Symptom assessmenttools were created that take into account d ifferent foodconsistencies as w ell as f ood avoidance and specificprocessing strategies. T he EEsAI is evaluated in a c ohort ofadult EoE patients since March 2011.Conclusions: After successful validation, the EEsAI will allowto standardize outcome assessment in E oE t rials which w illlikely lead to its wide applicability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Leptin is produced primarily by adipocytes. Although originally associated with the central regulation of satiety and energy metabolism, increasing evidence indicates that leptin may be an important factor for congestive heart faire (CHF). In the study, we aimed to test the hypothesis that leptin may influence CHF pathophysiology via a pathway of increasing body mass index (BMI). Methods: We studied 2,389 elderly participants aged 70 and older (M; 1161, F: 1228) without CHF and with serum leptin measures at the Health Aging, and Body Composition study. We analyzed the association between serum leptin level and risk of incident CHF using Cox hazard proportional regression models. Elevated leptin level was defined as more than the highest quartile (Q4) of leptin distribution in the total sample for each gender. Adjusted-covariates included demographic, behavior, lipid and inflammation variables (partially-adjusted models), and further included BMI (fully-adjusted models). Results: In a mean 9-year follow-up, 316 participants (13.2%) developed CHF. The partially-adjusted models indicated that men and women with elevated serum leptin levels (>=9.89 ng/ml in men and >=25 ng/ml in women) had significantly higher risks of developing CHF than those with leptin level of less than Q4. The adjusted hazard ratios (95%CI) for incident CHF was 1.49 (1.04 -2.13) in men and 1.71 (1.12 -2.58) in women. However, these associations became non-significant after adjustment for including BMI for each gender. The fully-adjusted hazard ratios (95%CI) were 1.43 (0.94 -2.18) in men and 1.24 (0.77-1.99) in women. Conclusion: Subjects with elevated leptin levels have a higher risk of CHF. The study supports the hypothesis that the influence of leptin level on risk of CHF may be through a pathway related to increasing BMI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Odds ratios for head and neck cancer increase with greater cigarette and alcohol use and lower body mass index (BMI; weight (kg)/height(2) (m(2))). Using data from the International Head and Neck Cancer Epidemiology Consortium, the authors conducted a formal analysis of BMI as a modifier of smoking- and alcohol-related effects. Analysis of never and current smokers included 6,333 cases, while analysis of never drinkers and consumers of < or =10 drinks/day included 8,452 cases. There were 8,000 or more controls, depending on the analysis. Odds ratios for all sites increased with lower BMI, greater smoking, and greater drinking. In polytomous regression, odds ratios for BMI (P = 0.65), smoking (P = 0.52), and drinking (P = 0.73) were homogeneous for oral cavity and pharyngeal cancers. Odds ratios for BMI and drinking were greater for oral cavity/pharyngeal cancer (P < 0.01), while smoking odds ratios were greater for laryngeal cancer (P < 0.01). Lower BMI enhanced smoking- and drinking-related odds ratios for oral cavity/pharyngeal cancer (P < 0.01), while BMI did not modify smoking and drinking odds ratios for laryngeal cancer. The increased odds ratios for all sites with low BMI may suggest related carcinogenic mechanisms; however, BMI modification of smoking and drinking odds ratios for cancer of the oral cavity/pharynx but not larynx cancer suggests additional factors specific to oral cavity/pharynx cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to determine whether breath 13CO2 measurements could be used to assess the compliance to a diet containing carbohydrates naturally enriched in 13C. The study was divided into two periods: Period 1 (baseline of 4 days) with low 13C/12C ratio carbohydrates. Period 2 (5 days) isocaloric diet with a high 13C/12C ratio (corn, cane sugar, pineapple, millet) carbohydrates. Measurements were made of respiratory gas exchange by indirect calorimetry, urinary nitrogen excretion and breath 13CO2 every morning in post-absorptive conditions, both in resting state and during a 45-min low intensity exercise (walking on a treadmill). The subjects were 10 healthy lean women (BMI 20.4 +/- 1.7 kg/m2, % body fat 24.4 +/- 1.3%), the 13C enrichment of oxidized carbohydrate and breath 13CO2 were compared to the enrichment of exogenous dietary carbohydrates. At rest the enrichment of oxidized carbohydrate increased significantly after one day of 13C carbohydrate enriched diet and reached a steady value (103 +/- 16%) similar to the enrichment of exogenous carbohydrates. During exercise, the 13C enrichment of oxidized carbohydrate remained significantly lower (68 +/- 17%) than that of dietary carbohydrates. The compliance to a diet with a high content of carbohydrates naturally enriched in 13C may be assessed from the measurement of breath 13CO2 enrichment combined with respiratory gas exchange in resting, postabsorptive conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The correlation between noninvasive markers with endoscopic activity according to the modified Baron Index in patients with ulcerative colitis (UC) is unknown. We aimed to evaluate the correlation between endoscopic activity and fecal calprotectin (FC), C-reactive protein (CRP), hemoglobin, platelets, blood leukocytes, and the Lichtiger Index (clinical score). METHODS: UC patients undergoing complete colonoscopy were prospectively enrolled and scored clinically and endoscopically. Samples from feces and blood were analyzed in UC patients and controls. RESULTS: We enrolled 228 UC patients and 52 healthy controls. Endoscopic disease activity correlated best with FC (Spearman's rank correlation coefficient r = 0.821), followed by the Lichtiger Index (r = 0.682), CRP (r = 0.556), platelets (r = 0.488), blood leukocytes (r = 0.401), and hemoglobin (r = -0.388). FC was the only marker that could discriminate between different grades of endoscopic activity (grade 0, 16 [10-30] μg/g; grade 1, 35 [25-48] μg/g; grade 2, 102 [44-159] μg/g; grade 3, 235 [176-319] μg/g; grade 4, 611 [406-868] μg/g; P < 0.001 for discriminating the different grades). FC with a cutoff of 57 μg/g had a sensitivity of 91% and a specificity of 90% to detect endoscopically active disease (modified Baron Index ≥ 2). CONCLUSIONS: FC correlated better with endoscopic disease activity than clinical activity, CRP, platelets, hemoglobin, and blood leukocytes. The strong correlation with endoscopic disease activity suggests that FC represents a useful biomarker for noninvasive monitoring of disease activity in UC patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Many emergency department (ED) providers do not follow guideline recommendations for the use of the pneumonia severity index (PSI) to determine the initial site of treatment for patients with community-acquired pneumonia (CAP). We identified the reasons why ED providers hospitalize low-risk patients or manage higher-risk patients as outpatients. METHODS: As a part of a trial to implement a PSI-based guideline for the initial site of treatment of patients with CAP, we analyzed data for patients managed at 12 EDs allocated to a high-intensity guideline implementation strategy study arm. The guideline recommended outpatient care for low-risk patients (nonhypoxemic patients with a PSI risk classification of I, II, or III) and hospitalization for higher-risk patients (hypoxemic patients or patients with a PSI risk classification of IV or V). We asked providers who made guideline-discordant decisions on site of treatment to detail the reasons for nonadherence to guideline recommendations. RESULTS: There were 1,306 patients with CAP (689 low-risk patients and 617 higher-risk patients). Among these patients, physicians admitted 258 (37.4%) of 689 low-risk patients and treated 20 (3.2%) of 617 higher-risk patients as outpatients. The most commonly reported reasons for admitting low-risk patients were the presence of a comorbid illness (178 [71.5%] of 249 patients); a laboratory value, vital sign, or symptom that precluded ED discharge (73 patients [29.3%]); or a recommendation from a primary care or a consulting physician (48 patients [19.3%]). Higher-risk patients were most often treated as outpatients because of a recommendation by a primary care or consulting physician (6 [40.0%] of 15 patients). CONCLUSION: ED providers hospitalize many low-risk patients with CAP, most frequently for a comorbid illness. Although higher-risk patients are infrequently treated as outpatients, this decision is often based on the request of an involved physician.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil pseudomonads increase their competitiveness by producing toxic secondary metabolites, which inhibit competitors and repel predators. Toxin production is regulated by cell-cell signalling and efficiently protects the bacterial population. However, cell communication is unstable, and natural populations often contain signal blind mutants displaying an altered phenotype defective in exoproduct synthesis. Such mutants are weak competitors, and we hypothesized that their fitness depends on natural communities on the exoproducts of wild-type bacteria, especially defence toxins. We established mixed populations of wild-type and signal blind, non-toxic gacS-deficient mutants of Pseudomonas fluorescens CHA0 in batch and rhizosphere systems. Bacteria were grazed by representatives of the most important bacterial predators in soil, nematodes (Caenorhabditis elegans) and protozoa (Acanthamoeba castellanii). The gacS mutants showed a negative frequency-dependent fitness and could reach up to one-third of the population, suggesting that they rely on the exoproducts of the wild-type bacteria. Both predators preferentially consumed the mutant strain, but populations with a low mutant load were resistant to predation, allowing the mutant to remain competitive at low relative density. The results suggest that signal blind Pseudomonas increase their fitness by exploiting the toxins produced by wild-type bacteria, and that predation promotes the production of bacterial defence compounds by selectively eliminating non-toxic mutants. Therefore, predators not only regulate population dynamics of soil bacteria but also structure the genetic and phenotypic constitution of bacterial communities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Practice guidelines recommend outpatient care for selected patients with non-massive pulmonary embolism (PE), but fail to specify how these low-risk patients should be identified. Using data from U.S. patients, we previously derived the Pulmonary Embolism Severity Index (PESI), a prediction rule that risk stratifies patients with PE. We sought to validate the PESI in a European patient cohort. We prospectively validated the PESI in patients with PE diagnosed at six emergency departments in three European countries. We used baseline data for the rule's 11 prognostic variables to stratify patients into five risk classes (I-V) of increasing probability of mortality. The outcome was overall mortality at 90 days after presentation. To assess the accuracy of the PESI to predict mortality, we estimated the sensitivity, specificity, and predictive values for low- (risk classes I/II) versus higher-risk patients (risk classes III-V), and the discriminatory power using the area under the receiver operating characteristic (ROC) curve. Among 357 patients with PE, overall mortality was 5.9%, ranging from 0% in class I to 17.9% in class V. The 186 (52%) low-risk patients had an overall mortality of 1.1% (95% confidence interval [CI]: 0.1-3.8%) compared to 11.1% (95% CI: 6.8-16.8%) in the 171 (48%) higher-risk patients. The PESI had a high sensitivity (91%, 95% CI: 71-97%) and a negative predictive value (99%, 95% CI: 96-100%) for predicting mortality. The area under the ROC curve was 0.78 (95% CI: 0.70-0.86). The PESI reliably identifies patients with PE who are at low risk of death and who are potential candidates for outpatient care. The PESI may help physicians make more rational decisions about hospitalization for patients with PE.