959 resultados para Central mortality rate


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Many orthopaedic surgical procedures can be performed with either regional or general anesthesia. We hypothesized that total hip arthroplasty with regional anesthesia is associated with less postoperative morbidity and mortality than total hip arthroplasty with general anesthesia. METHODS This retrospective propensity-matched cohort study utilizing the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database included patients who had undergone total hip arthroplasty from 2007 through 2011. After matching, logistic regression was used to determine the association between the type of anesthesia and deep surgical site infections, hospital length of stay, thirty-day mortality, and cardiovascular and pulmonary complications. RESULTS Of 12,929 surgical procedures, 5103 (39.5%) were performed with regional anesthesia. The adjusted odds for deep surgical site infections were significantly lower in the regional anesthesia group than in the general anesthesia group (odds ratio [OR] = 0.38; 95% confidence interval [CI] = 0.20 to 0.72; p < 0.01). The hospital length of stay (geometric mean) was decreased by 5% (95% CI = 3% to 7%; p < 0.001) with regional anesthesia, which translates to 0.17 day for each total hip arthroplasty. Regional anesthesia was also associated with a 27% decrease in the odds of prolonged hospitalization (OR = 0.73; 95% CI = 0.68 to 0.89; p < 0.001). The mortality rate was not significantly lower with regional anesthesia (OR = 0.78; 95% CI = 0.43 to 1.42; p > 0.05). The adjusted odds for cardiovascular complications (OR = 0.61; 95% CI = 0.44 to 0.85) and respiratory complications (OR = 0.51; 95% CI = 0.33 to 0.81) were all lower in the regional anesthesia group. CONCLUSIONS Compared with general anesthesia, regional anesthesia for total hip arthroplasty was associated with a reduction in deep surgical site infection rates, hospital length of stay, and rates of postoperative cardiovascular and pulmonary complications. These findings could have an important medical and economic impact on health-care practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Faecal peritonitis (FP) is a common cause of sepsis and admission to the intensive care unit (ICU). The Genetics of Sepsis and Septic Shock in Europe (GenOSept) project is investigating the influence of genetic variation on the host response and outcomes in a large cohort of patients with sepsis admitted to ICUs across Europe. Here we report an epidemiological survey of the subset of patients with FP. OBJECTIVES To define the clinical characteristics, outcomes and risk factors for mortality in patients with FP admitted to ICUs across Europe. METHODS Data was extracted from electronic case report forms. Phenotypic data was recorded using a detailed, quality-assured clinical database. The primary outcome measure was 6-month mortality. Patients were followed for 6 months. Kaplan-Meier analysis was used to determine mortality rates. Cox proportional hazards regression analysis was employed to identify independent risk factors for mortality. RESULTS Data for 977 FP patients admitted to 102 centres across 16 countries between 29 September 2005 and 5 January 2011 was extracted. The median age was 69.2 years (IQR 58.3-77.1), with a male preponderance (54.3%). The most common causes of FP were perforated diverticular disease (32.1%) and surgical anastomotic breakdown (31.1%). The ICU mortality rate at 28 days was 19.1%, increasing to 31.6% at 6 months. The cause of FP, pre-existing co-morbidities and time from estimated onset of symptoms to surgery did not impact on survival. The strongest independent risk factors associated with an increased rate of death at 6 months included age, higher APACHE II score, acute renal and cardiovascular dysfunction within 1 week of admission to ICU, hypothermia, lower haematocrit and bradycardia on day 1 of ICU stay. CONCLUSIONS In this large cohort of patients admitted to European ICUs with FP the 6 month mortality was 31.6%. The most consistent predictors of mortality across all time points were increased age, development of acute renal dysfunction during the first week of admission, lower haematocrit and hypothermia on day 1 of ICU admission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Pulmonary hypertension (PH) frequently coexists with severe aortic stenosis, and PH severity has been shown to predict outcomes after transcatheter aortic valve implantation (TAVI). The effect of PH hemodynamic presentation on clinical outcomes after TAVI is unknown. METHODS AND RESULTS Of 606 consecutive patients undergoing TAVI, 433 (71.4%) patients with severe aortic stenosis and a preprocedural right heart catheterization were assessed. Patients were dichotomized according to whether PH was present (mean pulmonary artery pressure, ≥25 mm Hg; n=325) or not (n=108). Patients with PH were further dichotomized by left ventricular end-diastolic pressure into postcapillary (left ventricular end-diastolic pressure, >15 mm Hg; n=269) and precapillary groups (left ventricular end-diastolic pressure, ≤15 mm Hg; n=56). Finally, patients with postcapillary PH were divided into isolated (n=220) and combined (n=49) subgroups according to whether the diastolic pressure difference (diastolic pulmonary artery pressure-left ventricular end-diastolic pressure) was normal (<7 mm Hg) or elevated (≥7 mm Hg). Primary end point was mortality at 1 year. PH was present in 325 of 433 (75%) patients and was predominantly postcapillary (n=269/325; 82%). Compared with baseline, systolic pulmonary artery pressure immediately improved after TAVI in patients with postcapillary combined (57.8±14.1 versus 50.4±17.3 mm Hg; P=0.015) but not in those with precapillary (49.0±12.6 versus 51.6±14.3; P=0.36). When compared with no PH, a higher 1-year mortality rate was observed in both precapillary (hazard ratio, 2.30; 95% confidence interval, 1.02-5.22; P=0.046) and combined (hazard ratio, 3.15; 95% confidence interval, 1.43-6.93; P=0.004) but not isolated PH patients (P=0.11). After adjustment, combined PH remained a strong predictor of 1-year mortality after TAVI (hazard ratio, 3.28; P=0.005). CONCLUSIONS Invasive stratification of PH according to hemodynamic presentation predicts acute response to treatment and 1-year mortality after TAVI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION The pentasaccharide fondaparinux is widely approved for prophylaxis and treatment of thromboembolic diseases and therapy of acute coronary syndrome. It is also used off-label in patients with acute, suspected or antecedent heparin-induced thrombocytopenia (HIT). The aim of this prospective observational cohort study was to document fondaparinux' prescription practice, tolerance and therapy safety in a representative mixed German single-centre patient cohort. PATIENTS AND METHODS Between 09/2008 - 04/2009, 231 consecutive patients treated with fondaparinux were enrolled. Medical data were obtained from patient's records. The patients were clinically screened for thrombosis (Wells score), sequelae of HIT (4T's score), and bleeding complications (ISTH-criteria) and subjected to further assessment (i.e. sonography, HIT-diagnostics), if necessary. The mortality rate was assessed 30 days after therapy start. RESULTS Overall, 153/231 patients had a prophylactic, 74/231 patients a therapeutic, and 4/231 patients a successive prophylactic/therapeutic indication. In 11/231 patients fondaparinux was used due to suspected/antecedent HIT, in 5/231 patients due to a previous cutaneous delayed-type hypersensitivity to heparins. Other indications were rare. Three new/progressive thromboses were detected. No cases of HIT, major bleedings, or fatalities occurred. CONCLUSIONS Fondaparinux was well tolerated and was safe in prophylaxis and therapy; prescriptions mostly followed the current approval guidelines and were rarely related to HIT-associated indications (<5% of prescriptions), which is in contrast to previous study results in the U.S. (>94% of prescriptions were HIT-associated). A trend towards an individualised fondaparinux use based on the compound's inherent properties and the patients' risk profiles, i.e., antecedent HIT, bone fractures, heparin allergy, was observed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Febrile neutropenia (FN) and other infectious complications are some of the most serious treatment-related toxicities of chemotherapy for cancer, with a mortality rate of 2% to 21%. The two main types of prophylactic regimens are granulocyte (macrophage) colony-stimulating factors (G(M)-CSF) and antibiotics, frequently quinolones or cotrimoxazole. Current guidelines recommend the use of colony-stimulating factors when the risk of febrile neutropenia is above 20%, but they do not mention the use of antibiotics. However, both regimens have been shown to reduce the incidence of infections. Since no systematic review has compared the two regimens, a systematic review was undertaken. OBJECTIVES To compare the efficacy and safety of G(M)-CSF compared to antibiotics in cancer patients receiving myelotoxic chemotherapy. SEARCH METHODS We searched The Cochrane Library, MEDLINE, EMBASE, databases of ongoing trials, and conference proceedings of the American Society of Clinical Oncology and the American Society of Hematology (1980 to December 2015). We planned to include both full-text and abstract publications. Two review authors independently screened search results. SELECTION CRITERIA We included randomised controlled trials (RCTs) comparing prophylaxis with G(M)-CSF versus antibiotics for the prevention of infection in cancer patients of all ages receiving chemotherapy. All study arms had to receive identical chemotherapy regimes and other supportive care. We included full-text, abstracts, and unpublished data if sufficient information on study design, participant characteristics, interventions and outcomes was available. We excluded cross-over trials, quasi-randomised trials and post-hoc retrospective trials. DATA COLLECTION AND ANALYSIS Two review authors independently screened the results of the search strategies, extracted data, assessed risk of bias, and analysed data according to standard Cochrane methods. We did final interpretation together with an experienced clinician. MAIN RESULTS In this updated review, we included no new randomised controlled trials. We included two trials in the review, one with 40 breast cancer patients receiving high-dose chemotherapy and G-CSF compared to antibiotics, a second one evaluating 155 patients with small-cell lung cancer receiving GM-CSF or antibiotics.We judge the overall risk of bias as high in the G-CSF trial, as neither patients nor physicians were blinded and not all included patients were analysed as randomised (7 out of 40 patients). We considered the overall risk of bias in the GM-CSF to be moderate, because of the risk of performance bias (neither patients nor personnel were blinded), but low risk of selection and attrition bias.For the trial comparing G-CSF to antibiotics, all cause mortality was not reported. There was no evidence of a difference for infection-related mortality, with zero events in each arm. Microbiologically or clinically documented infections, severe infections, quality of life, and adverse events were not reported. There was no evidence of a difference in frequency of febrile neutropenia (risk ratio (RR) 1.22; 95% confidence interval (CI) 0.53 to 2.84). The quality of the evidence for the two reported outcomes, infection-related mortality and frequency of febrile neutropenia, was very low, due to the low number of patients evaluated (high imprecision) and the high risk of bias.There was no evidence of a difference in terms of median survival time in the trial comparing GM-CSF and antibiotics. Two-year survival times were 6% (0 to 12%) in both arms (high imprecision, low quality of evidence). There were four toxic deaths in the GM-CSF arm and three in the antibiotics arm (3.8%), without evidence of a difference (RR 1.32; 95% CI 0.30 to 5.69; P = 0.71; low quality of evidence). There were 28% grade III or IV infections in the GM-CSF arm and 18% in the antibiotics arm, without any evidence of a difference (RR 1.55; 95% CI 0.86 to 2.80; P = 0.15, low quality of evidence). There were 5 episodes out of 360 cycles of grade IV infections in the GM-CSF arm and 3 episodes out of 334 cycles in the cotrimoxazole arm (0.8%), with no evidence of a difference (RR 1.55; 95% CI 0.37 to 6.42; P = 0.55; low quality of evidence). There was no significant difference between the two arms for non-haematological toxicities like diarrhoea, stomatitis, infections, neurologic, respiratory, or cardiac adverse events. Grade III and IV thrombopenia occurred significantly more frequently in the GM-CSF arm (60.8%) compared to the antibiotics arm (28.9%); (RR 2.10; 95% CI 1.41 to 3.12; P = 0.0002; low quality of evidence). Neither infection-related mortality, incidence of febrile neutropenia, nor quality of life were reported in this trial. AUTHORS' CONCLUSIONS As we only found two small trials with 195 patients altogether, no conclusion for clinical practice is possible. More trials are necessary to assess the benefits and harms of G(M)-CSF compared to antibiotics for infection prevention in cancer patients receiving chemotherapy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pleural infection is a frequent clinical condition. Prompt treatment has been shown to reduce hospital costs, morbidity and mortality. Recent advances in treatment have been variably implemented in clinical practice. This statement reviews the latest developments and concepts to improve clinical management and stimulate further research. The European Association for Cardio-Thoracic Surgery (EACTS) Thoracic Domain and the EACTS Pleural Diseases Working Group established a team of thoracic surgeons to produce a comprehensive review of available scientific evidence with the aim to cover all aspects of surgical practice related to its treatment, in particular focusing on: surgical treatment of empyema in adults; surgical treatment of empyema in children; and surgical treatment of post-pneumonectomy empyema (PPE). In the management of Stage 1 empyema, prompt pleural space chest tube drainage is required. In patients with Stage 2 or 3 empyema who are fit enough to undergo an operative procedure, there is a demonstrated benefit of surgical debridement or decortication [possibly by video-assisted thoracoscopic surgery (VATS)] over tube thoracostomy alone in terms of treatment success and reduction in hospital stay. In children, a primary operative approach is an effective management strategy, associated with a lower mortality rate and a reduction of tube thoracostomy duration, length of antibiotic therapy, reintervention rate and hospital stay. Intrapleural fibrinolytic therapy is a reasonable alternative to primary operative management. Uncomplicated PPE [without bronchopleural fistula (BPF)] can be effectively managed with minimally invasive techniques, including fenestration, pleural space irrigation and VATS debridement. PPE associated with BPF can be effectively managed with individualized open surgical techniques, including direct repair, myoplastic and thoracoplastic techniques. Intrathoracic vacuum-assisted closure may be considered as an adjunct to the standard treatment. The current literature cements the role of VATS in the management of pleural empyema, even if the choice of surgical approach relies on the individual surgeon's preference.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intraspecific and interspecific architectural patterns were studied for eight tree species of a Bornean rain forest. Trees 5--19 m tall in two 4-ha permanent sample plots in primary forest were selected, and three light descriptors and seven architectural traits for each tree were measured. Two general predictions were made: (1) Slow growing individuals (or short ones) encounter lower light, and have flatter crowns, fewer leaf layers, and thinner stems, than do fast growing individuals (or tall ones). (2) Species with higher shade-tolerance receive less light and have flatter crowns, fewer leaf layers, and thinner stems, than do species with lower shade-tolerance. Shade-tolerance is assumed to decrease with maximum growth rate, mortality rate, and adult stature of a species.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pregnant BALB/c mice have been widely used as an in vivo model to study Neospora caninum infection biology and to provide proof-of-concept for assessments of drugs and vaccines against neosporosis. The fact that this model has been used with different isolates of variable virulence, varying infection routes and differing methods to prepare the parasites for infection, has rendered the comparison of results from different laboratories impossible. In most studies, mice were infected with similar number of parasites (2 × 10(6)) as employed in ruminant models (10(7) for cows and 10(6) for sheep), which seems inappropriate considering the enormous differences in the weight of these species. Thus, for achieving meaningful results in vaccination and drug efficacy experiments, a refinement and standardization of this experimental model is necessary. Thus, 2 × 10(6), 10(5), 10(4), 10(3) and 10(2) tachyzoites of the highly virulent and well-characterised Nc-Spain7 isolate were subcutaneously inoculated into mice at day 7 of pregnancy, and clinical outcome, vertical transmission, parasite burden and antibody responses were compared. Dams from all infected groups presented nervous signs and the percentage of surviving pups at day 30 postpartum was surprisingly low (24%) in mice infected with only 10(2) tachyzoites. Importantly, infection with 10(5) tachyzoites resulted in antibody levels, cerebral parasite burden in dams and 100% mortality rate in pups, which was identical to infection with 2 × 10(6) tachyzoites. Considering these results, it is reasonable to lower the challenge dose to 10(5) tachyzoites in further experiments when assessing drugs or vaccine candidates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Larval development time is a critical factor in assessing the potential for larval transport, mortality. and subsequently, the connectivity of marine populations through larval exchange. Most estimates of larval duration are based on laboratory studies and may not reflect development times in nature. For larvae of the American lobster (Homarus americanus), temperature-dependent development times have been established in previous laboratory studies. Here, we used the timing of seasonal abundance curves for newly hatched larvae (stage 1) and the final plankonic instar (postlarva), coupled with a model of temperature-dependent development to assess development time in the field. We were unable to reproduce the timing of the seasonal abundance curves using laboratory development rates in our model. Our results suggest that larval development in situ may be twice as fast as reported laboratory rates. This will result in reduced estimates of larval transport potential, and increased estimates of instantaneous mortality rate and production.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vasculogenesis is the process by which Endothelial Precursor Cells (EPCs) form a vasculature. This process has been traditionally regarded as an embryological process of vessel formation. However, as early as in the 60's the concept of postnatal vasculogenesis was introduced, with a strong resurface of this idea in recent years. Similarly, previous work on a mouse skin tumor model provided us with the grounds to consider the role of vasculogenesis during tumor formation. ^ We examined the contribution of donor bone marrow (BM)-derived cells to neovascularization in recipient nude mice with Ewing's sarcoma. Ewing's sarcoma is a primitive neuroectodermal tumor that most often affects children and young adults between 5 and 30 years of age. Despite multiple attempts to improve the efficacy of chemotherapy for the disease, the 2-year metastases-free survival rate for patients with Ewing's sarcoma has not improved over the past 15 years. New therapeutic approaches are therefore needed to reduce the mortality rate. ^ The contribution of BM endothelial precursor cells in the development of Ewing's sarcoma was examined using different strategies to track the donor-derived cells. Using a BMT model that takes advantage of MHC differences between donor and recipient mice, we have found that donor BM cells were involved in the formation of Ewing's sarcoma vasculature. ^ Cells responsible for this vasculogenesis activity may be located within the stem cell population of the murine BM. These stem cells would not only generate the hematopoietic lineage but they would also generate ECs. Bone marrow SP (Side Population) cells pertain to a subpopulation that can be identified using flow cytometric analysis of Hoechst 33342-stained BM. This population of cells has HSC activity. We have tested the ability of BM SP cells to contribute to vasculogenesis in Ewing's sarcoma using our MHC mismatched transplant model. Mice transplanted with SP cells developed tumor neovessels that were derived from the donor SP cells. Thus, SP cells not only replenished the hematopoietic system of the lethally irradiated mice, but also differentiated into a non-hematopoietic cell lineage and contributed to the formation of the tumor vasculature. ^ In summary, we have demonstrated that BM-derived cells are involved in the generation of the new vasculature during the growth of Ewing's sarcoma. The finding that vasculogenesis plays a role in Ewing's sarcoma development opens the possibility of using genetically modified BM-derived cells for the treatment of Ewing's sarcomas. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. According to the WHO 2007 country report, Haiti lags behind the Millennium Development Goal of reducing child mortality and maintains the highest under-5 mortality rate in the Western hemisphere. 3 Overall, few studies exist that seek to better grasp barriers in caring for a seriously ill child in a resource-limited setting and only a handful propose sustainable, effective interventions. ^ Objectives. The objectives of this study are to describe the prevalence of serious illnesses among children hospitalized at 2 children's hospitals in Port au Prince, to determine the barriers faced when caring for seriously ill children, and to report hospital outcomes of children admitted with serious illnesses. ^ Methods. Data were gathered from 2 major children's hospitals in Port au Prince, Haiti (Grace Children's Hospital [GCH] and Hopital d l'Universite d'Etat d'Haiti [HUEH]) using a triangulated approach of focus group discussions, physician questionnaires, and retrospective chart review. 23 pediatric physicians participated in focus group discussions and completed a self-administered questionnaire evaluating healthcare provider knowledge, self-efficacy, and perceived barriers relating to the care of seriously ill children in a resource-limited setting. A sample of 240 patient charts meeting eligibility criteria was abstracted for pertinent elements including sociodemographics, documentation, treatment strategies, and outcomes. Factors associated with mortality were analyzed using χ2 test and Fisher exact test [Minitab v.15]. ^ Results. The most common primary diagnoses at admission were gastroenteritis with moderate dehydration (35.5%), severe malnutrition (25.8%), and pneumonia (19.3%) for GCH, and severe malnutrition (32.6%), sepsis (24.7%), and severe respiratory distress (18%) for HUEH. Overall, 12.9% and 27% of seriously ill patients presented with shock to GCH and HUEH, respectively. ^ Shortage of necessary materials and equipment represented the most commonly reported limitation (18/23 respondents). According to chart data, 9.4% of children presenting with shock did not receive a fluid bolus, and only 8% of patients presenting with altered mental status or seizures received a glucose check. 65% of patients with meningitis did not receive a lumbar puncture due to lack of materials. ^ Hospital mortality rates did not differ by gender or by institution. Children who died were more likely to have a history of prematurity (OR 4.97 [95% CI 1.32-18.80]), an incomplete vaccination record (OR 4.05 [95% CI 1.68-9.74]), or a weight for age ≤3rd percentile (OR 6.1 [95% CI 2.49-14.93]. Case-fatality rates were significantly higher among those who presented with signs of shock compared with those who did not (23.1% vs. 10.7%, RR=2.16, p=0.03). Caregivers did not achieve shock reversal in 21% of patients and did not document shock reversal in 50% of patients. ^ Conclusions. Many challenges face those who seek to optimize care for seriously ill children in resource-limited settings. Specifically, in Haiti, qualitative and quantitative data suggest major issues with lack of supplies, pre-hospital factors, including malnutrition as a comorbidity, and early recognition and management of shock. A tailored intervention designed to address these issues is needed in order to prospectively evaluate improvements in child mortality in a high-risk population.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Population health within and between nations is heavily influenced by political determinants, yet these determinants have received significantly less attention than socioeconomic factors in public health. It has been hypothesized that the welfare state, as a political variable, may play a particularly prominent role in affecting both health indicators and health disparities in developed countries. The research, however, provides conflicting evidence regarding the health impact of particular regimes over others and the mechanisms through which the welfare state can most significantly affect health.^ Objective. To perform a systematic review of the literature as a means of exploring what the current research indicates regarding the benefits or detriments of particular regimes styles and the pathways through which the welfare state can impact heath indicators and health disparities within developed countries.^ Methods. A thorough search of the EBSCO, Pubmed, Medline, Web of Science, and Scopus electronic databases was conducted and resulted in the identification of 15 studies that evaluated the association between welfare state regime and population health outcomes, and/or pathways through with the welfare state influences health. ^ Results. Social democratic countries tended to perform best when infant mortality rate (IMR) was the primary outcome of interest, whereas liberal countries performed strongly in relation to self perceived health. The results were mixed regarding welfare state effectiveness in mitigating health inequities, with Christian democratic countries performing as well as social democratic countries. In relation to welfare state pathways, public health spending and medical coverage were associated with positive health indicators. Redistributive impact of the welfare state was also consistently associated with better health outcomes while social security expenditures were not.^ Discussion/Conclusions. Studies consistently discovered a significant relationship between the welfare state and population health and/or health disparities, lending support to the hypothesis that the welfare state is, indeed, an important non-medical determinant of health. However, it is still fairly unclear which welfare state regime may be most protective for health, as results varied according to the measured health indicator. The research regarding welfare state pathways is particularly undeveloped, and does not provide much insight into the importance of in-kind service provision or cash transfers, or targeted or universal approaches to the welfare state. Suggestions to direct future research are provided.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Community respiratory viruses, mainly RSV and influenza, are significant causes of morbidity and mortality in patients with leukemia and HSCT recipients. The data on impact of PIV infections in these patients is lacking. Methods. We reviewed the records of patients with leukemia and HSCT recipients who developed PIV infection from Oct'02–Nov'07 to determine the outcome of such infections. Results. We identified 200 patients with PIV infections including 80(40%) patients with leukemia and 120 (60%) recipients of HSCT. Median age was 55 y (17-84 y). As compared to HSCT recipients, patients with leukemia had higher APACHE II score (14 vs. 10, p<0.0001); were more likely to have ANC<500 (48% vs. 10%, p<0.0001) and ALC<200 (45% vs. 23.5%, p=0.02). PIV type III was the commonest isolate (172/200, 86%). Most patients 141/200 (70%) had upper respiratory infection (URI), and 59/200 (30%) had pneumonia at presentation. Patients in leukemia group were more likely to require hospitalization due to PIV infection (77% vs. 36% p=0.0001) and were more likely to progress to pneumonia (61% vs. 39%, p=0.002). Fifty five patients received aerosolized ribavirin and/or IVIG. There were no significant differences in the duration of symptoms, length of hospitalization, progression to pneumonia or mortality between the treated verses untreated group. The clinical outcome was unknown in 13 (6%) patients. Complete resolution of symptoms was noted in 91% (171/187) patients and 9% (16/187) patients died. Mortality rate was 17% (16/95) among patients who had PIV pneumonia, with no significant difference between leukemia and HSCT group (16% vs. 17%). The cause of death was acute respiratory failure and/or multi-organ failure in (13, 81%) patients. Conclusions. Patients with leukemia and HSCT could be at high risk for serious PIV infections including PIV pneumonia. Treatment with aerosolized ribavirin and/or IVIG may not have significant effect on the outcome of PIV infection.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The retrospective cohort study examined the association between the presence of comorbidities and breast cancer disease-free survival rates among racial/ethnic groups. The study population consisted of 2389 women with stage I and II invasive breast cancer who were diagnosed and treated at the M.D. Anderson Cancer Center between 1985 and 2000. It has been suggested that as the number of comorbidities increases, breast cancer mortality increases. It is known that African Americans and Hispanics are considered to be at a higher risk for comorbid conditions such as hypertension and diabetes compared to Caucasian women (23) (10). When compared to Caucasian women, African American women also have a higher breast cancer mortality rate (1). As a result, the study also examined whether comorbid conditions contribute to racial differences in breast cancer disease-free survival. Among the study population, 24% suffered from breast cancer recurrence, 6% died from breast cancer and 24% died from all causes. The mean age was 56 with 41% of the population being women between the ages of 40-55. One or more comorbidities were reported in 84 (36%) African Americans (OR 1.57; 95% CI 1.19-2.10), 58 (31%) Hispanics (OR 1.25; 95% CI 0.90-1.74) compared to the reference group of 531 (27%) Caucasians. Additionally, African American women were significantly more likely to suffer from either a breast cancer recurrence or breast cancer death (OR 1.5; 95% CI 0.70-1.41) when compared to Caucasian women. Multivariate analysis found hypertension (HR 1.22; 95% CI 0.99-1.49; p<0.05) to be statistically significant and a potential prognostic tool for disease-free survival with African American women (OR 2.96; 95% 2.25-3.90) more likely to suffer from hypertension when compared to Caucasian women. When compared to Caucasian women, Hispanics were also more likely to suffer from hypertension (OR 1.33; 95% CI 0.96-1.83). This suggests that comorbid conditions like hypertension could account for the racial disparities that exist when comparing breast cancer survival rates. Future studies should investigate this relationship further.^