227 resultados para Health Sciences, Medicine and Surgery|Health Sciences, Nutrition|Health Sciences, Epidemiology
Resumo:
Three hypotheses have been offered to explain the historical specialty selection by women physicians. They are: (1) women choose the specialty for which the training requirements and working conditions interfere least with their commitments to marriage and children; (2) women tend to select the more "feminine" specialties such as pediatrics and psychiatry, and to avoid the "masculine" fields such as surgery; and (3) women have been deliberately excluded from male-dominated fields such as surgery. While the above hypotheses may be true to a greater or lesser degree, none of them has been adequately tested.^ The major study hypotheses are as follows: (1) female physicians' choice of specialty is influenced by the following reasons: (a) family responsibilities; (b) sex role expectations; and (c) sex discrimination; (2) female physicians' choice of specialty is also influenced by their age and ethnicity; and (3) the primary reasons for choosing a given specialty vary by type of specialty.^ The reasons for specialty selection will be explored based on a survey of women graduates of one of the oldest medical schools in the United States, The University of Texas Medical Branch (UTMB) in Galveston, Texas (n = 930). The survey response rate was 75.3% (700 respondents).^ The results for the first study hypothesis showed that fewer than 14% of the respondents agreed that sex role expectations, sex discrimination and family responsibilities played a role in their choice of specialty. Fifty nine percent of the respondents disagreed with the idea that sex role expectations influenced specialty selection and 64% disagreed that family responsibilities had an effect on the selection of their specialty. Around half (49%) were uncertain of the influence of sex discrimination. It was concluded that sex discrimination, sex role expectations and family responsibilities did not have a major impact on specialty selection.^ With respect to the second hypothesis, age was significant in Internal Medicine, Obstetrics/Gynecology and Psychiatry. Women physicians in Internal Medicine and Obstetrics/Gynecology were significantly younger (less than 45 years old) while physicians in Psychiatry were significantly older (45 years or older) than other specialties studied.^ The third hypothesis was confirmed: the reasons for choosing a given specialty varied by specialty.^ Respondents' comments written on the survey provided insight into other possible reasons for specialty selection including exploration of the role of mentoring and job satisfaction.^ The retrospective cross-sectional study design used in this study does not adequately capture the fact that different reasons may be given for the choice of specialty at different points in time, e.g., as the time of choosing a residency program versus several years into the future.^ In conclusion, approaches that explore the range of reasons that women elect to enter and stay within a given specialty must be explored to gain richer understanding of the complex and dynamic nature of women physicians' professional lives. (Abstract shortened by UMI.) ^
Resumo:
The natural history of placebo treated travelers' diarrhea and the prognostic factors of recovery from diarrhea were evaluated using 9 groups of placebo treated subjects from 9 clinical trial studies conducted since 1975, for use as a historical control in the future clinical trial of antidiarrheal agents. All of these studies were done by the same group of investigators in one site (Guadalajara, Mexico). The studies are similar in terms of population, measured parameters, microbiologic identification of enteropathogens and definitions of parameters. The studies had two different durations of followup. In some studies, subjects were followed for two days, and in some they were followed for five days.^ Using definitions established by the Infectious Diseases society of America and the Food and Drug Administration, the following efficacy parameters were evaluated: Time to last unformed stool (TLUS), number of unformed stools post-initiation of placebo treatment for five consecutive days of followup, microbiologic cure, and improvement of diarrhea. Among the groups that were followed for five days, the mean TLUS ranged from 59.1 to 83.5 hours. Fifty percent to 78% had diarrhea lasting more than 48 hours and 25% had diarrhea more than five days. The mean number of unformed stools passed on the first day post-initiation of therapy ranged from 3.6 to 5.8 and, for the fifth day ranged from 0.5 to 1.5. By the end of followup, diarrhea improved in 82.6% to 90% of the subjects. Subjects with enterotoxigenic E. coli had 21.6% to 90.0% microbiologic cure; and subjects with shigella species experienced 14.3% to 60.0% microbiologic cure.^ In evaluating the prognostic factors of recovery from diarrhea (primary efficacy parameter in evaluating the efficacy of antidiarrheal agents against travelers' diarrhea). The subjects from five studies were pooled and the Cox proportional hazard model was used to evaluate the predictors of prolonged diarrhea. After adjusting for design characteristics of each trial, fever with a rate ratio (RR) of 0.40, presence of invasive pathogens with a RR of 0.41, presence of severe abdominal pain and cramps with a RR of 0.50, number of watery stools more than five with a RR of 0.60, and presence of non-invasive pathogens with a RR of 0.84 predicted a longer duration of diarrhea. Severe vomiting with a RR of 2.53 predicted a shorter duration of diarrhea. The number of soft stools, presence of fecal leukocytes, presence of nausea, and duration of diarrhea before enrollment were not associated with duration of diarrhea. ^
Resumo:
Background: Mortality in pneumococcal pneumonia remains as high as 20%, and most deaths occur within the first two weeks of hospitalization despite eradication of the causative organisms by antimicrobials in the first 24 hours. An inflammatory response rather than active infection could be responsible for this early mortality. Statins have been shown to have potent immunomodulatory activity in vitro. We investigated whether there was decreased severity or improved outcome in patients who were receiving statins at the time they were admitted for pneumococcal pneumonia. ^ Methods: Patients seen at the Michael E. DeBakey Veterans Affairs Medical Center in Houston, Texas from January, 2000 to June, 2010 with a diagnosis of pneumococcal pneumonia were included in this retrospective cohort study. Electronic medical records were reviewed to record demographic characteristics, comorbidities, laboratory values and statin use at the time of admission. Severity of pneumonia was determined using the Pneumonia Outcomes Research Team (PORT) classification. Uni- and multivariate Cox regression was used to evaluate survival. We adjusted for all variables in the multivariate model if they were significant in the univariate model at p<0.05. ^ Results: Of 347 patients admitted for pneumococcal pneumonia, 90 (25.9%) were taking statins at the time of presentation. Patients in the statin group were older (age: 68.0±9.7 vs. 62.5±12.3 years, p<0.001) and had higher prevalence of diabetes, coronary artery disease and kidney disease (p<0.05 for each comparison). Liver disease and alcohol consumption were less prevalent among statin users (p<0.05). The PORT scores were normally distributed in both groups with statin users having higher mean scores at admission as compared to patients not on statins (108±32 vs. 96±32, p = 0.002). The Cox proportional hazard analyses, adjusted for age, comorbidities, length of stay and PORT scores, showed a significantly reduced risk of mortality among statin users at 14 days (HR: 0.39; 0.15-0.98, p=0.045), 20 days (0.35; 0.14- 0.88, p=0.03) and 30 days(0.41; 0.17-0.95, p=0.01) after presentation. ^ Conclusion: Statin use is associated with improved clinical outcomes in patients with pneumococcal pneumonia.^
Resumo:
Background. Necrotizing pneumonia is generally considered a rare complication of pneumococcal pneumonia in adults. We systematically studied the incidence of necrotizing changes in adult patients with pneumococcal pneumonia, and examined the severity of infection, the role of causative serotype and the association with bacteremia. ^ Methods. We used a data base of all pneumococcal infections identified at our medical center between 2000 and 2010. Original readings of chest X-rays (CXR) and computerized tomography (CT) were noted. All images were then reread independently by 2 radiologists. The severity of disease was assessed using the SMART-COP scoring system. ^ Results. There were 351 cases of pneumococcal pneumonia. Necrosis was reported in no original CXR readings and 6 of 136 (4.4%) CTs. With re-reading, 8 of 351 (2.3%) CXR and 15 of 136 (11.0%) CT had necrotizing changes. Overall, these changes were found in 23 of 351 (6.6%, 95% CI 4.0 - 9.1) patients. The incidence of bacteremia and the admitting SMART-COP scores were similar in patients with and without necrosis (P=1.00 and P=0.32, respectively). Type 3 pneumococcus was more commonly isolated from patients with than from patients without necrotizing pneumonia (P=0.05), but a total of 10 serotypes were identified among 16 cases in which the organism was available for typing. ^ Conclusions. Necrotizing changes in the lungs were seen in 6.6% (95% CI 4.0 - 9.1) of a large series of adults with pneumococcal pneumonia. Patients with necrosis were not more likely to have bacteremia or more severe disease. Type 3 pneumococcus was commonly implicated, but 9 other serotypes were also identified.^
Resumo:
Background: Heart failure (CHF) is the most frequent and prognostically severe symptom of aortic stenosis (AS), and the most common indication for surgery. The mainstay of treatment for AS is aortic valve replacement (AVR), and the main indication for an AVR is development of symptomatic disease. ACC/AHA guidelines define severe AS as an aortic valve area (AVA) ≤1cm², but there is little data correlating echocardiogram AVA with the onset of symptomatic CHF. We evaluated the risk of developing CHF with progressively decreasing echocardiographic AVA. We also compared echocardiographic AVA with Jet velocity (V2) and indexed AVA (AVAI) to assess the best predictor of development of symptomatic CHF.^ Methods and Results: This retrospective cohort study evaluated 518 patients with asymptomatic moderate or severe AS from a single community based cardiology practice. A total of 925 echocardiograms were performed over an 11-year period. Each echocardiogram was correlated with concurrent clinical assessments while the investigator was blinded to the echocardiogram severity of AS. The Cox Proportional hazards model was used to analyze the relationship between AVA and the development of CHF. The median age of patients at entry was 76.1 years, with 54% males. A total of 116 patients (21.8%) developed new onset CHF during follow-up. Compared to patients with AVA >1.0cm², patients with lower AVA had an exponentially increasing risk of developing CHF for each 0.2cm² decrement in AVA, becoming statistically significant only at an AVA less than 0.8 cm². Also, compared to V2 and AVAI, AVA added more information to assessing risk for development of CHF (p=0.041). ^ Conclusion: In patients with normal or mildly impaired LVEF, the risk of CHF rises exponentially with decreasing valve area and becomes statistically significant after AVA falls below 0.8cm². AVA is a better predictor of CHF when compared to V2 or AVAI.^
Resumo:
Early and accurate detection of TB disease in HIV-infected individuals is a critical step for a successful TB program. In Vietnam, the diagnosis of TB disease, which is based predominantly on the clinical examination, chest radiography (CXR) and acid fast bacilli (AFB) sputum smear, has shown to be of low sensitivity in immunocompromised patients. The sputum culture is not routinely performed for patients with AFB negative smears, even in HIV-infected individuals.^ In that background, we conducted this cross-sectional study to estimate the prevalence of sputum culture-confirmed pulmonary tuberculosis (PTB), smear-negative PTB, and multidrug-resistant TB (MDR-TB) in the HIV-infected population in Ho Chi Minh City (HCMC), the largest city in Vietnam where both TB and HIV are highly prevalent. We also evaluated the diagnostic performance of various algorithms based on routine available tools in Vietnam such as symptoms screening, CXR, and AFB smear. Nearly 400 subjects were consecutively recruited from HIV-infected patients seeking care at the An Hoa Clinic in District 6 of Ho Chi Minh City from August 2009 through June 2010. Participants’ demographic data, clinical status, CXR, and laboratory results were collected. A multiple logistic regression model was developed to assess the association of covariates and PTB. ^ The prevalence of smear-positive TB, smear-negative TB, resistant TB, and MDR-TB were 7%, 2%, 5%, 2.5%, and 0.3%, respectively. Adjusted odds ratios for low CD4+ cell count, positive sputum smear, and CXR to positive sputum culture were 3.17, 32.04, and 4.28, respectively. Clinical findings alone had poor sensitivity, but the combination of CD4+ cell count, sputum smear, and CXR proved to perform a more accurate diagnosis.^ This study results support the routine use of sputum culture to improve the detection of TB disease in HIV-infected individuals in Vietnam. When routine sputum culture is not available, an algorithm combining CD4+ cell count, sputum smear, and CXR is recommended for diagnosing PTB. Future studies on more affordable, rapid, and accurate tests for TB infection would also be necessary to timely provide specific treatments for patients in need, reduce mortality, and minimize TB transmission to the general population.^
Resumo:
Bisphosphonates have proven effectiveness in preventing skeletal-related events (SREs) in advanced breast cancer, prostate cancer and multiple myeloma. The purpose of this study was to assess efficacy of bisphosphonates in preventing SREs, in controlling pain, and in increasing life expectancy in lung cancer patients with bone metastases.^ We performed an electronic search in MEDLINE, EMBASE, Web of Science, and Cochrane library databases up to April 4, 2010. Hand searching and searching in clinicaltrials.gov were also performed. Two independent reviewers selected all clinical trials that included lung cancer patients with bone metastases treated with bisphosphonates. We excluded articles that involved cancers other than lung, patients without bone metastasis and treatment other than bisphosphonates. Outcome questions answered were efficacy measured as overall pain control, overall improvement in survival and reduction in skeletal-related events or SREs (fracture, cord compression, radiation or surgery to the bone, hypercalcemia of malignancy). The quality of each study was evaluated using the Cochrane Back Review group questionnaire to assess risk of bias (0-worst to 11-best). Data extraction and quality assessments were independently performed by two assessors. Meta-analyses were performed where more than one study with similar outcomes were found.^ We identified eight trials that met our inclusion criteria. Three studies evaluated zoledronic acid, three pamidronate, three clodronate and two ibandronate. Two were placebocontrol trials while two had multi-group comparisons (radiotherapy, radionucleotides, and chemotherapy) and two had different bisphosphonate as active controls. Quality scores ranged from 1-4 out of 11 suggesting high risk of bias. Studies failed to report adequate explanation of randomization procedures, concealment of randomization and blinding. Metaanalysis showed that patients treated with zoledronic acid alone had lower rates of developing SREs compared to placebo at 21 months (RR=0.80, 95% CI=0.66-0.97, p=0.02). Meta-analyses also showed increased pain control when a bisphosphonate was added to the existing treatment modality like chemotherapy or radiation (RR=1.17, 95% CI=1.03-1.34, p=0.02). However, pain control was not statistically significantly different among various bisphosphonates when other treatment modalities were not present. Despite improvement in SRE and pain control, bisphosphonates failed to show improvement in overall survival (Difference in means=109.1 days, 95% CI= -51.52 – 269.71, p=0.183).^ Adding biphosphonates to standard care improved pain control and reduced SREs. Biphosphonates did not improve overall survival. Further larger studies with higher quality are required to stengthen the evidence.^ Keywords/MeSH terms Bisphosphonates/diphosphonates: generic, chemical and trade names.^
Resumo:
Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^
Resumo:
HIV-1 infected children display a highly variable rate of progression to AIDS. Data about reasons underlying the variable progression to AIDS among vertically-infected children is sparse, and the few studies that have examined this important question have almost exclusively been done in the developed world. This is despite the fact that Sub-Saharan Africa is home to over 90% of all HIV infected children around the world.^ The main objective of this study was to examine predictors of HIV-1 slow progression among vertically infected children in Botswana, using a case control design. Cases (slow progressors) and controls (rapid progressors) were drawn from medical records of HIV-1 infected children being followed up for routine care and treatment at the BBCCCOE between February 2003 and February 2011. Univariate and Multivariate Logistic Regression Analyses were performed to identify independent predictors of slow disease progression and control for confounding respectively. ^ The study population comprised of 152 cases and 201 controls with ages ranging from 6 months to 16 years at baseline. Low baseline HIV-1 RNA viral load was the strongest independent predictor of slow progression (adjusted OR = 5.52, 95% CI = 2.75-11.07; P <0.001). Other independent predictors of slow disease progression identified were: lack of history of PMTCT with single dose Nevirapine plus Zidovudine (adjusted OR = 4.45, 95% CI = 1.45-13.69; P = 0.009) and maternal vital status (alive) (adjusted OR = 2.46, 95% CI = 1.51-4.01; P < 0.00 ).^ The results of this study may help clinicians and policy-makers in resource-limited settings to identify, at baseline, which children are at highest risk of rapid progression to AIDS and thus prioritize them for immediate intervention with HAART and other measures that would mitigate disease progression. At the same time HAART may be delayed among children who are at lower risk of disease progression. This would enable the highly affected, yet impoverished, Sub-Saharan African countries to use their scarce resources more efficiently which may in turn ensure that their National Antiretroviral Therapy Programs become more sustainable. Delaying HAART among the low-risk children would also lower the occurrence of adverse drug reactions associated with antiretroviral drugs exposure.^ Keywords. Slow Progressors, Rapid Progressors, HIV-1, Predictors, Children, Vertical Transmission, Sub-Saharan Africa^
Resumo:
Sepsis is a significant cause for multiple organ failure and death in the burn patient, yet identification in this population is confounded by chronic hypermetabolism and impaired immune function. The purpose of this study was twofold: 1) determine the ability of the systemic inflammatory response syndrome (SIRS) and American Burn Association (ABA) criteria to predict sepsis in the burn patient; and 2) develop a model representing the best combination of clinical predictors associated with sepsis in the same population. A retrospective, case-controlled, within-patient comparison of burn patients admitted to a single intensive care unit (ICU) was conducted for the period January 2005 to September 2010. Blood culture results were paired with clinical condition: "positive-sick"; "negative-sick", and "screening-not sick". Data were collected for the 72 hours prior to each blood culture. The most significant predictors were evaluated using logistic regression, Generalized Estimating Equations (GEE) and ROC area under the curve (AUC) analyses to assess model predictive ability. Bootstrapping methods were employed to evaluate potential model over-fitting. Fifty-nine subjects were included, representing 177 culture periods. SIRS criteria were not found to be associated with culture type, with an average of 98% of subjects meeting criteria in the 3 days prior. ABA sepsis criteria were significantly different among culture type only on the day prior (p = 0.004). The variables identified for the model included: heart rate>130 beats/min, mean blood pressure<60 mmHg, base deficit<-6 mEq/L, temperature>36°C, use of vasoactive medications, and glucose>150 mg/d1. The model was significant in predicting "positive culture-sick" and sepsis state, with AUC of 0.775 (p < 0.001) and 0.714 (p < .001), respectively; comparatively, the ABA criteria AUC was 0.619 (p = 0.028) and 0.597 (p = .035), respectively. SIRS criteria are not appropriate for identifying sepsis in the burn population. The ABA criteria perform better, but only for the day prior to positive blood culture results. The time period useful to diagnose sepsis using clinical criteria may be limited to 24 hours. A combination of predictors is superior to individual variable trends, yet algorithms or computer support will be necessary for the clinician to find such models useful. ^
Resumo:
Telemedicine is the use of telecommunications to support health care services and it incorporates a wide range of technology and devices. This systematic review seeks to determine which types of telemedicine technologies have been the most effective at improving the major health factors of subjects with type 2 diabetes. The major health factors identified were blood glucose, systolic and diastolic blood pressure, LDL cholesterol, weight, BMI, triglyceride levels, and waist circumference. A literature search was performed using peer reviewed, scholarly articles focused on the health outcomes of type 2 diabetes patients served by various telemedicine interventions. A total of 15 articles met the search criteria and were then analyzed to determine the significant health outcomes of each telemedicine interventions for type 2 diabetes patients. Results showed that telemedicine interventions using videoconferencing technology resulted in significant improvements in five health factor outcomes (total body weight, BMI, blood glucose, LDL cholesterol, and blood pressure), while telemedicine interventions using web applications and health monitors/modems only produced significant improvements in blood glucose. Future research should focus on examining the costs and benefits of videoconferencing and other telemedicine technologies for type 2 diabetes patients.^
Resumo:
A common complication of antibiotic use is the development of diarrheal illness. The pathogenesis of antibiotic associated diarrhea (AAD) may be mediated through alteration of intestinal microbiota, overgrowth of opportunistic pathogens, and direct drug toxicity on the gut. Alterations in the intestinal microbiota result in metabolic imbalances, loss of colonization resistance and in turn allow proliferation of opportunistic pathogens. Currently less than 33% of AAD cases can be attributable to Clostridium difficile leaving a large number of cases undiagnosed and poorly treated. Although the pathogenesis of Clostridium difficile infection (CDI) has been well documented, the role of other putative microbial etiologies (Clostridium perfringens, Staphylococcus aureus, Klebsiella oxytoca, Candida species) and their pathogenic mechanisms in AAD has been unclear. This review provides a comprehensive and systematic approach to the existing data on AAD and includes concise descriptions of the pathogenesis of CDI and non-CDI AAD in the form of figures.^
Resumo:
Renal insufficiency is one of the most common co-morbidities present in heart failure (HF) patients. It has significant impact on mortality and adverse outcomes. Cystatin C has been shown as a promising marker of renal function. A systematic review of all the published studies evaluating the prognostic role of cystatin C in both acute and chronic HF was undertaken. A comprehensive literature search was conducted involving various terms of 'cystatin C' and 'heart failure' in Pubmed medline and Embase libraries using Scopus database. A total of twelve observational studies were selected in this review for detailed assessment. Six studies were performed in acute HF patients and six were performed in chronic HF patients. Cystatin C was used as a continuous variable, as quartiles/tertiles or as a categorical variable in these studies. Different mortality endpoints were reported in these studies. All twelve studies demonstrated a significant association of cystatin C with mortality. This association was found to be independent of other baseline risk factors that are known to impact HF outcomes. In both acute and chronic HF, cystatin C was not only a strong predictor of outcomes but also a better prognostic marker than creatinine and estimated glomerular filtration rate (eGFR). A combination of cystatin C with other biomarkers such as N terminal pro B- type natriuretic peptide (NT-proBNP) or creatinine also improved the risk stratification. The plausible mechanisms are renal dysfunction, inflammation or a direct effect of cystatin C on ventricular remodeling. Either alone or in combination, cystatin C is a better, accurate and a reliable biomarker for HF prognosis. ^
Resumo:
Central Line-Associated Bloodstream Infections (CLABSIs) are one of the most costly and preventable cases of morbidity and mortality among intensive care units (ICUs) in health care today. In 2008, the Centers for Medicare and Medicaid Services Medicare Program, under the Deficit Reduction Act, announced it will no longer reimburse hospitals for such adverse events among those related to CLABSIs. This reveals the financial burden shift onto the hospital rather than the health care payer who can now withhold reimbursements. With this weighing more heavily on hospital management, decision makers will need to find a way to completely prevent cases of CLABSI or simply pay for the financial consequences. ^ To reduce the risk of CLABSIs, several clinical, preventive interventions have been studied and even instituted including the Central Line (CL) Bundle and Antimicrobial Coated Central Venous Catheters (AM-CVCs). I carried out a formal systematic review on the topic to compare the cost-effectiveness of the Central Line (CL) Bundle to the commercially available antimicrobial coated central venous catheters (AM-CVCs) in preventing CLABSIs among critically and chronically ill patients in the U.S. Evidence was assessed for inclusion against predefined criteria. I, myself, conducted the data extraction. Ten studies were included in the review. Efficacy in reducing the mean incidence rate of CLABSI by the CL Bundle and AM-CVC interventions were compared with one another including costs. ^ The AM-CVC impregnated with antibiotics, rifampin-minocycline (AI-RM) is more clinically effective than the CL Bundle in reducing the mean rate of CLABSI per 1,000 catheter days. The lowest mean incidence rate of CLABSI per 1,000 catheter days among the AM-CVC studies was as low as zero in favor of the AI-RM. Moreover, the review revealed that the AI-RM appears to be more cost-effective than the CL Bundle. Results showed the adjusted incremental cost of the CL Bundle per ICU patient requiring a CVC to be approximately $196 while the AI-RM at only an additional cost of $48 per ICU patient requiring a CVC. ^ Limited data regarding the cost of the CL Bundle made it difficult to make a true comparison to the direct cost of the AM-CVCs. However, using the result I did have from this review, I concluded that the AM-CVCs do appear to be more cost-effective in decreasing the mean rate of CLABSI while also minimizing incremental costs per CVC than the CL Bundle. This review calls for further research addressing the cost of the CL Bundle and compliance and more effective study designs such as randomized control trials comparing the efficacy and cost of the CL Bundle to the AM-CVCs. Barriers that may face health care managers when implementing the CL Bundle or AM-CVCs include additional costs associated with the intervention, educational training and ongoing reinforcement as well as creating a new culture of understanding.^
Resumo:
A strategy of pre-hospital reduced dose fibrinolytic administration coupled with urgent coronary intervention (PCI) for patients with STEMI (FAST-PCI) has been found to be superior to primary PCI (PPCI) alone. A coordinated STEMI system-of-care that includes FAST-PCI might offer better outcomes than pre-hospital diagnosis and STEMI team activation followed by PPCI alone. We compared the in-hospital outcomes for patients treated with the FAST-PCI approach with outcomes for patients treated with the PPCI approach during a pause in the FAST-PCI protocol. In-hospital data for 253 STEMI patients (03/2003–12/2009), treated with FAST-PCI protocol were compared to 124 patients (12/2009–08/2011), treated with PPCI strategy alone. In-hospital mortality was the primary endpoint. Stroke, major bleeding, and reinfarction during index hospitalization were secondary endpoints. Comparing the strategies used during the two time intervals, in-hospital mortality was significantly lower with FAST-PCI than with PPCI (2.77% vs. 10.48%, p = 0.0017). Rates of stroke, reinfarction and major bleeding were similar between the two groups. There was a lower frequency of pre- PCI TIMI 0 flow (no patency) seen in patients treated with FAST-PCI compared to the PPCI patients (26.7% vs. 62.7%, p<0.0001). Earlier infarct related artery patency in the FAST-PCI group had a favorable impact on the incidence of cardiogenic shock at hospital admission (FAST-PCI- 3.1% vs. PPCI- 20.9%, p<0.0001). The FAST-PCI strategy was associated with earlier infarct related artery patency and the lower incidence of cardiogenic shock on hospital arrival, as well as with reduced in-hospital mortality among STEMI patients.^