931 resultados para device failure analysis
Resumo:
OBJECTIVES The purpose of this study was to compare the 2-year safety and effectiveness of new- versus early-generation drug-eluting stents (DES) according to the severity of coronary artery disease (CAD) as assessed by the SYNTAX (Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery) score. BACKGROUND New-generation DES are considered the standard-of-care in patients with CAD undergoing percutaneous coronary intervention. However, there are few data investigating the effects of new- over early-generation DES according to the anatomic complexity of CAD. METHODS Patient-level data from 4 contemporary, all-comers trials were pooled. The primary device-oriented clinical endpoint was the composite of cardiac death, myocardial infarction, or ischemia-driven target-lesion revascularization (TLR). The principal effectiveness and safety endpoints were TLR and definite stent thrombosis (ST), respectively. Adjusted hazard ratios (HRs) with 95% confidence intervals (CIs) were calculated at 2 years for overall comparisons, as well as stratified for patients with lower (SYNTAX score ≤11) and higher complexity (SYNTAX score >11). RESULTS A total of 6,081 patients were included in the study. New-generation DES (n = 4,554) compared with early-generation DES (n = 1,527) reduced the primary endpoint (HR: 0.75 [95% CI: 0.63 to 0.89]; p = 0.001) without interaction (p = 0.219) between patients with lower (HR: 0.86 [95% CI: 0.64 to 1.16]; p = 0.322) versus higher CAD complexity (HR: 0.68 [95% CI: 0.54 to 0.85]; p = 0.001). In patients with SYNTAX score >11, new-generation DES significantly reduced TLR (HR: 0.36 [95% CI: 0.26 to 0.51]; p < 0.001) and definite ST (HR: 0.28 [95% CI: 0.15 to 0.55]; p < 0.001) to a greater extent than in the low-complexity group (TLR pint = 0.059; ST pint = 0.013). New-generation DES decreased the risk of cardiac mortality in patients with SYNTAX score >11 (HR: 0.45 [95% CI: 0.27 to 0.76]; p = 0.003) but not in patients with SYNTAX score ≤11 (pint = 0.042). CONCLUSIONS New-generation DES improve clinical outcomes compared with early-generation DES, with a greater safety and effectiveness in patients with SYNTAX score >11.
Resumo:
The European Registry for Patients with Mechanical Circulatory Support (EUROMACS) was founded on 10 December 2009 with the initiative of Roland Hetzer (Deutsches Herzzentrum Berlin, Berlin, Germany) and Jan Gummert (Herz- und Diabeteszentrum Nordrhein-Westfalen, Bad Oeynhausen, Germany) with 15 other founding international members. It aims to promote scientific research to improve care of end-stage heart failure patients with ventricular assist device or a total artificial heart as long-term mechanical circulatory support. Likewise, the organization aims to provide and maintain a registry of device implantation data and long-term follow-up of patients with mechanical circulatory support. Hence, EUROMACS affiliated itself with Dendrite Clinical Systems Ltd to offer its members a software tool that allows input and analysis of patient clinical data on a daily basis. EUROMACS facilitates further scientific studies by offering research groups access to any available data wherein patients and centres are anonymized. Furthermore, EUROMACS aims to stimulate cooperation with clinical and research institutions and with peer associations involved to further its aims. EUROMACS is the only European-based Registry for Patients with Mechanical Circulatory Support with rapid increase in institutional and individual membership. Because of the expeditious data input, the European Association for Cardiothoracic Surgeons saw the need to optimize the data availability and the significance of the registry to improve care of patients with mechanical circulatory support and its potential contribution to scientific intents; hence, the beginning of their alliance in 2012. This first annual report is designed to provide an overview of EUROMACS' structure, its activities, a first data collection and an insight to its scientific contributions.
Resumo:
PURPOSE The objective of this study was to evaluate stiffness, strength, and failure modes of monolithic crowns produced using computer-aided design/computer-assisted manufacture, which are connected to diverse titanium and zirconia abutments on an implant system with tapered, internal connections. MATERIALS AND METHODS Twenty monolithic lithium disilicate (LS2) crowns were constructed and loaded on bone level-type implants in a universal testing machine under quasistatic conditions according to DIN ISO 14801. Comparative analysis included a 2 × 2 format: prefabricated titanium abutments using proprietary bonding bases (group A) vs nonproprietary bonding bases (group B), and customized zirconia abutments using proprietary Straumann CARES (group C) vs nonproprietary Astra Atlantis (group D) material. Stiffness and strength were assessed and calculated statistically with the Wilcoxon rank sum test. Cross-sections of each tested group were inspected microscopically. RESULTS Loaded LS2 crowns, implants, and abutment screws in all tested specimens (groups A, B, C, and D) did not show any visible fractures. For an analysis of titanium abutments (groups A and B), stiffness and strength showed equally high stability. In contrast, proprietary and nonproprietary customized zirconia abutments exhibited statistically significant differences with a mean strength of 366 N (Astra) and 541 N (CARES) (P < .05); as well as a mean stiffness of 884 N/mm (Astra) and 1,751 N/mm (CARES) (P < .05), respectively. Microscopic cross-sections revealed cracks in all zirconia abutments (groups C and D) below the implant shoulder. CONCLUSION Depending on the abutment design, prefabricated titanium abutment and proprietary customized zirconia implant-abutment connections in conjunction with monolithic LS2 crowns had the best results in this laboratory investigation.
Resumo:
PURPOSE The aim of this study was to analyze the patient pool referred to a specialty clinic for implant surgery over a 3-year period. MATERIALS AND METHODS All patients receiving dental implants between 2008 and 2010 at the Department of Oral Surgery and Stomatology were included in the study. As primary outcome parameters, the patients were analyzed according to the following criteria: age, sex, systemic diseases, and indication for therapy. For the inserted implants, the type of surgical procedure, the types of implants placed, postsurgical complications, and early failures were recorded. A logistic regression analysis was performed to identify possible local and systemic risk factors for complications. As a secondary outcome, data regarding demographics and surgical procedures were compared with the findings of a historic study group (2002 to 2004). RESULTS A total of 1,568 patients (792 women and 776 men; mean age, 52.6 years) received 2,279 implants. The most frequent indication was a single-tooth gap (52.8%). Augmentative procedures were performed in 60% of the cases. Tissue-level implants (72.1%) were more frequently used than bone-level implants (27.9%). Regarding dimensions of the implants, a diameter of 4.1 mm (59.7%) and a length of 10 mm (55.0%) were most often utilized. An early failure rate of 0.6% was recorded (13 implants). Patients were older and received more implants in the maxilla, and the complexity of surgical interventions had increased when compared to the patient pool of 2002 to 2004. CONCLUSION Implant therapy performed in a surgical specialty clinic utilizing strict patient selection and evidence-based surgical protocols showed a very low early failure rate of 0.6%.
Resumo:
Venous angioplasty with stenting of iliac veins is an important treatment option for patients suffering from post-thrombotic syndrome due to chronic venous obstruction. Interventional treatment of a chronically occluded vena cava, however, is challenging and often associated with failure. We describe a case of a chronic total occlusion of the entire inferior vena cava that was successfully recanalized using bidirectional wire access and a balloon puncture by a re-entry catheter to establish patency of the inferior vena cava.
Resumo:
Detecting lame cows is important in improving animal welfare. Automated tools are potentially useful to enable identification and monitoring of lame cows. The goals of this study were to evaluate the suitability of various physiological and behavioral parameters to automatically detect lameness in dairy cows housed in a cubicle barn. Lame cows suffering from a claw horn lesion (sole ulcer or white line disease) of one claw of the same hind limb (n=32; group L) and 10 nonlame healthy cows (group C) were included in this study. Lying and standing behavior at night by tridimensional accelerometers, weight distribution between hind limbs by the 4-scale weighing platform, feeding behavior at night by the nose band sensor, and heart activity by the Polar device (Polar Electro Oy, Kempele, Finland) were assessed. Either the entire data set or parts of the data collected over a 48-h period were used for statistical analysis, depending upon the parameter in question. The standing time at night over 12 h and the limb weight ratio (LWR) were significantly higher in group C as compared with group L, whereas the lying time at night over 12 h, the mean limb difference (△weight), and the standard deviation (SD) of the weight applied on the limb taking less weight were significantly lower in group C as compared with group L. No significant difference was noted between the groups for the parameters of heart activity and feeding behavior at night. The locomotion score of cows in group L was positively correlated with the lying time and △weight, whereas it was negatively correlated with LWR and SD. The highest sensitivity (0.97) for lameness detection was found for the parameter SD [specificity of 0.80 and an area under the curve (AUC) of 0.84]. The highest specificity (0.90) for lameness detection was present for Δweight (sensitivity=0.78; AUC=0.88) and LWR (sensitivity=0.81; AUC=0.87). The model considering the data of SD together with lying time at night was the best predictor of cows being lame, accounting for 40% of the variation in the likelihood of a cow being lame (sensitivity=0.94; specificity=0.80; AUC=0.86). In conclusion, the data derived from the 4-scale-weighing platform, either alone or combined with the lying time at night over 12 h, represent the most valuable parameters for automated identification of lame cows suffering from a claw horn lesion of one individual hind limb.
Resumo:
STUDY DESIGN Retrospective analysis of prospectively collected clinical data. OBJECTIVE To assess the long-term outcome of patients with monosegmental L4/5 degenerative spondylolisthesis treated with the dynamic Dynesys device. SUMMARY OF BACKGROUND DATA The Dynesys system has been used as a semirigid, lumbar dorsal pedicular stabilization device since 1994. Good short-term results have been reported, but little is known about the long-term outcome after treatment for degenerative spondylolisthesis at the L4/5 level. METHODS A total of 39 consecutive patients with symptomatic degenerative lumbar spondylolisthesis at the L4/5 level were treated with bilateral decompression and Dynesys instrumentation. At a mean follow-up of 7.2 years (range, 5.0-11.2 y), they underwent clinical and radiographic evaluation and quality of life assessment. RESULTS At final follow-up, back pain improved in 89% and leg pain improved in 86% of patients compared with preoperative status. Eighty-three percent of patients reported global subjective improvement. Ninety-two percent would undergo the surgery again. Eight patients (21%) required further surgery because of symptomatic adjacent segment disease (6 cases), late-onset infection (1 case), and screw breakage (1 case). In 9 cases, radiologic progression of spondylolisthesis at the operated segment was found. Seventy-four percent of operated segments showed limited flexion-extension range of <4 degrees. Adjacent segment pathology, although without clinical correlation, was diagnosed at the L5/S1 (17.9%) and L3/4 (28.2%) segments. In 4 cases, asymptomatic screw loosening was observed. CONCLUSIONS Monosegmental Dynesys instrumentation of degenerative spondylolisthesis at L4/5 shows good long-term results. The rate of secondary surgeries is comparable to other dorsal instrumentation devices. Residual range of motion in the stabilized segment is reduced, and the rate of radiologic and symptomatic adjacent segment degeneration is low. Patient satisfaction is high. Dynesys stabilization of symptomatic L4/5 degenerative spondylolisthesis is a possible alternative to other stabilization devices.
Resumo:
Purpose. To investigate and understand the illness experiences of patients and their family members living with congestive heart failure (CHF). ^ Design. Focused ethnographic design. ^ Setting. One outpatient cardiology clinic, two outpatient heart failure clinics, and informants' homes in a large metropolitan city located in southeast Texas. ^ Sample. A purposeful sampling technique was used to select a sample of 28 informants. The following somewhat overlapping, sampling strategies were used to implement the purposeful method: criterion; typical case; operational construct; maximum variation; atypical case; opportunistic; and confirming and disconfirming case sampling. ^ Methods. Naturalistic inquiry consisted of data collected from observations, participant observations, and interviews. Open-ended semi-structured illness narrative interviews included questions designed to elicit informant's explanatory models of the illness, which served as a synthesizing framework for the analysis. A thematic analysis process was conducted through domain analysis and construction of data into themes and sub-themes. Credibility was enhanced through informant verification and a process of peer debriefing. ^ Findings. Thematic analysis revealed that patients and their family members living with CHF experience a process of disruption, incoherence, and reconciling. Reconciling emerged as the salient experience described by informants. Sub-themes of reconciling that emerged from the analysis included: struggling; participating in partnerships; finding purpose and meaning in the illness experience; and surrendering. ^ Conclusions. Understanding the experiences described in this study allows for a better understanding of living with CHF in everyday life. Findings from this study suggest that the experience of living with CHF entails more than the medical story can tell. It is important for nurses and other providers to understand the experiences of this population in order to develop appropriate treatment plans in a successful practitioner-patient partnership. ^
Resumo:
Background. Heart failure (HF) is a health problem of epidemic proportions and a clinical syndrome that leads to progressively severe symptoms, which contribute significantly to the burden of the disease. Several factors may affect the symptom burden of patients with HF, including physiological, psychological, and spiritual factors. This study was designed to examine the inter-relationship of physiological, psychological, and spiritual factors affecting symptoms for patients with HF. ^ Objectives. The aims of this study were to examine symptom burden of heart failure patients related to: (1) the physiological factor of brain natriuretic peptide (BNP); (2) the psychological factor of depression; (3) the spiritual factors of self transcendence and purpose in life; and (4) combined effects of physiological, psychological and spiritual factors. One additional aim was to describe symptom intensity related to symptom burden. ^ Methods. A cross-sectional non-experimental correlational design was used to examine factors affecting symptom burden in 105 patients with HF from a southwestern medical center outpatient heart failure clinic. Both men and women were included; average age was 56.6 (SD = 16.86). All measures except BNP were obtained by patient self-report. ^ Results. The mean number of symptoms present was 8.17 (SD = 3.34) with the three most common symptoms being shortness of breath on exertion, fatigue, and weakness. The mean symptom intensity was 365.66 (SD = 199.50) on a summative scale of visual analogue reports for 13 symptoms. The mean BNP level was 292.64 pg/ml (SD = 57 1.11). The prevalence rate for depression was 43.6% with a mean score of 3.48 (SD = 2.75) on the Center for Epidemiological Studies - Depression scale (CES-D 10) scale. In a multivariate analysis, depression was the only significant predictor of symptom burden (r = .474; P < .001), accounting for 18% of the variance. Spirituality had an interaction effect with depression (P ≤ .001), serving as a moderator between depression and symptom burden. ^ Conclusion. HF is a chronic and progressive syndrome characterized by severe symptoms, hospitalizations and disability. Depression is significantly related to symptom burden and this relationship is moderated by spirituality. ^
Resumo:
Background. Ambulatory blood pressure (ABP) measurement is a means of monitoring cardiac function in a noninvasive way, but little is known about ABP in heart failure (HF) patients. Blood pressure (BP) declines during sleep as protection from consistent BP load, a phenomenon termed "dipping." The aims of this study were (1) to compare BP dipping and physical activity between two groups of HF patients with different functional statuses and (2) to determine whether the strength of the association between ambulatory BP and PA is different between these two different functional statuses of HF. ^ Methods. This observational study used repeated measures of ABP and PA over a 24-hour period to investigate the profiles of BP and PA in community-based individuals with HF. ABP was measured every 30 minutes by using a SpaceLabs 90207, and a Basic Motionlogger actigraph was used to measure PA minute by minute. Fifty-six participants completed both BP and physical activity for a 24-hour monitoring period. Functional status was based on New York Heart Association (NYHA) ratings. There were 27 patients with no limitation of PA (NYHA class I HF) and 29 with some limitation of PA but no discomfort at rest (NYHA class II or III HF). The sample consisted of 26 men and 30 women, aged 45 to 91 years (66.96 ± 12.35). ^ Results. Patients with NYHA class I HF had significantly greater dipping percent than those with NYHA class II/III HF after controlling their left ventricular ejection fraction (LVEF). In a mixed model analysis (PROC MIXED, SAS Institute, v 9.1), PA was significantly related to ambulatory systolic and diastolic BP and mean arterial pressure. The strength of the association between PA and ABP readings was not significantly different for the two groups of patients. ^ Conclusions. These preliminary findings demonstrate differences between NYHA class I and class II/III of HF in BP dipping status and ABP but not PA. Longitudinal research is recommended to improve understanding of the influence of disease progression on changes in 24-hour physical activity and BP profiles of this patient population. ^ Key Words. Ambulatory Blood Pressure; Blood Pressure Dipping; Heart Failure; Physical Activity. ^
Resumo:
Introduction. The HIV/AIDS disease burden disproportionately affects minority populations, specifically African Americans. While sexual risk behaviors play a role in the observed HIV burden, other factors including gender, age, socioeconomics, and barriers to healthcare access may also be contributory. The goal of this study was to determine how far down the HIV/AIDS disease process people of different ethnicities first present for healthcare. The study specifically analyzed the differences in CD4 cell counts at the initial HIV-1 diagnosis with respect to ethnicity. The study also analyzed racial differences in HIV/AIDS risk factors. ^ Methods. This is a retrospective study using data from the Adult Spectrum of HIV Disease (ASD), collected by the City of Houston Department of Health. The ASD database contains information on newly reported HIV cases in the Harris County District Hospitals between 1989 and 2000. Each patient had an initial and a follow-up report. The extracted variables of interest from the ASD data set were CD4 counts at the initial HIV diagnosis, race, gender, age at HIV diagnosis and behavioral risk factors. One-way ANOVA was used to examine differences in baseline CD4 counts at HIV diagnosis between racial/ethnic groups. Chi square was used to analyze racial differences in risk factors. ^ Results. The analyzed study sample was 4767. The study population was 47% Black, 37% White and 16% Hispanic [p<0.05]. The mean and median CD4 counts at diagnosis were 254 and 193 cells per ml, respectively. At the initial HIV diagnosis Blacks had the highest average CD4 counts (285), followed by Whites (233) and Hispanics (212) [p<0.001 ]. These statistical differences, however, were only observed with CD4 counts above 350 [p<0.001], even when adjusted for age at diagnosis and gender [p<0.05]. Looking at risk factors, Blacks were mostly affected by intravenous drug use (IVDU) and heterosexuality, whereas Whites and Hispanics were more affected by male homosexuality [ p<0.05]. ^ Conclusion. (1) There were statistical differences in CD4 counts with respect to ethnicity, but these differences only existed for CD4 counts above 350. These differences however do not appear to have clinical significance. Antithetically, Blacks had the highest CD4 counts followed by Whites and Hispanics. (2) 50% of this study group clinically had AIDS at their initial HIV diagnosis (median=193), irrespective of ethnicity. It was not clear from data analysis if these observations were due to failure of early HIV surveillance, HIV testing policies or healthcare access. More studies need to be done to address this question. (3) Homosexuality and bisexuality were the biggest risk factors for Whites and Hispanics, whereas for Blacks were mostly affected by heterosexuality and IVDU, implying a need for different public health intervention strategies for these racial groups. ^
Resumo:
Mycobacterium tuberculosis, a bacillus known to cause disease in humans since ancient times, is the etiological agent of tuberculosis (TB). The infection is primarily pulmonary, although other organs may also be affected. The prevalence of pulmonary TB disease in the US is highest along the US-Mexico border, and of the four US states bordering Mexico, Texas had the second highest percentage of cases of TB disease among Mexico-born individuals in 1999 (CDC, 2001). Between the years of 1993 and 1998, the prevalence of drug-resistant (DR) TB was 9.1% among Mexican-born individuals and 4.4% among US-born individuals (CDC, 2001). In the same time period, the prevalence of multi-drug resistant (MDR) TB was 1.4% among Mexican-born individuals and 0.6% among US-born individuals (CDC, 2001). There is a renewed urgency in the quest for faster and more effective screening, diagnosis, and treatment methods for TB due to the resurgence of tuberculosis in the US during the mid-1980s and early 1990s (CDC, 2007a), and the emergence of drug-resistant, multidrug-resistant, and extremely drug-resistant tuberculosis worldwide. Failure to identify DR and MDR-TB quickly leads to poorer treatment outcomes (CDC, 2007b). The recent rise in TB/HIV comorbidity further complicates TB control efforts. The gold standard for identification of DR-TB requires mycobacterial growth in culture, a technique taking up to three weeks, during which time DR/MDR-TB individuals harboring resistant organisms may be receiving inappropriate treatment. The goal of this study was to determine the sensitivity and specificity of real-time quantitative polymerase chain reaction (qPCR) using molecular beacons in the Texas population. qPCR using molecular beacons is a novel approach to detect mycobacterial mutations conferring drug resistance. This technique is time-efficient and has been shown to have high sensitivity and specificity in several populations worldwide. Rifampin (RIF) susceptibility was chosen as the test parameter because strains of M. tuberculosis which are resistant to RIF are likely to also be MDR. Due to its status as a point of entry for many immigrants into the US, control efforts against TB and drug-resistant TB in Texas is a vital component of prevention efforts in the US as a whole. We show that qPCR using molecular beacons has high sensitivity and specificity when compared with culture (94% and 87%, respectively) and DNA sequencing (90% and 96%, respectively). We also used receiver operator curve analysis to calculate cutoff values for the objective determination of results obtained by qPCR using molecular beacons. ^
Resumo:
Ubiquitination is an essential process involved in basic biological processes such as the cell cycle and cell death. Ubiquitination is initiated by ubiquitin-activating enzymes (E1), which activate and transfer ubiquitin to ubiquitin-conjugating enzymes (E2). Subsequently, ubiquitin is transferred to target proteins via ubiquitin ligases (E3). Defects in ubiquitin conjugation have been implicated in several forms of malignancy, the pathogenesis of several genetic diseases, immune surveillance/viral pathogenesis, and the pathology of muscle wasting. However, the consequences of partial or complete loss of ubiquitin conjugation in multi-cellular organisms are not well understood. Here, we report the characterization of nba1, the sole E1 in Drosophila. We have determined that weak and strong nba1 alleluias behave genetically different and sometimes in opposing phenotypes. For example, weak uba1 alleluias protect cells from cell death whereas cells containing strong loss-of-function alleluias are highly apoptotic. These opposing phenotypes are due to differing sensitivities of cell death pathway components to ubiquitination level alterations. In addition, strong uba1 alleluias induce cell cycle arrest due to defects in the protein degradation of Cyclins. Surprisingly, clones of strong uba1 mutant alleluias stimulate neighboring wild-type tissue to undergo cell division in a non-autonomous manner resulting in severe overgrowth phenotypes in the mosaic fly. I have determined that the observed overgrowth phenotypes were due to a failure to downregulate the Notch signaling pathway in nba1 mutant cells. Aberrant Notch signaling results in the secretion of a local cytokine and activation of JAK/STAT pathway in neighboring cells. In addition, we elucidated a model describing the regulation of the caspase Dronc in surviving cells. Binding of Dronc by its inhibitor Diap1 is necessary but not sufficient to inhibit Dronc function. Ubiquitin conjugation and Uba1 function is necessary for the negative regulation of Dronc. ^
Resumo:
The Federal Food and Drug Administration (FDA) and the Centers for Medicare and Medicaid (CMS) play key roles in making Class III, medical devices available to the public, and they are required by law to meet statutory deadlines for applications under review. Historically, both agencies have failed to meet their respective statutory requirements. Since these failures affect patient access and may adversely impact public health, Congress has enacted several “modernization” laws. However, the effectiveness of these modernization laws has not been adequately studied or established for Class III medical devices. ^ The aim of this research study was, therefore, to analyze how these modernization laws may have affected public access to medical devices. Two questions were addressed: (1) How have the FDA modernization laws affected the time to approval for medical device premarket approval applications (PMAs)? (2) How has the CMS modernization law affected the time to approval for national coverage decisions (NCDs)? The data for this research study were collected from publicly available databases for the period January 1, 1995, through December 31, 2008. These dates were selected to ensure that a sufficient period of time was captured to measure pre- and post-modernization effects on time to approval. All records containing original PMAs were obtained from the FDA database, and all records containing NCDs were obtained from the CMS database. Source documents, including FDA premarket approval letters and CMS national coverage decision memoranda, were reviewed to obtain additional data not found in the search results. Analyses were conducted to determine the effects of the pre- and post-modernization laws on time to approval. Secondary analyses of FDA subcategories were conducted to uncover any causal factors that might explain differences in time to approval and to compare with the primary trends. The primary analysis showed that the FDA modernization laws of 1997 and 2002 initially reduced PMA time to approval; after the 2002 modernization law, the time to approval began increasing and continued to increase through December 2008. The non-combined, subcategory approval trends were similar to the primary analysis trends. The combined, subcategory analysis showed no clear trends with the exception of non-implantable devices, for which time to approval trended down after 1997. The CMS modernization law of 2003 reduced NCD time to approval, a trend that continued through December 2008. This study also showed that approximately 86% of PMA devices do not receive NCDs. ^ As a result of this research study, recommendations are offered to help resolve statutory non-compliance and access issues, as follows: (1) Authorities should examine underlying causal factors for the observed trends; (2) Process improvements should be made to better coordinate FDA and CMS activities to include sharing data, reducing duplication, and establishing clear criteria for “safe and effective” and “reasonable and necessary”; (3) A common identifier should be established to allow tracking and trending of applications between FDA and CMS databases; (4) Statutory requirements may need to be revised; and (5) An investigation should be undertaken to determine why NCDs are not issued for the majority of PMAs. Any process improvements should be made without creating additional safety risks and adversely impacting public health. Finally, additional studies are needed to fully characterize and better understand the trends identified in this research study.^
Resumo:
Trastuzumab is a humanized-monoclonal antibody, developed specifically for HER2-neu over-expressed breast cancer patients. Although highly effective and well tolerated, it was reported associated with Congestive Heart Failure (CHF) in clinical trial settings (up to 27%). This leaves a gap where, Trastuzumab-related CHF rate in general population, especially older breast cancer patients with long term treatment of Trastuzumab remains unknown. This thesis examined the rates and risk factors associated with Trastuzumab-related CHF in a large population of older breast cancer patients. A retrospective cohort study using the existing Surveillance, Epidemiology and End Results (SEER) and Medicare linked de-identified database was performed. Breast cancer patients ≥ 66 years old, stage I-IV, diagnosed in 1998-2007, fully covered by Medicare but no HMO within 1-year before and after first diagnosis month, received 1st chemotherapy no earlier than 30 days prior to diagnosis were selected as study cohort. The primary outcome of this study is a diagnosis of CHF after starting chemotherapy but none CHF claims on or before cancer diagnosis date. ICD-9 and HCPCS codes were used to pool the claims for Trastuzumab use, chemotherapy, comorbidities and CHF claims. Statistical analysis including comparison of characteristics, Kaplan-Meier survival estimates of CHF rates for long term follow up, and Multivariable Cox regression model using Trastuzumab as a time-dependent variable were performed. Out of 17,684 selected cohort, 2,037 (12%) received Trastuzumab. Among them, 35% (714 out of 2037) were diagnosed with CHF, compared to 31% (4784 of 15647) of CHF rate in other chemotherapy recipients (p<.0001). After 10 years of follow-up, 65% of Trastuzumab users developed CHF, compared to 47% in their counterparts. After adjusting for patient demographic, tumor and clinical characteristics, older breast cancer patients who used Trastuzumab showed a significantly higher risk in developing CHF than other chemotherapy recipients (HR 1.69, 95% CI 1.54 - 1.85). And this risk is increased along with the increment of age (p-value < .0001). Among Trastuzumab users, these covariates also significantly increased the risk of CHF: older age, stage IV, Non-Hispanic black race, unmarried, comorbidities, Anthracyclin use, Taxane use, and lower educational level. It is concluded that, Trastuzumab users in older breast cancer patients had 69% higher risk in developing CHF than non-Trastuzumab users, much higher than the 27% increase reported in younger clinical trial patients. Older age, Non-Hispanic black race, unmarried, comorbidity, combined use with Anthracycline or Taxane also significantly increase the risk of CHF development in older patients treated with Trastuzumab. ^