61 resultados para SURVIVAL ANALYSIS
Resumo:
Do siblings of centenarians tend to have longer life spans? To answer this question, life spans of 184 siblings for 42 centenarians have been evaluated. Two important questions have been addressed in analyzing the sibling data. First, a standard needs to be established, to which the life spans of 184 siblings are compared. In this report, an external reference population is constructed from the U.S. life tables. Its estimated mortality rates are treated as baseline hazards from which the relative mortality of the siblings are estimated. Second, the standard survival models which assume independent observations are invalid when correlation within family exists, underestimating the true variance. Methods that allow correlations are illustrated by three different methods. First, the cumulative relative excess mortality between siblings and their comparison group is calculated and used as an effective graphic tool, along with the Product Limit estimator of the survival function. The variance estimator of the cumulative relative excess mortality is adjusted for the potential within family correlation using Taylor linearization approach. Second, approaches that adjust for the inflated variance are examined. They are adjusted one-sample log-rank test using design effect originally proposed by Rao and Scott in the correlated binomial or Poisson distribution setting and the robust variance estimator derived from the log-likelihood function of a multiplicative model. Nether of these two approaches provide correlation estimate within families, but the comparison with the comparison with the standard remains valid under dependence. Last, using the frailty model concept, the multiplicative model, where the baseline hazards are known, is extended by adding a random frailty term that is based on the positive stable or the gamma distribution. Comparisons between the two frailty distributions are performed by simulation. Based on the results from various approaches, it is concluded that the siblings of centenarians had significant lower mortality rates as compared to their cohorts. The frailty models also indicate significant correlations between the life spans of the siblings. ^
Resumo:
Objective. The goal of this study is to characterize the current workforce of CIHs, the lengths of professional practice careers of the past and current CIHs.^ Methods. This is a secondary data analysis of data compiled from all of the nearly 50 annual roster listings of the American Board of Industrial Hygiene (ABIH) for Certified Industrial Hygienists active in each year since 1960. Survival analysis was performed as a technique to measure the primary outcome of interest. The technique which was involved in this study was the Kaplan-Meier method for estimating the survival function.^ Study subjects: The population to be studied is all Certified Industrial Hygienists (CIHs). A CIH is defined by the ABIH as an individual who has achieved the minimum requirements for education, working experience and through examination, has demonstrated a minimum level of knowledge and competency in the prevention of occupational illnesses. ^ Results. A Cox-proportional hazards model analysis was performed by different start-time cohorts of CIHs. In this model we chose cohort 1 as the reference cohort. The estimated relative risk of the event (defined as retirement, or absent from 5 consecutive years of listing) occurred for CIHs for cohorts 2,3,4,5 relative to cohort 1 is 0.385, 0.214, 0.234, 0.299 relatively. The result show that cohort 2 (CIHs issued from 1970-1980) has the lowest hazard ratio which indicates the lowest retirement rate.^ Conclusion. The manpower of CIHs (still actively practicing up to the end of 2009) increased tremendously starting in 1980 and grew into a plateau in recent decades. This indicates that the supply and demand of the profession may have reached equilibrium. More demographic information and variables are needed to actually predict the future number of CIHs needed. ^
Resumo:
Background. Breast cancer is the most frequently diagnosed cancer and the leading cause of cancer death among females, accounting for 23% (1.38 million) of the total new cancer cases and 14% (458,400) of the total cancer deaths in 2008. [1] Triple-negative breast cancer (TNBC) is an aggressive phenotype comprising 10–20% of all breast cancers (BCs). [2-4] TNBCs show absence of estrogen, progesterone and HER2/neu receptors on the tumor cells. Because of the absence of these receptors, TNBCs are not candidates for targeted therapies. Circulating tumor cells (CTCs) are observed in blood of breast cancer patients even at early stages (Stage I & II) of the disease. Immunological and molecular analysis can be used to detect the presence of tumor cells in the blood (Circulating tumor cells; CTCs) of many breast cancer patients. These cells may explain relapses in early stage breast cancer patients even after adequate local control. CTC detection may be useful in identifying patients at risk for disease progression, and therapies targeting CTCs may improve outcome in patients harboring them. Methods . In this study we evaluated 80 patients with TNBC who are enrolled in a larger prospective study conducted at M D Anderson Cancer Center in order to determine whether the presence of circulating tumor cells is a significant prognostic factor in relapse free and overall survival . Patients with metastatic disease at the time of presentation were excluded from the study. CTCs were assessed using CellSearch System™ (Veridex, Raritan, NJ). CTCs were defined as nucleated cells lacking the presence of CD45 but expressing cytokeratins 8, 18 or 19. The distribution of patient and tumor characteristics was analyzed using chi square test and Fisher's exact test. Log rank test and Cox regression analysis was applied to establish the association of circulating tumor cells with relapse free and overall survival. Results. The median age of the study participants was 53years. The median duration of follow-up was 40 months. Eighty-eight percent (88%) of patients were newly diagnosed (without a previous history of breast cancer), and (60%) of patients were chemo naïve (had not received chemotherapy at the time of their blood draw for CTC analysis). Tumor characteristics such as stage (P=0.40), tumor size (P=69), sentinel nodal involvement (P=0.87), axillary lymph node involvement (P=0.13), adjuvant therapy (P=0.83), and high histological grade of tumor (P=0.26) did not predict the presence of CTCs. However, CTCs predicted worse relapse free survival (1 or more CTCs log rank P value = 0.04, at 2 or more CTCs P = 0.02 and at 3 or more CTCs P < 0.0001) and overall survival (at 1 or more CTCs log rank P value = 0.08, at 2 or more CTCs P = 0.01 and at 3 or more CTCs P = 0.0001. Conclusions. The number of circulating tumor cells predicted worse relapse free survival and overall survival in TNBC patients.^
Resumo:
Brain tumor is one of the most aggressive types of cancer in humans, with an estimated median survival time of 12 months and only 4% of the patients surviving more than 5 years after disease diagnosis. Until recently, brain tumor prognosis has been based only on clinical information such as tumor grade and patient age, but there are reports indicating that molecular profiling of gliomas can reveal subgroups of patients with distinct survival rates. We hypothesize that coupling molecular profiling of brain tumors with clinical information might improve predictions of patient survival time and, consequently, better guide future treatment decisions. In order to evaluate this hypothesis, the general goal of this research is to build models for survival prediction of glioma patients using DNA molecular profiles (U133 Affymetrix gene expression microarrays) along with clinical information. First, a predictive Random Forest model is built for binary outcomes (i.e. short vs. long-term survival) and a small subset of genes whose expression values can be used to predict survival time is selected. Following, a new statistical methodology is developed for predicting time-to-death outcomes using Bayesian ensemble trees. Due to a large heterogeneity observed within prognostic classes obtained by the Random Forest model, prediction can be improved by relating time-to-death with gene expression profile directly. We propose a Bayesian ensemble model for survival prediction which is appropriate for high-dimensional data such as gene expression data. Our approach is based on the ensemble "sum-of-trees" model which is flexible to incorporate additive and interaction effects between genes. We specify a fully Bayesian hierarchical approach and illustrate our methodology for the CPH, Weibull, and AFT survival models. We overcome the lack of conjugacy using a latent variable formulation to model the covariate effects which decreases computation time for model fitting. Also, our proposed models provides a model-free way to select important predictive prognostic markers based on controlling false discovery rates. We compare the performance of our methods with baseline reference survival methods and apply our methodology to an unpublished data set of brain tumor survival times and gene expression data, selecting genes potentially related to the development of the disease under study. A closing discussion compares results obtained by Random Forest and Bayesian ensemble methods under the biological/clinical perspectives and highlights the statistical advantages and disadvantages of the new methodology in the context of DNA microarray data analysis.
Resumo:
INTRODUCTION: Actual 5-year survival rates of 10-18% have been reported for patients with resected pancreatic adenocarcinoma (PC), but the use of multimodality therapy was uncommon in these series. We evaluated long-term survival and patterns of recurrence in patients treated for PC with contemporary staging and multimodality therapy. METHODS: We analyzed 329 consecutive patients with PC evaluated between 1990 and 2002 who underwent resection. Each received a multidisciplinary evaluation and a standard operative approach. Pre- or postoperative chemotherapy and/or chemoradiation were routine. Surgical specimens of 5-year survivors were re-reviewed. A multivariate model of factors associated with long-term survival was constructed. RESULTS: Patients underwent pancreaticoduodenectomy (n = 302; 92%), distal (n = 20; 6%), or total pancreatectomy (n = 7; 2%). A total of 108 patients (33%) underwent vascular reconstruction, 301 patients (91%) received neoadjuvant or adjuvant therapy, 157 specimens (48%) were node positive, and margins were microscopically positive in 52 patients (16%). Median overall survival and disease-specific survival was 23.9 and 26.5 months. Eighty-eight patients (27%) survived a minimum of 5 years and had a median overall survival of 11 years. Of these, 21 (24%) experienced recurrence, 7 (8%) after 5 years. Late recurrences occurred most frequently in the lungs, the latest at 6.7 years. Multivariate analysis identified disease-negative lymph nodes (P = .02) and no prior attempt at resection (P = 0.01) as associated with 5-year survival. CONCLUSIONS: Our 27% actual 5-year survival rate for patients with resected PC is superior to that previously reported, and it is influenced by our emphasis on detailed staging and patient selection, a standardized operative approach, and routine use of multimodality therapy.
Resumo:
BACKGROUND: The incidence of hepatitis C virus (HCV) and hepatocellular carcinoma (HCC) is increasing. The purpose of this study is to establish baseline survival in a medically-underserved population and to evaluate the effect of HCV seropositivity on our patient population. MATERIALS AND METHODS: We reviewed clinicopathologic parameters from a prospective tumor registry and medical records from the Harris County Hospital District (HCHD). Outcomes were compared using Kaplan-Meier survival analysis and log-rank tests. RESULTS: A total of 298 HCC patients were identified. The median survival for the entire cohort was 3.4 mo. There was no difference in survival between the HCV seropositive and the HCV seronegative groups (3.6 mo versus 2.6 mo, P = 0.7). Patients with a survival <1 mo had a significant increase in>αfetoprotein (AFP), international normalized ratio (INR), model for end-stage liver disease (MELD) score, and total bilirubin and decrease in albumin compared with patients with a survival ≥ 1 mo. CONCLUSIONS: Survival for HCC patients in the HCHD is extremely poor compared with an anticipated median survival of 7 mo reported in other studies. HCV seropositive patients have no survival advantage over HCV seronegative patients. Poorer liver function at diagnosis appears to be related to shorter survival. Further analysis into variables contributing to decreased survival is needed.
Resumo:
It is estimated that 50% of all lung cancer patients continue to smoke after diagnosis. Many of these lung cancer patients who are current smokers often experience tremendous guilt and responsibility for their disease, and feel it might be too late for them to quit smoking. In addition, many oncologists may be heard to say that it is 'too late', 'it doesn't matter', 'it is too difficult', 'it is too stressful' for their patients to stop smoking, or they never identify the smoking status of the patient. Many oncologists feel unprepared to address smoking cessation as part of their clinical practice. In reality, physicians can have tremendous effects on motivating patients, particularly when patients are initially being diagnosed with cancer. More information is needed to convince patients to quit smoking and to encourage clinicians to assist patients with their smoking cessation. ^ In this current study, smoking status at time of lung cancer diagnosis was assessed to examine its impact on complications and survival, after exploring the reliability of smoking data that is self-reported. Logistic Regression was used to determine the risks of smoking prior to lung resection. In addition, survival analysis was performed to examine the impact of smoking on survival. ^ The reliability of how patients report their smoking status was high, but there was some discordance between current smokers and recent quitters. In addition, we found that cigarette pack-year history and duration of smoking cessation were directly related to the rate of a pulmonary complication. In regards to survival, we found that current smoking at time of lung cancer diagnosis was an independent predictor of early stage lung cancer. This evidence supports the idea that it is "never too late" for patients to quit smoking and health care providers should incorporate smoking status regularly into their clinical practice.^
Resumo:
Alzheimer's disease (AD), the most common form of dementia, is the fifth leading cause of death among U.S. adults aged 65 or older. Most AD patients have shorter life expectancy compared with older people without dementia. This disease has become an enormous challenge in the aging society and is also a global problem. Not only do families of patients with Alzheimer's disease need to pay attention to this problem, but also the healthcare system and society as a whole have to confront. In dementia, functional impairment is associated with basic activities of daily living (ADL) and instrumental activities of daily living (IADL). For patients with Alzheimer's disease, problems typically appear in performing IADL and progress to the inability of managing less complex ADL functions of personal care. Thus, assessment of ADLs can be used for early accurate diagnosis of Alzheimer's disease. It should be useful for patients, caregivers, clinicians, and policy planners to estimate the survival of patients with Alzheimer's disease. However, it is unclear that when making predictions of patient outcome according to their histories, time-dependent covariates will provide us with important information on how changes in a patient's status can effect the survival. In this study, we examined the effect of impaired basic ADL as measured by the Physical Self-Maintenance Scale (PSMS) and utilized a multistate survival analysis approach to estimate the probability of death in the first few years of initial visit for AD patients taking into consideration the possibility of impaired basic ADL. The dataset used in this study was obtained from the Baylor Alzheimer's Disease and Memory Disorders Center (ADMDC). No impaired basic ADL and older age at onset of impaired basic ADL were associated with longer survival. These findings suggest that the occurrence of impaired basic ADL and age at impaired basic ADL could be predictors of survival among patients with Alzheimer's disease. ^
Resumo:
Stomach cancer is the fourth most common cancer in the world, and ranked 16th in the US in 2008. The age-adjusted rates among Hispanics were 2.8 times that of non-Hispanic Whites in 1998-2002. In spite of that, previous research has found that Hispanics with non-cardia adenocarcinoma of the stomach have a slightly better survival than non-Hispanic Whites. However, such previous research did not include a comparison with African-Americans, and it was limited to data released for the years 1973-2000 in the nine original Surveillance, Epidemiology, and End Results Cancer Registries. This finding was interpreted as related to the Hispanic Paradox, a phenomenon that refers to the fact that Hispanics in the USA tend to paradoxically have substantially better health than other ethnic groups in spite of what their aggregate socio-economic indicators would predict. We extended such research to the SEER 17 Registry, 1973-2005, with varying years of diagnosis per registry, and compared the survival of non-cardia adenocarcinoma of the stomach according to ethnicity (Hispanics, non-Hispanic Whites and African-Americans), while controlling for age, gender, marital status, stage of disease and treatment using Cox regression survival analysis. We found that Hispanic ethnicity by itself did not confer an advantage on survival from non-cardia adenocarcinoma of the stomach, but that being born abroad was independently associated with the apparent 'Hispanic Paradox' previously reported, and that such advantage was seen among foreign born persons across all race/ethnic groups.^
Resumo:
Purpose. A descriptive analysis of glioma patients by race was carried out in order to better elucidate potential differences between races in demographics, treatment, characteristics, prognosis and survival. ^ Patients and Methods. Among 1,967 patients ≥ 18 years diagnosed with glioma seen between July 2000 and September 2006 at The University of Texas M.D. Anderson Cancer Center (UTMDACC). Data were collated from the UTMDACC Patient History Database (PHDB) and the UTMDACC Tumor Registry Database (TRDB). Chi-square analysis, uni- /multivariate Cox proportional hazards modeling and survival analysis were used to analyze differences by race. ^ Results. Demographic, treatment and histologic differences exist between races. Though risk differences were seen between races, race was not found to be a significant predictor in multivariate regression analysis after accounting for age, surgery, chemotherapy, radiation, tumor type as stratified by WHO tumor grade. Age was the most consistent predictor in risk for death. Overall survival by race was significantly different (p=0.0049) only in low-grade gliomas after adjustment for age although survival differences were very slight. ^ Conclusion. Among this cohort of glioma patients, age was the strongest predictor for survival. It is likely that survival is more influenced by age, time to treatment, tumor grade and surgical expertise rather than racial differences. However, age at diagnosis, gender ratios, histology and history of cancer differed significantly between race and genetic differences to this effect cannot be excluded. ^
Resumo:
Background. Colorectal cancer (CRC) is the third most commonly diagnosed cancer (excluding skin cancer) in both men and women in the United States, with an estimated 148,810 new cases and 49,960 deaths in 2008 (1). Racial/ethnic disparities have been reported across the CRC care continuum. Studies have documented racial/ethnic disparities in CRC screening (2-9), but only a few studies have looked at these differences in CRC screening over time (9-11). No studies have compared these trends in a population with CRC and without cancer. Additionally, although there is evidence suggesting that hospital factors (e.g. teaching hospital status and NCI designation) are associated with CRC survival (12-16), no studies have sought to explain the racial/ethnic differences in survival by looking at differences in socio-demographics, tumor characteristics, screening, co-morbidities, treatment, as well as hospital characteristics. ^ Objectives and Methods. The overall goals of this dissertation were to describe the patterns and trends of racial/ethnic disparities in CRC screening (i.e. fecal occult blood test (FOBT), sigmoidoscopy (SIG) and colonoscopy (COL)) and to determine if racial/ethnic disparities in CRC survival are explained by differences in socio-demographic, tumor characteristics, screening, co-morbidities, treatment, and hospital factors. These goals were accomplished in a two-paper format.^ In Paper 1, "Racial/Ethnic Disparities and Trends in Colorectal Cancer Screening in Medicare Beneficiaries with Colorectal Cancer and without Cancer in SEER Areas, 1992-2002", the study population consisted of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and 62,917 Medicare beneficiaries without cancer during the same time period. Both cohorts were aged 67 to 89 years and resided in 16 Surveillance, Epidemiology and End Results (SEER) regions of the United States. Screening procedures between 6 months and 3 years prior to the date of diagnosis for CRC patients and prior to the index date for persons without cancer were identified in Medicare claims. The crude and age-gender-adjusted percentages and odds ratios of receiving FOBT, SIG, or COL were calculated. Multivariable logistic regression was used to assess race/ethnicity on the odds of receiving CRC screening over time.^ Paper 2, "Racial/Ethnic Disparities in Colorectal Cancer Survival: To what extent are racial/ethnic disparities in survival explained by racial differences in socio-demographics, screening, co-morbidities, treatment, tumor or hospital characteristics", included a cohort of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and residing in 16 SEER regions of the United States which were identified in the SEER-Medicare linked database. Survival was estimated using the Kaplan-Meier method. Cox proportional hazard modeling was used to estimate hazard ratios (HR) of mortality and 95% confidence intervals (95% CI).^ Results. The screening analysis demonstrated racial/ethnic disparities in screening over time among the cohort without cancer. From 1992 to 1995, Blacks and Hispanics were less likely than Whites to receive FOBT (OR=0.75, 95% CI: 0.65-0.87; OR=0.50, 95% CI: 0.34-0.72, respectively) but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.72-0.85; OR=0.67, 95% CI: 0.54-0.75, respectively). Blacks and Hispanics were less likely than Whites to receive SIG from 1992 to 1995 (OR=0.75, 95% CI: 0.57-0.98; OR=0.29, 95% CI: 0.12-0.71, respectively), but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.68-0.93; OR=0.50, 95% CI: 0.35-0.72, respectively).^ The survival analysis showed that Blacks had worse CRC-specific survival than Whites (HR: 1.33, 95% CI: 1.23-1.44), but this was reduced for stages I-III disease after full adjustment for socio-demographic, tumor characteristics, screening, co-morbidities, treatment and hospital characteristics (aHR=1.24, 95% CI: 1.14-1.35). Socioeconomic status, tumor characteristics, treatment and co-morbidities contributed to the reduction in hazard ratios between Blacks and Whites with stage I-III disease. Asians had better survival than Whites before (HR: 0.73, 95% CI: 0.64-0.82) and after (aHR: 0.80, 95% CI: 0.70-0.92) adjusting for all predictors for stage I-III disease. For stage IV, both Asians and Hispanics had better survival than Whites, and after full adjustment, survival improved (aHR=0.73, 95% CI: 0.63-0.84; aHR=0.74, 95% CI: 0.61-0.92, respectively).^ Conclusion. Screening disparities remain between Blacks and Whites, and Hispanics and Whites, but have decreased in recent years. Future studies should explore other factors that may contribute to screening disparities, such as physician recommendations and language/cultural barriers in this and younger populations.^ There were substantial racial/ethnic differences in CRC survival among older Whites, Blacks, Asians and Hispanics. Co-morbidities, SES, tumor characteristics, treatment and other predictor variables contributed to, but did not fully explain the CRC survival differences between Blacks and Whites. Future research should examine the role of quality of care, particularly the benefit of treatment and post-treatment surveillance, in racial disparities in survival.^
Resumo:
Head and Neck Squamous Cell Carcinoma (HNSCC) is the sixth common malignancy in the world, with high rates of developing second primary malignancy (SPM) and moderately low survival rates. This disease has become an enormous challenge in the cancer research and treatments. For HNSCC patients, a highly significant cause of post-treatment mortality and morbidity is the development of SPM. Hence, assessment of predicting the risk for the development of SPM would be very helpful for patients, clinicians and policy makers to estimate the survival of patients with HNSCC. In this study, we built a prognostic model to predict the risk of developing SPM in patients with newly diagnosed HNSCC. The dataset used in this research was obtained from The University of Texas MD Anderson Cancer Center. For the first aim, we used stepwise logistic regression to identify the prognostic factors for the development of SPM. Our final model contained cancer site and overall cancer stage as our risk factors for SPM. The Hosmer-Lemeshow test (p-value= 0.15>0.05) showed the final prognostic model fit the data well. The area under the ROC curve was 0.72 that suggested the discrimination ability of our model was acceptable. The internal validation confirmed the prognostic model was a good fit and the final prognostic model would not over optimistically predict the risk of SPM. This model needs external validation by using large data sample size before it can be generalized to predict SPM risk for other HNSCC patients. For the second aim, we utilized a multistate survival analysis approach to estimate the probability of death for HNSCC patients taking into consideration of the possibility of SPM. Patients without SPM were associated with longer survival. These findings suggest that the development of SPM could be a predictor of survival rates among the patients with HNSCC.^
Resumo:
Individuals with Lynch syndrome are predisposed to cancer due to an inherited DNA mismatch repair gene mutation. However, there is significant variability observed in disease expression likely due to the influence of other environmental, lifestyle, or genetic factors. Polymorphisms in genes encoding xenobiotic-metabolizing enzymes may modify cancer risk by influencing the metabolism and clearance of potential carcinogens from the body. In this retrospective analysis, we examined key candidate gene polymorphisms in CYP1A1, EPHX1, GSTT1, GSTM1, and GSTP1 as modifiers of age at onset of colorectal cancer among 257 individuals with Lynch syndrome. We found that subjects heterozygous for CYP1A1 I462V (c.1384A>G) developed colorectal cancer 4 years earlier than those with the homozygous wild-type genotype (median ages, 39 and 43 years, respectively; log-rank test P = 0.018). Furthermore, being heterozygous for the CYP1A1 polymorphisms, I462V and Msp1 (g.6235T>C), was associated with an increased risk for developing colorectal cancer [adjusted hazard ratio for AG relative to AA, 1.78; 95% confidence interval, 1.16-2.74; P = 0.008; hazard ratio for TC relative to TT, 1.53; 95% confidence interval, 1.06-2.22; P = 0.02]. Because homozygous variants for both CYP1A1 polymorphisms were rare, risk estimates were imprecise. None of the other gene polymorphisms examined were associated with an earlier onset age for colorectal cancer. Our results suggest that the I462V and Msp1 polymorphisms in CYP1A1 may be an additional susceptibility factor for disease expression in Lynch syndrome because they modify the age of colorectal cancer onset by up to 4 years.
Resumo:
Enterococcus faecium is a multidrug-resistant opportunist causing difficult-to-treat nosocomial infections, including endocarditis, but there are no reports experimentally demonstrating E. faecium virulence determinants. Our previous studies showed that some clinical E. faecium isolates produce a cell wall-anchored collagen adhesin, Acm, and that an isogenic acm deletion mutant of the endocarditis-derived strain TX0082 lost collagen adherence. In this study, we show with a rat endocarditis model that TX0082 Deltaacm::cat is highly attenuated versus wild-type TX0082, both in established (72 h) vegetations (P < 0.0001) and for valve colonization 1 and 3 hours after infection (P or=50-fold reduction relative to an Acm producer) were found in three of these five nonadherent isolates, including the sequenced strain TX0016, by quantitative reverse transcription-PCR, indicating that acm transcription is downregulated in vitro in these isolates. However, examination of TX0016 cells obtained directly from infected rat vegetations by flow cytometry showed that Acm was present on 40% of cells grown during infection. Finally, we demonstrated a significant reduction in E. faecium collagen adherence by affinity-purified anti-Acm antibodies from E. faecium endocarditis patient sera, suggesting that Acm may be a potential immunotarget for strategies to control this emerging pathogen.