54 resultados para 10-95
Resumo:
AIM: To compare the 10-year peri-implant bone loss (BL) rate in periodontally compromised (PCP) and periodontally healthy patients (PHP) around two different implant systems supporting single-unit crowns. MATERIALS AND METHODS: In this retrospective, controlled study, the mean BL (mBL) rate around dental implants placed in four groups of 20 non-smokers was evaluated after a follow-up of 10 years. Two groups of patients treated for periodontitis (PCP) and two groups of PHP were created. For each category (PCP and PHP), two different types of implant had been selected. The mBL was calculated by subtracting the radiographic bone levels at the time of crown cementation from the bone levels at the 10-year follow-up. RESULTS: The mean age, mean full-mouth plaque and full-mouth bleeding scores and implant location were similar between the four groups. Implant survival rates ranged between 85% and 95%, without statistically significant differences (P>0.05) between groups. For both implant systems, PCP showed statistically significantly higher mBL rates and number of sites with BL> or =3 mm compared with PHP (P<0.0001). CONCLUSIONS: After 10 years, implants in PCP yielded lower survival rates and higher mean marginal BL rates compared with those of implants placed in PHP. These results were independent of the implant system used or the healing modality applied.
Resumo:
PURPOSE: To compare adjuvant dose-intensive epirubicin and cyclophosphamide chemotherapy administered with filgrastim and progenitor cell support (DI-EC) with standard-dose anthracycline-based chemotherapy (SD-CT) for patients with early-stage breast cancer and a high risk of relapse, defined as stage II disease with 10 or more positive axillary nodes; or an estrogen receptor-negative or stage III tumor with five or more positive axillary nodes. PATIENTS AND METHODS: Three hundred forty-four patients were randomized after surgery to receive seven cycles of SD-CT over 22 weeks, or three cycles of DI-EC (epirubicin 200 mg/m2 plus cyclophosphamide 4 gm/m2 with filgrastim and progenitor cell support) over 6 weeks. All patients were assigned tamoxifen at the completion of chemotherapy. The primary end point was disease-free survival (DFS). RESULTS: After a median follow-up of 5.8 years (range, 3 to 8.4 years), 188 DFS events had occurred (DI-EC, 86 events; SD-CT, 102 events). The 5-year DFS was 52% for DI-EC and 43% for SD-CT, with hazard ratio of DI-EC compared with SD-CT of 0.77 (95% CI, 0.58 to 1.02; P = .07). The 5-year overall survival was 70% for DI-EC and 61% for SD-CT, with a hazard ratio of 0.79 (95% CI, 0.56 to 1.11; P = .17). There were eight cases (5%) of anthracycline-induced cardiomyopathy (two fatal) among those who received DI-EC. Women with hormone receptor-positive tumors benefited significantly from DI-EC. CONCLUSION: There was a trend in favor of DI-EC with respect to disease-free survival. A larger trial or meta-analysis will be required to reveal the true effect of dose-intensive therapy.
Resumo:
A multicenter trial was performed to confirm the therapeutic efficacy and the toxicity profile of the combination of cladribine, cyclophosphamide and prednisone in low-grade non-Hodgkin's lymphoma (NHL) and chronic lymphocytic leukemia (CLL). Twenty-three adults with previously treated (61%) or untreated (39%) NHL International Working Formulation A or Binet B and C CLL were administered cladribine 0.1 mg/kg/day as a subcutaneous bolus for 5 days, intravenous cyclophosphamide 500 mg/m2 on day 1, and oral prednisone 40 mg/m2 on days 1-5, every 4 weeks. Unexpected early hematological toxicities led to dose modifications for pretreated patients who received cladribine for 3 days only up to a maximum of five courses. Responses were observed in 75%, with 7 patients obtaining a complete clinical and hematological response. Median duration of complete response was 9 months. Median time to progression or relapse was 31 months. Myelosuppression and infections were dose limiting whereas posttreatment complications, including fatalities, resulted from infections. Median overall survival time from trial entry was 60 months. Activity of the combination of cladribine, cyclophosphamide and prednisone was confirmed. However, in the specific setting of a multicenter trial, unexpected fatal infectious episodes occurred in pretreated patients. Great caution is thus required in these susceptible patients and the routine use of corticosteroids should probably be abandoned.
Resumo:
Exercise intolerance may be reported by parents of young children with respiratory diseases. There is, however, a lack of standardized exercise protocols which allow verification of these reports especially in younger children. Consequently the aims of this pilot study were to develop a standardized treadmill walking test for children aged 4-10 years demanding low sensorimotor skills and achieving high physical exhaustion. In a prospective experimental cross sectional pilot study, 33 healthy Caucasian children were separated into three groups: G1 (4-6 years, n = 10), G2 (7-8 years, n = 12), and G3 (9-10 years, n = 11). Children performed the treadmill walking test with increasing exercise levels up to peak condition with maximal exhaustion. Gas exchange, heart rate, and lactate were measured during the test, spirometry before and after. Parameters were statistically calculated at all exercise levels as well as at 2 and 4 mmol/L lactate level for group differences (Kruskal-Wallis H-test, alpha = 0.05; post hoc: Mann-Whitney U-test with Bonferroni correction alpha = 0.05/n) and test-retest differences (Wilcoxon-rank-sum test) with SPSS. The treadmill walking test could be demonstrated to be feasible with a good repeatability within groups for most of the parameters. All children achieved a high exhaustion level. At peak level under exhaustion condition only the absolute VO2 and VCO2 differed significantly between age groups. In conclusion this newly designed treadmill walking test indicates a good feasibility, safety, and repeatability. It suggests the potential usefulness of exercise capacity monitoring for children aged from early 4 to 10 years. Various applications and test modifications will be investigated in further studies.
Resumo:
INTRODUCTION: We report the results of a titanium acetabular reinforcement ring with a hook (ARRH) in primary total hip arthroplasty (THA), which was introduced in 1987 and continues to be used routinely in our center. The favorable results of this device in arthroplasty for developmental dysplasia and difficult revisions motivated its use in primary THA. With this implant only minimal acetabular reaming is necessary, anatomic positioning is achieved by placing the hook around the teardrop and a homogenous base for cementing the polyethylene cup is provided. MATERIALS AND METHODS: Between April 1987 and December 1991, 241 THAs with insertion of an ARRH were performed in 178 unselected, consecutive patients (average age 58 years; range 30-84 years) with a secondary osteoarthrosis in 41% of the cases. RESULTS: At the time of the latest follow-up, 33 patients (39 hips) had died and 17 cases had been lost to follow-up. The median follow-up was 122 months with a minimum of 10 years. Eight hips had been revised, leaving 177 hips in 120 living patients without revision. Six cups were revised because of aseptic loosening. Two hips were revised for sepsis. The mean Merle d'Aubigné score for the remaining hips was 16 (range 7-18) at the latest follow-up. For aseptic loosening, the probability of survival of the cup was 0.97 (95% confidence interval, 0.94-0.99). However, analysis of radiographs implied loosening in seven other cups without clinical symptoms. CONCLUSIONS: The results of primary THA using an acetabular reinforcement ring parallel the excellent results of these implants often observed in difficult primary and revision arthroplasty at a minimum of 10 years. Survivorship is comparable to modern cementless implants. Medial migration that occurs with loosening of the acetabular component seems to be prevented with this implant. Radiographic loosening signs can exist without clinical symptoms.
Resumo:
BACKGROUND: The role of adjuvant dose-intensive chemotherapy and its efficacy according to baseline features has not yet been established. PATIENTS AND METHODS: Three hundred and forty-four patients were randomized to receive seven courses of standard-dose chemotherapy (SD-CT) or three cycles of dose-intensive epirubicin and cyclophosphamide (epirubicin 200 mg/m(2) plus cyclophosphamide 4 mg/m(2) with filgrastim and progenitor cell support). All patients were assigned tamoxifen at the completion of chemotherapy. The primary end point was disease-free survival (DFS). This paper updates the results and explores patterns of recurrence according to predicting baseline features. RESULTS: At 8.3-years median follow-up, patients assigned DI-EC had a significantly better DFS compared with those assigned SD-CT [8-year DFS percent 47% and 37%, respectively, hazard ratio (HR) 0.76; 95% confidence interval 0.58-1.00; P = 0.05]. Only patients with estrogen receptor (ER)-positive disease benefited from the DI-EC (HR 0.61; 95% confidence interval 0.39, 0.95; P = 0.03). CONCLUSIONS: After prolonged follow-up, DI-EC significantly improved DFS, but the effect was observed only in patients with ER-positive disease, leading to the hypothesis that efficacy of DI-EC may relate to its endocrine effects. Further studies designed to confirm the importance of endocrine responsiveness in patients treated with dose-intensive chemotherapy are encouraged.
Resumo:
BACKGROUND Of the approximately 2.4 million American women with a history of breast cancer, 43% are aged ≥ 65 years and are at risk for developing subsequent malignancies. METHODS Women from 6 geographically diverse sites included 5-year breast cancer survivors (N = 1361) who were diagnosed between 1990 and 1994 at age ≥ 65 years with stage I or II disease and a comparison group of women without breast cancer (N = 1361). Women in the comparison group were age-matched and site-matched to breast cancer survivors on the date of breast cancer diagnosis. Follow-up began 5 years after the index date (survivor diagnosis date or comparison enrollment date) until death, disenrollment, or through 15 years after the index date. Data were collected from medical records and electronic sources (cancer registry, administrative, clinical, National Death Index). Analyses included descriptive statistics, crude incidence rates, and Cox proportional hazards regression models for estimating the risk of incident malignancy and were adjusted for death as a competing risk. RESULTS Survivors and women in the comparison group were similar: >82% were white, 55% had a Charlson Comorbidity Index of 0, and ≥ 73% had a body mass index ≤ 30 kg/m(2) . Of all 306 women (N = 160 in the survivor group, N = 146 in the comparison group) who developed a first incident malignancy during follow-up, the mean time to malignancy was similar (4.37 ± 2.81 years vs 4.03 ± 2.76 years, respectively; P = .28), whereas unadjusted incidence rates were slightly higher in survivors (1882 vs 1620 per 100,000 person years). The adjusted hazard of developing a first incident malignancy was slightly elevated in survivors in relation to women in the comparison group, but it was not statistically significant (hazard ratio, 1.17; 95% confidence interval, 0.94-1.47). CONCLUSIONS Older women who survived 5 years after an early stage breast cancer diagnosis were not at an elevated risk for developing subsequent incident malignancies up to 15 years after their breast cancer diagnosis.
Resumo:
BACKGROUND Conventional factors do not fully explain the distribution of cardiovascular outcomes. Biomarkers are known to participate in well-established pathways associated with cardiovascular disease, and may therefore provide further information over and above conventional risk factors. This study sought to determine whether individual and/or combined assessment of 9 biomarkers improved discrimination, calibration and reclassification of cardiovascular mortality. METHODS 3267 patients (2283 men), aged 18-95 years, at intermediate-to-high-risk of cardiovascular disease were followed in this prospective cohort study. Conventional risk factors and biomarkers were included based on forward and backward Cox proportional stepwise selection models. RESULTS During 10-years of follow-up, 546 fatal cardiovascular events occurred. Four biomarkers (interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D) were retained during stepwise selection procedures for subsequent analyses. Simultaneous inclusion of these biomarkers significantly improved discrimination as measured by the C-index (0.78, P = 0.0001), and integrated discrimination improvement (0.0219, P<0.0001). Collectively, these biomarkers improved net reclassification for cardiovascular death by 10.6% (P<0.0001) when added to the conventional risk model. CONCLUSIONS In terms of adverse cardiovascular prognosis, a biomarker panel consisting of interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D offered significant incremental value beyond that conveyed by simple conventional risk factors.
Resumo:
Orbital blunt trauma is common, and the diagnosis of a fracture should be made by computed tomographic (CT) scan. However, this will expose patients to ionising radiation. Our objective was to identify clinical predictors of orbital fracture, in particular the presence of a black eye, to minimise unnecessary exposure to radiation. A 10-year retrospective study was made of the medical records of all patients with minor head trauma who presented with one or two black eyes to our emergency department between May 2000 and April 2010. Each of the patients had a CT scan, was over 16 years old, and had a Glasgow Coma Score (GCS) of 13-15. The primary outcome was whether the black eye was a valuable predictor of a fracture. Accompanying clinical signs were considered as a secondary outcome. A total of 1676 patients (mean (SD) age 51 (22) years) and minor head trauma with either one or two black eyes were included. In 1144 the CT scan showed a fracture of the maxillofacial skeleton, which gave an incidence of 68.3% in whom a black eye was the obvious symptom. Specificity for facial fractures was particularly high for other clinical signs, such as diminished skin sensation (specificity 96.4%), diplopia or occulomotility disorders (89.3%), fracture steps (99.8%), epistaxis (95.5%), subconjunctival haemorrhage (90.4%), and emphysema (99.6%). Sensitivity for the same signs ranged from 10.8% to 22.2%. The most striking fact was that 68.3% of all patients with a black eye had an underlying fracture. We therefore conclude that a CT scan should be recommended for every patient with minor head injury who presents with a black eye.
Resumo:
BACKGROUND The objective of the present investigation is to assess the baseline mortality-adjusted 10-year survival of rectal cancer patients. METHODS Ten-year survival was analyzed in 771 consecutive American Joint Committee on Cancer (AJCC) stage I-IV rectal cancer patients undergoing open resection between 1991 and 2008 using risk-adjusted Cox proportional hazard regression models adjusting for population-based baseline mortality. RESULTS The median follow-up of patients alive was 8.8 years. The 10-year relative, overall, and cancer-specific survival were 66.5% [95% confidence interval (CI) 61.3-72.1], 48.7% (95% CI 44.9-52.8), and 66.4% (95% CI 62.5-70.5), respectively. In the entire patient sample (stage I-IV) 47.3% and in patients with stage I-III 33.6 % of all deaths were related to rectal cancer during the 10-year period. For patients with AJCC stage I rectal cancer, the 10-year overall survival was 96% and did not significantly differ from an average population after matching for gender, age, and calendar year (p = 0.151). For the more advanced tumor stages, however, survival was significantly impaired (p < 0.001). CONCLUSIONS Retrospective investigations of survival after rectal cancer resection should adjust for baseline mortality because a large fraction of deaths is not cancer related. Stage I rectal cancer patients, compared to patients with more advanced disease stages, have a relative survival close to 100% and can thus be considered cured. Using this relative-survival approach, the real public health burden caused by rectal cancer can reliably be analyzed and reported.
Resumo:
PURPOSE Clinical studies related to the long-term outcomes with implant-supported reconstructions are still sparse. The aim of this 10-year retrospective study was to assess the rate of mechanical/technical complications and failures with implant supported fixed dental prostheses (FDPs) and single crowns (SCs) in a large cohort of partially edentulous patients. MATERIALS AND METHODS The comprehensive multidisciplinary examination consisted of a medical/dental history, clinical examination, and a radiographic analysis. Prosthodontic examination evaluated the implant-supported reconstructions for mechanical/technical complications and failures, occlusal analysis, presence/absence of attrition, and location, extension, and retention type. RESULTS Out of three hundred ninety seven fixed reconstructions in three hundred three patients, two hundred sixty eight were SCs and one hundred twenty seven were FDPs. Of these three hundred ninety seven implant-supported reconstructions, 18 had failed, yielding a failure rate of 4.5% and a survival rate of 95.5% after a mean observation period of 10.75 years (range: 8.4-13.5 years). The most frequent complication was ceramic chipping (20.31%) followed by occlusal screw loosening (2.57%) and loss of retention (2.06%). No occlusal screw fracture, one abutment loosening, and two abutment fractures were noted. This resulted in a total mechanical/technical complication rate of 24.7%. The prosthetic success rate over a mean follow-up time of 10.75 years was 70.8%. Generalized attrition and FDPs were associated with statistically significantly higher rates of ceramic fractures when compared with SCs. Cantilever extensions, screw retention, anterior versus posterior, and gender did not influence the chipping rate. CONCLUSIONS After a mean exposure time of 10.75 years, high survival rates for reconstructions supported by Sand-blasted Large-grit Acid-etched implants can be expected. Ceramic chipping was the most frequent complication and was increased in dentitions with attrition and in FDPs compared with SCs.
Resumo:
BACKGROUND Limited data exist on the longitudinal crestal bone changes around teeth compared with implants in partially edentulous patients. This study sought to compare the 10-year radiographic crestal bone changes (bone level [BL]) around teeth and implants in periodontally compromised (PCPs) and periodontally healthy (PHPs) patients. METHODS A total of 120 patients were evaluated for the radiographic crestal BL around dental implants and adjacent teeth at time of implant crown insertion and at the 10-year follow-up. Sixty patients had a previous history of periodontitis (PCPs), and the remaining 60 were PHPs. In each category (PCP and PHP), two different implant systems were used. The mean BL change at the implant and at the adjacent tooth at the interproximal area was calculated by subtracting the radiographic crestal BL at the time of crown cementation from the radiographic crestal BL at the 10-year follow-up. RESULTS At 10 years after therapy, the survival rate ranged from 80% to 95% for subgroups for implants, whereas it was 100% for the adjacent teeth. In all eight different patient categories evaluated, teeth demonstrated a significantly more stable radiographic BL compared with adjacent dental implants (teeth BL, 0.44 ± 0.23 mm; implant BL, 2.28 ± 0.72 mm; P <0.05). Radiographic BL changes around teeth seemed not to be influenced by the presence or absence of advanced bone loss (≥3 mm) at the adjacent implants. CONCLUSIONS Natural teeth yielded better long-term results with respect to survival rate and marginal BL changes compared with dental implants. Moreover, these findings also extend to teeth with an initial reduced periodontal attachment level, provided adequate periodontal treatment and maintenance are performed. As a consequence, the decision of tooth extraction attributable to periodontal reasons in favor of a dental implant should be carefully considered in partially edentulous patients.
Resumo:
OBJECTIVE To analyze the precision of fit of implant-supported screw-retained computer-aided-designed and computer-aided-manufactured (CAD/CAM) zirconium dioxide (ZrO) frameworks. MATERIALS AND METHODS Computer-aided-designed and computer-aided-manufactured ZrO frameworks (NobelProcera) for a screw-retained 10-unit implant-supported reconstruction on six implants (FDI positions 15, 13, 11, 21, 23, 25) were fabricated using a laser (ZrO-L, N = 6) and a mechanical scanner (ZrO-M, N = 5) for digitizing the implant platform and the cuspid-supporting framework resin pattern. Laser-scanned CAD/CAM titanium (TIT-L, N = 6) and cast CoCrW-alloy frameworks (Cast, N = 5) fabricated on the same model and designed similar to the ZrO frameworks were the control. The one-screw test (implant 25 screw-retained) was applied to assess the vertical microgap between implant and framework platform with a scanning electron microscope. The mean microgap was calculated from approximal and buccal values. Statistical comparison was performed with non-parametric tests. RESULTS No statistically significant pairwise difference was observed between the relative effects of vertical microgap between ZrO-L (median 14 μm; 95% CI 10-26 μm), ZrO-M (18 μm; 12-27 μm) and TIT-L (15 μm; 6-18 μm), whereas the values of Cast (236 μm; 181-301 μm) were significantly higher (P < 0.001) than the three CAD/CAM groups. A monotonous trend of increasing values from implant 23 to 15 was observed in all groups (ZrO-L, ZrO-M and Cast P < 0.001, TIT-L P = 0.044). CONCLUSIONS Optical and tactile scanners with CAD/CAM technology allow for the fabrication of highly accurate long-span screw-retained ZrO implant-reconstructions. Titanium frameworks showed the most consistent precision. Fit of the cast alloy frameworks was clinically inacceptable.