961 resultados para CHROMOSOME-8 LONG ARM
Resumo:
In this single-center, cross-sectional study, we evaluated 44 very long-term survivors with a median follow-up of 17.5 years (range, 11-26 years) after hematopoietic stem cell transplantation. We assessed the telomere length difference in human leukocyte antigen-identical donor and recipient sibling pairs and searched for its relationship with clinical factors. The telomere length (in kb, mean +/- SD) was significantly shorter in all recipient blood cells compared with their donors' blood cells (P < .01): granulocytes (6.5 +/- 0.9 vs 7.1 +/- 0.9), naive/memory T cells (5.7 +/- 1.2 vs 6.6 +/- 1.2; 5.2 +/- 1.0 vs 5.7 +/- 0.9), B cells (7.1 +/- 1.1 vs 7.8 +/- 1.1), and natural killer/natural killer T cells (4.8 +/- 1.0 vs 5.6 +/- 1.3). Chronic graft-versus-host disease (P < .04) and a female donor (P < .04) were associated with a greater difference in telomere length between donor and recipient. Critically short telomeres have been described in degenerative diseases and secondary malignancies. If this hypothesis can be confirmed, identification of recipients at risk for cellular senescence could become part of monitoring long-term survivors after hematopoietic stem cell transplantation.
Resumo:
We assessed the serological responses over 10 years to repeated immunization of cystic fibrosis (CF) patients with an O-polysaccharide (OPS)-toxin A conjugate vaccine against Pseudomonas aeruginosa. A retrospective analysis was performed with sera from 25 vaccinated and 25 unvaccinated children treated at the same CF centre and matched for clinical management, age and gender. Yearly immunization led to sustained elevations of serum immunoglobulin G (IgG) antibody levels to all vaccine components. Eighteen unvaccinated patients but only eight vaccinated ones developed chronic pseudomonal lung infections. Infection rapidly caused further marked elevations of polysaccharide- but not toxin A-specific serum IgG in both immunized and nonimmunized patients, indicating that protection did not depend on the quantity of IgG present. However, qualitative analyses revealed that the protective capacity of specific serum IgG antibodies was linked to high affinity and to specificity for OPS serotypes rather than for lipopolysaccharide core epitopes.
Resumo:
There is a lot of excitement about the potential use of multipotent neural stem cells for the treatment of neurodegenerative diseases. However, the strategy is compromised by the general loss of multipotency and ability to generate neurons after long-term in vitro propagation. In the present study, human embryonic (5 weeks post-conception) ventral mesencephalic (VM) precursor cells were propagated as neural tissue-spheres (NTS) in epidermal growth factor (EGF; 20 ng/ml) and fibroblast growth factor 2 (FGF2; 20 ng/ml). After more than 325 days, the NTS were transferred to media containing either EGF+FGF2, EGF+FGF2+heparin or leukemia inhibitory factor (LIF; 10 ng/ml)+FGF2+heparin. Cultures were subsequently propagated for more than 180 days with NTS analyzed at various time-points. Our data show for the first time that human VM neural precursor cells can be long-term propagated as NTS in the presence of EGF and FGF2. A positive effect of heparin was found only after 150 days of treatment. After switching into different media, only NTS exposed to LIF contained numerous cells positive for markers of newly formed neurons. Besides of demonstrating the ability of human VM NTS to be long-term propagated, our study also suggests that LIF favours neurogenic differentiation of human VM precursor cells.
Resumo:
BACKGROUND AND PURPOSE: Visual neglect is a frequent disability in stroke and adversely affects mobility, discharge destination, and length of hospital stay. It is assumed that its severity is enhanced by a released interhemispheric inhibition from the unaffected toward the affected hemisphere. Continuous theta burst transcranial magnetic stimulation (TBS) is a new inhibitory brain stimulation protocol which has the potential to induce behavioral effects outlasting stimulation. We aimed to test whether parietal TBS over the unaffected hemisphere can induce a long-lasting improvement of visual neglect by reducing the interhemispheric inhibition. METHODS: Eleven patients with left-sided visual neglect attributable to right hemispheric stroke were tested in a visual perception task. To evaluate the specificity of the TBS effect, 3 conditions were tested: 2 TBS trains over the left contralesional posterior parietal cortex, 2 trains of sham stimulation over the contralesional posterior parietal cortex, and a control condition without any intervention. To evaluate the lifetime of repeated trains of TBS in 1 session, 4 trains were applied over the contralesional posterior parietal cortex. RESULTS: Two TBS trains significantly increased the number of perceived left visual targets for up to 8 hours as compared to baseline. No significant improvement was found with sham stimulation or in the control condition without any intervention. The application of 4 TBS trains significantly increased the number of perceived left targets up to 32 hours. CONCLUSIONS: The new approach of repeating TBS at the same day may be promising for therapy of neglect.
Resumo:
To study the time course of demineralization and fracture incidence after spinal cord injury (SCI), 100 paraplegic men with complete motor loss were investigated in a cross-sectional study 3 months to 30 years after their traumatic SCI. Fracture history was assessed and verified using patients' files and X-rays. BMD of the lumbar spine (LS), femoral neck (FN), distal forearm (ultradistal part = UDR, 1/3 distal part = 1/3R), distal tibial diaphysis (TDIA), and distal tibial epiphysis (TEPI) was measured using DXA. Stiffness of the calcaneus (QUI.CALC), speed of sound of the tibia (SOS.TIB), and amplitude-dependent SOS across the proximal phalanges (adSOS.PHAL) were measured using QUS. Z-Scores of BMD and quantitative ultrasound (QUS) were plotted against time-since-injury and compared among four groups of paraplegics stratified according to time-since-injury (<1 year, stratum I; 1-9 years, stratum II; 10-19 years, stratum III; 20-29 years, stratum IV). Biochemical markers of bone turnover (deoxypyridinoline/creatinine (D-pyr/Cr), osteocalcin, alkaline phosphatase) and the main parameters of calcium phosphate metabolism were measured. Fifteen out of 98 paraplegics had sustained a total of 39 fragility fractures within 1,010 years of observation. All recorded fractures were fractures of the lower limbs, mean time to first fracture being 8.9 +/- 1.4 years. Fracture incidence increased with time-after-SCI, from 1% in the first 12 months to 4.6%/year in paraplegics since >20 years ( p<.01). The overall fracture incidence was 2.2%/year. Compared with nonfractured paraplegics, those with a fracture history had been injured for a longer time ( p<.01). Furthermore, they had lower Z-scores at FN, TEPI, and TDIA ( p<.01 to <.0001), the largest difference being observed at TDIA, compared with the nonfractured. At the lower limbs, BMD decreased with time at all sites ( r=.49 to.78, all p<.0001). At FN and TEPI, bone loss followed a log curve which leveled off between 1 to 3 years after injury. In contrast, Z-scores of TDIA continuously decreased even beyond 10 years after injury. LS BMD Z-score increased with time-since-SCI ( p<.05). Similarly to DXA, QUS allowed differentiation of early and rapid trabecular bone loss (QUI.CALC) vs slow and continuous cortical bone loss (SOS.TIB). Biochemical markers reflected a disproportion between highly elevated bone resorption and almost normal bone formation early after injury. Turnover declined following a log curve with time-after-SCI, however, D-pyr/Cr remained elevated in 30% of paraplegics injured >10 years. In paraplegic men early (trabecular) and persistent (cortical) bone loss occurs at the lower limbs and leads to an increasing fracture incidence with time-after-SCI.
Resumo:
The aim of this study was to explore the effect of long-term cross-sex hormonal treatment on cortical and trabecular bone mineral density and main biochemical parameters of bone metabolism in transsexuals. Twenty-four male-to-female (M-F) transsexuals and 15 female-to-male (F-M) transsexuals treated with either an antiandrogen in combination with an estrogen or parenteral testosterone were included in this cross-sectional study. BMD was measured by DXA at distal tibial diaphysis (TDIA) and epiphysis (TEPI), lumbar spine (LS), total hip (HIP) and subregions, and whole body (WB) and Z-scores determined for both the genetic and the phenotypic gender. Biochemical parameters of bone turnover, insulin-like growth factor-1 (IGF-1) and sex hormone levels were measured in all patients. M-F transsexuals were significantly older, taller and heavier than F-M transsexuals. They were treated by cross-sex hormones during a median of 12.5 years before inclusion. As compared with female age-matched controls, they showed a significantly higher median Z-score at TDIA and WB (1.7+/-1.0 and 1.8+/-1.1, P < 0.01) only. Based on the WHO definition, five (who did not comply with cross-sex hormone therapy) had osteoporosis. F-M transsexuals were treated by cross-sex hormones during a median of 7.6 years. They had significantly higher median Z-scores at TEPI, TDIA and WB compared with female age-matched controls (+0.9+/-0.2 SD, +1.0+/-0.4 SD and +1.4+/-0.3 SD, respectively, P < 0.0001 for all) and reached normal male levels except at TEPI. They had significantly higher testosterone and IGF-1 levels (p < 0.001) than M-F transsexuals. We conclude that in M-F transsexuals, BMD is preserved over a median of 12.5 years under antiandrogen and estrogen combination therapy, while in F-M transsexuals BMD is preserved or, at sites rich in cortical bone, is increased to normal male levels under a median of 7.6 years of androgen treatment in this cross sectional study. IGF-1 could play a role in the mediation of the effect of androgens on bone in F-M transsexuals.
Resumo:
Although osteoporosis is a systemic disease, vertebral fractures due to spinal bone loss are a frequent, sometimes early and often neglected complication of the disease, generally associated with considerable disability and pain. As osteoporotic vertebral fractures are an important predictor of future fracture risk, including at the hip, medical management is targeted at reducing fracture risk. A literature search for randomized, double-blind, prospective, controlled clinical studies addressing medical treatment possibilities of vertebral fractures in postmenopausal Caucasian women was performed on the leading medical databases. For each publication, the number of patients with at least one new vertebral fracture and the number of randomized patients by treatment arm was retrieved. The relative risk (RR) and the number needed to treat (NNT, i.e. the number of patients to be treated to avoid one radiological vertebral fracture over the duration of the study), together with the respective 95% confidence intervals (95%CI) were calculated for each study. Treatment of steroid-induced osteoporosis and treatment of osteoporosis in men were reviewed separately, based on the low number of publications available. Forty-five publications matched with the search criteria, allowing for analysis of 15 different substances tested regarding their anti-fracture efficacy at the vertebral level. Bisphosphonates, mainly alendronate and risedronate, were reported to have consistently reduced the risk of a vertebral fracture over up to 50 months of treatment in four (alendronate) and two (risedronate) publications. Raloxifene reduced vertebral fracture risk in one study over 36 months, which was confirmed by 48 months' follow-up data. Parathormone (PTH) showed a drastic reduction in vertebral fracture risk in early studies, while calcitonin may also be a treatment option to reduce fracture risk. For other substances published data are conflicting (calcitriol, fluoride) or insufficient to conclude about efficacy (calcium, clodronate, etidronate, hormone replacement therapy, pamidronate, strontium, tiludronate, vitamin D). The low NNTs for the leading substances (ranges: 15-64 for alendronate, 8-26 for risedronate, 23 for calcitonin and 28-31 for raloxifene) confirm that effective and efficient drug interventions for treatment and prevention of osteoporotic vertebral fractures are available. Bisphosphonates have demonstrated similar efficacy in treatment and prevention of steroid-induced and male osteoporosis as in postmenopausal osteoporosis. The selection of the appropriate drug for treatment of vertebral osteoporosis from among a bisphosphonate (alendronate or risedronate), PTH, calcitonin or raloxifene will mainly depend on the efficacy, tolerability and safety profile, together with the patient's willingness to comply with a long-term treatment. Although reduction of vertebral fracture risk is an important criterion for decision making, drugs with proven additional fracture risk reduction at all clinically relevant sites (especially at the hip) should be the preferred options.
Resumo:
BACKGROUND Newer generation everolimus-eluting stents (EES) improve clinical outcome compared to early generation sirolimus-eluting stents (SES) and paclitaxel-eluting stents (PES). We investigated whether the advantage in safety and efficacy also holds among the high-risk population of diabetic patients during long-term follow-up. METHODS Between 2002 and 2009, a total of 1963 consecutive diabetic patients treated with the unrestricted use of EES (n=804), SES (n=612) and PES (n=547) were followed throughout three years for the occurrence of cardiac events at two academic institutions. The primary end point was the occurrence of definite stent thrombosis. RESULTS The primary outcome occurred in 1.0% of EES, 3.7% of SES and 3.8% of PES treated patients ([EES vs. SES] adjusted HR=0.58, 95% CI 0.39-0.88; [EES vs. PES] adjusted HR=0.29, 95% CI 0.13-0.67). Similarly, patients treated with EES had a lower risk of target-lesion revascularization (TLR) compared to patients treated with SES and PES ([EES vs. SES], 5.6% vs. 11.5%, adjusted HR=0.68, 95% CI: 0.55-0.83; [EES vs. PES], 5.6% vs. 11.3%, adjusted HR=0.51, 95% CI: 0.33-0.77). There were no differences in other safety end points, such as all-cause mortality, cardiac mortality, myocardial infarction (MI) and MACE. CONCLUSION In diabetic patients, the unrestricted use of EES appears to be associated with improved outcomes, specifically a significant decrease in the need for TLR and ST compared to early generation SES and PES throughout 3-year follow-up.
Resumo:
BACKGROUND Long-term studies of ≥10 years are important milestones to get a better understanding of potential factors causing implant failures or complications. PURPOSE The present study investigated the long-term outcomes of titanium dental implants with a rough, microporous surface (titanium plasma sprayed [TPS]) and the associated biologic and technical complications in partially edentulous patients with fixed dental prostheses over a 20-year follow-up period. MATERIALS AND METHODS Sixty-seven patients, who received 95 implants in the 1980s, were examined with well-established clinical and radiographic parameters. Based on these findings, each implant was classified as either successful, surviving, or failed. RESULTS Ten implants in nine patients were lost during the observation period, resulting in an implant survival rate of 89.5%. Radiographically, 92% of the implants exhibited crestal bone loss below 1 mm between the 1- and 20-year follow-up examinations. Only 8% yielded peri-implant bone loss of >1 mm and none exhibited severe bone loss of more than 1.8 mm. During the observation period, 19 implants (20%) experienced a biologic complication with suppuration. Of these 19 implants, 13 implants (13.7%) had been treated and were successfully maintained over the 20-year follow-up period. Therefore, the 20-year implant success rate was 75.8 or 89.5% depending on the different success criteria. Technical complications were observed in 32%. CONCLUSION The present study is the first to report satisfactory success rates after 20 years of function of dental implants with a TPS surface in partially edentulous patients.
Resumo:
BACKGROUND: This study aimed to investigate the influence of deep sternal wound infection on long-term survival following cardiac surgery. MATERIAL AND METHODS: In our institutional database we retrospectively evaluated medical records of 4732 adult patients who received open-heart surgery from January 1995 through December 2005. The predictive factors for DSWI were determined using logistic regression analysis. Then, each patient with deep sternal wound infection (DSWI) was matched with 2 controls without DSWI, according to the risk factors identified previously. After checking balance resulting from matching, short-term mortality was compared between groups using a paired test, and long-term survival was compared using Kaplan-Meier analysis and a Cox proportional hazard model. RESULTS: Overall, 4732 records were analyzed. The mean age of the investigated population was 69.3±12.8 years. DSWI occurred in 74 (1.56%) patients. Significant independent predictive factors for deep sternal infections were active smoking (OR 2.19, CI95 1.35-3.53, p=0.001), obesity (OR 1.96, CI95 1.20-3.21, p=0.007), and insulin-dependent diabetes mellitus (OR 2.09, CI95 1.05-10.06, p=0.016). Mean follow-up in the matched set was 125 months, IQR 99-162. After matching, in-hospital mortality was higher in the DSWI group (8.1% vs. 2.7% p=0.03), but DSWI was not an independent predictor of long-term survival (adjusted HR 1.5, CI95 0.7-3.2, p=0.33). CONCLUSIONS: The results presented in this report clearly show that post-sternotomy deep wound infection does not influence long-term survival in an adult general cardio-surgical patient population.
Resumo:
Pastures containing hay-type and grazing tolerant alfalfa hybrids were grazed in a season-long or complementary rotational stocking system with Nfertilized smooth bromegrass. The pastures were stocked at a seasonal density of .8 cow-calf pairs per acre for 120 days in 1998 and 141 days in 1999. Pastures were intensively managed by daily stripstocking with the assumptions that 50% of live forage was available and daily live dry matter consumption of each cow-calf pair was 3.5% of the cow’s body weight. First-cutting forage was harvested as hay from 40% of the pasture acres to remove excess forage growth early in the grazing season. Grazing occurred on the remaining 60% of each pasture for the first 44 and 54 days and 100% of each pasture after days 45 and 55 in 1998 and 1999, respectively. Proportions of ‘Amerigraze’ and ‘Affinity’ alfalfa in the live forage dry matter decreased by 70% and 55% in pastures stocked season-long and by 60% and 42% in pastures used for complementary stocking (alfalfa type, p<.05; grazing management, p<.05) in 1998, but decreased by a mean of 72% and was unaffected by hybrid or stocking system in 1999. Cows grazing either alfalfa hybrid by either grazing system had greater weight gains during the breeding and overall grazing seasons and greater increases in body condition score pre-breeding and during the breeding season than the cows that grazed smooth bromegrass for the entire season in 1998. Also, cows grazing either alfalfa hybrid in the season-long system had greater breeding season increases in body condition score than cows grazing alfalfa in the complementary system with smooth bromegrass in 1998. Cows grazing in the season-long alfalfa system had greater prebreeding season weight (p<.10) increases and condition score (p<.05) increases than cows grazing alfalfa in the complementary system in 1999. Daily and seasonal body weight gains of calves were not affected (p>.10) by the presence of alfalfa in 1998 or by alfalfa type and grazing management in 1998 and 1999. Total animal production (cow and calf) in 1998 was greater (p<.10) from the season-long alfalfa pastures compared with the complementary stocked pastures. Total (p<.10) and live (p<.05) forage masses, estimated by monthly clippings, were greater in September of 1998 from the season-long alfalfa pastures than pastures using alfalfa for complementary stocking. Total (p<.10) and live (p<.05) forage masses were greater in August of 1999 from season-long alfalfa pastures than pastures using alfalfa for complementary stocking.
Resumo:
Pastures containing hay-type and grazing tolerant alfalfa hybrids were grazed in a season-long or complimentary rotational stocking system with Nfertilized smooth bromegrass. The pastures were stocked at a seasonal density of .8 cow-calf pairs per acre for 120 days. Pastures were intensively managed by daily strip-stocking with the assumptions that 50% of live forage was available and daily live dry matter consumption of each cow-calf pair was 3.5% of the cow’s body weight. First-cutting forage was harvested as hay from 40% of pasture acres to remove excess forage growth early in the grazing season. Forage was grazed from the remaining 60% of each pasture for the first 44 days of the experiment and then from the entire pasture thereafter. Live forage yields, estimated by monthly clippings, were greater in May and September on the season-long alfalfa pastures compared with the complementary pastures and on the alfalfa pastures compared with the N-fertilized smooth bromegrass pastures. The proportions of legumes in the live dry matter in pastures with grazing tolerant and hay-type alfalfas in the season-long grazing systems declined by 70% and 50%, respectively, in the 120 day trial. The proportions of legumes in the live dry matter in pastures with grazing tolerant and the hay-type alfalfas in the complementary grazing system declined 60% and 42%, respectively, in the 120 day trial. Cows grazing either alfalfa hybrid by either management system had greater weight gains during the breeding and grazing seasons and greater increases in body condition score prebreeding and during the breeding season than the cows that grazed N-fertilized smooth bromegrass for the entire season. Also, cows grazing either alfalfa in the season-long system had greater breeding season increases in body condition score than cows grazing alfalfa in the complementary system with N-fertilized smooth bromegrass. Daily gains and seasonal gains of calves from cows grazing the alfalfa pastures tended to be greater than those grazing N-fertilized smooth bromegrass. Within alfalfa treatments, calves of cows grazing alfalfa pastures in the season-long system tended to produce more pounds per acre than those of cows grazing alfalfa in the complementary systems.
Resumo:
BACKGROUND There is weak evidence to support the benefit of periodontal maintenance therapy in preventing tooth loss. In addition, the effects of long-term periodontal treatment on general health are unclear. METHODS Patients who were compliant and partially compliant (15 to 25 years' follow-up) in private practice were observed for oral and systemic health changes. RESULTS A total of 219 patients who were compliant (91 males and 128 females) were observed for 19.1 (range 15 to 25; SD ± 2.8) years. Age at reassessment was 64.6 (range: 39 to 84; SD ± 9.0) years. A total of 145 patients were stable (0 to 3 teeth lost), 54 were downhill (4 to 6 teeth lost), and 21 patients extreme downhill (>6 teeth lost); 16 patients developed hypertension, 13 developed type 2 diabetes, and 15 suffered myocardial infarcts (MIs). A minority developed other systemic diseases. Risk factors for MI included overweight (odds ratio [OR]: 9.04; 95% confidence interval [CI]: 2.9 to 27.8; P = 0.000), family history with cardiovascular disease (OR: 3.10; 95% CI: 1.07 to 8.94; P = 0.029), type 1 diabetes at baseline (P = 0.02), and developing type 2 diabetes (OR: 7.9; 95% CI: 2.09 to 29.65; P = 0.000). A total of 25 patients who were partially compliant (17 males and eight females) were observed for 19 years. This group had a higher proportion of downhill and extreme downhill cases and MI. CONCLUSIONS Patients who left the maintenance program in a periodontal specialist practice in Norway had a higher rate of tooth loss than patients who were compliant. Patients who were compliant with maintenance in a specialist practice in Norway have a similar risk of developing type 2 diabetes as the general population. A rate of 0.0037 MIs per patient per year was recorded for this group. Due to the lack of external data, it is difficult to assess how this compares with patients who have periodontal disease and are untreated.
Resumo:
PURPOSE The purpose of this study was to document the long-term outcome of Brånemark implants installed in augmented maxillary bone and to identify parameters that are associated with peri-implant bone level. MATERIAL AND METHODS Patients of a periodontal practice who had been referred to a maxillofacial surgeon for iliac crest bone grafting in the atrophic maxilla were retrospectively recruited. Five months following grafting, they received 7-8 turned Brånemark implants. Following submerged healing of another 5 months, implants were uncovered and restorative procedures for fixed rehabilitation were initiated 2-3 months thereafter. The primary outcome variable was bone level defined as the distance from the implant-abutment interface to the first visible bone-to-implant contact. Secondary outcome variables included plaque index, bleeding index, probing depth, and levels of 40 species in subgingival plaque samples as identified by means of checkerboard DNA-DNA hybridization. RESULTS Nine out of 16 patients (eight females, one male; mean age 59) with 71 implants agreed to come in for evaluation after on average 9 years (SD 4; range 3-13) of function. One implant was deemed mobile at the time of inspection. Clinical conditions were acceptable with 11% of the implants showing pockets ≥ 5 mm. Periodontopathogens were encountered frequently and in high numbers. Clinical parameters and bacterial levels were highly patient dependent. The mean bone level was 2.30 mm (SD 1.53; range 0.00-6.95), with 23% of the implants demonstrating advanced resorption (bone level > 3 mm). Regression analysis showed a significant association of the patient (p < .001) and plaque index (p = .007) with bone level. CONCLUSIONS The long-term outcome of Brånemark implants installed in iliac crest-augmented maxillary bone is acceptable; however, advanced peri-implant bone loss is rather common and indicative of graft resorption. This phenomenon is patient dependent and seems also associated with oral hygiene.