967 resultados para 100 years


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We recently demonstrated that suppressed bone remodeling allows microdamage to accumulate and causes reductions in some mechanical properties. However, in our previous study, I year treatment with high-dose etidronate (EHDP) did not increase microdamage accumulation in most skeletal sites of dogs in spite of complete remodeling suppression and the occurrence of spontaneous fractures of ribs and/or thoracic spinous processes. This study evaluates the effects of EHDP on microdamage accumulation and biomechanical properties before fractures occur. Thirty-six female beagles, 1-2 years old, were treated daily for 7 months with subcutaneous injections of saline vehicle (CNT) or EHDP at 0.5 (E-low) or 5 mg/kg per day (E-high). After killing, bone mineral measurement, histomorphometry, microdamage analysis, and biomechanical testing were performed. EHDP treatment suppressed intracortical and trabecular remodeling by 60%-75% at the lower dose, and by 100% at the higher dose. Osteoid accumulation caused by a mineralization deficit occurred only in the E-high group, and this led to a reduction of mineralized bone mass. Microdamage accumulation increased significantly by two- to fivefold in the rib, lumbar vertebra, ilium, and thoracic spinous process in E-low, and by twofold in the lumbar vertebra and ilium in E-high. However, no significant increase in damage accumulation was observed in ribs or thoracic spinous processes in E-high where fractures occur following 12 months of treatment. Mechanical properties of lumbar vertebrae and thoracic spinous processes were reduced significantly in both E-low and E-high. These findings suggest that suppression of bone remodeling by EHDP allows microdamage accumulation, but that osteoid accumulation reduces production of microdamage. (Bone 29:271-278; 2001) (C) 2001 by Elsevier Science Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In severe aplastic anaemia, the treatment of choice for young patients with a human leucocyte antigen-matched sibling is now established as allogeneic bone marrow transplantation (BMT). In older patients and in those without a matched sibling donor, immunosuppressive therapy is the usual first option. 'Alternative' marrow donors are emerging as an option for those without a matched sibling donor. Aims: To review 10 years of local experience in treating severe aplastic anaemia with BMT and immunosuppressive therapy with emphasis on long-term outcomes. Methods: A retrospective analysis was performed of all patients with severe aplastic anaemia presenting to the Royal Brisbane and Royal Children's Hos- pitals between 1989 and 1999. Data were abstracted regarding patient demographics, pretreatment characteristics and outcome measures, including response rates, overall survival and long-term complications. Results: Twenty-seven consecutive patients were identified, 12 treated with immunosuppression alone and 15 with BMT. In these two groups, transfusion independence was attained in 25% and 100%, respectively, with overall survival being 36% and 100%, respectively. Those treated with immunosuppression were significantly older (median 41.5 versus 22 years, P = 0.008). Long-term survivors of either treatment had extremely low morbidity. Three patients carried pregnancies to term post-transplant. Three patients received alternative donor BMT with correspondingly excellent survival. Conclusions: Patients treated with allogeneic BMT for severe aplastic anaemia enjoyed extremely good long-term survival and minimal morbidity. Patients treated with immunosuppressive therapy had a poorer outcome reflecting their older age and different usage of therapies over the past decade. Optimal treatment strategies for severe aplastic anaemia remain to be determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic and tillage effects on runoff and crop performance on a heavy clay soil were investigated over a period of 4 years. Tillage treatments and the cropping program were representative of broadacre grain production practice in northern Australia, and a split-plot design used to isolate traffic effects. Treatments subject to zero, minimum, and stubble mulch tillage each comprised pairs of 90-m 2 plots, from which runoff was recorded. A 3-m-wide controlled traffic system allowed one of each pair to be maintained as a non-wheeled plot, while the total surface area of the other received a single annual wheeling treatment from a working 100-kW tractor. Rainfall/runoff hydrographs demonstrate that wheeling produced a large and consistent increase in runoff, whereas tillage produced a smaller increase. Treatment effects were greater on dry soil, but were still maintained in large and intense rainfall events on wet soil. Mean annual runoff from wheeled plots was 63 mm (44%) greater than that from controlled traffic plots, whereas runoff from stubble mulch tillage plots was 38 mm (24%) greater than that from zero tillage plots. Traffic and tillage effects appeared to be cumulative, so the mean annual runoff from wheeled stubble mulch tilled plots, representing conventional cropping practice, was more than 100 mm greater than that from controlled traffic zero tilled plots, representing best practice. This increased infiltration was reflected in an increased yield of 16% compared with wheeled stubble mulch. Minimum tilled plots demonstrated a characteristic midway between that of zero and stubble mulch tillage. The results confirm that unnecessary energy dissipation in the soil during the traction process that normally accompanies tillage has a major negative effect on infiltration and crop productivity. Controlled traffic farming systems appear to be the only practicable solution to this problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The BRCA2 N372H nonconservative amino acid substitution polymorphism appears to affect fetal survival in a sex-dependent manner, and the HH genotype was found to be associated with a 1.3-fold risk of breast cancer from pooling five case-control studies of Northern European women. We investigated whether the BR 2 N372H polymorphism was associated with breast cancer in Australian women using a population-based case-control design. The BRCA2 372 genotype was determined in 1397 cases under the age of 60 years at diagnosis of a first primary breast cancer and in 775 population-sampled controls frequency matched for age. Case-control analyses and comparisons of genotype distributions were conducted using logistic regression. All of the statistical tests were two-tailed. The HH genotype was independent of age and family history of breast cancer within cases and controls, and was more common in cases (9.2% versus 6.5%). It was associated with an increased risk of breast cancer, 1.47-fold unadjusted (95% confidence interval, 1.05-2.07; P = 0.02), and 1.42-fold (95% confidence interval, 1.00-2.02; P = 0.05) after adjusting for measured risk factors. This effect was still evident after excluding women with any non-Caucasian ancestry or the 33 cases known to have inherited a mutation in BRCA1 or BRCA2, and would explain similar to3% of breast cancer. The BRCA2 N372H polymorphism appears to be associated with a modest recessively inherited risk of breast cancer in Australian women. This result is consistent with the findings for Northern European women.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The incidence of melanoma increases markedly in the second decade of life but almost nothing is known of the causes of melanoma in this age group. We report on the first population-based case-control study of risk factors for melanoma in adolescents (15-19 years). Data were collected through personal interviews with cases, controls and parents. A single examiner conducted full-body nevus counts and blood samples were collected from cases for analysis of the CDKN2A melanoma predisposition gene. A total of 201 (80%) of the 250 adolescents with melanoma diagnosed between 1987 and 1994 and registered with the Queensland Cancer Registry and 205 (79%) of 258 age-, gender- and location-matched controls who were contacted agreed to participate. The strongest risk factor associated with melanoma in adolescents in a multivariate model was the presence of more than 100 nevi 2 mm or more in diameter (odds ratio [OR] = 46.5, 95% confidence interval [Cl] = 11.4-190.8). Other risk factors were red hair (OR = 5.4, 95%Cl = 1.0-28.4); blue eyes (OR = 4.5, 95%Cl = 1.5- 13.6); inability to tan after prolonged sun exposure (OR = 4.7, 95%Cl = 0.9-24.6); heavy facial freckling (OR = 3.2, 95% Cl = 0.9-12.3); and family history of melanoma (OR = 4.0, 95%Cl = 0.8-18.9). Only 2 of 147 cases tested had germline variants or mutations in CDKN2A. There was no association with sunscreen use overall, however, never/rare use of sunscreen at home under the age of 5 years was associated with increased risk (OR = 2.2, 95%Cl = 0.7-7.1). There was no difference between cases and controls in cumulative sun exposure in this high-exposure environment. Factors indicating genetic susceptibility to melanoma, in particular, the propensity to develop nevi and freckles, red hair, blue eyes, inability to tan and a family history of the disease are the primary determinants of melanoma among adolescents in this high solar radiation environment. Lack of association with reported sun exposure is consistent with the high genetic susceptibility in this group. (C) 2002 Wiley-Liss, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Myelin proteolipid protein (PLP), the most abundant protein of central nervous system (CNS) myelin, is a hydrophobic integral membrane protein. Because of its physical properties, which make it difficult to work with, progress towards determining the exact function(s) and disease associations of myelin PLP has been slow. However, recent molecular biology advances have given new life to investigations of PLP, and suggest that it has multiple functions within myelin and is of importance in several neurological disorders. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To determine the incidence of dysphagia (defined as the inability to manage a diet of normal consistencies) at hospital discharge and beyond 1 year post-surgery and examine the impact of persistent dysphagia on levels of disability, handicap, and well-being in patients. Design: Retrospective review and patient contact. Setting: Adult acute care tertiary hospital. Patients: The study group, consecutively sampled from January 1993 to December 1997, comprised 55 patients who underwent total laryngectomy and 37 patients who underwent pharyngolaryngectomy with free jejunal reconstruction. Follow-up with 36 of 55 laryngectomy and 14 of 37 pharyngolaryngectomy patients was conducted 1 to 6 years postsurgery. Main Outcome Measures: Number of days until the resumption of oral intake; swallowing complications prior to and following discharge; types of diets managed at discharge and follow-up; and ratings of disability, handicap, and distress levels related to swallowing. Results: Fifty four (98%) of the laryngectomy and 37 (100%) of the pharyngolaryngectomy patients experienced dysphagia at discharge. By approximately 3 years postsurgery, 21 (58%) of the laryngectomy and 7 (50%) of the pharyngolaryngectomy patients managed a normal diet. Pharyngolaryngectomy patients experienced increased duration of nasogastric feeding, time to resume oral intake, and incidence of early complications affecting swallowing. Patients experiencing long-term dysphagia identified significantly increased levels of disability, handicap, and distress. Patients without dysphagia also experienced slight levels of handicap and distress resulting from taste changes and increased durations required to complete meals of normal consistency. Conclusions: The true incidence of patients experiencing a compromise in swallowing following surgery has been underestimated. The significant impact of impaired swallowing on a patient's level of perceived disability, handicap, and distress highlights the importance of providing optimal management of this negative consequence of surgery to maximize the patient's quality of life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To determine (i) factors which predict whether patients hospitalised with acute myocardial infarction (AMI) receive care discordant with recommendations of clinical practice guidelines; and (ii) whether such discordant care results in worse outcomes compared with receiving guideline-concordant care. Design: Retrospective cohort study. Setting: Two community general hospitals. Participants: 607 consecutive patients admitted with AMI between July 1997 and December 2000. Main outcome measures: Clinical predictors of discordant care; crude and risk-adjusted rates of inhospital mortality and reinfarction, and mean length of hospital stay. Results: At least one treatment recommendation for AMI was applicable for 602 of the 607 patients. Of these patients, 411(68%) received concordant care, and 191 (32%) discordant care. Positive predictors at presentation of discordant care were age > 65 years (odds ratio [OR], 2.5; 95% Cl, 1.7-3.6), silent infarction (OR, 2.7; 95% Cl, 1.6-4.6), anterior infarction (OR, 2.5; 95% Cl, 1.7-3.8), a history of heart failure (OR, 6.3; 95% Cl, 3.7-10.7), chronic atrial fibrillation (OR, 3.2; 95% Cl, 1.5-6.4); and heart rate greater than or equal to 100 beats/min (OR, 2.1; 95% Cl, 1.4-3.1). Death occurred in 12.0% (23/191) of discordant-care patients versus 4.6% (19/411) of concordant-care patients (adjusted OR, 2.42; 95% Cl, 1.22-4.82). Mortality was inversely related to the level of guideline concordance (P = 0.03). Reinfarction rates also tended to be higher in the discordant-care group (4.2% v 1.7%; adjusted OR, 2.5; 95% Cl, 0.90-7.1). Conclusions: Certain clinical features at presentation predict a higher likelihood of guideline-discordant care in patients presenting with AMI Such care appears to increase the risk of inhospital death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to determine the effects of 7 weeks of high- and low-velocity resistance training on strength and sprint running performance in nine male elite junior sprint runners (age 19.0 +/- 1.4 years, best 100 m times 10.89 +/- 0.21 s; mean +/- s). The athletes continued their sprint training throughout the study, but their resistance training programme was replaced by one in which the movement velocities of hip extension and flexion, knee extension and flexion and squat exercises varied according to the loads lifted (i.e. 30-50% and 70-90% of 1-RM in the high- and low-velocity training groups, respectively). There were no between-group differences in hip flexion or extension torque produced at 1.05, 4.74 or 8.42 rad . s(-1), 20 m acceleration or 20 m 'flying' running times, or 1-RM squat lift strength either before or after training. This was despite significant improvements in 20 m acceleration time (P < 0.01), squat strength (P< 0.05), isokinetic hip flexion torque at 4.74 rad . s(-1) and hip extension torque at 1.05 and 4.74 rad . s(-1) for the athletes as a whole over the training period. Although velocity-specific strength adaptations have been shown to occur rapidly in untrained and non-concurrently training individuals, the present results suggest a lack of velocity-specific performance changes in elite concurrently training sprint runners performing a combination of traditional and semi-specific resistance training exercises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To map out the career paths of veterinarians during their first 10 years after graduation, and to determine if this could have been predicted at entry to the veterinary course. Design Longitudinal study of students who started their course at The University of Queensland in 1985 and 1986, and who completed questionnaires in their first and fifth year as students, and in their second, sixth and eleventh year as veterinarians. Methods Data from 129 (96%) questionnaires completed during the eleventh year after graduation were coded numerically then analysed, together with data from previous questionnaires, with SAS System 7 for Windows 95. Results Ten years after they graduated, 80% were doing veterinary work, 60% were in private practice, 40% in small animal practice and 18% in mixed practice. The equivalent of 25% of the working time of all females was taken up by family duties. When part-time work was taken into account, veterinary work constituted the equivalent of 66% of the group working full-time. That 66% consisted of 52% on small animals, 7% on horses, 6% on cattle/sheep and 1% on pigs/poultry. Those who had grown up on farms with animals were twice as likely to be working with farm animals as were those from other backgrounds. Forecasts made on entry to the veterinary course were of no value in predicting who would remain in mixed practice. Conclusions Fewer than one-fifth of graduates were in mixed practice after 10 years, but the number was higher for those who grew up on farms with animals. Forecasts that may be made at interview before entry to the course were of little value in predicting the likelihood of remaining in mixed veterinary practice.