604 resultados para 0.05 µm
Resumo:
Objective To describe quality of life (QOL) over a 12-month period among women with breast cancer, consider the association between QOL and overall survival (OS), and explore characteristics associated with QOL declines. Methods A population-based sample of Australian women (n=287) with invasive, unilateral breast cancer (Stage I+), was observed prospectively for a median of 6.6 years. QOL was assessed at six, 12 and 18 months post-diagnosis, using the Functional Assessment of Cancer Therapy, Breast (FACT-B+4) questionnaire. Raw scores for the FACT-B+4 and subscales were computed and individuals were categorized according to whether QOL declined, remained stable or improved between six and 18 months. Kaplan-Meier and Cox proportional hazards survival methods were used to estimate OS and its associations with QOL. Logistic regression models identified factors associated with QOL decline. Results Within FACT-B+4 sub-scales, between 10% and 23% of women showed declines in QOL. Following adjustment for established prognostic factors, emotional wellbeing and FACT-B+4 scores at six months post-diagnosis were associated with OS (p<0.05). Declines in physical (p<0.01) or functional (p=0.02) well-being between six and 18 months post-diagnosis were also associated significantly with OS. Receiving multiple forms of adjuvant treatment, a perception of not handling stress well and reporting one or more other major life events at six months post-diagnosis were factors associated with declines in QOL in multivariable analyses. Conclusions Interventions targeted at preventing QOL declines may ultimately improve quantity as well as quality of life following breast cancer.
Resumo:
Habitat models are widely used in ecology, however there are relatively few studies of rare species, primarily because of a paucity of survey records and lack of robust means of assessing accuracy of modelled spatial predictions. We investigated the potential of compiled ecological data in developing habitat models for Macadamia integrifolia, a vulnerable mid-stratum tree endemic to lowland subtropical rainforests of southeast Queensland, Australia. We compared performance of two binomial models—Classification and Regression Trees (CART) and Generalised Additive Models (GAM)—with Maximum Entropy (MAXENT) models developed from (i) presence records and available absence data and (ii) developed using presence records and background data. The GAM model was the best performer across the range of evaluation measures employed, however all models were assessed as potentially useful for informing in situ conservation of M. integrifolia, A significant loss in the amount of M. integrifolia habitat has occurred (p < 0.05), with only 37% of former habitat (pre-clearing) remaining in 2003. Remnant patches are significantly smaller, have larger edge-to-area ratios and are more isolated from each other compared to pre-clearing configurations (p < 0.05). Whilst the network of suitable habitat patches is still largely intact, there are numerous smaller patches that are more isolated in the contemporary landscape compared with their connectedness before clearing. These results suggest that in situ conservation of M. integrifolia may be best achieved through a landscape approach that considers the relative contribution of small remnant habitat fragments to the species as a whole, as facilitating connectivity among the entire network of habitat patches.
Resumo:
Background Interdialytic weight gain (IDWG) can be reduced by lowering the dialysate sodium concentration ([Na]) in haemodialysis patients. It has been assumed that this is because thirst is reduced, although this has been difficult to prove. We compared thirst patterns in stable haemodialysis patients with high and low IDWG using a novel technique and compared the effect of low sodium dialysis (LSD) with normal sodium dialysis (NSD). Methods Eight patients with initial high IDWG and seven with low IDWG completed hourly visual analogue ratings of thirst using a modified palmtop computer during the dialysis day and the interdialytic day. The dialysate [Na] was progressively reduced by up to 5 mmol/l over five treatments. Dialysis continued at the lowest attained [Na] for 2 weeks and the measurements were repeated. The dialysate [Na] then returned to baseline and the process was repeated. Results Baseline interdialytic day mean thirst was higher than the dialysis day mean for the high IDWG group (49.9±14.0 vs 36.2±16.6) and higher than the low weight gain group (49.9±14.0 vs 34.1±14.6). This trend persisted on LSD, but there was a pronounced increase in post-dialysis thirst scores for both groups (high IDWG: 46±13 vs 30±21; low IDWG: 48±24 vs 33±18). The high IDWG group demonstrated lower IDWG during LSD than NSD (2.23±0.98 vs 2.86±0.38 kg; P<0.05). Conclusions Our results indicate that patients with high IDWG experience more intense feelings of thirst on the interdialytic day. LSD reduces their IDWG, but paradoxically increases thirst in the immediate post-dialysis period.
Resumo:
Background We have used serial visual analogue scores to demonstrate disturbances of the appetite profile in dialysis patients. This is potentially important as dialysis patients are prone to malnutrition yet have a lower nutrient intake than controls. Appetite disturbance may be influenced by accumulation of appetite inhibitors such as leptin and cholecystokinin (CCK) in dialysis patients. Methods Fasting blood samples were drawn from 43 controls, 50 haemodialysis (HD) and 39 peritoneal dialysis (PD) patients to measure leptin and CCK. Hunger and fullness scores were derived from profiles compiled using hourly visual analogue scores. Nutrient intake was derived from 3 day dietary records. Results Fasting CCK was elevated for PD (6.73 ± 4.42 ng/l vs control 4.99 ± 2.23 ng/l, P < 0.05; vs HD 4.43 ± 2.15 ng/l, P < 0.01). Fasting CCK correlated with the variability of the hunger (r = 0.426, P = 0.01) and fullness (r = 0.52, P = 0.002) scores for PD. There was a notable relationship with the increase in fullness after lunch for PD (r = 0.455, P = 0.006). When well nourished PD patients were compared with their malnourished counterparts, CCK was higher in the malnourished group (P = 0.004). Leptin levels were higher for the dialysis patients than controls (HD and PD, P < 0.001) with pronounced hyperleptinaemia evident in some PD patients. Control leptin levels demonstrated correlation with fullness scores (e.g. peak fullness, r = 0.45, P = 0.007) but the dialysis patients did not. PD nutrient intake (energy and protein intake, r = -0.56, P < 0.0001) demonstrated significant negative correlation with leptin. Conclusion Increased CCK levels appear to influence fullness and hunger perception in PD patients and thus may contribute to malnutrition. Leptin does not appear to affect perceived appetite in dialysis patients but it may influence nutrient intake in PD patients via central feeding centres.
Resumo:
ROLE OF LOW AFFINITY β1-ADRENERGIC RECEPTOR IN NORMAL AND DISEASED HEARTS Background: The β1-adrenergic receptor (AR) has at least two binding sites, 1HAR and 1LAR (high and low affinity site of the 1AR respectively) which cause cardiostimulation. Some β-blockers, for example (-)-pindolol and (-)-CGP 12177 can activate β1LAR at higher concentrations than those required to block β1HAR. While β1HAR can be blocked by all clinically used β-blockers, β1LAR is relatively resistant to blockade. Thus, chronic β1LAR activation may occur in the setting of β-blocker therapy, thereby mediating persistent βAR signaling. Thus, it is important to determine the potential significance of β1LAR in vivo, particularly in disease settings. Method and result: C57Bl/6 male mice were used. Chronic (4 weeks) β1LAR activation was achieved by treatment with (-)-CGP12177 via osmotic minipump. Cardiac function was assessed by echocardiography and catheterization. (-)-CGP12177 treatment in healthy mice increased heart rate and left ventricular (LV) contractility without detectable LV remodelling or hypertrophy. In mice subjected to an 8-week period of aorta banding, (-)-CGP12177 treatment given during 4-8 weeks led to a positive inotropic effect. (-)-CGP12177 treatment exacerbated LV remodelling indicated by a worsening of LV hypertrophy by ??% (estimated by weight, wall thickness, cardiomyocyte size) and interstitial/perivascular fibrosis (by histology). Importantly, (-)-CGP12177 treatment to aorta banded mice exacerbated cardiac expression of hypertrophic, fibrogenic and inflammatory genes (all p<0.05 vs. non-treated control with aorta banding).. Conclusion: β1LAR activation provides functional support to the heart, in both normal and diseased (pressure overload) settings. Sustained β1LAR activation in the diseased heart exacerbates LV remodelling and therefore may promote disease progression from compensatory hypertrophy to heart failure. Word count: 270
Resumo:
Exercise is known to cause physiological changes that could affect the impact of nutrients on appetite control. This study was designed to assess the effect of drinks containing either sucrose or high-intensity sweeteners on food intake following exercise. Using a repeated-measures design, three drink conditions were employed: plain water (W), a low-energy drink sweetened with artificial sweeteners aspartame and acesulfame-K (L), and a high-energy, sucrose-sweetened drink (H). Following a period of challenging exercise (70% VO2 max for 50 min), subjects consumed freely from a particular drink before being offered a test meal at which energy and nutrient intakes were measured. The degree of pleasantness (palatability) of the drinks was also measured before and after exercise. At the test meal, energy intake following the artificially sweetened (L) drink was significantly greater than after water and the sucrose (H) drinks (p < 0.05). Compared with the artificially sweetened (L) drink, the high-energy (H) drink suppressed intake by approximately the energy contained in the drink itself. However, there was no difference between the water (W) and the sucrose (H) drink on test meal energy intake. When the net effects were compared (i.e., drink + test meal energy intake), total energy intake was significantly lower after the water (W) drink compared with the two sweet (L and H) drinks. The exercise period brought about changes in the perceived pleasantness of the water, but had no effect on either of the sweet drinks. The remarkably precise energy compensation demonstrated after the higher energy sucrose drink suggests that exercise may prime the system to respond sensitively to nutritional manipulations. The results may also have implications for the effect on short-term appetite control of different types of drinks used to quench thirst during and after exercise.
Resumo:
Generating accurate population-specific public health messages regarding sun protection requires knowledge about seasonal variation in sun exposure in different environments. To address this issue for a subtropical area of Australia, we used polysulphone badges to measure UVR for the township of Nambour (26° latitude) and personal UVR exposure among Nambour residents who were taking part in a skin cancer prevention trial. Badges were worn by participants for two winter and two summer days. The ambient UVR was approximately three times as high in summer as in winter. However, participants received more than twice the proportion of available UVR in winter as in summer (6.5%vs 2.7%, P < 0.05), resulting in an average ratio of summer to winter personal UVR exposure of 1.35. The average absolute difference in daily dose between summer and winter was only one-seventh of a minimal erythemal dose. Extrapolating from our data, we estimate that ca. 42% of the total exposure received in the 6 months of winter (June–August) and summer (December–February) is received during the three winter months. Our data show that in Queensland a substantial proportion of people’s annual UVR dose is obtained in winter, underscoring the need for dissemination of sun protection messages throughout the year in subtropical and tropical climates.
Resumo:
Bone generation by autogenous cell transplantation in combination with a biodegradable scaffold is one of the most promising techniques being developed in craniofacial surgery. The objective of this combined in vitro and in vivo study was to evaluate the morphology and osteogenic differentiation of bone marrow derived mesenchymal progenitor cells and calvarial osteoblasts in a two-dimensional (2-D) and three-dimensional (3-D) culture environment (Part I of this study) and their potential in combination with a biodegradable scaffold to reconstruct critical-size calvarial defects in an autologous animal model [Part II of this study; see Schantz, J.T., et al. Tissue Eng. 2003;9(Suppl. 1):S-127-S-139; this issue]. New Zealand White rabbits were used to isolate osteoblasts from calvarial bone chips and bone marrow stromal cells from iliac crest bone marrow aspirates. Multilineage differentiation potential was evaluated in a 2-D culture setting. After amplification, the cells were seeded within a fibrin matrix into a 3-D polycaprolactone (PCL) scaffold system. The constructs were cultured for up to 3 weeks in vitro and assayed for cell attachment and proliferation using phase-contrast light, confocal laser, and scanning electron microscopy and the MTS cell metabolic assay. Osteogenic differentiation was analyzed by determining the expression of alkaline phosphatase (ALP) and osteocalcin. The bone marrow-derived progenitor cells demonstrated the potential to be induced to the osteogenic, adipogenic, and chondrogenic pathways. In a 3-D environment, cell-seeded PCL scaffolds evaluated by confocal laser microscopy revealed continuous cell proliferation and homogeneous cell distribution within the PCL scaffolds. On osteogenic induction mesenchymal progenitor cells (12 U/L) produce significantly higher (p < 0.05) ALP activity than do osteoblasts (2 U/L); however, no significant differences were found in osteocalcin expression. In conclusion, this study showed that the combination of a mechanically stable synthetic framework (PCL scaffolds) and a biomimetic hydrogel (fibrin glue) provides a potential matrix for bone tissue-engineering applications. Comparison of osteogenic differentiation between the two mesenchymal cell sources revealed a similar pattern.
Resumo:
Potential impacts of plantation forestry practices on soil organic carbon and Fe available to microorganisms were investigated in a subtropical coastal catchment. The impacts of harvesting or replanting were largely limited to the soil top layer (0–10 cm depth). The thirty-year-old Pinus plantation showed low soil moisture content (Wc) and relatively high levels of soil total organic carbon (TOC). Harvesting and replanting increased soil Wc but reduced TOC levels. Mean dissolved organic carbon (DOC) and microbial biomass carbon (MBC) increased in harvested or replanted soils, but such changes were not statistically significant (P > 0.05). Total dithionite-citrate and aqua regia-extractable Fe did not respond to forestry practices, but acid ammonium oxalate and pyrophosphate-extractable, bioavailable Fe decreased markedly after harvesting or replanting. Numbers of heterotrophic bacteria were significantly correlated with DOC levels (P < 0.05), whereas Fe-reducing bacteria and S-bacteria detected using laboratory cultivation techniques did not show strong correlation with either soil DOC or Fe content.
Resumo:
Objective: To determine whether bifocal and prismatic bifocal spectacles could control myopia in children with high rates of myopic progression. ---------- Methods: This was a randomized controlled clinical trial. One hundred thirty-five (73 girls and 62 boys) myopic Chinese Canadian children (myopia of 1.00 diopters [D]) with myopic progression of at least 0.50 D in the preceding year were randomly assigned to 1 of 3 treatments: (1) single-vision lenses (n = 41), (2) +1.50-D executive bifocals (n = 48), or (3) +1.50-D executive bifocals with a 3–prism diopters base-in prism in the near segment of each lens (n = 46). ---------- Main Outcome Measures: Myopic progression measured by an automated refractor under cycloplegia and increase in axial length (secondary) measured by ultrasonography at 6-month intervals for 24 months. Only the data of the right eye were used. ---------- Results: Of the 135 children (mean age, 10.29 years [SE, 0.15 years]; mean visual acuity, –3.08 D [SE, 0.10 D]), 131 (97%) completed the trial after 24 months. Myopic progression averaged –1.55 D (SE, 0.12 D) for those who wore single-vision lenses, –0.96 D (SE, 0.09 D) for those who wore bifocals, and –0.70 D (SE, 0.10 D) for those who wore prismatic bifocals. Axial length increased an average of 0.62 mm (SE, 0.04 mm), 0.41 mm (SE, 0.04 mm), and 0.41 mm (SE, 0.05 mm), respectively. The treatment effect of bifocals (0.59 D) and prismatic bifocals (0.85 D) was significant (P < .001) and both bifocal groups had less axial elongation (0.21 mm) than the single-vision lens group (P < .001). ---------- Conclusions: Bifocal lenses can moderately slow myopic progression in children with high rates of progression after 24 months.
Resumo:
Introduction: Management of osteoarthritis (OA) includes the use of non-pharmacological and pharmacological therapies. Although walking is commonly recommended for reducing pain and increasing physical function in people with OA, glucosamine sulphate has also been used to alleviate pain and slow the progression of OA. This study evaluated the effects of a progressive walking program and glucosamine sulphate intake on OA symptoms and physical activity participation in people with mild to moderate hip or knee OA. Methods: Thirty-six low active participants (aged 42 to 73 years) were provided with 1500 mg glucosamine sulphate per day for 6 weeks, after which they began a 12-week progressive walking program, while continuing to take glucosamine. They were randomized to walk 3 or 5 days per week and given a pedometer to monitor step counts. For both groups, step level of walking was gradually increased to 3000 steps/day during the first 6 weeks of walking, and to 6000 steps/day for the next 6 weeks. Primary outcomes included physical activity levels, physical function (self-paced step test), and the WOMAC Osteoarthritis Index for pain, stiffness and physical function. Assessments were conducted at baseline and at 6-, 12-, 18-, and 24-week follow-ups. The Mann Whitney Test was used to examine differences in outcome measures between groups at each assessment, and the Wilcoxon Signed Ranks Test was used to examine differences in outcome measures between assessments. Results: During the first 6 weeks of the study (glucosamine supplementation only), physical activity levels, physical function, and total WOMAC scores improved (P<0.05). Between the start of the walking program (Week 6) and the final follow-up (Week 24), further improvements were seen in these outcomes (P<0.05) although most improvements were seen between Weeks 6 and 12. No significant differences were found between walking groups. Conclusions: In people with hip or knee OA, walking a minimum of 3000 steps (~30 minutes), at least 3 days/week, in combination with glucosamine sulphate, may reduce OA symptoms. A more robust study with a larger sample is needed to support these preliminary findings. Trial Registration: Australian Clinical Trials Registry ACTRN012607000159459.
Resumo:
Grassland management affects soil organic carbon (SOC) storage and can be used to mitigate greenhouse gas emissions. However, for a country to assess emission reductions due to grassland management, there must be an inventory method for estimating the change in SOC storage. The Intergovernmental Panel on Climate Change (IPCC) has developed a simple carbon accounting approach for this purpose, and here we derive new grassland management factors that represent the effect of changing management on carbon storage for this method. Our literature search identified 49 studies dealing with effects of management practices that either degraded or improved conditions relative to nominally managed grasslands. On average, degradation reduced SOC storage to 95% +/- 0.06 and 97% +/- 0.05 of carbon stored under nominal conditions in temperate and tropical regions, respectively. In contrast, improving grasslands with a single management activity enhanced SOC storage by 14% 0.06 and 17% +/- 0.05 in temperate and tropical regions, respectively, and with an additional improvement(s), storage increased by another 11% +/- 0.04. We applied the newly derived factor coefficients to analyze C sequestration potential for managed grasslands in the U.S., and found that over a 20-year period changing management could sequester from 5 to 142 Tg C yr(-1) or 0.1 to 0.9 Mg C ha(-1) yr(-1), depending on the level of change. This analysis provides revised factor coefficients for the IPCC method that can be used to estimate impacts of management; it also provides a methodological framework for countries to derive factor coefficients specific to conditions in their region.
Resumo:
Background: Impairments in upper-body function (UBF) are common following breast cancer. However, the relationship between arm morbidity and quality of life (QoL) remains unclear. This investigation uses longitudinal data to describe UBF in a population-based sample of women with breast cancer and examines its relationship with QoL. ---------- Methods: Australian women (n = 287) with unilateral breast cancer were assessed at three-monthly intervals, from six- to 18-months post-surgery (PS). Strength, endurance and flexibility were used to assess objective UBF, while the Disability of the Arm, Shoulder and Hand questionnaire and the Functional Assessment of Cancer Therapy- Breast questionnaire were used to assess self-reported UBF and QoL, respectively. ---------- Results: Although mean UBF improved over time, up to 41% of women revealed declines in UBF between sixand 18-months PS. Older age, lower socioeconomic position, treatment on the dominant side, mastectomy, more extensive lymph node removal and having lymphoedema each increased odds of declines in UBF by at least twofold (p < 0.05). Lower baseline and declines in perceived UBF between six- and 18-months PS were each associated with poorer QoL at 18-months PS (p < 0.05). ---------- Conclusions: Significant upper-body morbidity is experienced by many following breast cancer treatment, persisting longer term, and adversely influencing the QoL of breast cancer survivors.
Resumo:
At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.
Resumo:
Purpose: To investigate the influence of accommodation upon axial length (and a comprehensive range of ocular biometric parameters), in populations of young adult myopic and emmetropic subjects. Methods: Forty young adult subjects had ocular biometry measured utilizing a non-contact optical biometer (Lenstar LS 900) based upon the principle of optical low coherence reflectometry, under three different accommodation demands (0 D, 3 D and 6 D). Subjects were classified as emmetropes (n=19) or myopes (n=21) based upon their spherical equivalent refraction (mean emmetropic refraction -0.05 ± 0.27DS and mean myopic refraction -1.82 ± 0.84 DS). Results: Axial length changed significantly with accommodation, with a mean increase of 11.9 ± 12.3 µm and 24.1 ± 22.7 µm for the 3 D and 6 D accommodation stimuli respectively. A significant axial elongation associated with accommodation was still evident even following correction of the axial length data for potential error due to lens thickness change. The mean ‘corrected’ increase in axial length was 5.2 ± 11.2 µm, and 7.4 ± 18.9 µm for the 3 D and 6 D stimuli respectively. There was no significant difference between the myopic and emmetropic populations in terms of the magnitude of change in axial length with accommodation, regardless of whether the data were corrected or not. A number of other ocular biometric parameters, such as anterior chamber depth, lens thickness and vitreous chamber depth also exhibited significant change with accommodation. The myopic and emmetropic populations also exhibited no significant difference in the magnitude of change in these parameters with accommodation. Conclusions: The eye undergoes a significant axial elongation associated with a brief period of accommodation, and the magnitude of this change in eye length increases for larger accommodation demands, however there is no significant difference in the magnitude of eye elongation in myopic and emmetropic subjects.