927 resultados para 860[729.1].07[Sarduy]


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Irrigation is known to stimulate soil microbial carbon and nitrogen turnover and potentially the emissions of nitrous oxide (N2O) and carbon dioxide (CO2). We conducted a study to evaluate the effect of three different irrigation intensities on soil N2O and CO2 fluxes and to determine if irrigation management can be used to mitigate N2O emissions from irrigated cotton on black vertisols in South-Eastern Queensland, Australia. Fluxes were measured over the entire 2009/2010 cotton growing season with a fully automated chamber system that measured emissions on a sub-daily basis. Irrigation intensity had a significant effect on CO2 emission. More frequent irrigation stimulated soil respiration and seasonal CO2 fluxes ranged from 2.7 to 4.1 Mg-C ha−1 for the treatments with the lowest and highest irrigation frequency, respectively. N2O emission happened episodic with highest emissions when heavy rainfall or irrigation coincided with elevated soil mineral N levels and seasonal emissions ranged from 0.80 to 1.07 kg N2O-N ha−1 for the different treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the cotton cropping season, uncorrected for background emissions, ranged from 0.40 to 0.53 % of total N applied for the different treatments. There was no significant effect of the different irrigation treatments on soil N2O fluxes because highest emission happened in all treatments following heavy rainfall caused by a series of summer thunderstorms which overrode the effect of the irrigation treatment. However, higher irrigation intensity increased the cotton yield and therefore reduced the N2O intensity (N2O emission per lint yield) of this cropping system. Our data suggest that there is only limited scope to reduce absolute N2O emissions by different irrigation intensities in irrigated cotton systems with summer dominated rainfall. However, the significant impact of the irrigation treatments on the N2O intensity clearly shows that irrigation can easily be used to optimize the N2O intensity of such a system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Several new types of contraception became available in Australia over the last twelve years (the implant in 2001, progestogen intra-uterine device (IUD) in 2003, and vaginal contraceptive ring in 2007). Most methods of contraception require access to health services. Permanent sterilisation and the insertion of an implant or IUD involve a surgical procedure. Access to health professionals providing these specialised services may be more difficult in rural areas. This paper examines uptake of permanent or long-acting reversible contraception (LARCs) among Australian women in rural areas compared to women in urban areas. Method: Participants in the Australian Longitudinal Study on Women's Health born in 1973-78 reported on their contraceptive use at three surveys: 2003, 2006 and 2009. Contraceptive methods included permanent sterilisation (tubal ligation, vasectomy), non-daily or LARC methods (implant, IUD, injection, vaginal ring), and other methods including daily, barrier or "natural" methods (oral contraceptive pills, condoms, withdrawal, safe period). Sociodemographic, reproductive history and health service use factors associated with using permanent, LARC or other methods were examined using a multivariable logistic regression analysis. Results: Of 9,081 women aged 25-30 in 2003, 3% used permanent methods and 4% used LARCs. Six years later in 2009, of 8,200 women (aged 31-36), 11% used permanent methods and 9% used LARCs. The fully adjusted parsimonious regression model showed that the likelihood of a woman using LARCs and permanent methods increased with number of children. Women whose youngest child was school-age were more likely to use LARCs (OR=1.83, 95%CI 1.43-2.33) or permanent methods (OR=4.39, 95%CI 3.54-5.46) compared to women with pre-school children. Compared to women living in major cities, women in inner regional areas were more likely to use LARCs (OR=1.26, 95%CI 1.03-1.55) or permanent methods (OR=1.43, 95%CI 1.17-1.76). Women living in outer regional and remote areas were more likely than women living in cities to use LARCs (OR=1.65, 95%CI 1.31-2.08) or permanent methods (OR=1.69, 95%CI 1.43-2.14). Women with poorer access to GPs were more likely to use permanent methods (OR=1.27, 95%CI 1.07-1.52). Conclusions: Location of residence and access to health services are important factors in women's choices about long-acting contraception in addition to the number and age of their children. There is a low level of uptake of non-daily, long-acting methods of contraception among Australian women in their mid-thirties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Vulnerable Elders Survey-13 (VES-13) is increasingly used to screen for older patients who can proceed to intensive chemotherapy without further comprehensive assessment. This study compared the VES-13 determination of fitness for treatment with the oncologist's assessments of fitness. Method: Sample: Consecutive series of solid tumour patients ≥65 years (n=175; M=72; range=65-86) from an Australian cancer centre. Patients were screened with the VES-13 before proceeding to usual treatment. Blinded to screening, oncologists concurrently predicted patient fitness for chemotherapy. A sample of 175 can detect, with 90% power, kappa coefficients of agreement between VES-13 and oncologists’ assessments >0.90 ("almost perfect agreement"). Separate backward stepwise logistic regression analyses assessed potential predictors of VES-13 and oncologists’ ratings of fitness. Results: Kappa coefficient for agreement between VES-13 and oncologists’ ratings of fitness was 0.41 (p<0.001). VES-13 and oncologists’ assessments agreed in 71% of ratings. VES-13 sensitivity = 83.3%; specificity = 57%; positive predictive value = 69%; negative predictive value = 75%. Logistic regression modelling indicated that the odds of being vulnerable to chemotherapy (VES-13) increased with increasing depression (OR=1.42; 95% CI: 1.18, 1.71) and decreased with increased functional independence assessed on the Bartel Index (OR=0.82; CI: 0.74, 0.92) and Lawton instrumental activities of daily living (OR=0.44; CI: 0.30, 0.65); RSquare=.65. Similarly, the odds of a patient being vulnerable to chemotherapy, when assessed by physicians, increased with increasing age (OR=1.15; CI: 1.07, 1.23) and depression (OR=1.23; CI: 1.06, 1.43), and decreased with increasing functional independence (OR=0.91; CI: 0.85, 0.98); RSquare=.32. Conclusions: Our data indicate moderate agreement between VES-13 and clinician assessments of patients’ fitness for chemotherapy. Current ‘one-step’ screening processes to determine fitness have limits. Nonetheless, screening tools do have the potential for modification and enhanced predictive properties in cancer care by adding relevant items, thus enabling fit patients to be immediately referred for chemotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phosphate mineral series eosphorite–childrenite–(Mn,Fe)Al(PO4)(OH)2·(H2O) has been studied using a combination of electron probe analysis and vibrational spectroscopy. Eosphorite is the manganese rich mineral with lower iron content in comparison with the childrenite which has higher iron and lower manganese content. The determined formulae of the two studied minerals are: (Mn0.72,Fe0.13,Ca0.01)(Al)1.04(PO4, OHPO3)1.07(OH1.89,F0.02)·0.94(H2O) for SAA-090 and (Fe0.49,Mn0.35,Mg0.06,Ca0.04)(Al)1.03(PO4, OHPO3)1.05(OH)1.90·0.95(H2O) for SAA-072. Raman spectroscopy enabled the observation of bands at 970 cm−1 and 1011 cm−1 assigned to monohydrogen phosphate, phosphate and dihydrogen phosphate units. Differences are observed in the area of the peaks between the two eosphorite minerals. Raman bands at 562 cm−1, 595 cm−1, and 608 cm−1 are assigned to the �4 bending modes of the PO4, HPO4 and H2PO4 units; Raman bands at 405 cm−1, 427 cm−1 and 466 cm−1 are attributed to the �2 modes of these units. Raman bands of the hydroxyl and water stretching modes are observed. Vibrational spectroscopy enabled details of the molecular structure of the eosphorite mineral series to be determined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Observational data suggested that supplementation with vitamin D could reduce risk of infection, but trial data are inconsistent. OBJECTIVE: We aimed to examine the effect of oral vitamin D supplementation on antibiotic use. DESIGN: We conducted a post hoc analysis of data from pilot D-Health, which is a randomized trial carried out in a general community setting between October 2010 and February 2012. A total of 644 Australian residents aged 60-84 y were randomly assigned to receive monthly doses of a placebo (n = 214) or 30,000 (n = 215) or 60,000 (n = 215) IU oral cholecalciferol for ≤12 mo. Antibiotics prescribed during the intervention period were ascertained by linkage with pharmacy records through the national health insurance scheme (Medicare Australia). RESULTS: People who were randomly assigned 60,000 IU cholecalciferol had nonsignificant 28% lower risk of having antibiotics prescribed at least once than did people in the placebo group (RR: 0.72; 95% CI: 0.48, 1.07). In analyses stratified by age, in subjects aged ≥70 y, there was a significant reduction in antibiotic use in the high-dose vitamin D compared with placebo groups (RR: 0.53; 95% CI: 0.32, 0.90), whereas there was no effect in participants <70 y old (RR: 1.07; 95% CI: 0.58, 1.97) (P-interaction = 0.1). CONCLUSION: Although this study was a post hoc analysis and statistically nonsignificant, this trial lends some support to the hypothesis that supplementation with 60,000 IU vitamin D/mo is associated with lower risk of infection, particularly in older adults. The trial was registered at the Australian New Zealand Clinical Trials Registry (anzctr.org.au) as ACTRN12609001063202.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Single nucleotide polymorphisms (SNPs) rs429358 (ε4) and rs7412 (ε2), both invoking changes in the amino-acid sequence of the apolipoprotein E (APOE) gene, have previously been tested for association with multiple sclerosis (MS) risk. However, none of these studies was sufficiently powered to detect modest effect sizes at acceptable type-I error rates. As both SNPs are only imperfectly captured on commonly used microarray genotyping platforms, their evaluation in the context of genome-wide association studies has been hindered until recently. Methods We genotyped 12 740 subjects hitherto not studied for their APOE status, imputed raw genotype data from 8739 subjects from five independent genome wide association studies datasets using the most recent high-resolution reference panels, and extracted genotype data for 8265 subjects from previous candidate gene assessments. Results Despite sufficient power to detect associations at genome-wide significance thresholds across a range of ORs, our analyses did not support a role of rs429358 or rs7412 on MS susceptibility. This included meta-analyses of the combined data across 13 913 MS cases and 15 831 controls (OR=0.95, p=0.259, and OR 1.07, p=0.0569, for rs429358 and rs7412, respectively). Conclusion Given the large sample size of our analyses, it is unlikely that the two APOE missense SNPs studied here exert any relevant effects on MS susceptibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background To explore the impact of geographical remoteness and area-level socioeconomic disadvantage on colorectal cancer (CRC) survival. Methods Multilevel logistic regression and Markov chain Monte Carlo simulations were used to analyze geographical variations in five-year all-cause and CRC-specific survival across 478 regions in Queensland Australia for 22,727 CRC cases aged 20–84 years diagnosed from 1997–2007. Results Area-level disadvantage and geographic remoteness were independently associated with CRC survival. After full multivariate adjustment (both levels), patients from remote (odds Ratio [OR]: 1.24, 95%CrI: 1.07-1.42) and more disadvantaged quintiles (OR = 1.12, 1.15, 1.20, 1.23 for Quintiles 4, 3, 2 and 1 respectively) had lower CRC-specific survival than major cities and least disadvantaged areas. Similar associations were found for all-cause survival. Area disadvantage accounted for a substantial amount of the all-cause variation between areas. Conclusions We have demonstrated that the area-level inequalities in survival of colorectal cancer patients cannot be explained by the measured individual-level characteristics of the patients or their cancer and remain after adjusting for cancer stage. Further research is urgently needed to clarify the factors that underlie the survival differences, including the importance of geographical differences in clinical management of CRC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives Heatwaves can have significant health consequences resulting in increased mortality and morbidity. However, their impact on people living in tropical/subtropical regions remains largely unknown. This study assessed the impact of heatwaves on mortality and emergency hospital admissions (EHAs) from non-external causes (NEC) in Brisbane, a subtropical city in Australia. Methods We acquired daily data on weather, air pollution and EHAs for patients aged 15 years and over in Brisbane between January 1996 and December 2005, and on mortality between January 1996 and November 2004. A locally derived definition of heatwave (daily maximum ≥37°C for 2 or more consecutive days) was adopted. Case–crossover analyses were used to assess the impact of heatwaves on cause-specific mortality and EHAs. Results During heatwaves, there was a statistically significant increase in NEC mortality (OR 1.46; 95% CI 1.21 to 1.77), cardiovascular mortality (OR 1.89; 95% CI 1.44 to 2.48), diabetes mortality in those aged 75+ (OR 9.96; 95% CI 1.02 to 96.85), NEC EHAs (OR 1.15; 95% CI 1.07 to 1.23) and EHAs from renal diseases (OR 1.41; 95% CI 1.09 to 1.83). The elderly were found to be particularly vulnerable to heatwaves (eg, for NEC EHAs, OR 1.24 for 65–74-year-olds and 1.39 for those aged 75+). Conclusions Significant increases in NEC mortality and EHAs were observed during heatwaves in Brisbane where people are well accustomed to hot summer weather. The most vulnerable were the elderly and people with cardiovascular, renal or diabetic disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Malaria remains a public health problem in the remote and poor area of Yunnan Province, China. Yunnan faces an increasing risk of imported malaria infections from Mekong river neighboring countries. This study aimed to identify the high risk area of malaria transmission in Yunnan Province, and to estimate the effects of climatic variability on the transmission of Plasmodium vivax and Plasmodium falciparum in the identified area. METHODS We identified spatial clusters of malaria cases using spatial cluster analysis at a county level in Yunnan Province, 2005-2010, and estimated the weekly effects of climatic factors on P. vivax and P. falciparum based on a dataset of daily malaria cases and climatic variables. A distributed lag nonlinear model was used to estimate the impact of temperature, relative humidity and rainfall up to 10-week lags on both types of malaria parasite after adjusting for seasonal and long-term effects. RESULTS The primary cluster area was identified along the China-Myanmar border in western Yunnan. A 1°C increase in minimum temperature was associated with a lag 4 to 9 weeks relative risk (RR), with the highest effect at lag 7 weeks for P. vivax (RR = 1.03; 95% CI, 1.01, 1.05) and 6 weeks for P. falciparum (RR = 1.07; 95% CI, 1.04, 1.11); a 10-mm increment in rainfall was associated with RRs of lags 2-4 weeks and 9-10 weeks, with the highest effect at 3 weeks for both P. vivax (RR = 1.03; 95% CI, 1.01, 1.04) and P. falciparum (RR = 1.04; 95% CI, 1.01, 1.06); and the RRs with a 10% rise in relative humidity were significant from lag 3 to 8 weeks with the highest RR of 1.24 (95% CI, 1.10, 1.41) for P. vivax at 5-week lag. CONCLUSIONS Our findings suggest that the China-Myanmar border is a high risk area for malaria transmission. Climatic factors appeared to be among major determinants of malaria transmission in this area. The estimated lag effects for the association between temperature and malaria are consistent with the life cycles of both mosquito vector and malaria parasite. These findings will be useful for malaria surveillance-response systems in the Mekong river region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Clinical guidelines for the treatment of chronic low back pain suggest the use of supervised exercise. Motor control (MC) based exercise is widely used within clinical practice but its efficacy is equivalent to general exercise therapy. MC exercise targets the trunk musculature. Considering the mechanical links between the hip, pelvis, and lumbar spine, surprisingly little focus has been on investigating the contribution of the hip musculature to lumbopelvic support. The purpose of this study is to compare the efficacy of two exercise programs for the treatment of non-specific low back pain (NSLBP). Methods Eighty individuals aged 18-65 years of age were randomized into two groups to participate in this trial. The primary outcome measures included self-reported pain intensity (0-100mm VAS) and percent disability (Oswestry Disability Index V2). Bilateral measures of hip strength (N/kg) and two dimensional frontal plane mechanics (º) were the secondary outcomes. Outcomes were measured at baseline and following a six-week home based exercise program including weekly sessions of real-time ultrasound imaging. Results Within group comparisons revealed clinically meaningful reductions in pain for both groups. The MC exercise only (N= 40, xˉ =-20.9mm, 95%CI -25.7, -16.1) and the combined MC and hip exercise (N= 40, xˉ = -24.9mm, 95%CI -30.8, -19.0). There was no statistical difference in the change of pain (xˉ =-4.0mm, t= -1.07, p=0.29, 95%CI -11.5, 3.5) or disability (xˉ =-0.3%, t=-0.19, p=0.85, 95%CI -11.5, 3.5) between groups. Conclusion Both exercise programs had similar and positive effects on NSLBP which support the use of the home based exercise programs with weekly supervised visits. However, the addition of specific hip strengthening exercises to a MC based exercise program did not result in significantly greater reductions in pain or disability. Trial Registration NCTO1567566 Funding: Worker’s Compensation Board Alberta Research Grant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to elucidate the thermophysiological effects of wearing lightweight non-military overt and covert personal body armour (PBA) in a hot and humid environment. Eight healthy males walked on a treadmill for 120 min at 22% of their heart rate reserve in a climate chamber simulating 31 °C (60%RH) wearing either no armour (control), overt or covert PBA in addition to a security guard uniform, in a randomised controlled crossover design. No significant difference between conditions at the end of each trial was observed in core temperature, heart rate or skin temperature (P > 0.05). Covert PBA produced a significantly greater amount of body mass change (−1.81 ± 0.44%) compared to control (−1.07 ± 0.38%, P = 0.009) and overt conditions (−1.27 ± 0.44%, P = 0.025). Although a greater change in body mass was observed after the covert PBA trial; based on the physiological outcome measures recorded, the heat strain encountered while wearing lightweight, non-military overt or covert PBA was negligible compared to no PBA. Practitioner summary The wearing of bullet proof vests or body armour is a requirement of personnel engaged in a wide range of occupations including police, security, customs and even journalists in theatres of war. This randomised controlled crossover study is the first to examine the thermophysiological effects of wearing lightweight non-military overt and covert personal body armour (PBA) in a hot and humid environment. We conclude that the heat strain encountered while wearing both overt and covert lightweight, non-military PBA was negligible compared to no PBA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To compare the efficacy of two exercise programs in reducing pain and disability for individuals with non-specific low back pain and to examine the underlying mechanical factors related to pain and disability for individuals with NSLBP. Design A single-blind, randomized controlled trial. Methods: Eighty participants were recruited from eleven community-based general medical practices and randomized into two groups completing either a lumbopelvic motor control or a combined lumbopelvic motor control and progressive hip strengthening exercise therapy program. All participants received an education session, 6 rehabilitation sessions including real time ultrasound training, and a home based exercise program manual and log book. The primary outcomes were pain (0-100mm visual analogue scale), and disability (Oswestry Disability Index V2). The secondary outcomes were hip strength (N/kg) and two-dimensional frontal plane biomechanics (°) measure during the static Trendelenburg test and while walking. All outcomes were measured at baseline and at 6-week follow up. Results There was no statistical difference in the change in pain (xˉ = -4.0mm, t= -1.07, p =0.29, 95%CI -11.5, 3.5) or disability (xˉ = -0.3%, t= -0.19, p =0.85, 95%CI -3.5, 2.8) between groups. Within group comparisons revealed clinically meaningful reductions in pain for both Group One (xˉ =-20.9mm, 95%CI -25.7, -16.1) and Group Two (xˉ =-24.9, 95%CI -30.8, -19.0). Conclusion Both exercise programs had similar efficacy in reducing pain. The addition of hip strengthening exercises to a motor control exercise program does not appear to result in improved clinical outcome for pain for individuals with non-specific low back pain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose We designed a visual field test focused on the field utilized while driving to examine associations between field impairment and motor vehicle collision involvement in 2,000 drivers ≥70 years old. Methods The "driving visual field test" involved measuring light sensitivity for 20 targets in each eye, extending 15° superiorly, 30° inferiorly, 60° temporally and 30° nasally. The target locations were selected on the basis that they fell within the field region utilized when viewing through the windshield of a vehicle or viewing the dashboard while driving. Monocular fields were combined into a binocular field based on the more sensitive point from each eye. Severe impairment in the overall field or a region was defined as average sensitivity in the lowest quartile of sensitivity. At-fault collision involvement for five years prior to enrollment was obtained from state records. Poisson regression was used to calculate crude and adjusted rate ratios examining the association between field impairment and at-fault collision involvement. Results Drivers with severe binocular field impairment in the overall driving visual field had a 40% increased rate of at-fault collision involvement (RR 1.40, 95%CI 1.07-1.83). Impairment in the lower and left fields was associated with elevated collision rates (RR 1.40 95%CI 1.07-1.82 and RR 1.49, 95%CI 1.15-1.92, respectively), whereas impairment in the upper and right field regions was not. Conclusions Results suggest that older drivers with severe impairment in the lower or left region of the driving visual field are more likely to have a history of at-fault collision involvement.