219 resultados para Phytoplankton related results
Resumo:
The biomass and species composition of tropical phytoplankton in Albatross Bay, Gulf of Carpentaria, northern Australia, were examined monthly for 6 yr (1986 to 1992). Chlorophyll a (chl a) concentrations were highest (2 to 5.7 mu g l(-1)) in the wet season at inshore sites, usually coinciding with low salinities (30 to 33 ppt) and high temperatures (29 to 32 degrees C). At the offshore sites chi a concentrations were lower (0.2 to 2 mu g l(-1)) and did not vary seasonally. Nitrate and phosphate concentrations were generally low (0 to 3.68 mu M and 0.09 to 3 mu M for nitrate and phosphate respectively), whereas silicate was present in concentrations in the range 0.19 to 13 mu M. The phytoplankton community was dominated by diatoms, particularly at the inshore sites, as determined by a combination of microscopic and high-performance liquid chromatography (HPLC) pigment analyses. At the offshore sites the proportion of green flagellates increased. The cyanobacterium genus Trichodesmium and the diatom genera Chaetoceros, Rhizosolenia, Bacteriastrum and Thalassionema dominated the phytoplankton caught in 37 mu m mesh nets; however, in contrast to many other coastal areas studied worldwide there was no distinct species succession of the diatoms and only Trichodesmium showed seasonal changes in abundance. This reflects a stable phytoplankton community in waters without pulses of physical and chemical disturbances. These results are discussed in the context of the commercial prawn fishery in the Gulf of Carpentaria and the possible effect of phytoplankton on prawn larval growth and survival.
Resumo:
Background From the conservative estimates of registrants with the National Diabetes Supply Scheme, we will be soon passing 1.1 Million Australians affected by all types of diabetes. The diabetes complications of foot ulceration and amputation are costly to all. These costs can be reduced with appropriate prevention strategies, starting with identifying people at risk through primary care diabetic foot screening. Yet levels of diabetic foot screening in Australia are difficult to quantify. This presentation aims to report on foot screening rates as recorded in existing academic literature, national health surveys and national database reports. Methods Literature searches included diabetic foot screening that occurred in the primary care setting for populations over 2000 people from 2002 to 2014. Searches were performed using Medline and CINAHL as well as internet searches of Organisations for Economic Co-operation and Development (OECD) countries health databases. The focus is on type 1 and type 2 diabetes in adults, and not gestational diabetes or children. The two primary outcome measures were foot -screening rates as a percentage of adult diabetic population and major lower limb amputation incidence rates from standardised OECD data. Results The most recent and accurate level for Australian population review was in the AUSDIAB (Australian Diabetes and lifestyle survey) from 2004. This survey reported screening in primary care to be as low as 50%. Countries such as the United Kingdom and United States of America have much higher reported rates of foot screening (67-86%) recorded using national databases and web based initiatives that involve patients and clinicians. By comparison major amputation rates for Australia were similar to the United Kingdom at 6.5 versus 5.1 per 100,000 population, but dis-similar to the United States of America at 17 per 100,000 population. Conclusions Australian rates of diabetic foot screening in primary care centres is ambiguous. There is no direct relationship between foot screening levels in a primary care environment and major lower limb amputation, based on national health survey's and OECD data. Uptake of national registers, incentives and web-based systems improve levels of diabetic foot assessment, which are the first steps to a healthier diabetic population.
Resumo:
The current study explored the influence of moral values (measured by ethical ideology) on self-reported driving anger and aggressive driving responses. A convenience sample of drivers aged 17-73 years (n = 280) in Queensland, Australia, completed a self-report survey. Measures included sensation seeking, trait aggression, driving anger, endorsement of aggressive driving responses and ethical ideology (Ethical Position Questionnaire, EPQ). Scores on the two underlying dimensions of the EPQ idealism (highI/lowI) and relativism (highR/lowR) were used to categorise drivers into four ideological groups: Situationists (highI/highR); Absolutists (highI/lowR); Subjectivists (lowI/highR); and Exceptionists (lowI/lowR). Mean aggressive driving scores suggested that exceptionists were significantly more likely to endorse aggressive responses. After accounting for demographic variables, sensation seeking and driving anger, ethical ideological category added significantly, though modestly to the prediction of aggressive driving responses. Patterns in results suggest that those drivers in ideological groups characterised by greater concern to avoid affecting others negatively (i.e. highI, Situationists, Absolutists) may be less likely to endorse aggressive driving responses, even when angry. In contrast, Subjectivists (lowI, HighR), reported the lowest levels of driving anger yet were significantly more likely to endorse aggressive responses. This provides further insight into why high levels of driving anger may not always translate into more aggressive driving.
Resumo:
Objective To evaluate health practitioners’ confidence and knowledge of alcohol screening, brief intervention and referral after training in a culturally adapted intervention on alcohol misuse and well-being issues for trauma patients. Design Mixed methods, involving semi-structured interviews at baseline and a post-workshop questionnaire. Setting: Targeted acute care within a remote area major tertiary referral hospital. Participants Ten key informants and 69 questionnaire respondents from relevant community services and hospital-based health care professionals. Intervention Screening and brief intervention training workshops and resources for 59 hospital staff. Main outcome measures Self-reported staff knowledge of alcohol screening, brief intervention and referral, and satisfaction with workshop content and format. Results After training, 44% of participants reported being motivated to implement alcohol screening and intervention. Satisfaction with training was high, and most participants reported that their knowledge of screening and brief intervention was improved. Conclusion Targeted educational interventions can improve the knowledge and confidence of inpatient staff who manage patients at high risk of alcohol use disorder. Further research is needed to determine the duration of the effect and influence on practice behaviour. Ongoing integrated training, linked with systemic support and established quality improvement processes, is required to facilitate sustained change and widespread dissemination.
Resumo:
Objective This study explored the dimensionality and measurement invariance of the 25-item Connor-Davidson Resilience Scale (CD-RISC; Connor & Davidson, 2003) across samples of adult (n = 321; aged 20–36) and adolescent (n = 199; aged 12–18) Australian cricketers. Design Cross-sectional, self-report survey Methods An online, multi-section questionnaire. Results Confirmatory factor and item level analyses supported the psychometric superiority of a revised 10-item, unidimensional model of resilience over the original 25-item, five-factor measurement model. Positive and moderate correlations with hardiness as well as negative and moderate correlations with burnout components were evidenced thereby providing support for the convergent validity of the unidimensional model. Measurement invariance analyses of the unidimensional model across the two age-group samples supported configural (i.e., same factor structure across groups), metric (i.e., same pattern of factor loadings across the groups), and partial scalar invariance (i.e., mostly the same intercepts across the groups). Conclusion Evidence for a psychometrically sound measure of resilient qualities of the individual provides an important foundation upon which researchers can identify the antecedents to and outcomes of resilience in sport contexts.
Resumo:
Endometriosis is a chronic inflammatory condition in women that results in pelvic pain and subfertility, and has been associated with decreased body mass index (BMI). Genetic variants contributing to the heritable component have started to emerge from genome-wide association studies (GWAS), although the majority remain unknown. Unexpectedly, we observed an intergenic locus on 7p15.2 that was genome-wide significantly associated with both endometriosis and fat distribution (waist-to-hip ratio adjusted for BMI; WHRadjBMI) in an independent meta-GWAS of European ancestry individuals. This led us to investigate the potential overlap in genetic variants underlying the aetiology of endometriosis, WHRadjBMI and BMI using GWAS data. Our analyses demonstrated significant enrichment of common variants between fat distribution and endometriosis (P = 3.7 x 10(-3)), which was stronger when we restricted the investigation to more severe (Stage B) cases (P = 4.5 x 10(-4)). However, no genetic enrichment was observed between endometriosis and BMI (P = 0.79). In addition to 7p15.2, we identify four more variants with statistically significant evidence of involvement in both endometriosis and WHRadjBMI (in/near KIFAP3, CAB39L, WNT4, GRB14); two of these, KIFAP3 and CAB39L, are novel associations for both traits. KIFAP3, WNT4 and 7p15.2 are associated with the WNT signalling pathway; formal pathway analysis confirmed a statistically significant (P = 6.41 x 10(-4)) overrepresentation of shared associations in developmental processes/WNT signalling between the two traits. Our results demonstrate an example of potential biological pleiotropy that was hitherto unknown, and represent an opportunity for functional follow-up of loci and further cross-phenotype comparisons to assess how fat distribution and endometriosis pathogenesis research fields can inform each other.
Resumo:
OBJECTIVES To identify common genetic variants that predispose to caffeine-induced insomnia and to test whether genes whose expression changes in the presence of caffeine are enriched for association with caffeine-induced insomnia. DESIGN A hypothesis-free, genome-wide association study. SETTING Community-based sample of Australian twins from the Australian Twin Registry. PARTICIPANTS After removal of individuals who said that they do not drink coffee, a total of 2,402 individuals from 1,470 families in the Australian Twin Registry provided both phenotype and genotype information. MEASUREMENTS AND RESULTS A dichotomized scale based on whether participants reported ever or never experiencing caffeine-induced insomnia. A factor score based on responses to a number of questions regarding normal sleep habits was included as a covariate in the analysis. More than 2 million common single nucleotide polymorphisms (SNPs) were tested for association with caffeine-induced insomnia. No SNPs reached the genome-wide significance threshold. In the analysis that did not include the insomnia factor score as a covariate, the most significant SNP identified was an intronic SNP in the PRIMA1 gene (P = 1.4 x 10(-)(6), odds ratio = 0.68 [0.53 - 0.89]). An intergenic SNP near the GBP4 gene on chromosome 1 was the most significant upon inclusion of the insomnia factor score into the model (P = 1.9 x 10(-)(6), odds ratio = 0.70 [0.62 - 0.78]). A previously identified association with a polymorphism in the ADORA2A gene was replicated. CONCLUSIONS Several genes have been identified in the study as potentially influencing caffeine-induced insomnia. They will require replication in another sample. The results may have implications for understanding the biologic mechanisms underlying insomnia.
Resumo:
Based on a survey of climate change experts in different stakeholder groups and interviews with corporate climate change managers, this study provides insights into the gap between what information stakeholders expect, and what Australian corporations disclose. This paper focuses on annual reports and sustainability reports with specific reference to the disclosure of climate change-related corporate governance practices. The findings culminate in the governance practises. Interview results indicate that the low levels of disclosures made by Australian companies may be due to a number of factors. A lack of proactive stakeholder engagement and an apparent preoccupation with financial performance and advancing shareholders interest, coupled with a failure by managers to accept accountability, seems to go a long way to explaining low levels of disclosure.
Resumo:
The accumulation of deficits with increasing age results in a decline in the functional capacity of multiple organs and systems. These changes can have a significant influence on the pharmacokinetics and pharmacodynamics of prescribed drugs. Although alterations in body composition and worsening renal clearance are important considerations, for most drugs the liver has the greatest effect on metabolism. Age-related change in hepatic function thereby causes much of the variability in older people’s responses to medication. In this review, we propose that a decline in the ability of the liver to inactivate toxins may contribute to a proinflammatory state in which frailty can develop. Since inflammation also downregulates drug metabolism, medication prescribed to frail older people in accordance with disease-specific guidelines may undergo reduced systemic clearance, leading to adverse drug reactions, further functional decline and increasing polypharmacy, exacerbating rather than ameliorating frailty status. We also describe how increasing chronological age and frailty status impact liver size, blood flow and protein binding and enzymes of drug metabolism. This is used to contextualise our discussion of appropriate prescribing practices. For example, while the general axiom of ‘start low, go slow’ should underpin the initiation of medication (titrating to a defined therapeutic goal), it is important to consider whether drug clearance is flow or capacity-limited. By summarising the effect of age-related changes in hepatic function on medications commonly used in older people, we aim to provide a guide that will have high clinical utility for practising geriatricians.
Resumo:
Aim This study assessed the association between compression use and changes in lymphoedema observed in women with breast cancer-related lymphoedema who completed a 12 week exercise intervention. Methods This work uses data collected from a 12 week exercise trial, whereby women were randomly allocated into either aerobic-based only (n=21) or resistance-based only (n=20) exercise. Compression use during the trial was at the participant’s discretion. Differences in lymphoedema (measured by L-Dex score and inter-limb circumference difference [%]) and associated symptoms between those who wore, and did not wear compression during the 12 week intervention were assessed. We also explored participants’ reasons surrounding compression during exercise. Results No significant interaction effect between time and compression use for lymphoedema was observed. There was no difference between groups over time in the number or severity of lymphoedema symptoms. Irrespective of compression use, there were trends for reductions in the proportion of women reporting severe symptoms, but lymphoedema status did not change. Individual reasons for the use of compression, or lack thereof, varied markedly. Conclusion Our findings demonstrated an absence of a positive or negative effect from compression use during exercise on lymphoedema. Current and previous findings suggest the clinical recommendation that garments must be worn during exercise is questionable, and its application requires an individualised approach.
Resumo:
Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.
Resumo:
Recent research has shown that, in general, older professors are rated to have more passive-avoidant leadership styles than younger professors by their research assistants. The current study investigated professors' age-related work concerns and research assistants' favorable age stereotypes as possible explanations for this finding. Data came from 128 university professors paired to one research assistant each. Results show that professors' age-related work concerns (decreased enthusiasm for research, growing humanism, development of exiting consciousness and increased follower empowerment) did not explain the relationships between professor age and research assistant ratings of passive-avoidant and proactive leadership. However, research assistants' favorable age stereotypes influenced the relationships between professor age and research assistant ratings of leadership, such that older professors were rated as more passive-avoidant and less proactive than younger professors by research assistants with less favorable age stereotypes, but not by research assistants with more favorable age stereotypes.
Resumo:
Background Research on the relationship between Health Related Quality of Life (HRQoL) and physical activity (PA), to date, have rarely investigated how this relationship differ across objective and subjective measures of PA. The aim of this paper is to explore the relationship between HRQoL and PA, and examine how this relationship differs across objective and subjective measures of PA, within the context of a large representative national survey from England. Methods Using a sample of 5,537 adults (40–60 years) from a representative national survey in England (Health Survey for England 2008), Tobit regressions with upper censoring was employed to model the association between HRQoL and objective, and subjective measures of PA controlling for potential confounders. We tested the robustness of this relationship across specific types of PA. HRQoL was assessed using the summary measure of health state utility value derived from the EuroQol-5 Dimensions (EQ-5D) whilst PA was assessed via subjective measure (questionnaire) and objective measure (accelerometer- actigraph model GT1M). The actigraph was worn (at the waist) for 7 days (during waking hours) by a randomly selected sub-sample of the HSE 2008 respondents (4,507 adults – 16 plus years), with a valid day constituting 10 hours. Analysis was conducted in 2010. Results Findings suggest that higher levels of PA are associated with better HRQoL (regression coefficient: 0.026 to 0.072). This relationship is consistent across different measures and types of PA although differences in the magnitude of HRQoL benefit associated with objective and subjective (regression coefficient: 0.047) measures of PA are noticeable, with the former measure being associated with a relatively better HRQoL (regression coefficient: 0.072). Conclusion Higher levels of PA are associated with better HRQoL. Using an objective measure of PA compared with subjective shows a relatively better HRQoL.
Resumo:
Public private partnerships (PPP) are widely used for construction project procurement. However, the briefing stage of PPP projects has been largely overlooked, although it has a far-reaching influence throughout the project life cycle. In response, we rectify this by exploring the critical factors involved. A set of 15 procurement-related factors are first identified from the existing literature. Then the effects of four background variables on the factors are tested with Hong Kong government data by an exploratory factor analysis extracting four major dimensions. The relationships between these dimensions and background variables indicate the need to take the background variables into account when ranking the factors. The ranking of the factors is then obtained by considering their weighted importance. Finally, the final practical value of the results is discussed.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.