825 resultados para Mansfield, Katherine
Resumo:
The conflicts in Iraq and Afghanistan have been epitomized by the insurgents’ use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies.
Resumo:
Since World War I, explosions have accounted for over 70% of all injuries in conflict. With the development of improved personnel protection of the torso, improved medical care and faster aeromedical evacuation, casualties are surviving with more severe injuries to the extremities. Understanding the processes involved in the transfer of blast-induced shock waves through biological tissues is essential for supporting efforts aimed at mitigating and treating blast injury. Given the inherent heterogeneities in the human body, we argue that studying these processes demands a highly integrated approach requiring expertise in shock physics, biomechanics and fundamental biological processes. This multidisciplinary systems approach enables one to develop the experimental framework for investigating the material properties of human tissues that are subjected to high compression waves in blast conditions and the fundamental cellular processes altered by this type of stimuli. Ultimately, we hope to use the information gained from these studies in translational research aimed at developing improved protection for those at risk and improved clinical outcomes for those who have been injured from a blast wave.
Resumo:
Introduction and Aim: Sexual assaults commonly involve alcohol use by the perpetrator, victim, or both. Beliefs about alcohol’s effects may impact on people’s perceptions of and responses to men and women who have had such experiences while intoxicated from alcohol. This study aimed to develop an alcohol expectancy scale that captures young adults’ beliefs about alcohol’s role in sexual aggression and victimisation. Design and Methods: Based on pilot focus groups, an initial pool of 135 alcohol expectancy items was developed, checked for readability and face validity, and administered via a cross-sectional survey to 201 male and female university students (18-25 years). Items were specified in terms of three target drinkers: self, men, and women. In addition, a social desirability measure was included. Results: Principal Axis Factoring revealed a 4-factor solution for the targets men and women and a 5-factor solution for the target self with 72 items retained. Factors related to sexual coercion, sexual vulnerability, confidence, self-centredness, and negative cognitive and behavioural effects. Social desirability issues were evident for the target self, but not for the targets men and women. Discussion and Conclusions: Young adults link alcohol’s effects with sexual vulnerabilities via perceived risky cognitions and behaviours. Due to social desirability, these expectancies may be difficult to explicate for the self but may be accessible instead via other-oriented assessment. The Sexual Coercion and Vulnerability Alcohol Expectancy Scale has potential as a tool to elucidate the established tendency for observers to excuse intoxicated sexual perpetrators while blaming intoxicated victims.
Resumo:
Early on Christmas morning 1974 Tropical Cyclone Tracy, a Category 4 storm, devastated the Northern Territory city of Darwin leaving only 6% of the city’s housing habitable. The extent of the disaster was largely the result of unregulated and poorly constructed buildings, predominantly housing. While the engineering and reconstruction process demonstrated a very successful response and adaptation to an existing and future risk, the impact of the cyclone of the local community and its Indigenous population in particular, had not been well recorded. NCCARF therefore commissioned a report on the Indigenous experience of Cyclone Tracy to document how Indigenous people were impacted by, responded to, and recovered from Cyclone Tracy in comparison to non-Indigenous groups. The report also considers the research literature on disasters and Indigenous people in the Northern Territory, with a specific focus on cyclones, and considers the socio-political context of Indigenous communities in Darwin prior to Cyclone Tracy.
Resumo:
This case study will review the impact of Tropical Cyclone Tracy on the city and people of Darwin, the Australian engineering and institutional responses that it invoked and the relevance of these lessons to a world threatened by global climate change. At Christmas, 1974, Tropical Cyclone Tracy laid waste the city of Darwin, an iconic episode in the history of Australian natural disasters. It provides one of the clearest and most successful examples worldwide of adaptation to a catastrophe. Following large losses in Townsville from Tropical Cyclone Althea in 1971, the level of destruction in Darwin was such that it led to new regulations mandating the use of the wind code for reconstruction, and eventually to similar regulations for new construction in other cyclone-prone areas of Australia.
Resumo:
Cancer can be defined as a deregulation or hyperactivity in the ongoing network of intracellular and extracellular signaling events. Reverse phase protein microarray technology may offer a new opportunity to measure and profile these signaling pathways, providing data on post-translational phosphorylation events not obtainable by gene microarray analysis. Treatment of ovarian epithelial carcinoma almost always takes place in a metastatic setting since unfortunately the disease is often not detected until later stages. Thus, in addition to elucidation of the molecular network within a tumor specimen, critical questions are to what extent do signaling changes occur upon metastasis and are there common pathway elements that arise in the metastatic microenvironment. For individualized combinatorial therapy, ideal therapeutic selection based on proteomic mapping of phosphorylation end points may require evaluation of the patient's metastatic tissue. Extending these findings to the bedside will require the development of optimized protocols and reference standards. We have developed a reference standard based on a mixture of phosphorylated peptides to begin to address this challenge.
Resumo:
Informed broadly by the theory of planned behaviour, this study used qualitative methodology to understand Australian adults' sun-protective decisions. Forty-two adults participated in focus groups where they discussed behavioural (advantages and disadvantages), normative (important referents), and control (barriers and facilitators) beliefs, as well as potential social influences and images of tanned and non-tanned people. Responses were analysed using the consensual qualitative research approach to determine the dominant themes. Themes of fashion and comfort were prominent, the important role of friends and family in sun safe decision-making was highlighted, as was the availability of sun-protective measures (e.g., in an accessible place or in the environment). Additional themes included the need to model sound sun-protective behaviours to (current and future) children, the emphasis on personal choice and personal responsibility to be sun safe, and the influence of Australian identity and culture on tanning and socially acceptable forms of sun protection. These beliefs can be used to inform interventions and public health campaigns targeting sun safety among Australians, a population with the highest skin cancer incidence in the world.
Resumo:
Background. Interventions that prevent healthcare-associated infection should lead to fewer deaths and shorter hospital stays. Cleaning hands (with soap or alcohol) is an effective way to prevent the transmission of organisms, but rates of compliance with hand hygiene are sometimes disappointingly low. The National Hand Hygiene Initiative in Australia aimed to improve hand hygiene compliance among healthcare workers, with the goal of reducing rates of healthcare-associated infection. Methods. We examined whether the introduction of the National Hand Hygiene Initiative was associated with a change in infection rates. Monthly infection rates for healthcare-associated Staphylococcus aureus bloodstream infections were examined in 38 Australian hospitals across 6 states. We used Poisson regression and examined 12 possible patterns of change, with the best fitting pattern chosen using the Akaike information criterion. Monthly bed-days were included to control for increased hospital use over time. Results. The National Hand Hygiene Initiative was associated with a reduction in infection rates in 4 of the 6 states studied. Two states showed an immediate reduction in rates of 17% and 28%, 2 states showed a linear decrease in rates of 8% and 11% per year, and 2 showed no change in infection rates. Conclusions. The intervention was associated with reduced infection rates in most states. The failure in 2 states may have been because those states already had effective initiatives before the national initiative’s introduction or because infection rates were already low and could not be further reduced.
Resumo:
Background Research has identified associations between serum 25(OH)D and a range of clinical outcomes in chronic kidney disease and wider populations. The present study aimed to investigate vitamin D deficiency/insufficiency in dialysis patients and the relationship with vitamin D intake and sun exposure. Methods A cross-sectional study was used. Participants included 30 peritoneal dialysis (PD) (43.3% male; 56.87 ± 16.16 years) and 26 haemodialysis (HD) (80.8% male; 63.58 ± 15.09 years) patients attending a department of renal medicine. Explanatory variables were usual vitamin D intake from diet/supplements (IU day−1) and sun exposure (min day−1). Vitamin D intake, sun exposure and ethnic background were assessed by questionnaire. Weight, malnutrition status and routine biochemistry were also assessed. Data were collected during usual department visits. The main outcome measure was serum 25(OH)D (nm). Results Prevalence of inadequate/insufficient vitamin D intake differed between dialysis modality, with 31% and 43% found to be insufficient (<50 nm) and 4% and 33% found to be deficient (<25 nm) in HD and PD patients, respectively (P < 0.001). In HD patients, there was a correlation between diet and supplemental vitamin D intake and 25(OH)D (ρ = 0.84, P < 0.001) and average sun exposure and 25(OH)D (ρ = 0.50, P < 0.02). There were no associations in PD patients. The results remained significant for vitamin D intake after multiple regression, adjusting for age, gender and sun exposure. Conclusions The results highlight a strong association between vitamin D intake and 25(OH)D in HD but not PD patients, with implications for replacement recommendations. The findings indicate that, even in a sunny climate, many dialysis patients are vitamin D deficient, highlighting the need for exploration of determinants and consequences.
Resumo:
Background Obtaining single parasite clones is required for many techniques in malaria research. Cloning by limiting dilution using microscopy-based assessment for parasite growth is an arduous and labor-intensive process. An alternative method for the detection of parasite growth in limiting dilution assays is using a commercial ELISA histidine-rich protein II (HRP2) detection kit. Methods Detection of parasite growth was undertaken using HRP2 ELISA and compared to thick film microscopy. An HRP2 protein standard was used to determine the detection threshold of the HRP2 ELISA assay, and a HRP2 release model was used to extrapolate the amount of parasite growth required for a positive result. Results The HRP2 ELISA was more sensitive than microscopy for detecting parasite growth. The minimum level of HRP2 protein detection of the ELISA was 0.11ng/ml. Modeling of HRP2 release determined that 2,116 parasites are required to complete a full erythrocytic cycle to produce sufficient HRP2 to be detected by the ELISA. Under standard culture conditions this number of parasites is likely to be reached between 8 to 14 days of culture. Conclusions This method provides an accurate and simple way for the detection of parasite growth in limiting dilution assays, reducing time and resources required in traditional methods. Furthermore the method uses spent culture media instead of the parasite-infected red blood cells, enabling culture to continue.
Resumo:
We examined goal importance, focusing on high, but not exclusive priority goals, in the theory of planned behaviour (TPB) to predict students’ academic performance. At the beginning of semester, students in a psychology subject (N = 197) completed TPB and goal importance items for achieving a high grade. Regression analyses revealed partial support for the TPB. Perceived behavioural control, but not attitude or subjective norm, significantly predicted intention, with intention predicting final grade. Goal importance significantly predicted intention, but not final grade, indicating that perceiving a performance goal as highly, but not necessarily exclusively, important impacts on students’ achievement intentions.
Resumo:
Little is known about the beliefs that underlie the biased attributions that typically characterise people’s perceptions of intoxicated sexual perpetrators and their victims. Guided by consensual qualitative research, we explored young Australian adults’ (18-25 years; N = 15) attributions for an alcohol-involved rape based on focus groups and interviews. Prominent themes indicated that participants rarely labelled the assault as rape and, instead, adhered to miscommunication explanations. Participants emphasised the developmental value of the victimisation experience although recognising its harmful consequences. Both perpetrator and victim were held strongly responsible based on perceived opportunities to prevent the assault but implicit justifications were, nevertheless, evident. As such, explicit and implicit attributions were contradictory, with the latter reflecting the attributional double standard previously observed in quantitative rape-perception research. Findings underscore the need to challenge pervasive rape myths and equip young adults with knowledge on how to respond supportively to the commonly stigmatised victims of rape.
Resumo:
Bladder infections affect millions of people yearly, and recurrent symptomatic infections (cystitis) are very common. The rapid increase in infections caused by multidrug-resistant uropathogens threatens to make recurrent cystitis an increasingly troubling public health concern. Uropathogenic Escherichia coli (UPEC) cause the vast majority of bladder infections. Upon entry into the lower urinary tract, UPEC face obstacles to colonization that constitute population bottlenecks, reducing diversity, and selecting for fit clones. A critical mucosal barrier to bladder infection is the epithelium (urothelium). UPEC bypass this barrier when they invade urothelial cells and form intracellular bacterial communities (IBCs), a process which requires type 1 pili. IBCs are transient in nature, occurring primarily during acute infection. Chronic bladder infection is common and can be either latent, in the form of the quiescent intracellular reservoir (QIR), or active, in the form of asymptomatic bacteriuria (ASB/ABU) or chronic cystitis. In mice, the fate of bladder infection, QIR, ASB, or chronic cystitis, is determined within the first 24 h of infection and constitutes a putative host–pathogen mucosal checkpoint that contributes to susceptibility to recurrent cystitis. Knowledge of these checkpoints and bottlenecks is critical for our understanding of bladder infection and efforts to devise novel therapeutic strategies.
Resumo:
PURPOSE Every health care sector including hospice/palliative care needs to systematically improve services using patient-defined outcomes. Data from the national Australian Palliative Care Outcomes Collaboration aims to define whether hospice/palliative care patients' outcomes and the consistency of these outcomes have improved in the last 3 years. METHODS Data were analysed by clinical phase (stable, unstable, deteriorating, terminal). Patient-level data included the Symptom Assessment Scale and the Palliative Care Problem Severity Score. Nationally collected point-of-care data were anchored for the period July-December 2008 and subsequently compared to this baseline in six 6-month reporting cycles for all services that submitted data in every time period (n = 30) using individual longitudinal multi-level random coefficient models. RESULTS Data were analysed for 19,747 patients (46 % female; 85 % cancer; 27,928 episodes of care; 65,463 phases). There were significant improvements across all domains (symptom control, family care, psychological and spiritual care) except pain. Simultaneously, the interquartile ranges decreased, jointly indicating that better and more consistent patient outcomes were being achieved. CONCLUSION These are the first national hospice/palliative care symptom control performance data to demonstrate improvements in clinical outcomes at a service level as a result of routine data collection and systematic feedback.
Resumo:
This paper challenges the assumptions underlying many reviews and offers alternative criteria for examining evidence for nonpharmacological interventions. We evaluated 27 reviews examining interventions for persons with dementia as they relate to the issues of selection based on randomized controlled trial (RCT) design. Reviews were described by type of intervention, level of cognitive function, and criteria for inclusion. Of the 27 reviews, 46% required RCTs for inclusion and most had stringent inclusion criteria. This resulted in poor utilization of the literature and low ecological validity. Eliminating most of the available data poses a critical problem to clinical and research development. Studies meeting strict methodological criteria may not generalize to the greater population or may exclude sub-populations and interventions. Limitations of double-blind RCTs and potential design solutions are set forth based on appropriate populations, problems, interventions, and settings characteristics.