42 resultados para Bull Terrier
Resumo:
Introduction: The built environment is increasingly recognised as being associated with health outcomes. Relationships between the built environment and health differ among age groups, especially between children and adults, but also between younger, mid-age and older adults. Yet few address differences across life stage groups within a single population study. Moreover, existing research mostly focuses on physical activity behaviours, with few studying objective clinical and mental health outcomes. The Life Course Built Environment and Health (LCBEH) project explores the impact of the built environment on self-reported and objectively measured health outcomes in a random sample of people across the life course. Methods and analysis: This cross-sectional data linkage study involves 15 954 children (0–15 years), young adults (16–24 years), adults (25–64 years) and older adults (65+years) from the Perth metropolitan region who completed the Health and Wellbeing Surveillance System survey administered by the Department of Health of Western Australia from 2003 to 2009. Survey data were linked to Western Australia's (WA) Hospital Morbidity Database System (hospital admission) and Mental Health Information System (mental health system outpatient) data. Participants’ residential address was geocoded and features of their ‘neighbourhood’ were measured using Geographic Information Systems software. Associations between the built environment and self-reported and clinical health outcomes will be explored across varying geographic scales and life stages. Ethics and dissemination: The University of Western Australia's Human Research Ethics Committee and the Department of Health of Western Australia approved the study protocol (#2010/1). Findings will be published in peer-reviewed journals and presented at local, national and international conferences, thus contributing to the evidence base informing the design of healthy neighbourhoods for all residents.
Resumo:
We explored the impact of neighborhood walkability on young adults, early-middle adults, middle-aged adults, and older adults' walking across different neighborhood buffers. Participants completed the Western Australian Health and Wellbeing Surveillance System Survey (2003–2009) and were allocated a neighborhood walkability score at 200 m, 400 m, 800 m, and 1600 m around their home. We found little difference in strength of associations across neighborhood size buffers for all life stages. We conclude that neighborhood walkability supports more walking regardless of adult life stage and is relevant for small (e.g., 200 m) and larger (e.g., 1600 m) neighborhood buffers.
Resumo:
Current military conflicts are characterized by the use of the improvised explosive device. Improvements in personal protection, medical care, and evacuation logistics have resulted in increasing numbers of casualties surviving with complex musculoskeletal injuries, often leading to lifelong disability. Thus, there exists an urgent requirement to investigate the mechanism of extremity injury caused by these devices in order to develop mitigation strategies. In addition, the wounds of war are no longer restricted to the battlefield; similar injuries can be witnessed in civilian centers following a terrorist attack. Key to understanding such mechanisms of injury is the ability to deconstruct the complexities of an explosive event into a controlled, laboratory-based environment. In this article, a traumatic injury simulator, designed to recreate in the laboratory the impulse that is transferred to the lower extremity from an anti-vehicle explosion, is presented and characterized experimentally and numerically. Tests with instrumented cadaveric limbs were then conducted to assess the simulator’s ability to interact with the human in two mounting conditions, simulating typical seated and standing vehicle passengers. This experimental device will now allow us to (a) gain comprehensive understanding of the load-transfer mechanisms through the lower limb, (b) characterize the dissipating capacity of mitigation technologies, and (c) assess the bio-fidelity of surrogates.
Resumo:
Background Improvised explosive devices have become the characteristic weapon of conflicts in Iraq and Afghanistan. While little can be done to mitigate against the effects of blast in free-field explosions, scaled blast simulations have shown that the combat boot can attenuate the effects on the vehicle occupants of anti-vehicular mine blasts. Although the combat boot offers some protection to the lower limb, its behaviour at the energies seen in anti-vehicular mine blast has not been documented previously. Methods The sole of eight same-size combat boots from two brands currently used by UK troops deployed to Iraq and Afghanistan were impacted at energies of up to 518 J, using a spring-assisted drop rig. Results The results showed that the Meindl Desert Fox combat boot consistently experienced a lower peak force at lower impact energies and a longer time-to-peak force at higher impact energies when compared with the Lowa Desert Fox combat boot. Discussion This reduction in the peak force and extended rise time, resulting in a lower energy transfer rate, is a potentially positive mitigating effect in terms of the trauma experienced by the lower limb. Conclusion Currently, combat boots are tested under impact at the energies seen during heel strike in running. Through the identification of significantly different behaviours at high loading, this study has shown that there is rationale in adding the performance of combat boots under impact at energies above those set out in international standards to the list of criteria for the selection of a combat boot.
Resumo:
The conflicts in Iraq and Afghanistan have been epitomized by the insurgents’ use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies.
Resumo:
Blast mats that can be retrofitted to the floor of military vehicles are considered to reduce the risk of injury from under‐vehicle explosions. Anthropometric test devices (ATDs) are validated for use only in the seated position. The aim of this study was to use a traumatic injury simulator fitted with 3 different blast mats in order to assess the ability of 2 ATD designs to evaluate the protective capacity of the mats in 2 occupant postures under 2 severities. Tests were performed for each combination of mat design, ATD, severity and posture using an antivehicle under‐belly injury simulator. The differences between mitigation systems were larger under the H‐III compared to the MiL‐Lx. There was little difference in how the 2 ATDs and how posture ranked the mitigation systems. Results from this study suggest that conclusions obtained by testing in the seated position can be extrapolated to the standing. However, the different percentage reductions observed in the 2 ATDs suggests different levels of protection. It is therefore unclear which ATD should be used to assess such mitigation systems. A correlation between cadavers and ATDs on the protection offered by blast mats is required in order to elucidate this issue.
Resumo:
The lower limb of military vehicle occupants has been the most injured body part due to undervehicle explosions in recent conflicts. Understanding the injury mechanism and causality of injury severity could aid in developing better protection. Therefore, we tested 4 different occupant postures (seated, brace, standing, standing with knee locked in hyper‐extension) in a simulated under‐vehicle explosion (solid blast) using our traumatic injury simulator in the laboratory; we hypothesised that occupant posture would affect injury severity. No skeletal injury was observed in the specimens in seated and braced postures. Severe, impairing injuries were observed in the foot of standing and hyper‐extended specimens. These results demonstrate that a vehicle occupant whose posture at the time of the attack incorporates knee flexion is more likely to be protected against severe skeletal injury to the lower leg.
Resumo:
Lower extremities are particularly susceptible to injury in an under‐vehicle explosion. Operational fitness of military vehicles is assessed through anthropometric test devices (ATDs) in full‐scale blast tests. The aim of this study was to compare the response between the Hybrid‐III ATD, the MiL‐Lx ATD and cadavers in our traumatic injury simulator, which is able to replicate the response of the vehicle floor in an under‐vehicle explosion. All specimens were fitted with a combat boot and tested on our traumatic injury simulator in a seated position. The load recorded in the ATDs was above the tolerance levels recommended by NATO in all tests; no injuries were observed in any of the 3 cadaveric specimens. The Hybrid‐III produced higher peak forces than the MiL‐Lx. The time to peak strain in the calcaneus of the cadavers was similar to the time to peak force in the ATDs. Maximum compression of the sole of the combat boot was similar for cadavers and MiL‐Lx, but significantly greater for the Hybrid‐III. These results suggest that the MiL‐Lx has a more biofidelic response to under‐vehicle explosive events compared to the Hybrid‐III. Therefore, it is recommended that mitigation strategies are assessed using the MiL‐Lx surrogate and not the Hybrid‐III.
Resumo:
Since World War I, explosions have accounted for over 70% of all injuries in conflict. With the development of improved personnel protection of the torso, improved medical care and faster aeromedical evacuation, casualties are surviving with more severe injuries to the extremities. Understanding the processes involved in the transfer of blast-induced shock waves through biological tissues is essential for supporting efforts aimed at mitigating and treating blast injury. Given the inherent heterogeneities in the human body, we argue that studying these processes demands a highly integrated approach requiring expertise in shock physics, biomechanics and fundamental biological processes. This multidisciplinary systems approach enables one to develop the experimental framework for investigating the material properties of human tissues that are subjected to high compression waves in blast conditions and the fundamental cellular processes altered by this type of stimuli. Ultimately, we hope to use the information gained from these studies in translational research aimed at developing improved protection for those at risk and improved clinical outcomes for those who have been injured from a blast wave.
Resumo:
Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia's Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment. © 2013 IOP Publishing Ltd.
Resumo:
Objectives: To report the quarterly incidence of hospital-identified Clostridium difficile infection (HI-CDI) in Australia, and to estimate the burden ascribed to hospital-associated (HA) and community-associated (CA) infections. Design, setting and patients: Prospective surveillance of all cases of CDI diagnosed in hospital patients from 1 January 2011 to 31 December 2012 in 450 public hospitals in all Australian states and the Australian Capital Territory. All patients admitted to inpatient wards or units in acute public hospitals, including psychiatry, rehabilitation and aged care, were included, as well as those attending emergency departments and outpatient clinics. Main outcome measures: Incidence of HI-CDI (primary outcome); proportion and incidence of HA-CDI and CA-CDI (secondary outcomes). Results: The annual incidence of HI-CDI increased from 3.25/10 000 patient-days (PD) in 2011 to 4.03/10 000 PD in 2012. Poisson regression modelling demonstrated a 29% increase (95% CI, 25% to 34%) per quarter between April and December 2011, with a peak of 4.49/10 000 PD in the October–December quarter. The incidence plateaued in January–March 2012 and then declined by 8% (95% CI, − 11% to − 5%) per quarter to 3.76/10 000 PD in July–September 2012, after which the rate rose again by 11% (95% CI, 4% to 19%) per quarter to 4.09/10 000 PD in October–December 2012. Trends were similar for HA-CDI and CA-CDI. A subgroup analysis determined that 26% of cases were CA-CDI. Conclusions: A significant increase in both HA-CDI and CA-CDI identified through hospital surveillance occurred in Australia during 2011–2012. Studies are required to further characterise the epidemiology of CDI in Australia.
Resumo:
Introduction Natural product provenance is important in the food, beverage and pharmaceutical industries, for consumer confidence and with health implications. Raman spectroscopy has powerful molecular fingerprint abilities. Surface Enhanced Raman Spectroscopy’s (SERS) sharp peaks allow distinction between minimally different molecules, so it should be suitable for this purpose. Methods Naturally caffeinated beverages with Guarana extract, coffee and Red Bull energy drink as a synthetic caffeinated beverage for comparison (20 µL ea.) were reacted 1:1 with Gold nanoparticles functionalised with anti-caffeine antibody (ab15221) (10 minutes), air dried and analysed in a micro-Raman instrument. The spectral data was processed using Principle Component Analysis (PCA). Results The PCA showed Guarana sourced caffeine varied significantly from synthetic caffeine (Red Bull) on component 1 (containing 76.4% of the variance in the data). See figure 1. The coffee containing beverages, and in particular Robert Timms (instant coffee) were very similar on component 1, but the barista espresso showed minor variance on component 1. Both coffee sourced caffeine samples varied with red Bull on component 2, (20% of variance). ************************************************************ Figure 1 PCA comparing a naturally caffeinated beverage containing Guarana with coffee. ************************************************************ Discussion PCA is an unsupervised multivariate statistical method that determines patterns within data. Figure 1 shows Caffeine in Guarana is notably different to synthetic caffeine. Other researchers have revealed that caffeine in Guarana plants is complexed with tannins. Naturally sourced/ lightly processed caffeine (Monster Energy, Espresso) are more inherently different than synthetic (Red Bull) /highly processed (Robert Timms) caffeine, in figure 1, which is consistent with this finding and demonstrates this technique’s applicability. Guarana provenance is important because it is still largely hand produced and its demand is escalating with recognition of its benefits. This could be a powerful technique for Guarana provenance, and may extend to other industries where provenance / authentication are required, e.g. the wine or natural pharmaceuticals industries.
Resumo:
Motivated by privacy issues associated with dissemination of signed digital certificates, we define a new type of signature scheme called a ‘Universal Designated-Verifier Signature’ (UDVS). A UDVS scheme can function as a standard publicly-verifiable digital signature but has additional functionality which allows any holder of a signature (not necessarily the signer) to designate the signature to any desired designated-verifier (using the verifier’s public key). Given the designated-signature, the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. We propose an efficient deterministic UDVS scheme constructed using any bilinear group-pair. Our UDVS scheme functions as a standard Boneh-Lynn-Shacham (BLS) signature when no verifier-designation is performed, and is therefore compatible with the key-generation, signing and verifying algorithms of the BLS scheme. We prove that our UDVS scheme is secure in the sense of our unforgeability and privacy notions for UDVS schemes, under the Bilinear Diffie-Hellman (BDH) assumption for the underlying group-pair, in the random-oracle model. We also demonstrate a general constructive equivalence between a class of unforgeable and unconditionally-private UDVS schemes having unique signatures (which includes the deterministic UDVS schemes) and a class of ID-Based Encryption (IBE) schemes which contains the Boneh-Franklin IBE scheme but not the Cocks IBE scheme.
Resumo:
The question of whether more Socially Responsible (SR) firms outperform or underperform other conventional firms has been debated in the economic literature. In this study, using the Socially Responsible Investment (SRI) indexes and conventional stock indexes in the US, the UK and Japan, first and second moments of firm performance distributions are estimated based on the Markov Switching (MS) model. We find two distinct regimes (bear and bull) in the SRI markets as well as the stock markets for all the three countries. These regimes occur with the same timing in both types of market. No statistical difference in means and volatilities generated from the SRI indexes and conventional indexes in either region was found. Furthermore, we find strong comovements between the two indexes in both the regimes.