73 resultados para Pit bull terriers
Resumo:
Objective: The present study investigated the foot health of the Kaimanawa feral horse population and tested the hypotheses that horses would have a large range of foot morphology and that the incidence of foot abnormality would be significantly high. Procedures: Abnormality was defined as a variation from what the two veterinarian assessors considered as optimal morphology and which was considered to impact negatively on the structure and/or function of the foot. Fifteen morphometric variables were measured on four calibrated photographic views of all four feet of 20 adult Kaimanawa feral horses. Four morphometric variables were measured from the lateromedial radiographs of the left forefoot of each horse. In addition, the study identified the incidence of gross abnormality observed on the photographs and radiographs of all 80 feet. Results: There was a large variation between horses in the morphometric dimensions, indicating an inconsistent foot type. Mean hoof variables were outside the normal range recommended by veterinarians and hoof care providers; 35% of all feet had a long toe conformation and 15% had a mediolateral imbalance. Abnormalities included lateral (85% of horses) and dorsal (90% of horses) wall flares, presence of laminar rings (80% of horses) and bull-nose tip of the distal phalanx (75% of horses). Both hypotheses were therefore accepted. Conclusions: The Kaimanawa feral horse population demonstrated a broad range of foot abnormalities and we propose that one reason for the questionable foot health and conformation is lack of abrasive wearing by the environment. In comparison with other feral horse populations in Australia and America there may be less pressure on the natural selection of the foot of the Kaimanawa horses by the forgiving environment of the Kaimanawa Ranges. Contrary to popular belief, the feral horse foot type should not be used as an ideal model for the domestic horse foot.
Resumo:
Bi-2212 thick film on silver tapes are seen as a simple and low cost alternative to high temperature superconducting wires produced by the Powder In Thbe (PIT) technique, particularly in react and wind applications. A rig for the continuous production of Bi-2212 tapes for use in react and wind component manufacture has been developed and commissioned. The rig consists of several sections, each fully automatic, for task specific duties in the production of HTS tape. The major sections are: tape coating, sintering and annealing. High temperature superconducting tapes with engineering critical current densities of 10 kA/cm2 (77 K, self field), and lengths of up to 100 m have been produced using the rig. Properties of the finished tape are discussed and results are presented for current density versus bend radius and applied strain. Depending on tape content and thickness, irreversible strain tirrm varies between 0.04 and 0.1 %. Cyclic bending tests when applied strain does not exceed Eirrm showed negligible reduction in J c along the length of the tape.
Resumo:
In this paper, an interactive planning and scheduling framework are proposed for optimising operations from pits to crushers in ore mining industry. Series of theoretical and practical operations research techniques are investigated to improve the overall efficiency of mining systems due to the facts that mining managers need to tackle optimisation problems within different horizons and with different levels of detail. Under this framework, mine design planning,mine production sequencing and mine transportation scheduling models are integrated and interacted within a whole optimisation system. The proposed integrated framework could be used by mining industry for reducing equipment costs, improving the production efficiency and maximising the net present value.
Resumo:
BACKGROUND: Variations in 'slope' (how steep or flat the ground is) may be good for health. As walking up hills is a physiologically vigorous physical activity and can contribute to weight control, greater neighbourhood slopes may provide a protective barrier to weight gain, and help prevent Type 2 diabetes onset. We explored whether living in 'hilly' neighbourhoods was associated with diabetes prevalence among the Australian adult population. METHODS: Participants ([greater than or equal to]25years; n=11,406) who completed the Western Australian Health and Wellbeing Surveillance System Survey (2003-2009) were asked whether or not they had medically-diagnosed diabetes. Geographic Information Systems (GIS) software was used to calculate a neighbourhood mean slope score, and other built environment measures at 1600m around each participant's home. Logistic regression models were used to predict the odds of self-reported diabetes after progressive adjustment for individual measures (i.e., age, sex), socioeconomic status (i.e., education, income), built environment, destinations, nutrition, and amount of walking. RESULTS: After full adjustment, the odds of self-reported diabetes was 0.72 (95% CI 0.55-0.95) and 0.52 (95% CI 0.39-0.69) for adults living in neighbourhoods with moderate and higher levels of slope, respectively, compared with adults living in neighbourhoods with the lowest levels of slope. The odds of having diabetes was 13% lower (odds ratio 0.87; 95% CI 0.80-0.94) for each increase of one percent in mean slope. CONCLUSIONS: Living in a hilly neighbourhood may be protective of diabetes onset or this finding is spurious. Nevertheless, the results are promising and have implications for future research and the practice of flattening land in new housing developments.
Resumo:
Introduction: The built environment is increasingly recognised as being associated with health outcomes. Relationships between the built environment and health differ among age groups, especially between children and adults, but also between younger, mid-age and older adults. Yet few address differences across life stage groups within a single population study. Moreover, existing research mostly focuses on physical activity behaviours, with few studying objective clinical and mental health outcomes. The Life Course Built Environment and Health (LCBEH) project explores the impact of the built environment on self-reported and objectively measured health outcomes in a random sample of people across the life course. Methods and analysis: This cross-sectional data linkage study involves 15 954 children (0–15 years), young adults (16–24 years), adults (25–64 years) and older adults (65+years) from the Perth metropolitan region who completed the Health and Wellbeing Surveillance System survey administered by the Department of Health of Western Australia from 2003 to 2009. Survey data were linked to Western Australia's (WA) Hospital Morbidity Database System (hospital admission) and Mental Health Information System (mental health system outpatient) data. Participants’ residential address was geocoded and features of their ‘neighbourhood’ were measured using Geographic Information Systems software. Associations between the built environment and self-reported and clinical health outcomes will be explored across varying geographic scales and life stages. Ethics and dissemination: The University of Western Australia's Human Research Ethics Committee and the Department of Health of Western Australia approved the study protocol (#2010/1). Findings will be published in peer-reviewed journals and presented at local, national and international conferences, thus contributing to the evidence base informing the design of healthy neighbourhoods for all residents.
Resumo:
We explored the impact of neighborhood walkability on young adults, early-middle adults, middle-aged adults, and older adults' walking across different neighborhood buffers. Participants completed the Western Australian Health and Wellbeing Surveillance System Survey (2003–2009) and were allocated a neighborhood walkability score at 200 m, 400 m, 800 m, and 1600 m around their home. We found little difference in strength of associations across neighborhood size buffers for all life stages. We conclude that neighborhood walkability supports more walking regardless of adult life stage and is relevant for small (e.g., 200 m) and larger (e.g., 1600 m) neighborhood buffers.
Resumo:
Current military conflicts are characterized by the use of the improvised explosive device. Improvements in personal protection, medical care, and evacuation logistics have resulted in increasing numbers of casualties surviving with complex musculoskeletal injuries, often leading to lifelong disability. Thus, there exists an urgent requirement to investigate the mechanism of extremity injury caused by these devices in order to develop mitigation strategies. In addition, the wounds of war are no longer restricted to the battlefield; similar injuries can be witnessed in civilian centers following a terrorist attack. Key to understanding such mechanisms of injury is the ability to deconstruct the complexities of an explosive event into a controlled, laboratory-based environment. In this article, a traumatic injury simulator, designed to recreate in the laboratory the impulse that is transferred to the lower extremity from an anti-vehicle explosion, is presented and characterized experimentally and numerically. Tests with instrumented cadaveric limbs were then conducted to assess the simulator’s ability to interact with the human in two mounting conditions, simulating typical seated and standing vehicle passengers. This experimental device will now allow us to (a) gain comprehensive understanding of the load-transfer mechanisms through the lower limb, (b) characterize the dissipating capacity of mitigation technologies, and (c) assess the bio-fidelity of surrogates.
Resumo:
Background Improvised explosive devices have become the characteristic weapon of conflicts in Iraq and Afghanistan. While little can be done to mitigate against the effects of blast in free-field explosions, scaled blast simulations have shown that the combat boot can attenuate the effects on the vehicle occupants of anti-vehicular mine blasts. Although the combat boot offers some protection to the lower limb, its behaviour at the energies seen in anti-vehicular mine blast has not been documented previously. Methods The sole of eight same-size combat boots from two brands currently used by UK troops deployed to Iraq and Afghanistan were impacted at energies of up to 518 J, using a spring-assisted drop rig. Results The results showed that the Meindl Desert Fox combat boot consistently experienced a lower peak force at lower impact energies and a longer time-to-peak force at higher impact energies when compared with the Lowa Desert Fox combat boot. Discussion This reduction in the peak force and extended rise time, resulting in a lower energy transfer rate, is a potentially positive mitigating effect in terms of the trauma experienced by the lower limb. Conclusion Currently, combat boots are tested under impact at the energies seen during heel strike in running. Through the identification of significantly different behaviours at high loading, this study has shown that there is rationale in adding the performance of combat boots under impact at energies above those set out in international standards to the list of criteria for the selection of a combat boot.
Resumo:
The conflicts in Iraq and Afghanistan have been epitomized by the insurgents’ use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies.
Resumo:
Blast mats that can be retrofitted to the floor of military vehicles are considered to reduce the risk of injury from under‐vehicle explosions. Anthropometric test devices (ATDs) are validated for use only in the seated position. The aim of this study was to use a traumatic injury simulator fitted with 3 different blast mats in order to assess the ability of 2 ATD designs to evaluate the protective capacity of the mats in 2 occupant postures under 2 severities. Tests were performed for each combination of mat design, ATD, severity and posture using an antivehicle under‐belly injury simulator. The differences between mitigation systems were larger under the H‐III compared to the MiL‐Lx. There was little difference in how the 2 ATDs and how posture ranked the mitigation systems. Results from this study suggest that conclusions obtained by testing in the seated position can be extrapolated to the standing. However, the different percentage reductions observed in the 2 ATDs suggests different levels of protection. It is therefore unclear which ATD should be used to assess such mitigation systems. A correlation between cadavers and ATDs on the protection offered by blast mats is required in order to elucidate this issue.
Resumo:
The lower limb of military vehicle occupants has been the most injured body part due to undervehicle explosions in recent conflicts. Understanding the injury mechanism and causality of injury severity could aid in developing better protection. Therefore, we tested 4 different occupant postures (seated, brace, standing, standing with knee locked in hyper‐extension) in a simulated under‐vehicle explosion (solid blast) using our traumatic injury simulator in the laboratory; we hypothesised that occupant posture would affect injury severity. No skeletal injury was observed in the specimens in seated and braced postures. Severe, impairing injuries were observed in the foot of standing and hyper‐extended specimens. These results demonstrate that a vehicle occupant whose posture at the time of the attack incorporates knee flexion is more likely to be protected against severe skeletal injury to the lower leg.
Resumo:
Lower extremities are particularly susceptible to injury in an under‐vehicle explosion. Operational fitness of military vehicles is assessed through anthropometric test devices (ATDs) in full‐scale blast tests. The aim of this study was to compare the response between the Hybrid‐III ATD, the MiL‐Lx ATD and cadavers in our traumatic injury simulator, which is able to replicate the response of the vehicle floor in an under‐vehicle explosion. All specimens were fitted with a combat boot and tested on our traumatic injury simulator in a seated position. The load recorded in the ATDs was above the tolerance levels recommended by NATO in all tests; no injuries were observed in any of the 3 cadaveric specimens. The Hybrid‐III produced higher peak forces than the MiL‐Lx. The time to peak strain in the calcaneus of the cadavers was similar to the time to peak force in the ATDs. Maximum compression of the sole of the combat boot was similar for cadavers and MiL‐Lx, but significantly greater for the Hybrid‐III. These results suggest that the MiL‐Lx has a more biofidelic response to under‐vehicle explosive events compared to the Hybrid‐III. Therefore, it is recommended that mitigation strategies are assessed using the MiL‐Lx surrogate and not the Hybrid‐III.
Resumo:
Since World War I, explosions have accounted for over 70% of all injuries in conflict. With the development of improved personnel protection of the torso, improved medical care and faster aeromedical evacuation, casualties are surviving with more severe injuries to the extremities. Understanding the processes involved in the transfer of blast-induced shock waves through biological tissues is essential for supporting efforts aimed at mitigating and treating blast injury. Given the inherent heterogeneities in the human body, we argue that studying these processes demands a highly integrated approach requiring expertise in shock physics, biomechanics and fundamental biological processes. This multidisciplinary systems approach enables one to develop the experimental framework for investigating the material properties of human tissues that are subjected to high compression waves in blast conditions and the fundamental cellular processes altered by this type of stimuli. Ultimately, we hope to use the information gained from these studies in translational research aimed at developing improved protection for those at risk and improved clinical outcomes for those who have been injured from a blast wave.
Resumo:
Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia's Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment. © 2013 IOP Publishing Ltd.
Resumo:
Objectives: To report the quarterly incidence of hospital-identified Clostridium difficile infection (HI-CDI) in Australia, and to estimate the burden ascribed to hospital-associated (HA) and community-associated (CA) infections. Design, setting and patients: Prospective surveillance of all cases of CDI diagnosed in hospital patients from 1 January 2011 to 31 December 2012 in 450 public hospitals in all Australian states and the Australian Capital Territory. All patients admitted to inpatient wards or units in acute public hospitals, including psychiatry, rehabilitation and aged care, were included, as well as those attending emergency departments and outpatient clinics. Main outcome measures: Incidence of HI-CDI (primary outcome); proportion and incidence of HA-CDI and CA-CDI (secondary outcomes). Results: The annual incidence of HI-CDI increased from 3.25/10 000 patient-days (PD) in 2011 to 4.03/10 000 PD in 2012. Poisson regression modelling demonstrated a 29% increase (95% CI, 25% to 34%) per quarter between April and December 2011, with a peak of 4.49/10 000 PD in the October–December quarter. The incidence plateaued in January–March 2012 and then declined by 8% (95% CI, − 11% to − 5%) per quarter to 3.76/10 000 PD in July–September 2012, after which the rate rose again by 11% (95% CI, 4% to 19%) per quarter to 4.09/10 000 PD in October–December 2012. Trends were similar for HA-CDI and CA-CDI. A subgroup analysis determined that 26% of cases were CA-CDI. Conclusions: A significant increase in both HA-CDI and CA-CDI identified through hospital surveillance occurred in Australia during 2011–2012. Studies are required to further characterise the epidemiology of CDI in Australia.