900 resultados para Sports - Safety measures
Resumo:
Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach, which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this chapter we propose two approaches which measure multi-level association rules to help evaluate their interestingness by considering the database’s underlying taxonomy. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.
Resumo:
Objectives To evaluate the feasibility, acceptability and effects of a Tai Chi and Qigong exercise programme in adults with elevated blood glucose. Design, Setting, and Participants A single group pre–post feasibility trial with 11 participants (3 male and 8 female; aged 42–65 years) with elevated blood glucose. Intervention Participants attended Tai Chi and Qigong exercise training for 1 to 1.5 h, 3 times per week for 12 weeks, and were encouraged to practise the exercises at home. Main Outcome Measures Indicators of metabolic syndrome (body mass index (BMI), waist circumference, blood pressure, fasting blood glucose, triglycerides, HDL-cholesterol); glucose control (HbA1c, fasting insulin and insulin resistance (HOMA)); health-related quality of life; stress and depressive symptoms. Results There was good adherence and high acceptability. There were significant improvements in four of the seven indicators of metabolic syndrome including BMI (mean difference −1.05, p<0.001), waist circumference (−2.80 cm, p<0.05), and systolic (−11.64 mm Hg, p<0.01) and diastolic blood pressure (−9.73 mm Hg, p<0.001), as well as in HbA1c (−0.32%, p<0.01), insulin resistance (−0.53, p<0.05), stress (−2.27, p<0.05), depressive symptoms (−3.60, p<0.05), and the SF-36 mental health summary score (5.13, p<0.05) and subscales for general health (19.00, p<0.01), mental health (10.55, p<0.01) and vitality (23.18, p<0.05). Conclusions The programme was feasible and acceptable and participants showed improvements in metabolic and psychological variables. A larger controlled trial is now needed to confirm these promising preliminary results.
Resumo:
In Australia, protection orders are a key legal response to domestic violence, and are often viewed as a way of providing for victim safety. For instance, recently the joint Australian and New South Wales Law Reform Commissions recommended that a common core purpose of all state and territory domestic violence legislation should be ‘to ensure or maximise the safety and protection of persons who fear or experience family violence’ (2010:Recommendation 7-4). Drawing and building upon prior research in Australia and the United States (‘US’), this paper uses comparative quantitative content analysis to assess the victim safety focus of domestic violence protection order legislation in each Australian state and territory. The findings of this analysis show that the Northern Territory, South Australia and Victoria ‘stand out’ from the other jurisdictions, having the highest victim safety focus in their legislation. However, there remains sizeable scope for improvement in all Australian jurisdictions, in terms of the victim safety focus of their legislative provisions and the considerations of legislative inconsistency between jurisdictions.
Resumo:
Road traffic injuries are one of the major public health burdens worldwide. The United Nations Decade of Action for Road Safety (2011-2020) implores all nations to work to reduce this burden. This decade represents a unique and historic period of time in the field of road safety. Information exchange and co-operation between nations is an important step in achieving the goal. The burden of road crashes, fatalities and injuries is not equally distributed. We know that low and middle-income countries experience the majority of the road trauma burden. Therefore it is imperative that these countries learn from the successes of others that have developed and implemented road safety laws, public education campaigns and countermeasures over many years and have achieved significant road trauma reductions as a result. China is one of the countries experiencing a large road trauma burden. Vulnerable road users such as pedestrians and cyclists make up a large proportion of fatalities and injuries in China. Speeding, impaired/drug driving, distracted driving, vehicle overloading, inadequate road infrastructure, limited use of safety restraints and helmets, and limited road safety training have all been identified as contributing to the problem. Some important steps have been taken to strengthen China’s approach, including increased penalties for drunk driving in May 2011 and increased attention to school bus safety in 2011/12. However, there is still a large amount of work needed to improve the current road safety position in China. This paper provides details of a program to assist with road safety knowledge exchange between China and Australia that was funded by the Australian Government which was undertaken in the latter part of 2012. The four month program provided the opportunity for the first author to work closely with key agencies in Australia that are responsible for policy development and implementation of a broad range of road safety initiatives. In doing so, an in-depth understanding was gained about key road safety strategies in Australia and processes for developing and implementing them. Insights were also gained into the mechanisms used for road safety policy development, implementation and evaluation in several Australian jurisdictions. Road traffic law and enforcement issues were explored with the relevant jurisdictional transport and police agencies to provide a greater understanding of how Chinese laws and practices could be enhanced. Working with agencies responsible for public education and awareness campaigns about road safety in Australia also provided relevant information about how to promote road safety at the broader community level in China. Finally, the program provided opportunities to work closely with several world-renowned Australian research centres and key expert researchers to enhance opportunities for ongoing road safety research in China. The overall program provided the opportunity for the first author to develop knowledge in key areas of road safety strategy development, implementation and management which are directly relevant to the current situation in China. This paper describes some main observations and findings from participation in the program.
Resumo:
Background In China, as in many developing countries, rapid increases in car ownership and new drivers have been coupled with a large trauma burden. The World Health Organization has identified key risk factors including speeding, drink-driving, helmet and restraint non-use, overloaded vehicles, and fatigued-driving in many rapidly motorising countries, including China. Levels of awareness of these risk factors among road users are not well understood. Although research identifies speeding as the major factor contributing to road crashes in China, there appears to be widespread acceptance of it among the broader community. Purpose To assess self-reported speeding and awareness of crash risk factors among Chinese drivers in Beijing. Methods Car drivers (n=299) were recruited from car washing locations and car parks to complete an anonymous questionnaire. Perceptions of the relative risk of drink-driving, fatigued-driving and speeding, and attitudes towards speeding and self-reported driving speeds were assessed. Results Overall, driving speeds of >10km/hr above posted limits on two road types (60 and 80 km/hour zones) were reported by more than one third of drivers. High-range speeding (i.e., >30 km/hour in a 60 km/hour zone and >40 km/hour in an 80 km/hour zone) was reported by approximately 5% of the sample. Attitudinal measures indicated that approximately three quarters of drivers reported attitudes that were not supportive of speeding. Drink-driving was identified as the most risky behaviour; 18% reported the perception that drink-driving had the same level of danger as speeding and 82% reported it as more dangerous. For fatigued-driving, 1% reported the perception that it was not as dangerous as speeding; 27.4% reported it as the same level and 71.6% perceived it as more dangerous. Conclusion Driving speeds well above posted speed limits were commonly reported by drivers. Speeding was rated as the least dangerous on-road behaviour, compared to drink-driving and fatigued-driving. One third of drivers reported regularly engaging in speeds at least 10km/hr above posted limits, despite speeding being the major reported contributor to crashes. Greater awareness of the risks associated with speeding is needed to help reduce the road trauma burden in China and promote greater speed limit compliance.
Resumo:
Pilot cars are used in one-lane two-way work zones to guide traffic and keep their speeds within posted limits. While many studies have examined the effectiveness of measures to reduce vehicle speeds in work zones, little is known about the reductions achievable through the use of pilot cars. This paper examines the effectiveness of a pilot car in reducing travel speeds in a rural highway work zone in Queensland, Australia. Analysis of speed data covering a period of five days showed that a pilot car reduced average speeds at the treatment location, but not downstream. The proportion of vehicles speeding through the activity area was also reduced, particularly those traveling at 10 km/h or more above the posted limit. Motorists were more likely to speed during the day, under a 40 kh/h limit, when traffic volumes were higher and when there were fewer vehicles in the traffic stream. Medium vehicles were less likely to speed in the presence of a pilot car than light vehicles. To maximize these benefits, it is necessary to ensure that the pilot car itself is not speeding.
Resumo:
Objective The 2010–2011 Queensland floods resulted in the most deaths from a single flood event in Australia since 1916. This article analyses the information on these deaths for comparison with those from previous floods in modern Australia in an attempt to identify factors that have contributed to those deaths. Haddon's Matrix, originally designed for prevention of road trauma, offers a framework for understanding the interplay between contributing factors and helps facilitate a clearer understanding of the varied strategies required to ensure people's safety for particular flood types. Methods Public reports and flood relevant literature were searched using key words ‘flood’, ‘fatality’, ‘mortality’, ‘death’, ‘injury’ and ‘victim’ through Google Scholar, PubMed, ProQuest and EBSCO. Data relating to reported deaths during the 2010–2011 Queensland floods, and relevant data of previous Australian flood fatality (1997–2009) were collected from these available sources. These sources were also used to identify contributing factors. Results There were 33 deaths directly attributed to the event, of which 54.5% were swept away in a flash flood on 10 January 2011. A further 15.1% of fatalities were caused by inappropriate behaviours. This is different to floods in modern Australia where over 90% of deaths are related to the choices made by individuals. There is no single reason why people drown in floods, but rather a complex interplay of factors. Conclusions The present study and its integration of research findings and conceptual frameworks might assist governments and communities to develop policies and strategies to prevent flood injury and fatalities.
Resumo:
Objective Despite ‘hospital resilience’ gaining prominence in recent years, it remains poorly defined. This article aims to define hospital resilience, build a preliminary conceptual framework and highlight possible approaches to measurement. Methods Searches were conducted of the commonly used health databases to identify relevant literature and reports. Search terms included ‘resilience and framework or model’ or ‘evaluation or assess or measure and hospital and disaster or emergency or mass casualty and resilience or capacity or preparedness or response or safety’. Articles were retrieved that focussed on disaster resilience frameworks and the evaluation of various hospital capacities. Result A total of 1480 potentially eligible publications were retrieved initially but the final analysis was conducted on 47 articles, which appeared to contribute to the study objectives. Four disaster resilience frameworks and 11 evaluation instruments of hospital disaster capacity were included. Discussion and conclusion Hospital resilience is a comprehensive concept derived from existing disaster resilience frameworks. It has four key domains: hospital safety; disaster preparedness and resources; continuity of essential medical services; recovery and adaptation. These domains were categorised according to four criteria, namely, robustness, redundancy, resourcefulness and rapidity. A conceptual understanding of hospital resilience is essential for an intellectual basis for an integrated approach to system development. This article (1) defines hospital resilience; (2) constructs conceptual framework (including key domains); (3) proposes comprehensive measures for possible inclusion in an evaluation instrument, and; (4) develops a matrix of critical issues to enhance hospital resilience to cope with future disasters.
Resumo:
This paper will identify and discuss the major occupational health and safety (OHS) hazards and risks for clean-up and recovery workers. The lessons learned from previous disasters including; the Exxon Valdez oil spill, World Trade Centre (WTC) terrorist attack, Hurricane Katrina and the Deepwater Horizon Gulf of Mexico oil spill will be discussed. The case for an increased level of preparation and planning to mitigate the health risks for clean-up and recovery workers will be presented, based on recurring themes identified in the peer reviewed literature. There are a number of important issues pertaining to the occupational health and safety of workers who are engaged in clean-up and recovery operations following natural and technological disasters. These workers are often exposed to a wide range of occupational health and safety hazards, some of which may be unknown at the time. It is well established that clean-up and recovery operations involve risks of physical injury, for example, from manual handling, mechanical equipment, extreme temperatures, slips, trips and falls. In addition to these well established physical injury risks there are now an increasing number of studies which highlight the risks of longer term or chronic health effects arising from clean-up and recovery work. In particular, follow up studies from the Exxon Valdez oil spill, Hurricane Katrina and the World Trade Centre (WTC) terrorism attack have documented the longer term health consequences of these events. These health effects include respiratory symptoms and musculoskeletal disorders, as well as post traumatic stress disorder (PTSD). In large scale operations many of those workers and supervisors involved have not had any specific occupational health and safety (OHS) training and may not have access to the necessary instruction, personal protective equipment or other appropriate equipment, this is especially true when volunteers are used to form part of the clean-up and recovery workforce. In general, first responders are better equipped and trained than clean-up and recovery workers and some of the training approaches used for the traditional first responders would be relevant for clean-up and recovery workers.
Resumo:
Objectives Current evidence to support non-medical prescribing is predominantly qualitative, with little evaluation of accuracy, safety and appropriateness. Our aim was to evaluate a new model of service for the Australia healthcare system, of inpatient medication prescribing by a pharmacist in an elective surgery preadmission clinic (PAC) against usual care, using an endorsed performance framework. Design Single centre, randomised controlled, two-arm trial. Setting Elective surgery PAC in a Brisbane-based tertiary hospital. Participants 400 adults scheduled for elective surgery were randomised to intervention or control. Intervention A pharmacist generated the inpatient medication chart to reflect the patient's regular medication, made a plan for medication perioperatively and prescribed venous thromboembolism (VTE) prophylaxis. In the control arm, the medication chart was generated by the Resident Medical Officers. Outcome measures Primary outcome was frequency of omissions and prescribing errors when compared against the medication history. The clinical significance of omissions was also analysed. Secondary outcome was appropriateness of VTE prophylaxis prescribing. Results There were significantly less unintended omissions of medications: 11 of 887 (1.2%) intervention orders compared with 383 of 1217 (31.5%) control (p<0.001). There were significantly less prescribing errors involving selection of drug, dose or frequency: 2 in 857 (0.2%) intervention orders compared with 51 in 807 (6.3%) control (p<0.001). Orders with at least one component of the prescription missing, incorrect or unclear occurred in 208 of 904 (23%) intervention orders and 445 of 1034 (43%) controls (p<0.001). VTE prophylaxis on admission to the ward was appropriate in 93% of intervention patients and 90% controls (p=0.29). Conclusions Medication charts in the intervention arm contained fewer clinically significant omissions, and prescribing errors, when compared with controls. There was no difference in appropriateness of VTE prophylaxis on admission between the two groups.
Resumo:
AIMS: Recent studies on corneal markers have advocated corneal nerve fibre length as the most important measure of diabetic peripheral neuropathy. The aim of this study was to determine if standardizing corneal nerve fibre length for tortuosity increases its association with other measures of diabetic peripheral neuropathy. METHODS: Two hundred and thirty-one individuals with diabetes with either predominantly mild or absent neuropathic changes and 61 control subjects underwent evaluation of diabetic neuropathy symptom score, neuropathy disability score, testing with 10-g monofilament, quantitative sensory testing (warm, cold, vibration detection) and nerve conduction studies. Corneal nerve fibre length and corneal nerve fibre tortuosity were measured using corneal confocal microscopy. A tortuosity-standardised corneal nerve fibre length variable was generated by dividing corneal nerve fibre length by corneal nerve fibre tortuosity. Differences in corneal nerve morphology between individuals with and without diabetic peripheral neuropathy and control subjects were determined and associations were estimated between corneal morphology and established tests of, and risk factors for, diabetic peripheral neuropathy. RESULTS: The tortuosity-standardised corneal nerve fibre length variable was better than corneal nerve fibre length in demonstrating differences between individuals with diabetes, with and without neuropathy (tortuosity-standardised corneal nerve fibre length variable: 70.5 ± 27.3 vs. 84.9 ± 28.7, P < 0.001, receiver operating characteristic area under the curve = 0.67; corneal nerve fibre length: 15.9 ± 6.9 vs. 18.4 ± 6.2 mm/mm(2) , P = 0.004, receiver operating characteristic area under the curve = 0.64). Furthermore, the tortuosity-standardised corneal nerve fibre length variable demonstrated a significant difference between the control subjects and individuals with diabetes, without neuropathy, while corneal nerve fibre length did not (tortuosity-standardised corneal nerve fibre length variable: 94.3 ± 27.1 vs. 84.9 ± 28.7, P = 0.028; corneal nerve fibre length: 20.1 ± 6.3 vs. 18.4 ± 6.2 mm/mm(2) , P = 0.084). Correlations between corneal nerve fibre length and established measures of neuropathy and risk factors for neuropathy were higher when a correction was made for the nerve tortuosity. CONCLUSIONS: Standardizing corneal nerve fibre length for tortuosity enhances the ability to differentiate individuals with diabetes, with and without neuropathy.
Resumo:
Objective Recently, a number of studies have identified self-employed Protective Behavioral Strategies (PBS) as effective in decreasing the level of alcohol-related harm among young people. However, much of the published research has ignored important gender differences, such as women's increased tendency to rely on PBS that are social in nature. To further the understanding of women's PBS, the current study sought to investigate the nature and correlates of the strategies young women employ to keep their friends safe when drinking (i.e., peer-directed PBS). Method A scale measuring peer-directed PBS was developed and administered in conjunction with existing measures of alcohol consumption, personal PBS, and peer attachment. Participants consisted of 422 women aged 18–30 years, recruited among psychology students and the general public. Results Exploratory factor analysis revealed two clusters of peer-directed PBS; those that were aimed at reducing intoxication among one's friends and those that were designed to minimize alcohol-related harms. Further analysis found a positive relationship between women's tendency to implement personal and peer-directed PBS and that risky drinkers were less likely to engage in personal or peer-directed PBS (either type). Conclusion Findings indicate that personal and peer-directed PBS are related behaviors that are less frequently adopted by risky drinkers.
Resumo:
The contact lens industry has evolved and now provides many choices, including continuous wear, overnight orthokeratology, frequent-replacement lenses, daily-disposable lenses, and many alternatives in systems of care and maintenance. Epidemiologic studies to date have shown that how a lens is worn, particularly if worn overnight, can increase the risk of microbial keratitis. However, the risk of silicone hydrogel contact lenses worn on a continuous-wear basis has been evaluated only recently. This article summarizes the recent research data on extended-wear silicone hydrogel lenses and discusses the challenges of early evaluations of silicone hydrogel lens safety. Finally, the relevance of this information is discussed to practitioners and contact lens wearers making choices about the risks and benefits of different products and how they are used.
Resumo:
In nature, the interactions between agents in a complex system (fish schools; colonies of ants) are governed by information that is locally created. Each agent self-organizes (adjusts) its behaviour, not through a central command centre, but based on variables that emerge from the interactions with other system agents in the neighbourhood. Self-organization has been proposed as a mechanism to explain the tendencies for individual performers to interact with each other in field-invasion sports teams, displaying functional co-adaptive behaviours, without the need for central control. The relevance of self-organization as a mechanism that explains pattern-forming dynamics within attacker-defender interactions in field-invasion sports has been sustained in the literature. Nonetheless, other levels of interpersonal coordination, such as intra-team interactions, still raise important questions, particularly with reference to the role of leadership or match strategies that have been prescribed in advance by a coach. The existence of key properties of complex systems, such as system degeneracy, nonlinearity or contextual dependency, suggests that self-organization is a functional mechanism to explain the emergence of interpersonal coordination tendencies within intra-team interactions. In this opinion article we propose how leadership may act as a key constraint on the emergent, self-organizational tendencies of performers in field-invasion sports.
Resumo:
Capacity to produce data for performance analysis in sports has been enhanced in the last decade with substantial technological advances. However, current performance analysis methods have been criticised for the lack of a viable theoretical framework to assist on the development of fundamental principles that regulate performance achievement. Our aim in this paper is to discuss ecological dynamics as an explanatory framework for improving analysis and understanding of competitive performance behaviours. We argue that integration of ideas from ecological dynamics into previous approaches to performance analysis advances current understanding of how sport performance emerges from continuous interactions between individual players and teams. Exemplar data from previous studies in association football are presented to illustrate this novel perspective on performance analysis. Limitations of current ecological dynamics research and challenges for future research are discussed in order to improve the meaningfulness of information presented to coaches and managers.