305 resultados para high-risk medications
Resumo:
Objectives To assess the effects of information interventions which orient patients and their carers/family to a cancer care facility and the services available within the facility. Design Systematic review of randomised controlled trials (RCTs), cluster RCTs and quasi-RCTs. Data sources MEDLINE, CINAHL, PsycINFO, EMBASE and the Cochrane Central Register of Controlled Trials. Methods We included studies evaluating the effect of an orientation intervention, compared with a control group which received usual care, or with trials comparing one orientation intervention with another orientation intervention. Results Four RCTs of 610 participants met the criteria for inclusion. Findings from two RCTs demonstrated significant benefits of the orientation intervention in relation to reduced levels of distress (mean difference (MD): −8.96, 95% confidence interval (95%CI): −11.79 to −6.13), but non-significant benefits in relation to the levels state anxiety levels (MD −9.77) (95%CI: −24.96 to 5.41). There are insufficient data on the other outcomes of interest. Conclusions This review has demonstrated the feasibility and some potential benefits of orientation interventions. There was a low level of evidence to suggest that orientation interventions can reduce distress in patients. However, other outcomes, including patient knowledge recall/satisfaction, remain inconclusive. The majority of trials were subjected to high risk of bias and were likely to be insufficiently powered. Further well conducted and powered RCTs are required to provide evidence for determining the most appropriate intensity, nature, mode and resources for such interventions. Patient and carer-focused outcomes should be included.
Resumo:
Safety culture is a concept that has long been accepted in high risk industries such as aviation, nuclear industries and mining, however, considerable research is now also being undertaken within the construction sector. This paper discusses three recent interlocked projects undertaken in the Australian construction industry. The first project examined the development and implementation of a safety competency framework targeted at safety critical positions (SCP's) across first tier construction organisations. Combining qualitative and quantitative methods, the project: developed a matrix of SCP's (n=11) and safety management tasks (SMTs; n=39); mapped the process steps for their acquisition and development; detailed the knowledge, skills and behaviours required for all SMTs; and outlined potential organisational cultural outcomes from a successful implementation of the framework. The second project extended this research to develop behavioural guidelines for leaders to drive safety culture change down to second tier companies and to assist them to customise their own competency framework and implementation guidelines to match their aspirations and resources. The third interlocked project explored the use of safety effectiveness indicators (SEIs) as an industry-relevant assessment tool for reducing risk on construction sites. With direct linkages to safety competencies and SMT's, the SEIs are the next step towards an integrated safety cultural approach to safety and extend the concept of positive performance indicators (PPIs) by providing a valid, reliable, and user friendly measurement platform. Taken together, the results of the interlocked projects suggest that industry engaged collaborative safety culture research has many potential benefits for the construction industry.
Resumo:
Applying ice or other forms of topical cooling is a popular method of treating sports injuries. It is commonplace for athletes to return to competitive activity, shortly or immediately after the application of a cold treatment. In this article, we examine the effect of local tissue cooling on outcomes relating to functional performance and to discuss their relevance to the sporting environment. A computerized literature search, citation tracking and hand search was performed up to April, 2011. Eligible studies were trials involving healthy human participants, describing the effects of cooling on outcomes relating to functional performance. Two reviewers independently assessed the validity of included trials and calculated effect sizes. Thirty five trials met the inclusion criteria; all had a high risk of bias. The mean sample size was 19. Meta-analyses were not undertaken due to clinical heterogeneity. The majority of studies used cooling durations >20 minutes. Strength (peak torque/force) was reported by 25 studies with approximately 75% recording a decrease in strength immediately following cooling. There was evidence from six studies that cooling adversely affected speed, power and agility-based running tasks; two studies found this was negated with a short rewarming period. There was conflicting evidence on the effect of cooling on isolated muscular endurance. A small number of studies found that cooling decreased upper limb dexterity and accuracy. The current evidence base suggests that athletes will probably be at a performance disadvantage if they return to activity immediately after cooling. This is based on cooling for longer than 20 minutes, which may exceed the durations employed in some sporting environments. In addition, some of the reported changes were clinically small and may only be relevant in elite sport. Until better evidence is available, practitioners should use short cooling applications and/or undertake a progressive warm up prior to returning to play.
Resumo:
Unlicensed driving remains a serious problem in many jurisdictions, and while it does not play a direct causative role in road crashes, it undermines driver licensing systems and is linked to other high risk driving behaviours. Roadside licence check surveys represent the most direct means of estimating the prevalence of unlicensed driving. The current study involved the Queensland Police Service (QPS) checking the licences of 3,112 drivers intercepted at random breath testing operations across Queensland between February and April 2010. Data was matched with official licensing records from Transport and Main Roads (TMR) via the drivers’ licence number. In total, 2,914 (93.6%) records were matched, with the majority of the 198 unmatched cases representing international or interstate licence holders (n = 156), leaving 42 unknown cases. Among the drivers intercepted at the roadside, 20 (0.6%) were identified as being unlicensed at the time, while a further 11 (0.4%) were driving unaccompanied on a Learner Licence. However, the examination of TMR licensing records revealed that an additional 9 individuals (0.3%) had a current licence sanction but were not identified as unlicensed by QPS. Thus, in total 29 of the drivers were unlicensed at the time, representing 0.9% of all the drivers intercepted and 1% of those for whom their licence records could be checked. This is considerably lower than the involvement of unlicensed drivers in fatal and serious injury crashes in Queensland, which is consistent with other research confirming the increased crash risk of the group. However, the number of unmatched records suggest that it is possible the on-road survey may have under-estimated the prevalence of unlicensed driving, so further development of the survey method is recommended.
Resumo:
Many governments throughout the world rely heavily on traffic law enforcement programs to modify driver behaviour and enhance road safety. There are two related functions of traffic law enforcement, apprehension and deterrence, and these are achieved through three processes: the establishment of traffic laws, the policing of those laws, and the application of penalties and sanctions to offenders. Traffic policing programs can vary by visibility (overt or covert) and deployment methods (scheduled and non-scheduled), while sanctions can serve to constrain, deter or reform offending behaviour. This chapter will review the effectiveness of traffic law enforcement strategies from the perspective of a range of high-risk, illegal driving behaviours including drink/drug driving, speeding, seat belt use and red light running. Additionally, this chapter discusses how traffic police are increasingly using technology to enforce traffic laws and thus reduce crashes. The chapter concludes that effective traffic policing involves a range of both overt and covert operations and includes a mix of automatic and more traditional manual enforcement methods. It is important to increase both the perceived and actual risk of detection by ensuring that traffic law enforcement operations are sufficiently intensive, unpredictable in nature and conducted as widely as possible across the road network. A key means of maintaining the unpredictability of operations is through the random deployment of enforcement and/or the random checking of drivers. The impact of traffic enforcement is also heightened when it is supported by public education campaigns. In the future, technological improvements will allow the use of more innovative enforcement strategies. Finally, further research is needed to continue the development of traffic policing approaches and address emerging road safety issues.
Resumo:
Introduction: In Singapore, motorcycle crashes account for 50% of traffic fatalities and 53% of injuries. While extensive research efforts have been devoted to improve the motorcycle safety, the relationship between the rider behavior and the crash risk is still not well understood. The objective of this study is to evaluate how behavioral factors influence crash risk and to identify the most vulnerable group of motorcyclists. Methods: To explore the rider behavior, a 61-item questionnaire examining sensation seeking (Zuckerman et al., 1978), impulsiveness (Eysenck et al., 1985), aggressiveness (Buss & Perry, 1992), and risk-taking behavior (Weber et al., 2002) was developed. A total of 240 respondents with at least one year riding experience form the sample that relate behavior to their crash history, traffic penalty awareness, and demographic characteristics. By clustering the crash risk using the medoid portioning algorithm, the log-linear model relating the rider behavior to crash risk was developed. Results and Discussions: Crash-involved motorcyclists scored higher in impulsive sensation seeking, aggression and risk-taking behavior. Aggressive and high risk-taking motorcyclists were respectively 1.30 and 2.21 times more likely to fall under the high crash involvement group while impulsive sensation seeking was not found to be significant. Based on the scores on risk-taking and aggression, the motorcyclists were clustered into four distinct personality combinations namely, extrovert (aggressive, impulsive risk-takers), leader (cautious, aggressive risk-takers), follower (agreeable, ignorant risk-takers), and introvert (self-consciousness, fainthearted risk-takers). “Extrovert” motorcyclists were most prone to crashes, being 3.34 times more likely to involve in crash and 8.29 times more vulnerable than the “introvert”. Mediating factors like demographic characteristics, riding experience, and traffic penalty awareness were found not to be significant in reducing crash risk. Conclusion: The findings of this study will be useful for road safety campaign planners to be more focused in the target group as well as those who employ motorcyclists for their delivery business.
Resumo:
Enterococci are versatile Gram-positive bacteria that can survive under extreme conditions. Most enterococci are non-virulent and found in the gastrointestinal tract of humans and animals. Other strains are opportunistic pathogens that contribute to a large number of nosocomial infections globally. Epidemiological studies demonstrated a direct relationship between the density of enterococci in surface waters and the risk of swimmer-associated gastroenteritis. The distribution of infectious enterococcal strains from the hospital environment or other sources to environmental water bodies through sewage discharge or other means, could increase the prevalence of these strains in the human population. Environmental water quality studies may benefit from focusing on a subset of Enterococcus spp. that are consistently associated with sources of faecal pollution such as domestic sewage, rather than testing for the entire genus. E. faecalis and E. faecium are potentially good focal species for such studies, as they have been consistently identified as the dominant Enterococcus spp. in human faeces and sewage. On the other hand enterococcal infections are predominantly caused by E. faecalis and E. faecium. The characterisation of E. faecalis and E. faecium is important in studying their population structures, particularly in environmental samples. In developing and implementing rapid, robust molecular genotyping techniques, it is possible to more accurately establish the relationship between human and environmental enterococci. Of particular importance, is to determine the distribution of high risk enterococcal clonal complexes, such as E. faecium clonal complex 17 and E. faecalis clonal complexes 2 and 9 in recreational waters. These clonal complexes are recognized as particularly pathogenic enterococcal genotypes that cause severe disease in humans globally. The Pimpama-Coomera watershed is located in South East Queensland, Australia and was investigated in this study mainly because it is used intensively for agriculture and recreational purposes and has a strong anthropogenic impact. The primary aim of this study was to develop novel, universally applicable, robust, rapid and cost effective genotyping methods which are likely to yield more definitive results for the routine monitoring of E. faecalis and E. faecium, particularly in environmental water sources. To fullfill this aim, new genotyping methods were developed based on the interrogation of highly informative single nucleotide polymorphisms (SNPs) located in housekeeping genes of both E. faecalis and E. faecium. SNP genotyping was successfully applied in field investigations of the Coomera watershed, South-East Queensland, Australia. E. faecalis and E. faecium isolates were grouped into 29 and 23 SNP profiles respectively. This study showed the high longitudinal diversity of E. faecalis and E. faecium over a period of two years, and both human-related and human-specific SNP profiles were identified. Furthermore, 4.25% of E. faecium strains isolated from water was found to correspond to the important clonal complex-17 (CC17). Strains that belong to CC17 cause the majority of hospital outbreaks and clinical infections globally. Of the six sampling sites of the Coomera River, Paradise Point had the highest number of human-related and human-specific E. faecalis and E. faecium SNP profiles. The secondary aim of this study was to determine the antibiotic-resistance profiles and virulence traits associated with environmental E. faecalis and E. faecium isolates compared to human pathogenic E. faecalis and E. faecium isolates. This was performed to predict the potential health risks associated with coming into contact with these strains in the Coomera watershed. In general, clinical isolates were found to be more resistant to all the antibiotics tested compared to water isolates and they harbored more virulence traits. Multi-drug resistance was more prevalent in clinical isolates (71.18% of E. faecalis and 70.3 % of E. faecium) compared to water isolates (only 5.66 % E. faecium). However, tetracycline, gentamicin, ciprofloxacin and ampicillin resistance was observed in water isolates. The virulence gene esp was the most prevalent virulence determinant observed in clinical isolates (67.79% of E. faecalis and 70.37 % of E. faecium), and this gene has been described as a human-specific marker used for microbial source tracking (MST). The presence of esp in water isolates (16.36% of E. faecalis and 19.14% of E. faecium) could be indicative of human faecal contamination in these waterways. Finally, in order to compare overall gene expression between environmental and clinical strains of E. faecalis, a comparative gene hybridization study was performed. The results of this investigation clearly demonstrated the up-regulation of genes associated with pathogenicity in E. faecalis isolated from water. The expression study was performed at physiological temperatures relative to ambient temperatures. The up-regulation of virulence genes demonstrates that environmental strains of E. faecalis can pose an increased health risk which can lead to serious disease, particularly if these strains belong to the virulent CC17 group. The genotyping techniques developed in this study not only provide a rapid, robust and highly discriminatory tool to characterize E. faecalis and E. faecium, but also enables the efficient identification of virulent enterococci that are distributed in environmental water sources.
Resumo:
The somatosensory system plays an important role in balance control and age-related changes to this system have been implicated in falls. Parkinson’s disease (PD) is a chronic and progressive disease of the brain, characterized by postural instability and gait disturbance. Previous research has shown that deficiencies in somatosensory feedback may contribute to the poorer postural control demonstrated by PD individuals. However, few studies have comprehensively explored differences in somatosensory function and postural control between PD participants and healthy older individuals. The soles of the feet contain many cutaneous mechanoreceptors that provide important somatosensory information sources for postural control. Different types of insole devices have been developed to enhance this somatosensory information and improve postural stability, but these devices are often too complex and expensive to integrate into daily life. Textured insoles provide a more passive intervention that may be an inexpensive and accessible means to enhance the somatosensory input from the plantar surface of the feet. However, to date, there has been little work conducted to test the efficacy of enhanced somatosensory input induced by textured insoles in both healthy and PD populations during standing and walking. Therefore, the aims of this thesis were to determine: 1) whether textured insole surfaces can improve postural stability by enhancing somatosensory information in younger and older adults, 2) the differences between healthy older participants and PD participants for measures of physiological function and postural stability during standing and walking, 3) how changes in somatosensory information affect postural stability in both groups during standing and walking; and 4), whether textured insoles can improve postural stability in both groups during standing and walking. To address these aims, Study 1 recruited seven older individuals and ten healthy young controls to investigate the effects of two textured insole surfaces on postural stability while performing standing balance tests on a force plate. Participants were tested under three insole surface conditions: 1) barefoot; 2) standing on a hard textured insole surface; and 3), standing on a soft textured insole surface. Measurements derived from the centre of pressure displacement included the range of anterior-posterior and medial-lateral displacement, path length and the 90% confidence elliptical area (C90 area). Results of study 1 revealed a significant Group*Surface*Insole interaction for the four measures. Both textured insole surfaces reduced postural sway for the older group, especially in the eyes closed condition on the foam surface. However, participants reported that the soft textured insole surface was more comfortable and, hence, the soft textured insoles were adopted for Studies 2 and 3. For Study 2, 20 healthy older adults (controls) and 20 participants with Parkinson’s disease were recruited. Participants were evaluated using a series of physiological assessments that included touch sensitivity, vibratory perception, and pain and temperature threshold detection. Furthermore, nerve function and somatosensory evoked potentials tests were utilized to provide detailed information regarding peripheral nerve function for these participants. Standing balance and walking were assessed on different surfaces using a force plate and the 3D Vicon motion analysis system, respectively. Data derived from the force plate included the range of anterior-posterior and medial-lateral sway, while measures of stride length, stride period, cadence, double support time, stance phase, velocity and stride timing variability were reported for the walking assessment. The results of this study demonstrated that the PD group had decrements in somatosensory function compared to the healthy older control group. For electrodiagnosis, PD participants had poorer nerve function than controls, as evidenced by slower nerve conduction velocities and longer latencies in sural nerve and prolonged latency in the P37 somatosensory evoked potential. Furthermore, the PD group displayed more postural sway in both the anterior-posterior and medial-lateral directions relative to controls and these differences were increased when standing on a foam surface. With respect to the gait assessment, the PD group took shorter strides and had a reduced stride period compared with the control group. Furthermore, the PD group spent more time in the stance phase and had increased cadence and stride timing variability than the controls. Compared with walking on the firm surface, the two groups demonstrated different gait adaptations while walking on the uneven surface. Controls increased their stride length and stride period and decreased their cadence, which resulted in a consistent walking velocity on both surfaces. Conversely, while the PD patients also increased their stride period and decreased their cadence and stance period on the uneven surface, they did not increase their stride length and, hence walked slower on the uneven surface. In the PD group, there was a strong positive association between decreased somatosensory function and decreased clinical balance, as assessed by the Tinetti test. Poorer somatosensory function was also strongly positively correlated with the temporospatial gait parameters, especially shorter stride length. Study 3 evaluated the effects of manipulating the somatosensory information from the plantar surface of the feet using textured insoles in the same populations assessed in Study 2. For this study, participants performed the standing and walking balance tests under three footwear conditions: 1) barefoot; 2) with smooth insoles; and 3), with textured insoles. Standing balance and walking were evaluated using a force plate and a Vicon motion analysis system and the data were analysed in the same way outlined for Study 2. The findings showed that the smooth and textured insoles caused different effects on postural control during both the standing and walking trials. Both insoles decreased medial-lateral sway to the same level on the firm surface. The greatest benefits were observed in the PD group while wearing the textured insole. When standing under a more challenging condition on the foam surface with eyes closed, only the textured insole decreased medial-lateral sway in the PD group. With respect to the gait trials, both insoles increased walking velocity, stride length and stride time and decreased cadence, but these changes were more pronounced for the textured insoles. The effects of the textured insoles were evident under challenging conditions in the PD group and increased walking velocity and stride length, while decreasing cadence. Textured insoles were also effective in reducing the time spent in the double support and stance phases of the gait cycle and did not increase stride timing variability, as was the case for the smooth insoles for the PD group. The results of this study suggest that textured insoles, such as those evaluated in this research, may provide a low-cost means of improving postural stability in high-risk groups, such as people with PD, which may act as an important intervention to prevent falls.
Resumo:
Marginalised young people have been consistently identified as a high risk group in relation to sexual health. This research, undertaken through the Youth Affairs Network of Queensland, seeks to explore impacts on youth workers’ ability to provide effective interventions around sexual health? What knowledge,skills, resources, value and ethics, training and support is available to youth workers? What do youth workers identify that they need and what workforce development strategies are recommended to enable the youth sector to respond more effectively? This project report provides a snapshot and introduction to the key themes raised by youth workers and other key stakeholders in Queensland Australia.
Resumo:
Alcohol use disorders (AUDs) are a major public health problem, and the few treatment options available to those seeking treatment offer only modest success rates. There remains a need to identify novel targets for the treatment of AUDs. The neuronal nicotinic acetylcholine receptors (nAChRs) represent a potential therapeutic target in the brain, as recent human genetic studies have implicated gene variants in the α5 nAChR subunit as high risk factors for developing alcohol dependence. Here, we evaluate the role of 5* nAChR for ethanol-mediated behaviors using α5+/+ and α5-/- mice. We characterized the effect of hypnotic doses of ethanol and investigated drinking behavior using an adapted Drinking-in-the Dark (DID) paradigm that has been shown to induce high ethanol consumption in mice. We found the α5 subunit to be critical in mediating the sedative effects of ethanol. The α5-/- mice showed slower recovery from ethanol-induced sleep, as measured by loss of righting reflex. Additionally the α5-/- mice showed enhanced impairment to ethanol-induced ataxia. We found the initial sensitivity to ethanol and ethanol metabolism to be similar in both α5+/+ and α5-/- mice. Hence the enhanced sedation is likely due to a difference in the acute tolerance of ethanol in mice deficient of the α5 subunit. However the α5 subunit did not play a role in ethanol consumption for ethanol concentrations ranging from 5% to 30% in the DID paradigm. Additionally, varenicline (Chantix®) was effective in reducing ethanol intake in α5-/- mice. Together, our data suggest that the α5 nAChR subunit is important for the sedative hypnotic doses of ethanol but does not play a role in ethanol consumption. Varenicline can be a treatment option even when there is loss of function of the α5 nAChR subunit.
Resumo:
Exceeding the speed limit and driving too fast for the conditions are regularly cited as significant contributing factors in traffic crashes, particularly fatal and serious injury crashes. Despite an extensive body of research highlighting the relationship between increased vehicle speeds and crash risk and severity, speeding remains a pervasive behaviour on Australian roads. The development of effective countermeasures designed to reduce the prevalence of speeding behaviour requires that this behaviour is well understood. The primary aim of this program of research was to develop a better understanding of the influence of drivers’ perceptions and attitudes toward police speed enforcement on speeding behaviour. Study 1 employed focus group discussions with 39 licensed drivers to explore the influence of perceptions relating to specific characteristics of speed enforcement policies and practices on drivers’ attitudes towards speed enforcement. Three primary factors were identified as being most influential: site selection; visibility; and automaticity (i.e., whether the enforcement approach is automated/camera-based or manually operated). Perceptions regarding these enforcement characteristics were found to influence attitudes regarding the perceived legitimacy and transparency of speed enforcement. Moreover, misperceptions regarding speed enforcement policies and practices appeared to also have a substantial impact on attitudes toward speed enforcement, typically in a negative direction. These findings have important implications for road safety given that prior research has suggested that the effectiveness of speed enforcement approaches may be reduced if efforts are perceived by drivers as being illegitimate, such that they do little to encourage voluntary compliance. Study 1 also examined the impact of speed enforcement approaches varying in the degree of visibility and automaticity on self-reported willingness to comply with speed limits. These discussions suggested that all of the examined speed enforcement approaches (see Section 1.5 for more details) generally showed potential to reduce vehicle speeds and encourage compliance with posted speed limits. Nonetheless, participant responses suggested a greater willingness to comply with approaches operated in a highly visible manner, irrespective of the corresponding level of automaticity of the approach. While less visible approaches were typically associated with poorer rates of driver acceptance (e.g., perceived as “sneaky” and “unfair”), participants reported that such approaches would likely encourage long-term and network-wide impacts on their own speeding behaviour, as a function of the increased unpredictability of operations and increased direct (specific deterrence) and vicarious (general deterrence) experiences with punishment. Participants in Study 1 suggested that automated approaches, particularly when operated in a highly visible manner, do little to encourage compliance with speed limits except in the immediate vicinity of the enforcement location. While speed cameras have been criticised on such grounds in the past, such approaches can still have substantial road safety benefits if implemented in high-risk settings. Moreover, site-learning effects associated with automated approaches can also be argued to be a beneficial by-product of enforcement, such that behavioural modifications are achieved even in the absence of actual enforcement. Conversely, manually operated approaches were reported to be associated with more network-wide impacts on behaviour. In addition, the reported acceptance of such methods was high, due to the increased swiftness of punishment, ability for additional illegal driving behaviours to be policed and the salutary influence associated with increased face-to-face contact with authority. Study 2 involved a quantitative survey conducted with 718 licensed Queensland drivers from metropolitan and regional areas. The survey sought to further examine the influence of the visibility and automaticity of operations on self-reported likelihood and duration of compliance. Overall, the results from Study 2 corroborated those of Study 1. All examined approaches were again found to encourage compliance with speed limits, such that all approaches could be considered to be “effective”. Nonetheless, significantly greater self-reported likelihood and duration of compliance was associated with visibly operated approaches, irrespective of the corresponding automaticity of the approach. In addition, the impact of automaticity was influenced by visibility; such that significantly greater self-reported likelihood of compliance was associated with manually operated approaches, but only when they are operated in a less visible fashion. Conversely, manually operated approaches were associated with significantly greater durations of self-reported compliance, but only when they are operated in a highly visible manner. Taken together, the findings from Studies 1 and 2 suggest that enforcement efforts, irrespective of their visibility or automaticity, generally encourage compliance with speed limits. However, the duration of these effects on behaviour upon removal of the enforcement efforts remains questionable and represents an area where current speed enforcement practices could possibly be improved. Overall, it appears that identifying the optimal mix of enforcement operations, implementing them at a sufficient intensity and increasing the unpredictability of enforcement efforts (e.g., greater use of less visible approaches, random scheduling) are critical elements of success. Hierarchical multiple regression analyses were also performed in Study 2 to investigate the punishment-related and attitudinal constructs that influence self-reported frequency of speeding behaviour. The research was based on the theoretical framework of expanded deterrence theory, augmented with three particular attitudinal constructs. Specifically, previous research examining the influence of attitudes on speeding behaviour has typically focussed on attitudes toward speeding behaviour in general only. This research sought to more comprehensively explore the influence of attitudes by also individually measuring and analysing attitudes toward speed enforcement and attitudes toward the appropriateness of speed limits on speeding behaviour. Consistent with previous research, a number of classical and expanded deterrence theory variables were found to significantly predict self-reported frequency of speeding behaviour. Significantly greater speeding behaviour was typically reported by those participants who perceived punishment associated with speeding to be less certain, who reported more frequent use of punishment avoidance strategies and who reported greater direct experiences with punishment. A number of interesting differences in the significant predictors among males and females, as well as younger and older drivers, were reported. Specifically, classical deterrence theory variables appeared most influential on the speeding behaviour of males and younger drivers, while expanded deterrence theory constructs appeared more influential for females. These findings have important implications for the development and implementation of speeding countermeasures. Of the attitudinal factors, significantly greater self-reported frequency of speeding behaviour was reported among participants who held more favourable attitudes toward speeding and who perceived speed limits to be set inappropriately low. Disappointingly, attitudes toward speed enforcement were found to have little influence on reported speeding behaviour, over and above the other deterrence theory and attitudinal constructs. Indeed, the relationship between attitudes toward speed enforcement and self-reported speeding behaviour was completely accounted for by attitudes toward speeding. Nonetheless, the complexity of attitudes toward speed enforcement are not yet fully understood and future research should more comprehensively explore the measurement of this construct. Finally, given the wealth of evidence (both in general and emerging from this program of research) highlighting the association between punishment avoidance and speeding behaviour, Study 2 also sought to investigate the factors that influence the self-reported propensity to use punishment avoidance strategies. A standard multiple regression analysis was conducted for exploratory purposes only. The results revealed that punishment-related and attitudinal factors significantly predicted approximately one fifth of the variance in the dependent variable. The perceived ability to avoid punishment, vicarious punishment experience, vicarious punishment avoidance and attitudes toward speeding were all significant predictors. Future research should examine these relationships more thoroughly and identify additional influential factors. In summary, the current program of research has a number of implications for road safety and speed enforcement policy and practice decision-making. The research highlights a number of potential avenues for the improvement of public education regarding enforcement efforts and provides a number of insights into punishment avoidance behaviours. In addition, the research adds strength to the argument that enforcement approaches should not only demonstrate effectiveness in achieving key road safety objectives, such as reduced vehicle speeds and associated crashes, but also strive to be transparent and legitimate, such that voluntary compliance is encouraged. A number of potential strategies are discussed (e.g., point-to-point speed cameras, intelligent speed adaptation. The correct mix and intensity of enforcement approaches appears critical for achieving optimum effectiveness from enforcement efforts, as well as enhancements in the unpredictability of operations and swiftness of punishment. Achievement of these goals should increase both the general and specific deterrent effects associated with enforcement through an increased perceived risk of detection and a more balanced exposure to punishment and punishment avoidance experiences.
Resumo:
Objective. The aim of this paper is to report the clinical practice changes resulting from strategies to standardise diabetic foot clinical management in three diverse ambulatory service sites in Queensland, Australia. Methods. Multifaceted strategies were implemented in 2008, including: multidisciplinary teams, clinical pathways, clinical training, clinical indicators, and telehealth support. Prior to the intervention, none of the aforementioned strategies were used, except one site had a basic multidisciplinary team. A retrospective audit of consecutive patient records from July 2006 to June 2007 determined baseline clinical activity (n = 101).Aclinical pathway teleform was implemented as a clinical activity analyser in 2008 (n = 327) and followed up in 2009 (n = 406). Pre- and post-implementation data were analysed using Chi-square tests with a significance level set at P < 0.05. Results. There was an improvement in surveillance of the high risk population of 34% in 2008 and 19% in 2009, and treating according to risk of 15% in 2009 (P < 0.05). The documentation of all best-practice clinical activities performed improved 13–66% (P < 0.03). Conclusion. These findings support the use of multifaceted strategies to standardise practice and improve diabetic foot complications management in diverse ambulatory services.
Resumo:
Background Diabetic foot complications are recognised as the most common reason for diabetic related hospitalisation and lower extremity amputations. Multi-faceted strategies to reduce diabetic foot hospitalisation and amputation rates have been successful. However, most diabetic foot ulcers are managed in ambulatory settings where data availability is poor and studies limited. The project aimed to develop and evaluate strategies to improve the management of diabetic foot complications in three diverse ambulatory settings and measure the subsequent impact on ospitalisation and amputation. Methods Multifaceted strategies were implemented in 2008, including: multi-disciplinary teams, clinical pathways and training, clinical indicators, telehealth support and surveys. A retrospective audit of consecutive patient records from July 2006 – June 2007 determined baseline clinical indicators (n = 101). A clinical pathway teleform was implemented as a clinical record and clinical indicator analyser in all sites in 2008 (n = 327) and followed up in 2009 (n = 406). Results Prior to the intervention, clinical pathways were not used and multi-disciplinary teams were limited. There was an absolute improvement in treating according to risk of 15% in 2009 and surveillance of the high risk population of 34% and 19% in 2008 and 2009 respectively (p < 0.001). Improvements of 13 – 66% (p < 0.001) were recorded in 2008 for individual clinical activities to a performance > 92% in perfusion, ulcer depth, infection assessment and management, offloading and education. Hospitalisation impacts recorded reductions of up to 64% in amputation rates / 100,000 population (p < 0.001) and 24% average length of stay (p < 0.001) Conclusion These findings support the use of multi-faceted strategies in diverse ambulatory services to standardise practice, improve diabetic foot complications management and positively impact on hospitalisation outcomes. As of October 2010, these strategies had been rolled out to over 25 ambulatory sites, representing 66% of Queensland Health districts, managing 1,820 patients and 13,380 occasions of service, including 543 healed ulcer patients. It is expected that this number will rise dramatically as an incentive payment for the use of the teleform is expanded.
Resumo:
Abstract Background: As low HDL cholesterol levels are a risk factor for cardiovascular disease, raising HDL cholesterol substantially by inhibiting or modulating cholesteryl ester transfer protein (CETP) may be useful in coronary artery disease. The first CETP inhibitor that went into clinical trial, torcetrapib, was shown to increase the levels of HDL cholesterol, but it also increased cardiovascular outcomes, probably due to an increase in blood pressure and aldosterone secretion, by an off-target mechanism/s. Objective/methods: Dalcetrapib is a new CETP modulator that increases the levels of HDL cholesterol, but does not increase blood pressure or aldosterone secretion. The objective was to evaluate a paper describing the effects of dalcetrapib on carotid and aortic wall thickness in subjects with, or at high risk, of coronary artery disease; the dal-PLAQUE study. Results: dal-PLAQUE showed that dalcetrapib reduced the progression of atherosclerosis and may also reduce the vascular inflammation associated with this, in subjects with, or with high risk of, coronary heart disease, who were already taking statins. Conclusions: These results suggest that modulating CETP with dalcetrapib may be a beneficial mechanism in cardiovascular disease. The results of the dal-HEART series, which includes dal-PLAQUE 1 and 2, and dal-OUTCOMES, when complete, will provide more definitive information about the benefit, or not, of dalcetrapib in coronary artery disease.
Resumo:
Purpose: To determine whether there is a difference in neuroretinal function and in macular pigment optical density between persons with high- and low-risk gene variants for age-related macular degeneration (AMD) and no ophthalmoscopic signs of AMD, and to compare the results on neuroretinal function to patients with manifest early AMD. Methods and Participants: Neuroretinal function was assessed with the multifocal electroretinogram (mfERG) for 32 participants (22 healthy persons with no AMD and 10 early AMD patients). The 22 healthy participants with no AMD had high- or low-risk genotypes for either CFH (rs380390) and/or ARMS2 (rs10490924). Trough-to-peak response densities and peak-implicit times were analyzed in 5 concentric rings. Macular pigment optical densitometry was assessed by customized heterochromatic flicker photometry. Results: Trough-to-peak response densities for concentric rings 1 to 3 were, on average, significantly greater in participants with high-risk genotypes than in participants with low-risk genotypes and in persons with early AMD after correction for age and smoking (p<0.05). The group peak- implicit times for ring 1 were, on average, delayed in the patients with early AMD compared with the participants with high- or low-risk genotypes, although these differences were not significant. There was no significant correlation between genotypes and macular pigment optical density. Conclusion: Increased neuroretinal activity in persons who carry high-risk AMD genotypes may be due to genetically determined subclinical inflammatory and/or histological changes in the retina. Neuroretinal function in healthy persons genetically susceptible to AMD may be a useful additional early biomarker (in combination with genetics) before there is clinical manifestation.