817 resultados para Prevention strategies
Resumo:
Background and Aims: The evolution of resistance to herbicides is a substantial problem in contemporary agriculture. Solutions to this problem generally consist of the use of practices to control the resistant population once it evolves, and/or to institute preventative measures before populations become resistant. Herbicide resistance evolves in populations over years or decades, so predicting the effectiveness of preventative strategies in particular relies on computational modelling approaches. While models of herbicide resistance already exist, none deals with the complex regional variability in the northern Australian sub-tropical grains farming region. For this reason, a new computer model was developed. Methods: The model consists of an age- and stage-structured population model of weeds, with an existing crop model used to simulate plant growth and competition, and extensions to the crop model added to simulate seed bank ecology and population genetics factors. Using awnless barnyard grass (Echinochloa colona) as a test case, the model was used to investigate the likely rate of evolution under conditions expected to produce high selection pressure. Key Results: Simulating continuous summer fallows with glyphosate used as the only means of weed control resulted in predicted resistant weed populations after approx. 15 years. Validation of the model against the paddock history for the first real-world glyphosate-resistant awnless barnyard grass population shows that the model predicted resistance evolution to within a few years of the real situation. Conclusions: This validation work shows that empirical validation of herbicide resistance models is problematic. However, the model simulates the complexities of sub-tropical grains farming in Australia well, and can be used to investigate, generate and improve glyphosate resistance prevention strategies.
Resumo:
Background The Researching Effective Approaches to Cleaning in Hospitals (REACH) study will generate evidence about the effectiveness and cost-effectiveness of a novel cleaning initiative that aims to improve the environmental cleanliness of hospitals. The initiative is an environmental cleaning bundle, with five interdependent, evidence-based components (training, technique, product, audit and communication) implemented with environmental services staff to enhance hospital cleaning practices. Methods/design The REACH study will use a stepped-wedge randomised controlled design to test the study intervention, an environmental cleaning bundle, in 11 Australian hospitals. All trial hospitals will receive the intervention and act as their own control, with analysis undertaken of the change within each hospital based on data collected in the control and intervention periods. Each site will be randomised to one of the 11 intervention timings with staggered commencement dates in 2016 and an intervention period between 20 and 50 weeks. All sites complete the trial at the same time in 2017. The inclusion criteria allow for a purposive sample of both public and private hospitals that have higher-risk patient populations for healthcare-associated infections (HAIs). The primary outcome (objective one) is the monthly number of Staphylococcus aureus bacteraemias (SABs), Clostridium difficile infections (CDIs) and vancomycin resistant enterococci (VRE) infections, per 10,000 bed days. Secondary outcomes for objective one include the thoroughness of hospital cleaning assessed using fluorescent marker technology, the bio-burden of frequent touch surfaces post cleaning and changes in staff knowledge and attitudes about environmental cleaning. A cost-effectiveness analysis will determine the second key outcome (objective two): the incremental cost-effectiveness ratio from implementation of the cleaning bundle. The study uses the integrated Promoting Action on Research Implementation in Health Services (iPARIHS) framework to support the tailored implementation of the environmental cleaning bundle in each hospital. Discussion Evidence from the REACH trial will contribute to future policy and practice guidelines about hospital environmental cleaning. It will be used by healthcare leaders and clinicians to inform decision-making and implementation of best-practice infection prevention strategies to reduce HAIs in hospitals. Trial registration Australia New Zealand Clinical Trial Registry ACTRN12615000325505
Resumo:
What is the future for public health in the twenty-first century? Can we glean an idea about the future of public health from its past? As Winston Churchill once said: ‘[T]he further backward you look, the further forward you can see.’ What can we see in the history of public health that gives us an idea of where public health might be headed in the future? (Gruszin et al. 2012). In the twentieth century there was substantial progress in public health in Australia. These improvements were brought about through a number of factors. In part, improvements were due to increasing knowledge about the natural history of disease and its treatment. Added to this knowledge was a shifting focus from legislative measures to protect health, to the emergence of improved promotion and prevention strategies, and a general improvement in social and economic conditions for people living in countries such as Australia. Gruszin et al. (2012) consider the range of social and economic reforms of the twentieth century as the most important determinants of the public’s health at the start of the twenty-first century (Gruszin et al. 2012 p 201). The same could not, however, be said for second or third world countries, many of whom have the most fundamental of sanitary and health protection issues still to deal with. For example, in sub-Saharan Africa and in Russia the decline in life expectancy can be said to be related to a range of interconnected factors. In Russia, issues such as alcoholism, violence, suicide, accidents and cardiovascular disease could be contributing to the falling life expectancy (McMichael & Butler 2007). In sub-Saharan Africa, a range of factors, such as HIV/AIDS, poverty, malaria, tuberculosis, undernutrition, totally inadequate infrastructure, gender inequality, conflict and violence, political taboos and a complete lack of political will, have all contributed to a dramatic drop in life expectancy (McMichael & Butler 2007).
Resumo:
Major advances in the treatment of preterm infants have occurred during the last three decades. Survival rates have increased, and the first generations of preterm infants born at very low birth weight (VLBW; less than 1500 g) who profited from modern neonatal intensive care are now in young adulthood. The literature shows that VLBW children achieve on average lower scores on cognitive tests, even after exclusion of individuals with obvious neurosensory deficits. Evidence also exists for an increased risk in VLBW children for various neuropsychiatric disorders such as attention-deficit hyperactivity disorder (ADHD) and related behavioral symptoms. Up till now, studies extending into adulthood are sparse, and it remains to be seen whether these problems persist into adulthood. The aim of this thesis was to study ADHD-related symptoms and cognitive and executive functioning in young adults born at VLBW. In addition, we aimed to study sleep disturbances, known to adversely affect both cognition and attention. We hypothesized that preterm birth at VLBW interferes with early brain development in a way that alters the neuropsychological phenotype; this may manifest itself as ADHD symptoms and impaired cognitive abilities in young adulthood. In this cohort study from a geographically defined region, we studied 166 VLBW adults and 172 term-born controls born from 1978 through 1985. At ages 18 to 27 years, the study participants took part in a clinic study during which their physical and psychological health was assessed in detail. Three years later, 213 of these individuals participated in a follow-up. The current study is part of a larger research project (The Helsinki Study of Very Low Birth Weight Adults), and the measurements of interest for this particular study include the following: 1) The Adult Problem Questionnaire (APQ), a self-rating scale of ADHD-related symptoms in adults; 2) A computerized cognitive test battery designed for population studies (CogState®) which measures core cognitive abilities such as reaction time, working memory, and visual learning; 3) Sleep assessment by actigraphy, the Basic Nordic Sleep Questionnaire, and the Morningness-Eveningness Questionnaire. Actigraphs are wrist-worn accelerometers that separate sleep from wakefulness by registering body movements. Contrary to expectations, VLBW adults as a group reported no more ADHD-related behavioral symptoms than did controls. Further subdivision of the VLBW group into SGA (small for gestational age) and AGA (appropriate for gestational age) subgroups, however, revealed more symptoms on ADHD subscales pertaining to executive dysfunction and emotional instability among those born SGA. Thus, it seems that intrauterine growth retardation (for which SGA served as a proxy) is a more essential predictor for self-perceived ADHD symptoms in adulthood than is VLBW birth as such. In line with observations from other cohorts, the VLBW adults reported less risk-taking behavior in terms of substance use (alcohol, smoking, and recreational drugs), a finding reassuring for the VLBW individuals and their families. On the cognitive test, VLBW adults free from neurosensory deficits had longer reaction times than did term-born peers on all tasks included in the test battery, and lower accuracy on the learning task, with no discernible effect of SGA status over and above the effect of VLBW. Altogether, on a group level, even high-functioning VLBW adults show subtle deficits in psychomotor processing speed, visual working memory, and learning abilities. The sleep studies provided no evidence for differences in sleep quality or duration between the two groups. The VLBW adults were, however, at more than two-fold higher risk for sleep-disordered breathing (in terms of chronic snoring). Given the link between sleep-disordered breathing and health sequelae, these results suggest that VLBW individuals may benefit from an increased awareness among clinicians of this potential problem area. An unexpected finding from the sleep studies was the suggestion of an advanced sleep phase: The VLBW adults went to bed earlier according to the actigraphy registrations and also reported earlier wake-up times on the questionnaire. In further study of this issue in conjunction with the follow-up three years later, the VLBW group reported higher levels of morningness propensity, further corroborating the preliminary findings of an advanced sleep phase. Although the clinical implications are not entirely clear, the issue may be worth further study, since circadian rhythms are closely related to health and well-being. In sum, we believe that increased understanding of long-term outcomes after VLBW, and identification of areas and subgroups that are particularly vulnerable, will allow earlier recognition of potential problems and ultimately lead to improved prevention strategies.
Resumo:
Esta tese é composta por três estudos ecológicos que incluíram as 27 capitais brasileiras. Esses três estudos foram os seguintes: 1- A associação entre a disponibilidade de cirurgiões-dentistas e a quantidade de procedimentos odontológicos nos serviços públicos de odontologia; 2- A associação entre a disponibilidade de cirurgiões-dentistas e a proporção de dentes restaurados (em relação ao total de dentes atacados pela cárie) em indivíduos de 15 a 19 anos ; 3- A associação da disponibilidade de cirurgiões-dentistas com a prevalência e severidade da cárie em indivíduos de 15 a 19 anos. As três investigações são apresentadas sob forma de artigos. Foram utilizados diversos bancos de dados secundários, disponíveis gratuitamente na internet. No primeiro estudo foi identificada associação do número de Equipes de Saúde Bucal do programa Saúde da Família (ESB) e de cirurgiões-dentistas no SUS de uma forma geral com o número de procedimentos odontológicos no serviço público; quanto mais ESB e cirurgiões-dentistas mais procedimentos odontológicos, tanto preventivos quanto restauradores. Mais dentistas no serviço público de odontologia significaram mais procedimentos preventivos e coletivos, porém um número relativamente pequeno a mais de restaurações. É preocupante a quantidade relativamente pequena de restaurações realizadas pelos dentistas do serviço público no Brasil diante do grande número de dentes com cárie não tratada, identificado pela pesquisa nacional de saúde bucal. O segundo estudo revelou que a quantidade de dentistas nas capitais brasileiras é muito grande e que, portanto, há capacidade instalada para atender todas as necessidades de tratamentos restauradores. Entretanto, o índice de cuidado odontológico em jovens de 15 a 19 anos revelou que menos da metade dos dentes atacados pela cárie tinham recebido o cuidado adequado, i.e., estavam restaurados. Este estudo concluiu que, o grande investimento da sociedade brasileira em odontologia, seja no setor público ou privado, não está tendo o retorno esperado, pelo menos para jovens de 15 a 19 anos. O terceiro estudo concluiu que fatores socioeconômicos amplos e flúor na água foram os principais determinantes da variação na prevalência e severidade da cárie em jovens de 15 a 19 anos e que a contribuição do dentista foi relativamente pequena. Diante do papel relativamente pequeno do dentista na prevenção da cárie, o esforço clínico do mesmo deveria, portanto, enfatizar tratamentos de maior complexidade, visando a restauração e reabilitação de danos relevantes para a função e bem estar (Serviço Pessoal de Saúde). Esforços efetivos para evitar a cárie dentária ocorrem principalmente no âmbito de estratégias preventivas populacionais (Serviço não Pessoal de Saúde), com uma contribuição relativamente pequena do trabalho clínico.
Resumo:
BACKGROUND: Individuals with osteoporosis are predisposed to hip fracture during trips, stumbles or falls, but half of all hip fractures occur in those without generalised osteoporosis. By analysing ordinary clinical CT scans using a novel cortical thickness mapping technique, we discovered patches of markedly thinner bone at fracture-prone regions in the femurs of women with acute hip fracture compared with controls. METHODS: We analysed CT scans from 75 female volunteers with acute fracture and 75 age- and sex-matched controls. We classified the fracture location as femoral neck or trochanteric before creating bone thickness maps of the outer 'cortical' shell of the intact contra-lateral hip. After registration of each bone to an average femur shape and statistical parametric mapping, we were able to visualise and quantify statistically significant foci of thinner cortical bone associated with each fracture type, assuming good symmetry of bone structure between the intact and fractured hip. The technique allowed us to pinpoint systematic differences and display the results on a 3D average femur shape model. FINDINGS: The cortex was generally thinner in femoral neck fracture cases than controls. More striking were several discrete patches of statistically significant thinner bone of up to 30%, which coincided with common sites of fracture initiation (femoral neck or trochanteric). INTERPRETATION: Femoral neck fracture patients had a thumbnail-sized patch of focal osteoporosis at the upper head-neck junction. This region coincided with a weak part of the femur, prone to both spontaneous 'tensile' fractures of the femoral neck, and as a site of crack initiation when falling sideways. Current hip fracture prevention strategies are based on case finding: they involve clinical risk factor estimation to determine the need for single-plane bone density measurement within a standard region of interest (ROI) of the femoral neck. The precise sites of focal osteoporosis that we have identified are overlooked by current 2D bone densitometry methods.
Resumo:
Trabalho de Projeto apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Terapêutica da Fala, área de especialização em Linguagem no Adulto
Resumo:
Motorcycle crash related injuries and deaths are increasing rapidly in many African nations. Utilizing conspicuity measures, such as wearing reflective, fluorescent safety vests, are effective as crash prevention strategies. Furthermore, use of some conspicuity measures is mandated by law among motorcycle-taxi drivers in Tanzania. Nonetheless, uptake remains low. Locally appropriate strategies to improve crash preventative behaviors are needed.
To explore whether use of conspicuity measures could be improved through eliminating cost-barriers, we tested a distribution strategy involving the provision of free motorcycle safety vests among a population of motorcycle-taxi drivers in Moshi, Tanzania. We conducted a cluster randomized controlled trial among 180 motorcycle-taxi drivers in which half of the participants (90) were randomized to the intervention arm and received a free reflective vest. The other half of the participants (90) were randomized to the control arm and did not receive free vests. Whether motorcycle taxi drivers used the reflective vest was then unobtrusively observed on city streets over a period of three months.
Mixed-effects logistic regression was used to estimate differential uptake of the vests between trial arms. At baseline, 3.3% of individuals in both arms used a reflective vest. In three months of follow-up, 79 drivers in the intervention arm and 82 drivers in the control arm were able to be observed. In the intervention arm the average proportion of observations during which drivers were observed to be using a reflective vest was 9.5%, compared to 2.0% in the control arm (odds ratio: 5.5, 95% confidence interval: 1.1-26.9, p-value: 0.04). Distribution of free reflective vests did lead to an increase in vest usage, however, the increase was minimal. Removing economic barriers alone appears insufficient to adequately improve adherence to conspicuity measures.
Resumo:
The lifestyles of young people excluded from school have received much attention recently, particularly in relation to illicit drug use. Commentators have acknowledged that they constitute a high-risk group to social disaffection and substance abuse. This paper reports on a group of 48 young people living in Belfast aged 13�14 years who are considered to be at a particularly high risk to substance abuse because they are excluded from school. The evidence in this paper suggests that many are already exhibiting potentially high-risk behaviours to problem drug use compared with their contemporaries in mainstream education. This paper examines the evidence within the context of a limited existing literature base on this group of young people. It suggests that a more focused approach is required for the development of appropriate drug-prevention strategies to meet their needs.
Resumo:
The relatively high levels of cannabis use among young people is a cause of concern because of the positive relationship between its early onset use, antisocial behaviours and associated lifestyle. Amongst a survey of 3919 young people at school year 11 in Northern Ireland (aged 14/15 years) 142 reported daily cannabis use. These young people also reported particularly high levels of legal and illegal drug use and accounted for a high proportion of use of hard drugs such as cocaine and heroin for the full school cohort. Daily cannabis users also reported high levels of antisocial behaviour and disaffection with school. The findings perhaps raise questions about the existence of a potentially ‘hidden’ high risk school based group of young people during adolescence who require specific targeted prevention strategies.
Resumo:
It is now common for young people in full-time compulsory education to hold part-time jobs. However, whilst the 1990s experienced a rise in illicit drug use particularly among young people and an increase in the level of interest for identifying factors associated with drug use, little attention has been paid to the influence of the money young people have to spend and its potential links with drug use. Four thousand five hundred and twenty-four young people living in Northern Ireland completed a questionnaire in school year 10 (aged 13/14 years). The findings suggested there was a positive association between the amount of money (and its source) young people received and higher rates of drug use. The study concludes that money, and how it is spent by young people, may be an important factor for consideration when investigating drug use during adolescence. The findings may help inform drug prevention strategies particularly through advice on money management, and taking responsibility for their own money.
Resumo:
The BRCA1 gene was cloned in 1994 as one of the genes that conferred genetic predisposition to early-onset breast and ovarian cancer. Since then, a genetic test for identification of high-risk individuals has been developed. Despite being implicated in many important cellular pathways, including DNA repair and regulation of transcription, the exact mechanism by which inactivation of BRCA1 might lead to malignant transformation of cells remains unknown. We examine the mechanisms that underlie inactivation of BRCA1 and assess how they affect management of patients, in terms of both primary and secondary cancer prevention strategies. Furthermore, we look at the potential usefulness of BRCA1 as a prognostic tool and as a predictive marker of response to different classes of drugs. Finally, throughout this review, we draw links between the functional consequences of BRCA1 inactivation, in terms of key cellular signalling pathways, and how they might explain specific clinical observations in individuals who carry mutations in the gene.
Resumo:
Systematic reviews of systematic reviews identify good quality reviews of earlier studies of medical conditions. This article describes a systematic review of systematic reviews performed to investigate factors that might influence the risk of rupture of an intracranial aneurysm. It exemplifies the technique of this type of research and reports the finding of a specific study. The annual incidence of subarachnoid haemorrhage resulting from the rupture of intracranial aneurysms is estimated to be nine per 100,000. A large proportion of people who have this bleed, will die or remain dependent on the care of others for some time. Reliable knowledge about the risks of subarachnoid haemorrhage in different populations will help in planning, screening and prevention strategies and in predicting the prognosis of individual patients. If the necessary data were available in the identified reviews, an estimate for the numerical relationship between a particular characteristic and the risk of subarachnoid haemorrhage was included in this report. The identification of eligible systematic reviews relied mainly on the two major bibliographic databases of the biomedical literature: PubMed and EMBASE. These were searched in 2006, using specially designed search strategies. Approximately 2,000 records were retrieved and each of these was checked carefully against the eligibility criteria for this systematic review. These criteria required that the report be a systematic review of studies assessing the risk of subarachnoid haemorrhage in patients known to have an unruptured intracranial aneurysm or of studies that had investigated the characteristics of people who experienced a subarachnoid haemorrhage without previously being known to have an unruptured aneurysm. Reports which included more than one systematic review were eligible and each of these reviews was potentially eligible. The quality of each systematic review was assessed. In this review, 16 separate reports were identified, including a total of 46 eligible systematic reviews. These brought together research studies for 24 different risk factors. This has shown that the following factors appear to be associated with a higher risk of subarachnoid haemorrhage: being a woman, older age, posterior circulation aneurysms, larger aneurysms, previous symptoms,
Resumo:
AIMS Screening tools have been formulated to identify potentially inappropriate prescribing (IP) in older people. Beers’ criteria are the most widely used but have disadvantages when used in Europe. New
IP screening tools called Screening Tool of Older Person’s Prescriptions (STOPP) and Screening Tool to Alert doctors to Right Treatment (START) have been developed to identify potential IP and potential prescribing omissions (PPOs). The aim was to measure the prevalence rates of potential IP and PPOs in primary care using Beers’ criteria, STOPP and START.
METHODS
Case records of 1329 patients 65 years old from three general practices in one region of southern Ireland were studied. The mean age SD of the patients was 74.9 6.4 years, 60.9% were female. Patients’current diagnoses and prescription medicines were reviewed and the Beers’ criteria, STOPP and START tools applied.
RESULTS
The total number of medicines prescribed was 6684; median number of medicines per patient was ?ve (range 1–19). Overall, Beers’ criteria identi?ed 286 potentially inappropriate prescriptions in 18.3% (243) of patients, whilst the corresponding IP rate identi?ed by STOPP was 21.4% (284), in respect of 346 potentially inappropriate prescriptions. A total of 333 PPOs were identi?ed in 22.7% (302) of patients using the START tool.
CONCLUSION
Potentially inappropriate drug prescribing and errors of drug omission are highly prevalent among older people living in the community. Prevention strategies should involve primary care doctors and community pharmacists.
Resumo:
OBJECTIVE: Despite recent increases in the volume of research in professional rugby union, there is little consensus on the epidemiology of injury in adolescent players. We undertook a systematic review to determine the incidence, severity, and nature of injury in adolescent rugby union players.
DATA SOURCES: In April 2009, we performed a computerized literature search on PubMed, Embase, and Cochrane Controlled Trials Register (via Ovid). Population-specific and patient-specific search terms were combined in the form of MEDLINE subject headings and key words (wound$ and injur$, rugby, adolescent$). These were supplemented with related-citation searches on PubMed and bibliographic tracking of primary and review articles.
STUDY SELECTION: Prospective epidemiologic studies in adolescent rugby union players.
DATA SYNTHESIS: A total of 15 studies were included, and the data were analyzed descriptively. Two independent reviewers extracted key study characteristics regarding the incidence, severity, and nature of injuries and the methodologic design.
CONCLUSIONS: Wide variations existed in the injury definitions and data collection procedures. The incidence of injury necessitating medical attention varied with the definition, from 27.5 to 129.8 injuries per 1000 match hours. The incidence of time-loss injury (>7 days) ranged from 0.96 to 1.6 per 1000 playing hours and from 11.4/1000 match hours (>1 day) to 12-22/1000 match hours (missed games). The highest incidence of concussion was 3.3/1000 playing hours. No catastrophic injuries were reported. The head and neck, upper limb, and lower limb were all common sites of injury, and trends were noted toward greater time loss due to upper limb fractures or dislocations and knee ligament injuries. Increasing age, the early part of the playing season, and the tackle situation were most closely associated with injury. Future injury-surveillance studies in rugby union must follow consensus guidelines to facilitate interstudy comparisons and provide further clarification as to where injury-prevention strategies should be focused.