915 resultados para Risk Identification
Resumo:
Background The mechanisms underlying socioeconomic inequalities in mortality from cardiovascular diseases (CVD) are largely unknown. We studied the contribution of childhood socioeconomic conditions and adulthood risk factors to inequalities in CVD mortality in adulthood. Methods The prospective GLOBE study was carried out in the Netherlands, with baseline data from 1991, and linked with the cause of death register in 2007. At baseline, participants reported on adulthood socioeconomic position (SEP) (own educational level), childhood socioeconomic conditions (occupational level of respondent’s father), and a broad range of adulthood risk factors (health behaviours, material circumstances, psychosocial factors). This present study is based on 5,395 men and 6,306 women, and the data were analysed using Cox regression models and hazard ratios (HR). Results A low adulthood SEP was associated with increased CVD mortality for men (HR 1.84; 95% CI: 1.41-2.39) and women (HR 1.80; 95%CI: 1.04-3.10). Those with poorer childhood socioeconomic conditions were more likely to die from CVD in adulthood, but this reached statistical significance only among men with the poorest childhood socioeconomic circumstances. About half of the investigated adulthood risk factors showed significant associations with CVD mortality among both men and women, namely renting a house, experiencing financial problems, smoking, physical activity and marital status. Alcohol consumption and BMI showed a U-shaped relationship with CVD mortality among women, with the risk being significantly greater for both abstainers and heavy drinkers, and among women who were underweight or obese. Among men, being single or divorced and using sleep/anxiety drugs increased the risk of CVD mortality. In explanatory models, the largest contributor to adulthood CVD inequalities were material conditions for men (42%; 95% CI: −73 to −20) and behavioural factors for women (55%; 95% CI: -191 to −28). Simultaneous adjustment for adulthood risk factors and childhood socioeconomic conditions attenuated the HR for the lowest adulthood SEP to 1.34 (95% CI: 0.99-1.82) for men and 1.19 (95% CI: 0.65-2.15) for women. Conclusions Adulthood material, behavioural and psychosocial factors played a major role in the explanation of adulthood SEP inequalities in CVD mortality. Childhood socioeconomic circumstances made a modest contribution, mainly via their association with adulthood risk factors. Policies and interventions to reduce health inequalities are likely to be most effective when considering the influence of socioeconomic circumstances across the entire life course and in particular, poor material conditions and unhealthy behaviours in adulthood.
Resumo:
AIMS: Increases in inflammatory markers, hepatic enzymes and physical inactivity are associated with the development of the metabolic syndrome (MetS). We examined whether inflammatory markers and hepatic enzymes are correlated with traditional risk factors for MetS and studied the effects of resistance training (RT) on these emerging risk factors in individuals with a high number of metabolic risk factors (HiMF, 2.9 +/- 0.8) and those with a low number of metabolic risk factors (LoMF, 0.5 +/- 0.5). METHODS: Twenty-eight men and 27 women aged 50.8 +/- 6.5 years (mean +/- sd) participated in the study. Participants were randomized to four groups, HiMF training (HiMFT), HiMF control (HiMFC), LoMF training (LoMFT) and LoMF control (LoMFC). Before and after 10 weeks of RT [3 days/week, seven exercises, three sets with intensity gradually increased from 40-50% of one repetition maximum (1RM) to 75-85% of 1RM], blood samples were obtained for the measurement of pro-inflammatory cytokines, C-reactive protein (CRP), gamma-glutamyltransferase (GGT) and alanine aminotransferase (ALT). RESULTS: At baseline, HiMF had higher interleukin-6 (33.9%), CRP (57.1%), GGT (45.2%) and ALT (40.6%) levels, compared with LoMF (all P < 0.05). CRP, GGT and ALT correlated with the number of risk factors (r = 0.48, 0.51 and 0.57, respectively, all P < 0.01) and with other anthropometric and clinical measures (r range from 0.26 to 0.60, P < 0.05). RT did not significantly alter inflammatory markers or hepatic enzymes (all P > 0.05). CONCLUSIONS: HiMF was associated with increased inflammatory markers and hepatic enzyme concentrations. RT did not reduce inflammatory markers and hepatic enzymes in individuals with HiMF.
Resumo:
In recent times concerns about possible adverse effects of early separation and advocacy for individual rights have resulted in a movement away from organizational level policies about the separation of twin children as they enter school. Instead, individualized approaches that focus on the twin children’s characteristics and family perspectives have been proposed. This study, conducted in Australia where all but a few families had choice about the class placement of their twin children, questioned parents (N = 156) about their placement decisions. Results indicated that most parents opted for placement together in the early years of schooling. The choice to separate twins at school entry was associated with parent identification of risk in the twin relationship, while being kept together was associated with parent identification of absence of such risk. The findings are discussed in light of the current evidence against separation, and suggest that parent choices regarding the separation of twin children in the early years are informative to educational policy and practice.
Resumo:
This article proposes an approach for real-time monitoring of risks in executable business process models. The approach considers risks in all phases of the business process management lifecycle, from process design, where risks are defined on top of process models, through to process diagnosis, where risks are detected during process execution. The approach has been realized via a distributed, sensor-based architecture. At design-time, sensors are defined to specify risk conditions which when fulfilled, are a likely indicator of negative process states (faults) to eventuate. Both historical and current process execution data can be used to compose such conditions. At run-time, each sensor independently notifies a sensor manager when a risk is detected. In turn, the sensor manager interacts with the monitoring component of a business process management system to prompt the results to process administrators who may take remedial actions. The proposed architecture has been implemented on top of the YAWL system, and evaluated through performance measurements and usability tests with students. The results show that risk conditions can be computed efficiently and that the approach is perceived as useful by the participants in the tests.
Resumo:
School connectedness is an important protective factor for adolescent risk-taking behaviour. This study examined a pilot version of the Skills for Preventing Injury in Youth (SPIY) programme, combining teacher professional development for increasing school connectedness (connectedness component) with a risk and injury prevention curriculum for early adolescents (curriculum component). A process evaluation was conducted on the connectedness component, involving assessments of programme reach, participant receptiveness and initial use, and a preliminary impact evaluation was conducted on the combined connectedness and curriculum programme. The connectedness component was well received by teacher participants, who saw benefits for both themselves and their students. Classroom observation also showed that teachers who received professional development made use of the programme strategies. Grade 8 students who participated in the SPIY programme were less likely to report violent behaviour at six-month follow-up than were control students, and trends also suggested reduced transport injuries. The results of this research support the use of the combined SPIY connectedness and curriculum components in a large-scale effectiveness trial to assess the impact of the programme on students’ connectedness, risk-taking and associated injuries.
Resumo:
A recent comment in the Journal of Sports Sciences (MacNamara & Collins, 2011) highlighted some major concerns with the current structure of talent identification and development (TID) programmes of Olympic athletes (e.g. Gulbin, 2008; Vaeyens, Gullich, Warr, & Philippaerts, 2009). In a cogent commentary, MacNamara and Collins (2011) provided a short review of the extant literature, which was both timely and insightful. Specifically, they criticised the ubiquitous one-dimensional ‘physically-biased’ attempts to produce world class performers, emphasising the need to consider a number of key environmental variables in a more multi-disciplinary perspective. They also lamented the wastage of talent, and alluded to the operational and opportunistic nature of current talent transfer programmes. A particularly compelling aspect of the comment was their allusion to high profile athletes who had ‘failed’ performance evaluation tests and then proceeded to succeed in that sport. This issue identifies a problem with current protocols for evaluating performance and is a line of research that is sorely needed in the area of talent development. To understand the nature of talent wastage that might be occurring in high performance programmes in sport, future empirical work should seek to follow the career paths of ‘successful’ and ‘unsuccessful’ products of TID programmes, in comparative analyses. Pertinent to the insights of MacNamara and Collins (2011), it remains clear that a number of questions have not received enough attention from sport scientists interested in talent development, including: (i) why is there so much wastage of talent in such programmes? And (ii), why are there so few reported examples of successful talent transfer programmes? These questions highlight critical areas for future investigation. The aim of this short correspondence is to discuss these and other issues researchers and practitioners might consider, and to propose how an ecological dynamics underpinning to such investigations may help the development of existing protocols...
Resumo:
Travel time in an important transport performance indicator. Different modes of transport (buses and cars) have different mechanical and operational characteristics, resulting in significantly different travel behaviours and complexities in multimodal travel time estimation on urban networks. This paper explores the relationship between bus and car travel time on urban networks by utilising the empirical Bluetooth and Bus Vehicle Identification data from Brisbane. The technologies and issues behind the two datasets are studied. After cleaning the data to remove outliers, the relationship between not-in-service bus and car travel time and the relationship between in-service bus and car travel time are discussed. The travel time estimation models reveal that the not-in-service bus travel time are similar to the car travel time and the in-service bus travel time could be used to estimate car travel time during off-peak hours
Resumo:
School connectedness has been shown to be an important protective factor in adolescent development, which is associated with reduced risk-taking behaviour. Interventions to increase students’ connectedness to school commonly incorporate aspects of teacher training. To date, however, research on connectedness has largely been based on student survey data, with no reported research addressing teachers’ perceptions of students’ connectedness and its association with student behavior. This research attempted to address this gap in the literature through in depth interviews with 14 school teachers and staff from two Australian high schools. Findings showed that teachers perceived students’ connectedness to be important in regards to reducing problem behavior, and discussed aspects of connectedness, including fairness and discipline, feeling valued, belonging and having teacher support, and being successfully engaged in school, as being particularly important. This research enables the development of school-based intervention programs that are based on both student and teacher-focused research.
Resumo:
This paper reports research into teacher-‐librarians’ perceptions of using social media and Web 2.0 in teaching and learning. A pilot study was conducted with teacher-‐librarians in five government schools and five private schools in southeast Queensland. The findings revealed that there was a strong digital divide between government schools and private schools, with government schools suffering severe restrictions on the use of social media and Web 2.0, leading to an unsophisticated use of these technologies. It is argued that internet ‘over-‐ blocking’ may lead to government school students not being empowered to manage risks in an open internet environment. Furthermore, their use of information for academic and recreational learning may be compromised. This has implications particularly for low socioeconomic students, leading to further inequity in the process and outcomes of Australian education.
Resumo:
Background: Decreased ability to perform Activities of Daily Living (ADLs) during hospitalisation has negative consequences for patients and health service delivery. Objective: To develop an Index to stratify patients at lower and higher risk of a significant decline in ability to perform ADLs at discharge. Design: Prospective two cohort study comprising a derivation (n=389; mean age 82.3 years; SD� 7.1) and a validation cohort (n=153; mean age 81.5 years; SD� 6.1). Patients and setting: General medical patients aged = 70 years admitted to three university-affiliated acute care hospitals in Brisbane, Australia. Measurement and main results: The short ADL Scale was used to identify a significant decline in ability to perform ADLs from premorbid to discharge. In the derivation cohort, 77 patients (19.8%) experienced a significant decline. Four significant factors were identified for patients independent at baseline: 'requiring moderate assistance to being totally dependent on others with bathing'; 'difficulty understanding others (frequently or all the time)'; 'requiring moderate assistance to being totally dependent on others with performing housework'; a 'history of experiencing at least one fall in the previous 90 days prior to hospital admission' in addition to 'independent at baseline', which was protective against decline at discharge. 'Difficulty understanding others (frequently or all the time)' and 'requiring moderate assistance to being totally dependent on others with performing housework' were also predictors for patients dependent in ADLs at baseline. Sensitivity, specificity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV) of the DADLD dichotomised risk scores were: 83.1% (95% CI 72.8; 90.7); 60.5% (95% CI 54.8; 65.9); 34.2% (95% CI 27.5; 41.5); 93.5% (95% CI 89.2; 96.5). In the validation cohort, 47 patients (30.7%) experienced a significant decline. Sensitivity, specificity, PPV and NPV of the DADLD were: 78.7% (95% CI 64.3; 89.3); 69.8% (95% CI 60.1, 78.3); 53.6% (95% CI 41.2; 65.7); 88.1% (95% CI 79.2; 94.1). Conclusions: The DADLD Index is a useful tool for identifying patients at higher risk of decline in ability to perform ADLs at discharge.
Resumo:
One of the primary desired capabilities of any future air traffic separation management system is the ability to provide early conflict detection and resolution effectively and efficiently. In this paper, we consider the risk of conflict as a primary measurement to be used for early conflict detection. This paper focuses on developing a novel approach to assess the impact of different measurement uncertainty models on the estimated risk of conflict. The measurement uncertainty model can be used to represent different sensor accuracy and sensor choices. Our study demonstrates the value of modelling measurement uncertainty in the conflict risk estimation problem and presents techniques providing a means of assessing sensor requirements to achieve desired conflict detection performance.
Resumo:
Background and Objectives In Australia, the risk of transfusion-transmitted malaria is managed through the identification of ‘at-risk’ donors, antibody screening enzyme-linked immunoassay (EIA) and, if reactive, exclusion from fresh blood component manufacture. Donor management depends on the duration of exposure in malarious regions (>6 months: ‘Resident’, <6 months: ‘Visitor’) or a history of malaria diagnosis. We analysed antibody testing and demographic data to investigate antibody persistence dynamics. To assess the yield from retesting 3 years after an initial EIA reactive result, we estimated the proportion of donors who would become non-reactive over this period. Materials and Methods Test results and demographic data from donors who were malaria EIA reactive were analysed. Time since possible exposure was estimated and antibody survival modelled. Results Among seroreverters, the time since last possible exposure was significantly shorter in ‘Visitors’ than in ‘Residents’. The antibody survival modelling predicted 20% of previously EIA reactive ‘Visitors’, but only 2% of ‘Residents’ would become non-reactive within 3 years of their first reactive EIA. Conclusion Antibody persistence in donors correlates with exposure category, with semi-immune ‘Residents’ maintaining detectable antibodies significantly longer than non-immune ‘Visitors’.
Resumo:
Motivated by growing considerations of the scale, severity and risks associated with human exposure to indoor particulate matter, this work reviewed existing literature to: (i) identify state-of-the-art experimental techniques used for personal exposure assessment; (ii) compare exposure levels reported for domestic/school settings in different countries (excluding exposure to environmental tobacco smoke and particulate matter from biomass cooking in developing countries); (iii) assess the contribution of outdoor background vs indoor sources to personal exposure; and (iv) examine scientific understanding of the risks posed by personal exposure to indoor aerosols. Limited studies assessing integrated daily residential exposure to just one particle size fraction, ultrafine particles, show that the contribution of indoor sources ranged from 19-76%. This indicates a strong dependence on resident activities, source events and site specificity, and highlights the importance of indoor sources for total personal exposure. Further, it was assessed that 10-30% of the total burden-of-disease from particulate matter exposure was due to indoor generated particles, signifying that indoor environments are likely to be a dominant environmental factor affecting human health. However, due to challenges associated with conducting epidemiological assessments, the role of indoor generated particles has not been fully acknowledged, and improved exposure/risk assessment methods are still needed, together with a serious focus on exposure control.
Resumo:
Vitamin D may have anti-skin cancer effects, but population-based evidence is lacking. We therefore assessed associations between vitamin D status and skin cancer risk in an Australian subtropical community. We analyzed prospective skin cancer incidence for 11 years following baseline assessment of serum 25(OH)-vitamin D in 1,191 adults (average age 54 years) and used multivariable logistic regression analysis to adjust risk estimates for age, sex, detailed assessments of usual time spent outdoors, phenotypic characteristics, and other possible confounders. Participants with serum 25(OH)-vitamin D concentrations above 75 nmol l(-1) versus those below 75 nmol l(-1) more often developed basal cell carcinoma (odds ratio (OR)=1.51 (95% confidence interval (CI): 1.10-2.07, P=0.01) and melanoma (OR=2.71 (95% CI: 0.98-7.48, P=0.05)). Squamous cell carcinoma incidence tended to be lower in persons with serum 25(OH)-vitamin D concentrations above 75 nmol l(-1) compared with those below 75 nmol l(-1) (OR=0.67 (95% CI: 0.44-1.03, P=0.07)). Vitamin D status was not associated with skin cancer incidence when participants were classified as above or below 50 nmol l(-1) 25(OH)-vitamin D. Our findings do not indicate that the carcinogenicity of high sun exposure can be counteracted by high vitamin D status. High sun exposure is to be avoided as a means to achieve high vitamin D status.
Resumo:
The risk of vitamin D insufficiency is increased in persons having limited sunlight exposure and dietary vitamin D. Supplementation compliance might be improved with larger doses taken less often, but this may increase the potential for side effects. The objective of the present study was to determine whether a weekly or weekly/monthly regimen of vitamin D supplementation is as effective as daily supplementation without increasing the risk of side effects. Participants were forty-eight healthy adults who were randomly assigned for 3 months to placebo or one of three supplementation regimens: 50 μg/d (2000 IU/d, analysed dose 70 μg/d), 250 μg/week (10 000 IU/week, analysed dose 331 μg/week) or 1250 μg/week (50 000 IU/week, analysed dose 1544 μg/week) for 4 weeks and then 1250 μg/month for 2 months. Daily and weekly doses were equally effective at increasing serum 25-hydroxyvitamin D, which was significantly greater than baseline in all the supplemented groups after 30 d of treatment. Subjects in the 1250 μg treatment group, who had a BMI >26 kg/m2, had a steady increase in urinary Ca in the first 3 weeks of supplementation, and, overall, the relative risk of hypercalciuria was higher in the 1250 μg group than in the placebo group (P= 0·01). Although vitamin D supplementation remains a controversial issue, these data document that supplementing with ≤ 250 μg/week ( ≤ 10 000 IU/week) can improve or maintain vitamin D status in healthy populations without the risk of hypercalciuria, but 24 h urinary Ca excretion should be evaluated in healthy persons receiving vitamin D3 supplementation in weekly single doses of 1250 μg (50 000 IU).