940 resultados para Risk measure
Resumo:
Tomatoes are the most common crop in Italy. The production cycle requires operations in the field and factory that can cause musculoskeletal disorders due to the repetitive movements of the upper limbs of the workers employed in the sorting phase. This research aims to evaluate these risks using the OCRA (occupational repetitive actions) index method This method is based firstly on the calculation of a maximum number of recommended actions, related to the way the operation is performed, and secondly on a comparison of the number of actions effectively carried out by the upper limb with the recommended calculated value. The results of the risk evaluation for workers who manually sort tomatoes during harvest showed a risk for the workers, with an exposure index greater than 20; the OCRA index defines an index higher than 3.5 as unacceptable. The present trend of replacing manual sorting onboard a vehicle with optical sorters seems to be appropriate to reduce the risk of work-related musculoskeletal disorders (WMSDs) and is supported from both a financial point of view and as a quality control measure.
Resumo:
Chlamydia trachomatis infection, the most common reportable disease in the United States, can lead to pelvic inflammatory disease (PID), infertility, ectopic pregnancy, and chronic pelvic pain. Although C. trachomatis is identified among many women who receive a diagnosis of PID, the incidence and timing of PID and long-term sequelae from an untreated chlamydial infection have not been fully determined. This article examines evidence reviewed as part of the Centers for Disease Control and Prevention Chlamydia Immunology and Control Expert Advisory Meeting; 24 reports were included. We found no prospective studies directly assessing risk of long-term reproductive sequelae, such as infertility, after untreated C. trachomatis infection. Several studies assessed PID diagnosis after untreated chlamydial infection, but rates varied widely, making it difficult to determine an overall estimate. In high-risk settings, 2%-5% of untreated women developed PID within the approximately 2-week period between testing positive for C. trachomatis and returning for treatment. However, the rate of PID progression in the general, asymptomatic population followed up for longer periods appeared to be low. According to the largest studies, after symptomatic PID of any cause has occurred, up to 18% of women may develop infertility. In several studies, repeated chlamydial infection was associated with PID and other reproductive sequelae, although it was difficult to determine whether the risk per infection increased with each recurrent episode. The present review critically evaluates this body of literature and suggests future research directions. Specifically, prospective studies assessing rates of symptomatic PID, subclinical tubal damage, and long-term reproductive sequelae after C. trachomatis infection; better tools to measure PID and tubal damage; and studies on the natural history of repeated chlamydial infections are needed.
Resumo:
Objective: To review the literature to identify and synthesize the evidence on risk factors for patient falls in geriatric rehabilitation hospital settings. Data sources: Eligible studies were systematically searched on 16 databases from inception to December 2010. Review methods: The search strategies used a combination of terms for rehabilitation hospital patients, falls, risk factors and older adults. Cross-sectional, cohort, case-control studies and randomized clinical trials (RCTs) published in English that investigated risks for falls among patients ≥65 years of age in rehabilitation hospital settings were included. Studies that investigated fall risk assessment tools, but did not investigate risk factors themselves or did not report a measure of risk (e.g. odds ratio, relative risk) were excluded. Results: A total of 2,824 references were identified; only eight articles concerning six studies met the inclusion criteria. In these, 1,924 geriatric rehabilitation patients were followed. The average age of the patients ranged from 77 to 83 years, the percentage of women ranged from 56% to 81%, and the percentage of fallers ranged from 15% to 54%. Two were case-control studies, two were RCTs and four were prospective cohort studies. Several intrinsic and extrinsic risk factors for falls were identified. Conclusion: Carpet flooring, vertigo, being an amputee, confusion, cognitive impairment, stroke, sleep disturbance, anticonvulsants, tranquilizers and antihypertensive medications, age between 71 and 80, previous falls, and need for transfer assistance are risk factors for geriatric patient falls in rehabilitation hospital settings.
Resumo:
Mr. Pechersky set out to examine a specific feature of the employer-employee relationship in Russian business organisations. He wanted to study to what extent the so-called "moral hazard" is being solved (if it is being solved at all), whether there is a relationship between pay and performance, and whether there is a correlation between economic theory and Russian reality. Finally, he set out to construct a model of the Russian economy that better reflects the way it actually functions than do certain other well-known models (for example models of incentive compensation, the Shapiro-Stiglitz model etc.). His report was presented to the RSS in the form of a series of manuscripts in English and Russian, and on disc, with many tables and graphs. He begins by pointing out the different examples of randomness that exist in the relationship between employee and employer. Firstly, results are frequently affected by circumstances outside the employee's control that have nothing to do with how intelligently, honestly, and diligently the employee has worked. When rewards are based on results, uncontrollable randomness in the employee's output induces randomness in their incomes. A second source of randomness involves the outside events that are beyond the control of the employee that may affect his or her ability to perform as contracted. A third source of randomness arises when the performance itself (rather than the result) is measured, and the performance evaluation procedures include random or subjective elements. Mr. Pechersky's study shows that in Russia the third source of randomness plays an important role. Moreover, he points out that employer-employee relationships in Russia are sometimes opposite to those in the West. Drawing on game theory, he characterises the Western system as follows. The two players are the principal and the agent, who are usually representative individuals. The principal hires an agent to perform a task, and the agent acquires an information advantage concerning his actions or the outside world at some point in the game, i.e. it is assumed that the employee is better informed. In Russia, on the other hand, incentive contracts are typically negotiated in situations in which the employer has the information advantage concerning outcome. Mr. Pechersky schematises it thus. Compensation (the wage) is W and consists of a base amount, plus a portion that varies with the outcome, x. So W = a + bx, where b is used to measure the intensity of the incentives provided to the employee. This means that one contract will be said to provide stronger incentives than another if it specifies a higher value for b. This is the incentive contract as it operates in the West. The key feature distinguishing the Russian example is that x is observed by the employer but is not observed by the employee. So the employer promises to pay in accordance with an incentive scheme, but since the outcome is not observable by the employee the contract cannot be enforced, and the question arises: is there any incentive for the employer to fulfil his or her promises? Mr. Pechersky considers two simple models of employer-employee relationships displaying the above type of information symmetry. In a static framework the obtained result is somewhat surprising: at the Nash equilibrium the employer pays nothing, even though his objective function contains a quadratic term reflecting negative consequences for the employer if the actual level of compensation deviates from the expectations of the employee. This can lead, for example, to labour turnover, or the expenses resulting from a bad reputation. In a dynamic framework, the conclusion can be formulated as follows: the higher the discount factor, the higher the incentive for the employer to be honest in his/her relationships with the employee. If the discount factor is taken to be a parameter reflecting the degree of (un)certainty (the higher the degree of uncertainty is, the lower is the discount factor), we can conclude that the answer to the formulated question depends on the stability of the political, social and economic situation in a country. Mr. Pechersky believes that the strength of a market system with private property lies not just in its providing the information needed to compute an efficient allocation of resources in an efficient manner. At least equally important is the manner in which it accepts individually self-interested behaviour, but then channels this behaviour in desired directions. People do not have to be cajoled, artificially induced, or forced to do their parts in a well-functioning market system. Instead, they are simply left to pursue their own objectives as they see fit. Under the right circumstances, people are led by Adam Smith's "invisible hand" of impersonal market forces to take the actions needed to achieve an efficient, co-ordinated pattern of choices. The problem is that, as Mr. Pechersky sees it, there is no reason to believe that the circumstances in Russia are right, and the invisible hand is doing its work properly. Political instability, social tension and other circumstances prevent it from doing so. Mr. Pechersky believes that the discount factor plays a crucial role in employer-employee relationships. Such relationships can be considered satisfactory from a normative point of view, only in those cases where the discount factor is sufficiently large. Unfortunately, in modern Russia the evidence points to the typical discount factor being relatively small. This fact can be explained as a manifestation of aversion to risk of economic agents. Mr. Pechersky hopes that when political stabilisation occurs, the discount factors of economic agents will increase, and the agent's behaviour will be explicable in terms of more traditional models.
Resumo:
BACKGROUND: Pain is a common experience in later life. There is conflicting evidence of the prevalence, impact, and context of pain in older people. GPs are criticised for underestimating and under-treating pain. AIM: To assess the extent to which older people experience pain, and to explore relationships between self-reported pain and functional ability and depression. DESIGN OF STUDY: Secondary analysis of baseline data from a randomised controlled trial of health risk appraisal. SETTING: A total of 1090 community-dwelling non-disabled people aged 65 years and over were included in the study from three group practices in suburban London. METHOD: Main outcome measures were pain in the last 4 weeks and the impact of pain, measured using the 24-item Geriatric Pain Measure; depression symptoms captured using the 5-item Mental Health Inventory; social relationships measured using the 6-item Lubben Social Network Scale; Basic and Instrumental Activities of Daily Living and self-reported symptoms. RESULTS: Forty-five per cent of women and 34% of men reported pain in the previous 4 weeks. Pain experience appeared to be less in the 'oldest old': 27.5% of those aged 85 years and over reported pain compared with 38-53% of the 'younger old'. Those with arthritis were four times more likely to report pain. Pain had a profound impact on activities of daily living, but most of those reporting pain described their health as good or excellent. Although there was a significant association between the experience of pain and depressed mood, the majority of those reporting pain did not have depressed mood. CONCLUSION: A multidimensional approach to assessing pain is appropriate. Primary care practitioners should also assess the impact of pain on activities of daily living.
Resumo:
OBJECTIVE: To explore the feasibility and psychometric properties of a self-administered version of the 24-item Geriatric Pain Measure (GPM-24-SA). DESIGN: Secondary analysis of baseline data from the Prevention in Older People-Assessment in Generalists' practices trial, an international multi-center study of a health-risk appraisal system. PARTICIPANTS: One thousand seventy-two community dwelling nondisabled older adults self-reporting pain from London, UK; Hamburg, Germany; and Solothurn, Switzerland. OUTCOME MEASURES: GPM-24-SA as part of a multidimensional Health Risk Appraisal Questionnaire including self-reported demographic and health-related information. RESULTS: Among the 1,072 subjects, 655 had complete GPM-24-SA data, 404 had
Resumo:
Traditional methods do not actually measure peoples’ risk attitude naturally and precisely. Therefore, a fuzzy risk attitude classification method is developed. Since the prospect theory is usually considered as an effective model of decision making, the personalized parameters in prospect theory are firstly fuzzified to distinguish people with different risk attitudes, and then a fuzzy classification database schema is applied to calculate the exact value of risk value attitude and risk be- havior attitude. Finally, by applying a two-hierarchical clas- sification model, the precise value of synthetical risk attitude can be acquired.
Resumo:
PRINCIPLES To evaluate the validity and feasibility of a novel photography-based home assessment (PhoHA) protocol, as a possible substitute for on-site home assessment (OsHA). METHODS A total of 20 patients aged ≥65 years who were hospitalised in a rehabilitation centre for musculoskeletal disorders affecting mobility participated in this prospective validation study. For PhoHA, occupational therapists rated photographs and measurements of patients' homes provided by patients' confidants. For OsHA, occupational therapists conducted a conventional home visit. RESULTS Information obtained by PhoHA was 79.1% complete (1,120 environmental factors identified by PhoHA vs 1416 by OsHA). Of the 1,120 factors, 749 had dichotomous (potential hazards) and 371 continuous scores (measurements with tape measure). Validity of PhoHA to potential hazards was good (sensitivity 78.9%, specificity 84.9%), except for two subdomains (pathways, slippery surfaces). Pearson's correlation coefficient for the validity of measurements was 0.87 (95% confidence interval [CI 0.80-0.92, p <0.001). Agreement between methods was 0.52 (95%CI 0.34-0.67, p <0.001, Cohen's kappa coefficient) for dichotomous and 0.86 (95%CI 0.79-0.91, p <0.001, intraclass correlation coefficient) for continuous scores. Costs of PhoHA were 53.0% lower than those of OsHA (p <0.001). CONCLUSIONS PhoHA has good concurrent validity for environmental assessment if instructions for confidants are improved. PhoHA is potentially a cost-effective method for environmental assessment.
Resumo:
Recent evidence suggests that transition risks from initial clinical high risk (CHR) status to psychosis are decreasing. The role played by remission in this context is mostly unknown. The present study addresses this issue by means of a meta-analysis including eight relevant studies published up to January 2012 that reported remission rates from an initial CHR status. The primary effect size measure was the longitudinal proportion of remissions compared to non-remission in subjects with a baseline CHR state. Random effect models were employed to address the high heterogeneity across studies included. To assess the robustness of the results, we performed sensitivity analyses by sequentially removing each study and rerunning the analysis. Of 773 subjects who met initial CHR criteria, 73% did not convert to psychosis along a 2-year follow. Of these, about 46% fully remitted from the baseline attenuated psychotic symptoms, as evaluated on the psychometric measures usually employed by prodromal services. The corresponding clinical remission was estimated as high as 35% of the baseline CHR sample. The CHR state is associated with a significant proportion of remitting subjects that can be accounted by the effective treatments received, a lead time bias, a dilution effect, a comorbid effect of other psychiatric diagnoses.
Resumo:
Cancer is a chronic disease that often necessitates recurrent hospitalizations, a costly pattern of medical care utilization. In chronically ill patients, most readmissions are for treatment of the same condition that caused the preceding hospitalization. There is concern that rather than reducing costs, earlier discharge may shift costs from the initial hospitalization to emergency center visits. ^ This is the first descriptive study to measure the incidence of emergency center visits (ECVs) after hospitalization at The University of M. D. Anderson Cancer Center (UTMDACC), to identify the risk factors for and outcomes of these ECVs, and to compare 30-day all-cause mortality and costs for episodes of care with and without ECVs. ^ We identified all hospitalizations at UTMDACC with admission dates from September 1, 1993 through August 31, 1997 which met inclusion criteria. Data were electronically obtained primarily from UTMDACC's institutional database. Demographic factors, clinical factors, duration of the index hospitalization, method of payment for care, and year of hospitalization study were variables determined for each hospitalization. ^ The overall incidence of ECVs was 18%. Forty-five percent of ECVs resulted in hospital readmission (8% of all hospitalizations). In 1% of ECVs the patient died in the emergency center, and for the remaining 54% of ECVs the patient was discharged home. Risk factors for ECVs were marital status, type of index hospitalization, cancer type, and duration of the index hospitalization. The overall 30-day all-cause mortality rate was 8.6% for hospitalizations with an ECV and 5.3% for those without an ECV. In all subgroups, the 30-day all-cause mortality rate was higher for groups with ECVs than for those without ECVs. The most important factor increasing cost was having an ECV. In all patient subgroups, the cost per episode of care with an ECV was at least 1.9 times the cost per episode without an ECV. ^ The higher costs and poorer outcomes of episodes of care with ECVs and hospital readmissions suggest that interventions to avoid these ECVs or mitigate their costs are needed. Further research is needed to improve understanding of the methodological issues involved in relation to health care issues for cancer patients. ^
Resumo:
BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.
Resumo:
BACKGROUND The number of older adults in the global population is increasing. This demographic shift leads to an increasing prevalence of age-associated disorders, such as Alzheimer's disease and other types of dementia. With the progression of the disease, the risk for institutional care increases, which contrasts with the desire of most patients to stay in their home environment. Despite doctors' and caregivers' awareness of the patient's cognitive status, they are often uncertain about its consequences on activities of daily living (ADL). To provide effective care, they need to know how patients cope with ADL, in particular, the estimation of risks associated with the cognitive decline. The occurrence, performance, and duration of different ADL are important indicators of functional ability. The patient's ability to cope with these activities is traditionally assessed with questionnaires, which has disadvantages (eg, lack of reliability and sensitivity). Several groups have proposed sensor-based systems to recognize and quantify these activities in the patient's home. Combined with Web technology, these systems can inform caregivers about their patients in real-time (e.g., via smartphone). OBJECTIVE We hypothesize that a non-intrusive system, which does not use body-mounted sensors, video-based imaging, and microphone recordings would be better suited for use in dementia patients. Since it does not require patient's attention and compliance, such a system might be well accepted by patients. We present a passive, Web-based, non-intrusive, assistive technology system that recognizes and classifies ADL. METHODS The components of this novel assistive technology system were wireless sensors distributed in every room of the participant's home and a central computer unit (CCU). The environmental data were acquired for 20 days (per participant) and then stored and processed on the CCU. In consultation with medical experts, eight ADL were classified. RESULTS In this study, 10 healthy participants (6 women, 4 men; mean age 48.8 years; SD 20.0 years; age range 28-79 years) were included. For explorative purposes, one female Alzheimer patient (Montreal Cognitive Assessment score=23, Timed Up and Go=19.8 seconds, Trail Making Test A=84.3 seconds, Trail Making Test B=146 seconds) was measured in parallel with the healthy subjects. In total, 1317 ADL were performed by the participants, 1211 ADL were classified correctly, and 106 ADL were missed. This led to an overall sensitivity of 91.27% and a specificity of 92.52%. Each subject performed an average of 134.8 ADL (SD 75). CONCLUSIONS The non-intrusive wireless sensor system can acquire environmental data essential for the classification of activities of daily living. By analyzing retrieved data, it is possible to distinguish and assign data patterns to subjects' specific activities and to identify eight different activities in daily living. The Web-based technology allows the system to improve care and provides valuable information about the patient in real-time.
Resumo:
Decisions require careful weighing of the risks and benefits associated with a choice. Some people need to be offered large rewards to balance even minimal risks, whereas others take great risks in the hope for an only minimal benefit. We show here that risk-taking is a modifiable behavior that depends on right hemisphere prefrontal activity. We used low-frequency, repetitive transcranial magnetic stimulation to transiently disrupt left or right dorsolateral prefrontal cortex (DLPFC) function before applying a well known gambling paradigm that provides a measure of decision-making under risk. Individuals displayed significantly riskier decision-making after disruption of the right, but not the left, DLPFC. Our findings suggest that the right DLPFC plays a crucial role in the suppression of superficially seductive options. This confirms the asymmetric role of the prefrontal cortex in decision-making and reveals that this fundamental human capacity can be manipulated in normal subjects through cortical stimulation. The ability to modify risk-taking behavior may be translated into therapeutic interventions for disorders such as drug abuse or pathological gambling.
Resumo:
BACKGROUND AND PURPOSE Treatment with statins reduces the rate of cardiovascular events in high-risk patients, but residual risk persists. At least part of that risk may be attributable to atherogenic dyslipidemia characterized by low high-density lipoprotein cholesterol (≤40 mg/dL) and high triglycerides (triglycerides≥150 mg/dL). METHODS We studied subjects with stroke or transient ischemic attack in the Prevention of Cerebrovascular and Cardiovascular Events of Ischemic Origin With Terutroban in Patients With a History of Ischemic Stroke or Transient Ischemic Attack (PERFORM; n=19,100) and Stroke Prevention by Aggressive Reduction in Cholesterol Levels (SPARCL; n=4731) trials who were treated with a statin and who had high-density lipoprotein cholesterol and triglycerides measurements 3 months after randomization (n=10,498 and 2900, respectively). The primary outcome measure for this exploratory analysis was the occurrence of major cardiovascular events (nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death). We also performed a time-varying analysis to account for all available high-density lipoprotein cholesterol and triglyceride measurements. RESULTS A total of 10% of subjects in PERFORM and 9% in SPARCL had atherogenic dyslipidemia after ≥3 months on start statin therapy. After a follow-up of 2.3 years (PERFORM) and 4.9 years (SPARCL), a major cardiovascular event occurred in 1123 and 485 patients in the 2 trials, respectively. The risk of major cardiovascular events was higher in subjects with versus those without atherogenic dyslipidemia in both PERFORM (hazard ratio, 1.36; 95% confidence interval, 1.14-1.63) and SPARCL (hazard ratio, 1.40; 95% confidence interval, 1.06-1.85). The association was attenuated after multivariable adjustment (hazard ratio, 1.23; 95% confidence interval, 1.03-1.48 in PERFORM and hazard ratio, 1.24; 95% confidence interval, 0.93-1.65 in SPARCL). Time-varying analysis confirmed these findings. CONCLUSIONS The presence of atherogenic dyslipidemia was associated with higher residual cardiovascular risk in PERFORM and SPARCL subjects with stroke or transient ischemic attack receiving statin therapy. Specific therapeutic interventions should now be trialed to address this residual risk.
Resumo:
BACKGROUND Early identification of patients at risk of developing persistent low back pain (LBP) is crucial. OBJECTIVE Aim of this study was to identify in patients with a new episode of LBP the time point at which those at risk of developing persistent LBP can be best identified.METHODS: Prospective cohort study of 315 patients presenting to a health practitioner with a first episode of acute LBP. Primary outcome measure was functional limitation. Patients were assessed at baseline, three, six, twelve weeks and six months looking at factors of maladaptive cognition as potential predictors. Multivariate logistic regression analysis was performed for all time points. RESULTS The best time point to predict the development of persistent LBP at six months was the twelve-week follow-up (sensitivity 78%; overall predictive value 90%). Cognitions assessed at first visit to a health practitioner were not predictive. CONCLUSIONS Maladaptive cognitions at twelve weeks appear to be suitable predictors for a transition from acute to persistent LBP. Already three weeks after patients present to a health practitioner with acute LBP cognitions might influence the development of persistent LBP. Therefore, cognitive-behavioral interventions should be considered as early adjuvant LBP treatment in patients at risk of developing persistent LBP.