833 resultados para Individualized measures
Resumo:
Background: According to the World Health Organization, stroke is the 'incoming epidemic of the 21st century'. In light of recent data suggesting that 85% of all strokes may be preventable, strategies for prevention are moving to the forefront in stroke management. Summary: This review discusses the risk factors and provides evidence on the effective medical interventions and lifestyle modifications for optimal stroke prevention. Key Messages: Stroke risk can be substantially reduced using the medical measures that have been proven in many randomized trials, in combination with effective lifestyle modifications. The global modification of health and lifestyle is more beneficial than the treatment of individual risk factors. Clinical Implications: Hypertension is the most important modifiable risk factor for stroke. Efficacious reduction of blood pressure is essential for stroke prevention, even more so than the choice of antihypertensive drugs. Indications for the use of antihypertensive drugs depend on blood pressure values and vascular risk profile; thus, treatment should be initiated earlier in patients with diabetes mellitus or in those with a high vascular risk profile. Treatment of dyslipidemia with statins, anticoagulation therapy in atrial fibrillation, and carotid endarterectomy in symptomatic high-grade carotid stenosis are also effective for stroke prevention. Lifestyle factors that have been proven to reduce stroke risk include reducing salt, eliminating smoking, performing regular physical activity, and maintaining a normal body weight. © 2015 S. Karger AG, Basel.
Resumo:
OBJECTIVE The cause precipitating intracranial aneurysm rupture remains unknown in many cases. It has been observed that aneurysm ruptures are clustered in time, but the trigger mechanism remains obscure. Because solar activity has been associated with cardiovascular mortality and morbidity, we decided to study its association to aneurysm rupture in the Swiss population. METHODS Patient data were extracted from the Swiss SOS database, at time of analysis covering 918 consecutive patients with angiography-proven aneurysmal subarachnoid hemorrhage treated at 7 Swiss neurovascular centers between January 1, 2009, and December 31, 2011. The daily rupture frequency (RF) was correlated to the absolute amount and the change in various parameters of interest representing continuous measurements of solar activity (radioflux [F10.7 index], solar proton flux, solar flare occurrence, planetary K-index/planetary A-index, Space Environment Services Center [SESC] sunspot number and sunspot area) using Poisson regression analysis. RESULTS During the period of interest, there were 517 days without recorded aneurysm rupture. There were 398, 139, 27, 12, 1, and 1 days with 1, 2, 3, 4, 5, and 6 ruptures per day. Poisson regression analysis demonstrated a significant correlation of F10.7 index and RF (incidence rate ratio [IRR] = 1.006303; standard error (SE) 0.0013201; 95% confidence interval (CI) 1.003719-1.008894; P < 0.001), according to which every 1-unit increase of the F10.7 index increased the count for an aneurysm to rupture by 0.63%. A likewise statistically significant relationship of both the SESC sunspot number (IRR 1.003413; SE 0.0007913; 95% CI 1.001864-1.004965; P < 0.001) and the sunspot area (IRR 1.000419; SE 0.0000866; 95% CI 1.000249-1.000589; P < 0.001) emerged. All other variables analyzed showed no significant correlation with RF. CONCLUSIONS We found greater radioflux, SESC sunspot number, and sunspot area to be associated with an increased count of aneurysm rupture. The clinical meaningfulness of this statistical association must be interpreted carefully and future studies are warranted to rule out a type-1 error.
Resumo:
BACKGROUND Assessment of the proportion of patients with well controlled cardiovascular risk factors underestimates the proportion of patients receiving high quality of care. Evaluating whether physicians respond appropriately to poor risk factor control gives a different picture of quality of care. We assessed physician response to control cardiovascular risk factors, as well as markers of potential overtreatment in Switzerland, a country with universal healthcare coverage but without systematic quality monitoring, annual report cards on quality of care or financial incentives to improve quality. METHODS We performed a retrospective cohort study of 1002 randomly selected patients aged 50-80 years from four university primary care settings in Switzerland. For hypertension, dyslipidemia and diabetes mellitus, we first measured proportions in control, then assessed therapy modifications among those in poor control. "Appropriate clinical action" was defined as a therapy modification or return to control without therapy modification within 12 months among patients with baseline poor control. Potential overtreatment of these conditions was defined as intensive treatment among low-risk patients with optimal target values. RESULTS 20% of patients with hypertension, 41% with dyslipidemia and 36% with diabetes mellitus were in control at baseline. When appropriate clinical action in response to poor control was integrated into measuring quality of care, 52 to 55% had appropriate quality of care. Over 12 months, therapy of 61% of patients with baseline poor control was modified for hypertension, 33% for dyslipidemia, and 85% for diabetes mellitus. Increases in number of drug classes (28-51%) and in drug doses (10-61%) were the most common therapy modifications. Patients with target organ damage and higher baseline values were more likely to have appropriate clinical action. We found low rates of potential overtreatment with 2% for hypertension, 3% for diabetes mellitus and 3-6% for dyslipidemia. CONCLUSIONS In primary care, evaluating whether physicians respond appropriately to poor risk factor control, in addition to assessing proportions in control, provide a broader view of the quality of care than relying solely on measures of proportions in control. Such measures could be more clinically relevant and acceptable to physicians than simply reporting levels of control.
Resumo:
OBJECTIVES Sensorineural hearing loss from sound overexposure has a considerable prevalence. Identification of sound hazards is crucial, as prevention, due to a lack of definitive therapies, is the sole alternative to hearing aids. One subjectively loud, yet little studied, potential sound hazard is movie theaters. This study uses smart phones to evaluate their applicability as a widely available, validated sound pressure level (SPL) meter. Therefore, this study measures sound levels in movie theaters to determine whether sound levels exceed safe occupational noise exposure limits and whether sound levels in movie theaters differ as a function of movie, movie theater, presentation time, and seat location within the theater. DESIGN Six smart phones with an SPL meter software application were calibrated with a precision SPL meter and validated as an SPL meter. Additionally, three different smart phone generations were measured in comparison to an integrating SPL meter. Two different movies, an action movie and a children's movie, were measured six times each in 10 different venues (n = 117). To maximize representativeness, movies were selected focusing on large release productions with probable high attendance. Movie theaters were selected in the San Francisco, CA, area based on whether they screened both chosen movies and to represent the largest variety of theater proprietors. Measurements were analyzed in regard to differences between theaters, location within the theater, movie, as well as presentation time and day as indirect indicator of film attendance. RESULTS The smart phone measurements demonstrated high accuracy and reliability. Overall, sound levels in movie theaters do not exceed safe exposure limits by occupational standards. Sound levels vary significantly across theaters and demonstrated statistically significant higher sound levels and exposures in the action movie compared to the children's movie. Sound levels decrease with distance from the screen. However, no influence on time of day or day of the week as indirect indicator of film attendance could be found. CONCLUSIONS Calibrated smart phones with an appropriate software application as used in this study can be utilized as a validated SPL meter. Because of the wide availability, smart phones in combination with the software application can provide high quantity recreational sound exposure measurements, which can facilitate the identification of potential noise hazards. Sound levels in movie theaters decrease with distance to the screen, but do not exceed safe occupational noise exposure limits. Additionally, there are significant differences in sound levels across movie theaters and movies, but not in presentation time.
Resumo:
The unprecedented success of social networking sites (SNSs) has been recently overshadowed by concerns about privacy risks. As SNS users grow weary of privacy breaches and thus develop distrust, they may restrict or even terminate their platform activities. In the long run, these developments endanger SNS platforms’ financial viability and undermine their ability to create individual and social value. By applying a justice perspective, this study aims to understand the means at the disposal of SNS providers to leverage the privacy concerns and trusting beliefs of their users—two important determinants of user participation on SNSs. Considering that SNSs have a global appeal, empirical tests assess the effectiveness of justice measures for three culturally distinct countries: Germany, Russia and Morocco. The results indicate that these measures are particularly suited to address trusting beliefs of SNS audience. Specifically, in all examined countries, procedural justice and the awareness dimension of informational justice improve perceptions of trust in the SNS provider. Privacy concerns, however, are not as easy to manage, because the impact of justice-based measures on privacy concerns is not universal. Beyond theoretical value, this research offers valuable practical insights into the use of justice-based measures to promote trust and mitigate privacy concerns in a cross-cultural setting.
Resumo:
Assessing and managing risks relating to the consumption of food stuffs for humans and to the environment has been one of the most complex legal issues in WTO law, ever since the Agreement on Sanitary and Phytosanitary Measures was adopted at the end of the Uruguay Round and entered into force in 1995. The problem was expounded in a number of cases. Panels and the Appellate Body adopted different philosophies in interpreting the agreement and the basic concept of risk assessment as defined in Annex A para. 4 of the Agreement. Risk assessment entails fundamental question on law and science. Different interpretations reflect different underlying perceptions of science and its relationship to the law. The present thesis supported by the Swiss National Research Foundation undertakes an in-depth analysis of these underlying perceptions. The author expounds the essence and differences of positivism and relativism in philosophy and natural sciences. He clarifies the relationship of fundamental concepts such as risk, hazards and probability. This investigation is a remarkable effort on the part of lawyer keen to learn more about the fundamentals based upon which the law – often unconsciously – is operated by the legal profession and the trade community. Based upon these insights, he turns to a critical assessment of jurisprudence both of panels and the Appellate Body. Extensively referring and discussing the literature, he deconstructs findings and decisions in light of implied and assumed underlying philosophies and perceptions as to the relationship of law and science, in particular in the field of food standards. Finding that both positivism and relativism does not provide adequate answers, the author turns critical rationalism and applies the methodologies of falsification developed by Karl R. Popper. Critical rationalism allows combining discourse in science and law and helps preparing the ground for a new approach to risk assessment and risk management. Linking the problem to the doctrine of multilevel governance the author develops a theory allocating risk assessment to international for a while leaving the matter of risk management to national and democratically accountable government. While the author throughout the thesis questions the possibility of separating risk assessment and risk management, the thesis offers new avenues which may assist in structuring a complex and difficult problem
Resumo:
Numerous studies reported a strong link between working memory capacity (WMC) and fluid intelligence (Gf), although views differ in respect to how close these two constructs are related to each other. In the present study, we used a WMC task with five levels of task demands to assess the relationship between WMC and Gf by means of a new methodological approach referred to as fixed-links modeling. Fixed-links models belong to the family of confirmatory factor analysis (CFA) and are of particular interest for experimental, repeated-measures designs. With this technique, processes systematically varying across task conditions can be disentangled from processes unaffected by the experimental manipulation. Proceeding from the assumption that experimental manipulation in a WMC task leads to increasing demands on WMC, the processes systematically varying across task conditions can be assumed to be WMC-specific. Processes not varying across task conditions, on the other hand, are probably independent of WMC. Fixed-links models allow for representing these two kinds of processes by two independent latent variables. In contrast to traditional CFA where a common latent variable is derived from the different task conditions, fixed-links models facilitate a more precise or purified representation of the WMC-related processes of interest. By using fixed-links modeling to analyze data of 200 participants, we identified a non-experimental latent variable, representing processes that remained constant irrespective of the WMC task conditions, and an experimental latent variable which reflected processes that varied as a function of experimental manipulation. This latter variable represents the increasing demands on WMC and, hence, was considered a purified measure of WMC controlled for the constant processes. Fixed-links modeling showed that both the purified measure of WMC (β = .48) as well as the constant processes involved in the task (β = .45) were related to Gf. Taken together, these two latent variables explained the same portion of variance of Gf as a single latent variable obtained by traditional CFA (β = .65) indicating that traditional CFA causes an overestimation of the effective relationship between WMC and Gf. Thus, fixed-links modeling provides a feasible method for a more valid investigation of the functional relationship between specific constructs.
Resumo:
Neuropsychologists often face interpretational difficulties when assessing cognitive deficits, particularly in cases of unclear cerebral etiology. How can we be sure whether a single test score below the population average is indicative of a pathological brain condition or normal? In the past few years, the topic of intra-individual performance variability has gained great interest. On the basis of a large normative sample, two measures of performance variability and their importance for neuropsychological interpretation will be presented in this paper: the number of low scores and the level of dispersion.We conclude that low scores are common in healthy individuals. On the other hand, the level of dispersion is relatively small. Here, base rate information about abnormally low scores and abnormally high dispersion across cognitive abilities are providedto improve the awareness of normal variability and to serve clinicians as additional interpretive measures in the diagnostic process.
Resumo:
We investigated whether amygdala activation, autonomic responses, respiratory responses, and facial muscle activity (measured over the brow and cheek [fear grin] regions) are all sensitive to phobic versus nonphobic fear and, more importantly, whether effects in these variables vary as a function of both phobic and nonphobic fear intensity. Spider-phobic and comparably low spider-fearful control participants imagined encountering different animals and rated their subjective fear while their central and peripheral nervous system activity was measured. All measures included in our study were sensitive to variations in subjective fear, but were related to different ranges and positions on the subjective fear level continuum. Left amygdala activation, heart rate, and facial muscle activity over the cheek region captured fear intensity variations even within narrowly described regions on the fear level continuum (here within extremely low levels of fear and within considerable phobic fear). Skin conductance and facial muscle activity over the brow region did not capture fear intensity variations within low levels of fear: skin conductance mirrored only extreme levels of fear, and activity over the brow region distinguished phobic from nonphobic fear but also low-to-moderate and high phobic fear. Finally, respiratory measures distinguished phobic from nonphobic fear with no further differentiation within phobic and nonphobic fear. We conclude that a careful consideration of the measures to be used in an investigation and the population to be examined can be critical in order to obtain significant results.
Resumo:
Soil carbon (C) storage is a key ecosystem service. Soil C stocks play a vital role in soil fertility and climate regulation, but the factors that control these stocks at regional and national scales are unknown, particularly when their composition and stability are considered. As a result, their mapping relies on either unreliable proxy measures or laborious direct measurements. Using data from an extensive national survey of English grasslands, we show that surface soil (0–7 cm) C stocks in size fractions of varying stability can be predicted at both regional and national scales from plant traits and simple measures of soil and climatic conditions. Soil C stocks in the largest pool, of intermediate particle size (50–250 μm), were best explained by mean annual temperature (MAT), soil pH and soil moisture content. The second largest C pool, highly stable physically and biochemically protected particles (0·45–50 μm), was explained by soil pH and the community abundance-weighted mean (CWM) leaf nitrogen (N) content, with the highest soil C stocks under N-rich vegetation. The C stock in the small active fraction (250–4000 μm) was explained by a wide range of variables: MAT, mean annual precipitation, mean growing season length, soil pH and CWM specific leaf area; stocks were higher under vegetation with thick and/or dense leaves. Testing the models describing these fractions against data from an independent English region indicated moderately strong correlation between predicted and actual values and no systematic bias, with the exception of the active fraction, for which predictions were inaccurate. Synthesis and applications. Validation indicates that readily available climate, soils and plant survey data can be effective in making local- to landscape-scale (1–100 000 km2) soil C stock predictions. Such predictions are a crucial component of effective management strategies to protect C stocks and enhance soil C sequestration.