100 resultados para preventative measures


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The unprecedented success of social networking sites (SNSs) has been recently overshadowed by concerns about privacy risks. As SNS users grow weary of privacy breaches and thus develop distrust, they may restrict or even terminate their platform activities. In the long run, these developments endanger SNS platforms’ financial viability and undermine their ability to create individual and social value. By applying a justice perspective, this study aims to understand the means at the disposal of SNS providers to leverage the privacy concerns and trusting beliefs of their users—two important determinants of user participation on SNSs. Considering that SNSs have a global appeal, empirical tests assess the effectiveness of justice measures for three culturally distinct countries: Germany, Russia and Morocco. The results indicate that these measures are particularly suited to address trusting beliefs of SNS audience. Specifically, in all examined countries, procedural justice and the awareness dimension of informational justice improve perceptions of trust in the SNS provider. Privacy concerns, however, are not as easy to manage, because the impact of justice-based measures on privacy concerns is not universal. Beyond theoretical value, this research offers valuable practical insights into the use of justice-based measures to promote trust and mitigate privacy concerns in a cross-cultural setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessing and managing risks relating to the consumption of food stuffs for humans and to the environment has been one of the most complex legal issues in WTO law, ever since the Agreement on Sanitary and Phytosanitary Measures was adopted at the end of the Uruguay Round and entered into force in 1995. The problem was expounded in a number of cases. Panels and the Appellate Body adopted different philosophies in interpreting the agreement and the basic concept of risk assessment as defined in Annex A para. 4 of the Agreement. Risk assessment entails fundamental question on law and science. Different interpretations reflect different underlying perceptions of science and its relationship to the law. The present thesis supported by the Swiss National Research Foundation undertakes an in-depth analysis of these underlying perceptions. The author expounds the essence and differences of positivism and relativism in philosophy and natural sciences. He clarifies the relationship of fundamental concepts such as risk, hazards and probability. This investigation is a remarkable effort on the part of lawyer keen to learn more about the fundamentals based upon which the law – often unconsciously – is operated by the legal profession and the trade community. Based upon these insights, he turns to a critical assessment of jurisprudence both of panels and the Appellate Body. Extensively referring and discussing the literature, he deconstructs findings and decisions in light of implied and assumed underlying philosophies and perceptions as to the relationship of law and science, in particular in the field of food standards. Finding that both positivism and relativism does not provide adequate answers, the author turns critical rationalism and applies the methodologies of falsification developed by Karl R. Popper. Critical rationalism allows combining discourse in science and law and helps preparing the ground for a new approach to risk assessment and risk management. Linking the problem to the doctrine of multilevel governance the author develops a theory allocating risk assessment to international for a while leaving the matter of risk management to national and democratically accountable government. While the author throughout the thesis questions the possibility of separating risk assessment and risk management, the thesis offers new avenues which may assist in structuring a complex and difficult problem

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous studies reported a strong link between working memory capacity (WMC) and fluid intelligence (Gf), although views differ in respect to how close these two constructs are related to each other. In the present study, we used a WMC task with five levels of task demands to assess the relationship between WMC and Gf by means of a new methodological approach referred to as fixed-links modeling. Fixed-links models belong to the family of confirmatory factor analysis (CFA) and are of particular interest for experimental, repeated-measures designs. With this technique, processes systematically varying across task conditions can be disentangled from processes unaffected by the experimental manipulation. Proceeding from the assumption that experimental manipulation in a WMC task leads to increasing demands on WMC, the processes systematically varying across task conditions can be assumed to be WMC-specific. Processes not varying across task conditions, on the other hand, are probably independent of WMC. Fixed-links models allow for representing these two kinds of processes by two independent latent variables. In contrast to traditional CFA where a common latent variable is derived from the different task conditions, fixed-links models facilitate a more precise or purified representation of the WMC-related processes of interest. By using fixed-links modeling to analyze data of 200 participants, we identified a non-experimental latent variable, representing processes that remained constant irrespective of the WMC task conditions, and an experimental latent variable which reflected processes that varied as a function of experimental manipulation. This latter variable represents the increasing demands on WMC and, hence, was considered a purified measure of WMC controlled for the constant processes. Fixed-links modeling showed that both the purified measure of WMC (β = .48) as well as the constant processes involved in the task (β = .45) were related to Gf. Taken together, these two latent variables explained the same portion of variance of Gf as a single latent variable obtained by traditional CFA (β = .65) indicating that traditional CFA causes an overestimation of the effective relationship between WMC and Gf. Thus, fixed-links modeling provides a feasible method for a more valid investigation of the functional relationship between specific constructs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuropsychologists often face interpretational difficulties when assessing cognitive deficits, particularly in cases of unclear cerebral etiology. How can we be sure whether a single test score below the population average is indicative of a pathological brain condition or normal? In the past few years, the topic of intra-individual performance variability has gained great interest. On the basis of a large normative sample, two measures of performance variability and their importance for neuropsychological interpretation will be presented in this paper: the number of low scores and the level of dispersion.We conclude that low scores are common in healthy individuals. On the other hand, the level of dispersion is relatively small. Here, base rate information about abnormally low scores and abnormally high dispersion across cognitive abilities are providedto improve the awareness of normal variability and to serve clinicians as additional interpretive measures in the diagnostic process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigated whether amygdala activation, autonomic responses, respiratory responses, and facial muscle activity (measured over the brow and cheek [fear grin] regions) are all sensitive to phobic versus nonphobic fear and, more importantly, whether effects in these variables vary as a function of both phobic and nonphobic fear intensity. Spider-phobic and comparably low spider-fearful control participants imagined encountering different animals and rated their subjective fear while their central and peripheral nervous system activity was measured. All measures included in our study were sensitive to variations in subjective fear, but were related to different ranges and positions on the subjective fear level continuum. Left amygdala activation, heart rate, and facial muscle activity over the cheek region captured fear intensity variations even within narrowly described regions on the fear level continuum (here within extremely low levels of fear and within considerable phobic fear). Skin conductance and facial muscle activity over the brow region did not capture fear intensity variations within low levels of fear: skin conductance mirrored only extreme levels of fear, and activity over the brow region distinguished phobic from nonphobic fear but also low-to-moderate and high phobic fear. Finally, respiratory measures distinguished phobic from nonphobic fear with no further differentiation within phobic and nonphobic fear. We conclude that a careful consideration of the measures to be used in an investigation and the population to be examined can be critical in order to obtain significant results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil carbon (C) storage is a key ecosystem service. Soil C stocks play a vital role in soil fertility and climate regulation, but the factors that control these stocks at regional and national scales are unknown, particularly when their composition and stability are considered. As a result, their mapping relies on either unreliable proxy measures or laborious direct measurements. Using data from an extensive national survey of English grasslands, we show that surface soil (0–7 cm) C stocks in size fractions of varying stability can be predicted at both regional and national scales from plant traits and simple measures of soil and climatic conditions. Soil C stocks in the largest pool, of intermediate particle size (50–250 μm), were best explained by mean annual temperature (MAT), soil pH and soil moisture content. The second largest C pool, highly stable physically and biochemically protected particles (0·45–50 μm), was explained by soil pH and the community abundance-weighted mean (CWM) leaf nitrogen (N) content, with the highest soil C stocks under N-rich vegetation. The C stock in the small active fraction (250–4000 μm) was explained by a wide range of variables: MAT, mean annual precipitation, mean growing season length, soil pH and CWM specific leaf area; stocks were higher under vegetation with thick and/or dense leaves. Testing the models describing these fractions against data from an independent English region indicated moderately strong correlation between predicted and actual values and no systematic bias, with the exception of the active fraction, for which predictions were inaccurate. Synthesis and applications. Validation indicates that readily available climate, soils and plant survey data can be effective in making local- to landscape-scale (1–100 000 km2) soil C stock predictions. Such predictions are a crucial component of effective management strategies to protect C stocks and enhance soil C sequestration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Elicitability has recently been discussed as a desirable property for risk measures. Kou and Peng (2014) showed that an elicitable distortion risk measure is either a Value-at-Risk or the mean. We give a concise alternative proof of this result, and discuss the conflict between comonotonic additivity and elicitability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present contribution, we characterise law determined convex risk measures that have convex level sets at the level of distributions. By relaxing the assumptions in Weber (Math. Finance 16:419–441, 2006), we show that these risk measures can be identified with a class of generalised shortfall risk measures. As a direct consequence, we are able to extend the results in Ziegel (Math. Finance, 2014, http://onlinelibrary.wiley.com/doi/10.1111/mafi.12080/abstract) and Bellini and Bignozzi (Quant. Finance 15:725–733, 2014) on convex elicitable risk measures and confirm that expectiles are the only elicitable coherent risk measures. Further, we provide a simple characterisation of robustness for convex risk measures in terms of a weak notion of mixture continuity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE Our aim was to assess the diagnostic and predictive value of several quantitative EEG (qEEG) analysis methods in comatose patients. METHODS In 79 patients, coupling between EEG signals on the left-right (inter-hemispheric) axis and on the anterior-posterior (intra-hemispheric) axis was measured with four synchronization measures: relative delta power asymmetry, cross-correlation, symbolic mutual information and transfer entropy directionality. Results were compared with etiology of coma and clinical outcome. Using cross-validation, the predictive value of measure combinations was assessed with a Bayes classifier with mixture of Gaussians. RESULTS Five of eight measures showed a statistically significant difference between patients grouped according to outcome; one measure revealed differences in patients grouped according to the etiology. Interestingly, a high level of synchrony between the left and right hemisphere was associated with mortality on intensive care unit, whereas higher synchrony between anterior and posterior brain regions was associated with survival. The combination with the best predictive value reached an area-under the curve of 0.875 (for patients with post anoxic encephalopathy: 0.946). CONCLUSIONS EEG synchronization measures can contribute to clinical assessment, and provide new approaches for understanding the pathophysiology of coma. SIGNIFICANCE Prognostication in coma remains a challenging task. qEEG could improve current multi-modal approaches.