736 resultados para Violation
Resumo:
The protection of the fundamental human values (life, bodily integrity, human dignity, privacy) becomes imperative with the rapid progress in modern biotechnology, which can result in major alterations in the genetic make-up of organisms. It has become possible to insert human genes into pigs so that their internal organs coated in human proteins are more suitable for transplantation into humans (xenotransplantation), and micro-organisms that cam make insulin have been created, thus changing the genetic make-up of humans. At the end of the 1980s, the Central and Eastern European (CEE) countries either initiated new legislation or started to amend existing laws in this area (clinical testing of drugs, experiments on man, prenatal genetic diagnosis, legal protection of the embryo/foetus, etc.). The analysis here indicates that the CEE countries have not sufficiently adjusted their regulations to the findings of modern biotechnology, either because of the relatively short period they have had to do so, or because there are no definite answers to the questions which modern biotechnology has raised (ethical aspects of xenotransplantation, or of the use of live-aborted embryonic or foetal tissue in neuro-transplantation, etc.). In order to harmonise the existing regulations in CEE countries with respect to the EU and supranational contexts, two critical issues should be taken into consideration. The first is the necessity for CEE countries to recognise the place of humans within the achievements of modern biotechnology (a broader affirmation of the principle of autonomy, an explicit ban on the violation of the genetic identity of either born or unborn life, etc.). The second concerns the definition of the status of different biotechnological procedures and their permissibility (gene therapy, therapeutic genomes, xenotransplantation, etc.). The road towards such answers may be more easily identified once all CEE countries become members of the Council of Europe and express their wish to join the EU, which in turn presupposes taking over the entire body of EU legislation.
Resumo:
In this article I review some aspects of flavour phenomenology in the MSSM. After an overview of various flavour observables I discuss the constraints on the off-diagonal elements of the squark mass matrices. In this context I present the Fortran code SUSY_FLAVOR which calculates these processes in the generic MSSM including the complete resummation of all chirally enhanced effects as a new feature of version 2. Than I discuss where large new physics effects in the MSSM are still possible. As an example of a model which can give large effects in flavour physics I review a model with "radiative flavour violation" (RFV) and update the results in the light of the recent LHCb measurement of Bs -> \mu \mu. Finally, I recall that the MSSM can generate a sizable right-handed W-coupling which affects B -> \tau\nu and can solve the Vub problem.
Resumo:
We calculate the set of O(\alpha_s) corrections to the double differential decay width d\Gamma_{77}/(ds_1 \, ds_2) for the process \bar{B} \to X_s \gamma \gamma originating from diagrams involving the electromagnetic dipole operator O_7. The kinematical variables s_1 and s_2 are defined as s_i=(p_b - q_i)^2/m_b^2, where p_b, q_1, q_2 are the momenta of b-quark and two photons. While the (renormalized) virtual corrections are worked out exactly for a certain range of s_1 and s_2, we retain in the gluon bremsstrahlung process only the leading power w.r.t. the (normalized) hadronic mass s_3=(p_b-q_1-q_2)^2/m_b^2 in the underlying triple differential decay width d\Gamma_{77}/(ds_1 ds_2 ds_3). The double differential decay width, based on this approximation, is free of infrared- and collinear singularities when combining virtual- and bremsstrahlung corrections. The corresponding results are obtained analytically. When retaining all powers in s_3, the sum of virtual- and bremstrahlung corrections contains uncanceled 1/\epsilon singularities (which are due to collinear photon emission from the s-quark) and other concepts, which go beyond perturbation theory, like parton fragmentation functions of a quark or a gluon into a photon, are needed which is beyond the scope of our paper.
Resumo:
Recently the issue of radiative corrections to leptogenesis has been raised. Considering the "strong washout" regime, in which OPE-techniques permit to streamline the setup, we report the thermal self-energy matrix of heavy right-handed neutrinos at NLO (resummed 2-loop level) in Standard Model couplings. The renormalized expression describes flavour transitions and "inclusive" decays of chemically decoupled right-handed neutrinos. Although CP-violation is not addressed, the result may find use in existing leptogenesis frameworks.
Resumo:
BACKGROUND: In contrast to hypnosis, there is no surrogate parameter for analgesia in anesthetized patients. Opioids are titrated to suppress blood pressure response to noxious stimulation. The authors evaluated a novel model predictive controller for closed-loop administration of alfentanil using mean arterial blood pressure and predicted plasma alfentanil concentration (Cp Alf) as input parameters. METHODS: The authors studied 13 healthy patients scheduled to undergo minor lumbar and cervical spine surgery. After induction with propofol, alfentanil, and mivacurium and tracheal intubation, isoflurane was titrated to maintain the Bispectral Index at 55 (+/- 5), and the alfentanil administration was switched from manual to closed-loop control. The controller adjusted the alfentanil infusion rate to maintain the mean arterial blood pressure near the set-point (70 mmHg) while minimizing the Cp Alf toward the set-point plasma alfentanil concentration (Cp Alfref) (100 ng/ml). RESULTS: Two patients were excluded because of loss of arterial pressure signal and protocol violation. The alfentanil infusion was closed-loop controlled for a mean (SD) of 98.9 (1.5)% of presurgery time and 95.5 (4.3)% of surgery time. The mean (SD) end-tidal isoflurane concentrations were 0.78 (0.1) and 0.86 (0.1) vol%, the Cp Alf values were 122 (35) and 181 (58) ng/ml, and the Bispectral Index values were 51 (9) and 52 (4) before surgery and during surgery, respectively. The mean (SD) absolute deviations of mean arterial blood pressure were 7.6 (2.6) and 10.0 (4.2) mmHg (P = 0.262), and the median performance error, median absolute performance error, and wobble were 4.2 (6.2) and 8.8 (9.4)% (P = 0.002), 7.9 (3.8) and 11.8 (6.3)% (P = 0.129), and 14.5 (8.4) and 5.7 (1.2)% (P = 0.002) before surgery and during surgery, respectively. A post hoc simulation showed that the Cp Alfref decreased the predicted Cp Alf compared with mean arterial blood pressure alone. CONCLUSION: The authors' controller has a similar set-point precision as previous hypnotic controllers and provides adequate alfentanil dosing during surgery. It may help to standardize opioid dosing in research and may be a further step toward a multiple input-multiple output controller.
Resumo:
Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.
Resumo:
Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.
Resumo:
The future of Brazilian children who have the protection offered by familial bonds is threatened by social inequities that force them to seek shelter and grow up in shelters. According to the Institute of Applied Economic Research, an estimated 20,000 children and adolescents are served by institutions. The majority of these children are afro-descendent males between the ages of seven and fifteen years old. Of those researched, 87.6% have families (58.2% receive visits from their families, 22.7% are rarely visited by their families and 5.8 are legally prohibited from contacting or being by their families). The percentage of children and adolescents “without families” or with “missing families” is 11.3%. There is no information available for 2% of the children and adolescents residing in shelters. The principle factors that necessitate the placement of Brazilian children in institutions that provide care and shelter include poverty (including children forced to work, sell drugs or beg, for example); domestic violence; chemical dependence of parents or guardians; homelessness; death or parents or guardian; imprisonment of their parents; and sexual abuse committed by their parents or guardians. The issue of abandoned children and adolescents and their care and shelter in the Brazilian context expresses a perverse violation of Child and Adolescent Rights.
Resumo:
This Judgment by the Presidium of the Supreme Arbitration Court of the Russian Federation can be considered as a landmark ruling for Internet Service Provider’s (ISP) liability. The Court stipulates for the first time concise principles under which circumstances an ISP shall be exempt from liability for transmitting copyright infringing content. But due to the legislation on ISP liability in the Russian Federation it depends on the type of information which rules of liability apply to ISP. As far as a violation of intellectual property rights is claimed, the principles given now by the Supreme Arbitration Court are applicable, which basically follow the liability limitations of the so called EU E-Commerce Directive. But, furthermore, preventive measures that are provided in service provider contracts to suppress a violation through the use of services should be taken into account as well. On the other hand, as far as other information is concerned the limitations of the respective Information Law might be applicable which stipulates different liability requirements. This article gives a translation of the Supreme Arbitration Court’s decision as well as a comment on its key rulings with respect to the legal framework and on possible consequences for practice.
Resumo:
Big Brother Watch and others have filed a complaint against the United Kingdom under the European Convention on Human Rights about a violation of Article 8, the right to privacy. It regards the NSA affair and UK-based surveillance activities operated by secret services. The question is whether it will be declared admissible and, if so, whether the European Court of Human Rights will find a violation. This article discusses three possible challenges for these types of complaints and analyses whether the current privacy paradigm is still adequate in view of the development known as Big Data.
Resumo:
On October 10, 2013, the Chamber of the European Court of Human Rights (ECtHR) handed down a judgment (Delfi v. Estonia) condoning Estonia for a law which, as interpreted, held a news portal liable for the defamatory comments of its users. Amongst the considerations that led the Court to find no violation of freedom of expression in this particular case were, above all, the inadequacy of the automatic screening system adopted by the website and the users’ option to post their comments anonymously (i.e. without need for prior registration via email), which in the Court’s view rendered the protection conferred to the injured party via direct legal action against the authors of the comments ineffective. Drawing on the implications of this (not yet final) ruling, this paper discusses a few questions that the tension between the risk of wrongful use of information and the right to anonymity generates for the development of Internet communication, and examines the role that intermediary liability legislation can play to manage this tension.
Resumo:
Despite the considerable amount of self-disclosure in Online Social Networks (OSN), the motivation behind this phenomenon is still little understood. Building on the Privacy Calculus theory, this study fills this gap by taking a closer look at the factors behind individual self-disclosure decisions. In a Structural Equation Model with 237 subjects we find Perceived Enjoyment and Privacy Concerns to be significant determinants of information revelation. We confirm that the privacy concerns of OSN users are primarily determined by the perceived likelihood of a privacy violation and much less by the expected damage. These insights provide a solid basis for OSN providers and policy-makers in their effort to ensure healthy disclosure levels that are based on objective rationale rather than subjective misconceptions.
Resumo:
The following commentary serves as a response to the article, “Sex Trafficking of Minors in the U.S.: Implications for Policy, Prevention and Research,” drawing the important, though not often mentioned, connection between the sex trafficking of minors and human rights. The commentary argues that child trafficking has been inadequately addressed due to its relative invisibility, a lack of knowledge about human rights, and a lack of discourse about the human rights issues involved in sexual trafficking. As such, in the current day, the recognition that women and girls who are sexually exploited by traffickers are victims has seemingly been forgotten. The commentator commends the authors of “Sex Trafficking of Minors in the U.S.: Implications for Policy, Prevention and Research” for their work to better understand the characteristics of minor sex trafficking victims, which will play an important role in fighting deadly misperceptions about the victims, educating others about this lethal human rights violation, and finding ways to care for those victims who are rescued.
Resumo:
INTRODUCTION: Task stressors typically refer to characteristics such as not having enough time or resources, ambiguous demands, or the like. We suggest the perceived lack of legitimacy as an additional feature of tasks as a source of stress. Tasks are “illegitimate” to the extent that it is perceived as improper to expect employees to execute them – not because of difficulties in executing them, but because of their content for a given person, time, and situation; they are illegitimate because a) they are not conforming to a specific occupational role, as in “non-nursing activities” (called unreasonable) or b) there is no legitimate need for them to exist (called unnecessary; Semmer et al., 2007). These features make illegitimate tasks a unique task-related stressor. The concept of illegitimate tasks grew from the “Stress-as-Offense-to-Self” theory (SOS; Semmer et al, 2007); it is conceptually related to role stress (Kahn et al., 1964; Beehr & Glazer, 2005) and the organizational justice tradition (Cropanzano et al., 2001; Greenberg, 2010). SOS argues that a threat to one’s self-image is at the core of many stressful experiences. Violating role expectations, illegitimate tasks can be regarded as a special case of role conflict. As roles shape identities, this violation is postulated to constitute a threat to one’s professional identity. Being assigned a task considered illegitimate is likely to be considered unfair. Lack of fairness, in turn, contains a message about one’s social standing, and thus, the self. However, the aspects discussed have not received much attention in the role stress or the justice/fairness tradition. OBJECTIVE: Illegitimate tasks are a rather recent concept that has to be established as a construct in its own right by showing that it is associated with well-being/strain while controlling for other stressors, most notably role conflict and lack of justice. The aim of the presentation is to present the evidence accumulated so far. METHODS AND RESULTS: We present several studies employing different designs, using different control variables, and testing associations with different criteria. Study 1 demonstrates associations of illegitimate tasks with self-esteem, feelings of resentment against one’s organization, and burnout, controlling for distributive justice, role conflict, and social stressors (i.e. tensions). Study 2 yielded comparable results, using the same outcome variables but controlling for distributive as well as procedural / interactional justice. Study 3 demonstrated associations between illegitimate tasks and feelings of stress, sleeping problems, and emotional exhaustion, controlling for demands, control, and social support among medical doctors. Study 4 showed that feeling appreciated by one’s superior acted as a mediator between illegitimate tasks and job satisfaction and resentments towards the military in Swiss military officers. Study 5 demonstrated an association of illegitimate tasks with counterproductive work behavior (Semmer et al. 2010). Studies 1 to 5 were cross-sectional. In Study 6, illegitimate demands predicted irritability and resentments towards one’s organization longitudinally. Study 7 also was longitudinal, focusing on intra-individual variation in multilevel modeling; occasion-specific illegitimate tasks predicted cortisol among those who judged their health as comparatively poor. Studies 1-3 and 6 used SEM, and measurement models that used unreasonable and unnecessary tasks as indicators (isolated parceling) yielded a good fit. IMPLICATIONS & CONCLUSIONS: These studies demonstrate that illegitimate tasks are a stressor in its own right that is worth studying. It illuminates the social meaning of job design, emphasizing the implications of tasks for the (professional) self, and thus combining aspects that are traditionally treated as separate, that is, social aspects and task characteristics. Practical implications are that supervisors and managers should be alerted to the social messages that may be contained in task assignments (cf. Semmer & Beehr, in press).
Resumo:
The use of group-randomized trials is particularly widespread in the evaluation of health care, educational, and screening strategies. Group-randomized trials represent a subset of a larger class of designs often labeled nested, hierarchical, or multilevel and are characterized by the randomization of intact social units or groups, rather than individuals. The application of random effects models to group-randomized trials requires the specification of fixed and random components of the model. The underlying assumption is usually that these random components are normally distributed. This research is intended to determine if the Type I error rate and power are affected when the assumption of normality for the random component representing the group effect is violated. ^ In this study, simulated data are used to examine the Type I error rate, power, bias and mean squared error of the estimates of the fixed effect and the observed intraclass correlation coefficient (ICC) when the random component representing the group effect possess distributions with non-normal characteristics, such as heavy tails or severe skewness. The simulated data are generated with various characteristics (e.g. number of schools per condition, number of students per school, and several within school ICCs) observed in most small, school-based, group-randomized trials. The analysis is carried out using SAS PROC MIXED, Version 6.12, with random effects specified in a random statement and restricted maximum likelihood (REML) estimation specified. The results from the non-normally distributed data are compared to the results obtained from the analysis of data with similar design characteristics but normally distributed random effects. ^ The results suggest that the violation of the normality assumption for the group component by a skewed or heavy-tailed distribution does not appear to influence the estimation of the fixed effect, Type I error, and power. Negative biases were detected when estimating the sample ICC and dramatically increased in magnitude as the true ICC increased. These biases were not as pronounced when the true ICC was within the range observed in most group-randomized trials (i.e. 0.00 to 0.05). The normally distributed group effect also resulted in bias ICC estimates when the true ICC was greater than 0.05. However, this may be a result of higher correlation within the data. ^