488 resultados para disagreement
Resumo:
Incorporation of enediynes into anticancer drugs remains an intriguing yet elusive strategy for the design of therapeutically active agents. Density functional theory was used to locate reactants, products, and transition states along the Bergman cyclization pathways connecting enediynes to reactive para-biradicals. Sum method correction to low-level calculations confirmed B3LYP/6-31G(d,p) as the method of choice in investigating enediynes. Herein described as MI:Sum, calculated reaction enthalpies differed from experiment by an average of 2.1 kcal·mol−1 (mean unsigned error). A combination of strain energy released across the reaction coordinate and the critical intramolecular distance between reacting diynes explains reactivity differences. Where experimental and calculated barrier heights are in disagreement, higher level multireference treatment of the enediynes confirms lower level estimates. Previous work concerning the chemically reactive fragment of esperamcin, MTC, is expanded to our model system MTC2.
Resumo:
Shared Decision Making (SDM) is widely accepted as the preferred method for reaching treatment decisions in the oncology setting including those about clinical trial participation: however, there is some disagreement between researchers over the components of SDM. Specific standardized coding systems are needed to help overcome this difficulty.
Resumo:
This study evaluated the correlation between three strip-type, colorimetric tests and two laboratory methods with respect to the analysis of salivary buffering. The strip-type tests were saliva-check buffer, Dentobuff strip and CRT(®) Buffer test. The laboratory methods included Ericsson's laboratory method and a monotone acid/base titration to create a reference scale for the salivary titratable acidity. Additionally, defined buffer solutions were prepared and tested to simulate the carbonate, phosphate and protein buffer systems of saliva. The correlation between the methods was analysed by the Spearman's rank test. Disagreement was detected between buffering capacity values obtained with three strip-type tests that was more pronounced in case of saliva samples with medium and low buffering capacities. All strip-type tests were able to assign the hydrogencarbonate, di-hydrogenphosphate and 0.1% protein buffer solutions to the correct buffer categories. However, at 0.6% total protein concentrations, none of the test systems worked accurately. Improvements are necessary for strip-type tests because of certain disagreement with the Ericsson's laboratory method and dependence on the protein content of saliva.
Resumo:
Published opinions regarding the outcomes and complications in older patients have a broad spectrum and there is a disagreement whether surgery in older patients entails a higher risk. Therefore this study examines the risk of surgery for lumbar spinal stenosis relative to age in the pooled data set of the Spine Tango registry.
Resumo:
Conflict has marked civilization from Biblical times to the present day. Each of us, with our different and competing interests, and our desires to pursue those interests, have over time wronged another person. Not surprisingly then, forgiveness is a concern of individuals and groups¿communities, countries, religious groups, races¿yet it is a complex idea that philosophers, theologians, political scientists, and psychologists have grappled with. Some have argued that forgiveness is a therapeutic means for overcoming guilt, pain, and anger. Forgiveness is often portrayed as a coping mechanism¿how often we hear the phrase, ¿forgive and forget,¿ as an arrangement to help two parties surmount the complications of disagreement. But forgiveness is not simply a modus vivendi; the ability to forgive and conversely to ask for forgiveness, is counted as an admirable trait and virtue. This essay will explore the nature of forgiveness, which in Christian dogma is often posited as an unqualified virtue. The secular world has appropriated the Christian notion of forgiveness as such a virtue¿but are there instances wherein offering forgiveness is morally inappropriate or dangerous? I will consider the situations in which forgiveness, understood in this essay as the overcoming of resentment, may not be a virtue¿when perhaps maintaining resentment is as virtuous, if not more virtuous, than forgiving. I will explain the various ethical frameworks involved in understanding forgiveness as a virtue, and the relationship between them. I will argue that within Divine Command Theory forgiveness is a virtue¿and thus morally right¿because God commands it. This ethical system has established forgiveness as unconditional, an idea which has been adopted into popular culture. With virtue ethics in mind, which holds virtues to be those traits which benefit the person who possesses them, contributing to the good life, I will argue unqualified forgiveness is not always a virtue, as it will not always benefit the victim. Because there is no way to avoid wrongdoing, humans are confronted with the question of forgiveness with every indiscretion. Its limits, its possibilities, its relationship to one¿s character¿forgiveness is a concern of all people at some time if for no other reason than the plain fact that the past cannot be undone. I will be evaluating the idea of forgiveness as a virtue, in contrast to its counterpart, resentment. How can forgiveness be a response to evil, a way to renounce resentment, and a means of creating a positive self-narrative? And what happens when a sense of moral responsibility is impossible to reconcile with the Christian (and now, secularized imperative of) forgiveness? Is it ever not virtuous to forgive? In an attempt to answer that question I will argue that there are indeed times when forgiveness is not a virtue, specifically: when forgiveness compromises one¿s own self-respect; when it is not compatible with respect for the moral community; and when the offender is unapologetic. The kind of offense I have in mind is a dehumanizing one, one that intends to diminish another person¿s worth or humanity. These are moral injuries, to which I will argue resentment is a better response than forgiveness when the three qualifications cannot be met.
Resumo:
Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.
Resumo:
BACKGROUND: There is a lack of studies about how to proceed surgically in rare strabismus diseases. It was the aim of this study to inteview experienced German-speaking strabismologists about how they perform surgery in rare but also some frequent strabismic conditions. The focus was on the choice of the technique, the timing, and the dosage. METHOD: A validated questionnaire was sent to 11 experienced strabismus surgeons. It contained questions about the following topics: congenital fibrosis syndrome, Jaentsch-Brown syndrome, intermittent exotropia, maximum dosage for rectus muscle surgery, Kestenbaum surgery, sixth nerve palsy, heterophorias, myokymia of the superior oblique muscle, thyroid endocrine orbitopathy, dissociated vertical deviation, adjustable sutures, advancement of previously recessed rectus muscles, retroequatorial myopiexia, and congenital esotropia. RESULTS: Ten experts answered the questionnaire (91 %). There was a large consent for many topics. However, for many procedures there was disagreement about the dosage and the timing. Since some questions addressed rare diseases and many strabismologists use only certain types of surgical procedures, some questions could only be answered by a few surgeons. CONCLUSIONS: German-speaking strabismologist show a large consensus about the type of surgical procedure to use, but often disagree about the dosage and timing of the operation.
Resumo:
OBJECTIVE: To study the inter-observer variation related to extraction of continuous and numerical rating scale data from trial reports for use in meta-analyses. DESIGN: Observer agreement study. DATA SOURCES: A random sample of 10 Cochrane reviews that presented a result as a standardised mean difference (SMD), the protocols for the reviews and the trial reports (n=45) were retrieved. DATA EXTRACTION: Five experienced methodologists and five PhD students independently extracted data from the trial reports for calculation of the first SMD result in each review. The observers did not have access to the reviews but to the protocols, where the relevant outcome was highlighted. The agreement was analysed at both trial and meta-analysis level, pairing the observers in all possible ways (45 pairs, yielding 2025 pairs of trials and 450 pairs of meta-analyses). Agreement was defined as SMDs that differed less than 0.1 in their point estimates or confidence intervals. RESULTS: The agreement was 53% at trial level and 31% at meta-analysis level. Including all pairs, the median disagreement was SMD=0.22 (interquartile range 0.07-0.61). The experts agreed somewhat more than the PhD students at trial level (61% v 46%), but not at meta-analysis level. Important reasons for disagreement were differences in selection of time points, scales, control groups, and type of calculations; whether to include a trial in the meta-analysis; and data extraction errors made by the observers. In 14 out of the 100 SMDs calculated at the meta-analysis level, individual observers reached different conclusions than the originally published review. CONCLUSIONS: Disagreements were common and often larger than the effect of commonly used treatments. Meta-analyses using SMDs are prone to observer variation and should be interpreted with caution. The reliability of meta-analyses might be improved by having more detailed review protocols, more than one observer, and statistical expertise.
Resumo:
In Europe, a disagreement persists in the courts about the possibility of plaintiffs to request a domain name transfer in domain name disputes. In the last ten years, Slovak and Czech courts also produced some jurisprudence on this issue. Interestingly, the BGH’s influential opinion in the shell.de decision, which denied domain name transfer as an available remedy under German law back in 2002, wasn’t initially followed. To the contrary, several Slovak and Czech decisions of lower courts allowed a domain name transfer using two different legal bases. This seemingly settled case law was rejected a few months ago by the globtour.cz decision of the Czech Supreme Court, which refused domain name transfers for the time being
Resumo:
This article brings to light several inconsistencies within the narrative of the EU policy on institutional multilingualism. The EU has invoked fundamental EU principles of democracy, equality and transparent government, to publically bolster the need for its institutions to communicate and operate in the languages of its citizens. However, these principles do not allow for the pragmatic and budgetary arguments that the EU uses to justify the in reality limited number of official and de facto working languages of its institutions. The article argues that this disagreement could be resolved if the narrative of the EU's language policy would include the objective that all European citizens master any of the languages that the EU institutions use. In that light, the article recommends that further research is done into the question whether the EU should accept or even encourage the spontaneous development of English as a de facto pan-European lingua franca.
Resumo:
The early phase of psychotherapy has been regarded as a sensitive period in the unfolding of psychotherapy leading to positive outcomes. However, there is disagreement about the degree to which early (especially relationship-related) session experiences predict outcome over and above initial levels of distress and early response to treatment. The goal of the present study was to simultaneously examine outcome at post treatment as a function of (a) intake symptom and interpersonal distress as well as early change in well-being and symptoms, (b) the patient's early session-experiences, (c) the therapist's early session-experiences/interventions, and (d) their interactions. The data of 430 psychotherapy completers treated by 151 therapists were analyzed using hierarchical linear models. Results indicate that early positive intra- and interpersonal session experiences as reported by patients and therapists after the sessions explained 58% of variance of a composite outcome measure, taking intake distress and early response into account. All predictors (other than problem-activating therapists' interventions) contributed to later treatment outcomes if entered as single predictors. However, the multi-predictor analyses indicated that interpersonal distress at intake as well as the early interpersonal session experiences by patients and therapists remained robust predictors of outcome. The findings underscore that early in therapy therapists (and their supervisors) need to understand and monitor multiple interconnected components simultaneously
Resumo:
Understanding natural climate variability and its driving factors is crucial to assessing future climate change. Therefore, comparing proxy-based climate reconstructions with forcing factors as well as comparing these with paleoclimate model simulations is key to gaining insights into the relative roles of internal versus forced variability. A review of the state of modelling of the climate of the last millennium prior to the CMIP5–PMIP3 (Coupled Model Intercomparison Project Phase 5–Paleoclimate Modelling Intercomparison Project Phase 3) coordinated effort is presented and compared to the available temperature reconstructions. Simulations and reconstructions broadly agree on reproducing the major temperature changes and suggest an overall linear response to external forcing on multidecadal or longer timescales. Internal variability is found to have an important influence at hemispheric and global scales. The spatial distribution of simulated temperature changes during the transition from the Medieval Climate Anomaly to the Little Ice Age disagrees with that found in the reconstructions. Thus, either internal variability is a possible major player in shaping temperature changes through the millennium or the model simulations have problems realistically representing the response pattern to external forcing. A last millennium transient climate response (LMTCR) is defined to provide a quantitative framework for analysing the consistency between simulated and reconstructed climate. Beyond an overall agreement between simulated and reconstructed LMTCR ranges, this analysis is able to single out specific discrepancies between some reconstructions and the ensemble of simulations. The disagreement is found in the cases where the reconstructions show reduced covariability with external forcings or when they present high rates of temperature change.
Resumo:
Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.
Resumo:
To enable buyers to be better informed before purchasing, products and services can be virtually experienced on the internet. Research into virtual experience (VE) and the related construct of telepresence (TP) as means of online marketing has made great progress in recent years. However, there is still disagreement in the literature concerning the exact understanding of these terms. In this study, the two terms are analyzed by means of a systematically executed literature review, differentiated from one another, and their understandings explained. This study is to our knowledge the first to compare the concepts of VE and TP in a systematic way. The analysis shows that TP is regarded as the feeling of presence conveyed by a communication medium. VE, on the other hand, is to be defined as an active state of a consumer through the use of computer-based presentation formats, and constituting a subtype of TP. These findings are intended to help VE and TP become more uniformly understood and make it easier to compare the results of future studies. Finally, from the literature review, it is possible to derive focal points for research in future studies.
Resumo:
The task of encoding and processing complex sensory input requires many types of transsynaptic signals. This requirement is served in part by an extensive group of neurotransmitter substances which may include thirty or more different compounds. At the next level of information processing, the existence of multiple receptors for a given neurotransmitter appears to be a widely used mechanism to generate multiple responses to a given first messenger (Snyder and Goodman, 1980). Despite the wealth of published data on GABA receptors, the existence of more than one GABA receptor was in doubt until the mid 1980's. Presently there is still disagreement on the number of types of GABA receptors, estimates for which range from two to four (DeFeudis, 1983; Johnston, 1985). Part of the problem in evaluating data concerning multiple receptor types is the lack of information on the number of gene products and their subsequent supramolecular organization in different neurons. In order to evaluate the question concerning the diversity of GABA receptors in the nervous system, we must rely on indirect information derived from a wide variety of experimental techniques. These include pharmacological binding studies to membrane fractions, electrophysiological studies, localization studies, purification studies, and functional assays. Almost all parts of the central and peripheral nervous system use GABA as a neurotransmitter, and these experimental techniques have therefore been applied to many different parts of the nervous system for the analysis of GABA receptor characteristics. We are left with a large amount of data from a wide variety of techniques derived from many parts of the nervous system. When this project was initiated in 1983, there were only a handful of pharmacological tools to assess the question of multiple GABA receptors. The approach adopted was to focus on a single model system, using a variety of experimental techniques, in order to evaluate the existence of multiple forms of GABA receptors. Using the in vitro rabbit retina, a combination of pharmacological binding studies, functional release studies and partial purification studies were undertaken to examine the GABA receptor composition of this tissue. Three types of GABA receptors were observed: Al receptors coupled to benzodiazepine and barbiturate modulation, and A2 or uncoupled GABA-A receptors, and GABA-B receptors. These results are evaluated and discussed in light of recent findings by others concerning the number and subtypes of GABA receptors in the nervous system. ^