815 resultados para Consensus Panel
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.
Resumo:
Using panel data for 111 countries over the period 1982–2002, we employ two indexes that cover a wide range of human rights to empirically analyze whether and to what extent terrorism affects human rights. According to our results,terrorism significantly, but not dramatically, diminishes governments’ respect for basic human rights such as the absence of extrajudicial killings, political imprisonment, and torture. The result is robust to how we measure terrorist attacks, to the method of estimation, and to the choice of countries in our sample. However, we find no effect of terrorism on empowerment rights.
Resumo:
Using the KOF Index of Globalization and two indices of economic freedom, the authors empirically analyze whether globalization and economic liberalization affect governments’ respect for human rights in a panel of 106 countries over the 1981–2004 period. According to their results, physical integrity rights significantly and robustly increase with globalization and economic freedom, while empowerment rights are not robustly affected. Due to the lack of consensus about the appropriate level of empowerment rights as compared to the outright rejection of any violation of physical integrity rights, the global community is presumably less effective in promoting empowerment rights.
Resumo:
A number of studies have addressed the relationship between intra-personal uncertainty and inter-personal disagreement about the future values of economic variables such as output growth and inflation using the SPF. By making use of the SPF respondents' probability forecasts of declines in output, we are able to construct a quarterly series of output growth uncertainty to supplement the annual series that are often used in such analyses. We also consider the relationship between disagreement and uncertainty for probability forecasts of declines in output.
Resumo:
The bitter taste elicited by dairy protein hydrolysates (DPH) is a renowned issue for their acceptability by consumers and therefore incorporation into foods. The traditional method of assessment of taste in foods is by sensory analysis but this can be problematic due to the overall unpleasantness of the samples. Thus, there is a growing interest into the use of electronic tongues (e-tongues) as an alternative method to quantify the bitterness in such samples. In the present study the response of the e-tongue to the standard bitter agent caffeine and a range of both casein and whey based hydrolysates was compared to that of a trained sensory panel. Partial least square regression (PLS) was employed to compare the response of the e-tongue and the sensory panel. There was strong correlation shown between the two methods in the analysis of caffeine (R2 of 0.98) and DPH samples with R2 values ranging from 0.94-0.99. This study exhibits potential for the e-tongue to be used in bitterness screening in DPHs to reduce the reliance on expensive and time consuming sensory panels.
Resumo:
Dystrophin, the protein product of the Duchenne muscular dystrophy (DMD) gene, was studied in 19 patients with Xp21 disorders and in 25 individuals with non-Xp21 muscular dystrophy. Antibodies raised to seven different regions spanning most of the protein were used for immunocytochemistry. In all patients specific dystrophin staining anomalies were detected and correlated with clinical severity and also gene deletion. In patients with Becker muscular dystrophy (BMD) the anomalies detected ranged from inter- and intra-fibre variation in labelling intensity with the same antibody or several antibodies to general reduction in staining and discontinuous staining. In vitro evidence of abnormal dystrophin breakdown was observed reanalysing the muscle of patients, with BMD and not that of non-Xp21 dystrophies, after it has been stored for several months. A number of patients with DMD showed some staining but this did not represent a diagnostic problem. Based on the data presented, it was concluded that immunocytochemistry is a powerful technique in the prognostic diagnosis of Xp21 muscular dystrophies.
Resumo:
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.
Resumo:
Using a newly developed integrated indicator system with entropy weighting, we analyzed the panel data of 577 recorded disasters in 30 provinces of China from 1985–2011 to identify their links with the subsequent economic growth. Meteorological disasters promote economic growth through human capital instead of physical capital. Geological disasters did not trigger local economic growth from 1999–2011. Generally, natural disasters overall had no significant impact on economic growth from 1985–1998. Thus, human capital reinvestment should be the aim in managing recoveries, and it should be used to regenerate the local economy based on long-term sustainable development.
Resumo:
Crosstalk between nuclear receptors is important for conversion of external and internal stimuli to a physiologically meaningful response by cells. Previous studies from this laboratory have demonstrated crosstalk between the estrogen (ER) and thyroid hormone receptors (TR) on two estrogen responsive physiological promoters, the preproenkephalin and oxytocin receptor gene promoter. Since ERa and ERb are isoforms possessing overlapping and distinct transactivation properties, we hypothesized that the interaction of ERa and b with the various TR isoforms would not be equivalent. To explore this hypothesis, the consensus estrogen response element (ERE)derived from the Xenopus vitellogenin gene is used to investigate the differences in interaction between ERa and b isoforms and the different TR isoforms in fibroblast cells. Both the ER isoforms transactivate from the consensus ERE, though ERa transactivates to a greater extent than ERb. Although neither of the TRb isoforms have an effect on ERa transactivation from the consensus ERE, the liganded TRa1 inhibits the ERa transactivation from the consensus ERE. In contrast, the liganded TRa1 facilitates ERb-mediated transactivation. The crosstalk between the TRb isoforms with the ERa isoform, on the consensus ERE, is different from that with the ERb isoform. The use of a TRa1 mutant, which is unable to bind DNA, abolishes the ability of the TRa1 isoform to interact with either of the ER isoforms. These differences in nuclear receptor crosstalk reveal an important functional difference between isoforms, which provides a novel mechanism for neuroendocrine integration.
Resumo:
Thyroid hormones (T) and estrogens (E) are nuclear receptor ligands with at least two molecular mechanisms of action: (i) relatively slow genomic effects, such as the regulation of transcription by cognate T receptors (TR) and E receptors (ER); and (ii) relatively rapid nongenomic effects, such as kinase activation and calcium release initiated at the membrane by putative membrane receptors. Genomic and nongenomic effects were thought to be disparate and independent. However, in a previous study using a two-pulse paradigm in neuroblastoma cells, we showed that E acting at the membrane could potentiate transcription from an E-driven reporter gene in the nucleus. Because both T and E can have important effects on mood and cognition, it is possible that the two hormones can act synergistically. In this study, we demonstrate that early actions of T via TRalpha1 and TRbeta1 can potentiate E-mediated transcription (genomic effects) from a consensus E response element (ERE)-driven reporter gene in transiently transfected neuroblastoma cells. Such potentiation was reduced by inhibition of mitogen-activated protein kinase. Using phosphomutants of ERalpha, we also show that probable mitogen-activated protein kinase phosphorylation sites on the ERalpha, the serines at position 167 and 118, are important in TRbeta1-mediated potentiation of ERalpha-induced transactivation. We suggest that crosstalk between T and E includes potential interactions through both nuclear and membrane-initiated molecular mechanisms of hormone signaling.
Resumo:
Chromosomal microarray (CMA) is increasingly utilized for genetic testing of individuals with unexplained developmental delay/intellectual disability (DD/ID), autism spectrum disorders (ASD), or multiple congenital anomalies (MCA). Performing CMA and G-banded karyotyping on every patient substantially increases the total cost of genetic testing. The International Standard Cytogenomic Array (ISCA) Consortium held two international workshops and conducted a literature review of 33 studies, including 21,698 patients tested by CMA. We provide an evidence-based summary of clinical cytogenetic testing comparing CMA to G-banded karyotyping with respect to technical advantages and limitations, diagnostic yield for various types of chromosomal aberrations, and issues that affect test interpretation. CMA offers a much higher diagnostic yield (15%-20%) for genetic testing of individuals with unexplained DD/ID, ASD, or MCA than a G-banded karyotype (similar to 3%, excluding Down syndrome and other recognizable chromosomal syndromes), primarily because of its higher sensitivity for submicroscopic deletions and duplications. Truly balanced rearrangements and low-level mosaicism are generally not detectable by arrays, but these are relatively infrequent causes of abnormal phenotypes in this population (<1%). Available evidence strongly supports the use of CMA in place of G-banded karyotyping as the first-tier cytogenetic diagnostic test for patients with DD/ID, ASD, or MCA. G-banded karyotype analysis should be reserved for patients with obvious chromosomal syndromes (e.g., Down syndrome), a family history of chromosomal rearrangement, or a history of multiple miscarriages.