33 resultados para Voting.
Resumo:
OBJECTIVES The aim of this study was to forecast trends in restorative dentistry over the next 20 years and to identify treatment goals and corresponding properties of restorative materials. METHODS Using the Delphi method, a panel of 3 experts identified 8 key questions, which were sent to experts in restorative and preventive dentistry. In round 1 of this survey, 15 international experts devised a clearer semantic definition of the key questions and the completion of respective items for two additional rounds. In round 2, 125 experts from 35 countries rated the items developed in round 1 using a Likert scale. In round 3, the same 125 experts received the ratings of round 2 and were asked to agree or disagree to these ratings by re-voting on all key questions and items. A total of 105 experts re-voted and finally took part in the complete survey. Among the 8 key questions, two questions were selected for the present report: (Q1) "What will be the future role of restorative treatment?" and (Q6) "What will be the key qualities for clinical success of restorations?" For both questions and the respective items, the experts were asked to evaluate the importance and the feasibility for later calculation of the scientific value (i.e. the opportunity, where opportunity=importance+[importance-feasibility]). RESULTS The three items of highest importance for Q1 were "preservation of existing enamel and dentin tissue," "prevention of secondary caries," and "maintenance of the pulp vitality," and for Q6 they were "optimization of adhesion," "biocompatibility," and "minimizing technical sensitivity." SIGNIFICANCE Bioactivity toward the pulp-dentin complex and prevention of secondary caries were the items generally rated as having the highest opportunity.
Resumo:
While the pathology peer review/pathology working group (PWG) model has long been used in mammalian toxicologic pathology to ensure the accuracy, consistency, and objectivity of histopathology data, application of this paradigm to ecotoxicological studies has thus far been limited. In the current project, the PWG approach was used to evaluate histopathologic sections of gills, liver, kidney, and/or intestines from three previously published studies of diclofenac in trout, among which there was substantial variation in the reported histopathologic findings. The main objectives of this review process were to investigate and potentially reconcile these interstudy differences, and based on the results, to establish an appropriate no observed effect concentration (NOEC). Following a complete examination of all histologic sections and original diagnoses by a single experienced fish pathologist (pathology peer review), a two-day PWG session was conducted to allow members of a four-person expert panel to determine the extent of treatment-related findings in each of the three trout studies. The PWG was performed according to the United States Environmental Protection Agency (US EPA) Pesticide Regulation (PR) 94-5 (EPA Pesticide Regulation, 1994). In accordance with standard procedures, the PWG review was conducted by the non-voting chairperson in a manner intended to minimize bias, and thus during the evaluation, the four voting panelists were unaware of the treatment group status of individual fish and the original diagnoses associated with the histologic sections. Based on the results of this review, findings related to diclofenac exposure included minimal to slightly increased thickening of the gill filament tips in fish exposed to the highest concentration tested (1,000 μg/L), plus a previously undiagnosed finding, decreased hepatic glycogen, which also occurred at the 1,000 μg/L dose level. The panel found little evidence to support other reported effects of diclofenac in trout, and thus the overall NOEC was determined to be >320 μg/L. By consensus, the PWG panel was able to identify diagnostic inconsistencies among and within the three prior studies; therefore this exercise demonstrated the value of the pathology peer review/PWG approach for assessing the reliability of histopathology results that may be used by regulatory agencies for risk assessment.
Resumo:
This article asks if voters' participation in federal elections is lower in the new Länder (East Germany) than in the old Länder (West Germany). It is assumed that voters in the new Länder are less convinced they can influence politics by voting. Using the perspective of cognitive psychology the article stresses differences in individual interpretations of the election context among citizens of both the new and old Länder. Furthermore, it is argued that the strength of the expected influence by voting depends on the structure and direction of individuals' beliefs in their competence and control as well as their belief in causality and self-efficacy. These beliefs may differ among voters in the new and old Länder. For empirical analysis, the article uses data from the German General Social Survey 1998.
Resumo:
This article seeks to contribute to the illumination of the so-called 'paradox of voting' using the German Bundestag elections of 1998 as an empirical case. Downs' model of voter participation will be extended to include elements of the theory of subjective expected utility (SEU). This will allow a theoretical and empirical exploration of the crucial mechanisms of individual voters' decisions to participate, or abstain from voting, in the German general election of 1998. It will be argued that the infinitely low probability of an individual citizen's vote to decide the election outcome will not necessarily reduce the probability of electoral participation. The empirical analysis is largely based on data from the ALLBUS 1998. It confirms the predictions derived from SEU theory. The voters' expected benefits and their subjective expectation to be able to influence government policy by voting are the crucial mechanisms to explain participation. By contrast, the explanatory contribution of perceived information and opportunity costs is low.
Resumo:
We propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. Our algorithm works by estimating the displacements from image patches to the (unknown) landmark positions and then integrating them via voting. The fundamental contribution is that, we jointly estimate the displacements from all patches to multiple landmarks together, by considering not only the training data but also geometric constraints on the test image. The various constraints constitute a convex objective function that can be solved efficiently. Validated on three challenging datasets, our method achieves high accuracy in landmark detection, and, combined with statistical shape model, gives a better performance in shape segmentation compared to the state-of-the-art methods.
Resumo:
Parties, the Green Liberal Party and the Conservative Democratic Party. This election thus marks an end of the trend towards a growing level of party polarization. The introduction to this special issue puts these recent developments in context, discusses some of its consequences, and highlights how the articles in this special issue shed light on understanding the voting behaviour in Switzerland.
Resumo:
On 9 February 2014, the Swiss people accepted the popular initiative “against mass immigration” launched by the national-conservative Swiss People’s Party (SVP). This voting outcome has triggered wide-ranging debates about both the policy on immigrants as well as the future of Switzerland within the European context. Against this background, we evaluate attitudes toward immigration in Switzerland. Using hitherto unexplored survey data of MOSAiCH, our empirical analyses show that already in the year 2013, before the debate about the initiative on mass immigration was in full swing, roughly 53 percent of the 1011 interviewed Swiss citizens stated that immigration should be reduced. Moreover, our estimations indicate that the threats and fears induced by immigration and the will to maintain sovereignty and autonomy are particularly relevant for attitudes toward immigration. By contrast, education and national or personal economic conditions are only weakly related to the immigration issue.
Resumo:
In this paper, we propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. To detect landmarks, we estimate the displacements from some randomly sampled image patches to the (unknown) landmark positions, and then we integrate these predictions via a voting scheme. Our key contribution is a new algorithm for estimating these displacements. Different from other methods where each image patch independently predicts its displacement, we jointly estimate the displacements from all patches together in a data driven way, by considering not only the training data but also geometric constraints on the test image. The displacements estimation is formulated as a convex optimization problem that can be solved efficiently. Finally, we use the sparse shape composition model as the a priori information to regularize the landmark positions and thus generate the segmented shape contour. We validate our method on X-ray image datasets of three different anatomical structures: complete femur, proximal femur and pelvis. Experiments show that our method is accurate and robust in landmark detection, and, combined with the shape model, gives a better or comparable performance in shape segmentation compared to state-of-the art methods. Finally, a preliminary study using CT data shows the extensibility of our method to 3D data.
Resumo:
Well-established methods exist for measuring party positions, but reliable means for estimating intra-party preferences remain underdeveloped. While most efforts focus on estimating the ideal points of individual legislators based on inductive scaling of roll call votes, this data suffers from two problems: selection bias due to unrecorded votes and strong party discipline, which tends to make voting a strategic rather than a sincere indication of preferences. By contrast, legislative speeches are relatively unconstrained, as party leaders are less likely to punish MPs for speaking freely as long as they vote with the party line. Yet, the differences between roll call estimations and text scalings remain essentially unexplored, despite the growing application of statistical analysis of textual data to measure policy preferences. Our paper addresses this lacuna by exploiting a rich feature of the Swiss legislature: on most bills, legislators both vote and speak many times. Using this data, we compare text-based scaling of ideal points to vote-based scaling from a crucial piece of energy legislation. Our findings confirm that text scalings reveal larger intra-party differences than roll calls. Using regression models, we further explain the differences between roll call and text scalings by attributing differences to constituency-level preferences for energy policy.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
Studies assessing citizens’ attitudes towards Europe have mostly used explicit concepts and measures. However, psychologists have shown that human behaviour is not only determined by explicit attitudes which can be assessed via self-report, but also by implicit attitudes which require indirect measurement. We combine a self-report questionnaire with an implicit Affective Misattribution Procedure for the first time in an online environment to estimate the reliability, validity and predictive power of this implicit measure for the explanation of European Union-skeptical behaviour. Based on a survey with a sample representative for Germany, we found evidence for good reliability and validity of the implicit measure. In addition, the implicit attitude had a significant incremental impact beyond explicit attitudes on citizens’ proneness to engage in EU-skeptical information and voting behaviour.
Resumo:
Wir antworten auf die Kritik an unserem Artikel (Ackermann u. Traunmüller 2014) und argumentieren, dass Theorien über die abnehmende Bedeutung sozial-struktureller Merkmale für das Wahlverhalten fehlgeleitet sind. Stattdessen interessiert uns die gehaltvollere Frage, wie und unter welchen Bedingungen sie politisch wirksam werden. Diese Theorieperspektive öffnet den Blick für regionale und temporale Variation sozialer Einflussprozesse, welche gängigen Ansichten zum Cleavage-Voting widersprechen. Wir unterstützen unser Argument, indem wir demonstrieren, dass soziale Kontexte für das individuelle Wahlverhalten heutzutage wichtiger sind als noch vor Jahrzehnten. Abschließend diskutieren wir weiterführende Implikationen für soziale Kontextanalysen des Wahlverhaltens.
Resumo:
Surveys on voting behavior typically overestimate turnout rates substantially. To disentangle different sources of bias - coverage error, nonresponse bias, and overreporting - we conducted a validation study in which respondents' self-reported voting behavior was compared to administrative voting records (N = 2000). Our results show that all three sources of error inflate the survey estimate of the turnout rate and also bias estimates from political participation models, although coverage error is only moderate compared to the more pronounced biases due to nonresponse and overreporting. Furthermore, results from a wording experiment do not provide evidence that revised wording reduces measurement bias.