951 resultados para Conspiracist belief
Resumo:
Relevance theory (Sperber & Wilson. 1995) suggests that people expend cognitive effort when processing information in proportion to the cognitive effects to be gained from doing so. This theory has been used to explain how people apply their knowledge appropriately when evaluating category-based inductive arguments (Medin, Coley, Storms, & Hayes, 2003). In such arguments, people are told that a property is true of premise categories and are asked to evaluate the likelihood that it is also true of conclusion categories. According to the relevance framework, reasoners generate hypotheses about the relevant relation between the categories in the argument. We reasoned that premises inconsistent with early hypotheses about the relevant relation would have greater effects than consistent premises. We designed three premise garden-path arguments where the same 3rd premise was either consistent or inconsistent with likely hypotheses about the relevant relation. In Experiments 1 and 2, we showed that effort expended processing consistent premises (measured via reading times) was significantly less than effort expended on inconsistent premises. In Experiment 2 and 3, we demonstrated a direct relation between cognitive effect and cognitive effort. For garden-path arguments, belief change given inconsistent 3rd premises was significantly correlated with Premise 3 (Experiment 3) and conclusion (Experiments 2 and 3) reading times. For consistent arguments, the correlation between belief change and reading times did not approach significance. These results support the relevance framework for induction but are difficult to accommodate under other approaches.
Resumo:
Some have entertained the belief that early modern Gaelic society conferred substantial rights on women. This could hardly be farther from the truth. In aristocratic Gaelic circles women were used ruthlessly as pawns in political alliances and other manoeuvres. The status of women at the lower levels of society also seems to have been low relative to men. While patriarchal relationships persisted after the Plantation of Ulster, they took new forms. Some women actually benefited in terms of property rights relative to men. Economic change in the eighteenth century, in particular the development of proto-industry, opened up opportunities for poorer women but it is notable that women did not feature at all in the public political sphere before 1800
Resumo:
Contrary to a commonly held belief that broiler chickens need more space, there is increasing evidence that these birds are attracted to other birds. Indeed, commercially farmed birds exhibit a range of socially facilitated behaviours, such as increased feeding and preening in response to the presence of other birds. Social facilitation can generate feedback loops, whereby the adoption of a particular behaviour can spread rapidly and suddenly through the population. Here, by measuring the rate at which broiler chickens join and leave a feeding trough as a function of the number of birds already there, we quantify social facilitation. We use these measurements to parameterize a simulation model of chicken feeding behaviour. This model predicts, and further observations of broiler chickens confirm, that social facilitation leads to excitatory and synchronized patterns of group feeding. Such models could prove a powerful tool in understanding how feeding patterns depend on broiler house design.
Resumo:
Modern ‘nonscripted’ theatre (NST) clearly owes much to improvisation. Perhaps less obviously, and more surprisingly, so too does modern law. In this article I will contend that, despite all the rules of evidence and procedure, statutes and legal precedents that fundamentally govern the decisions and actions of a judge, it is only through ‘spontaneity’ that judgment can take place. This claim may appear strange to those well-versed in the common law tradition which proceeds on the basis of past legal decisions, or reason where no precedent exists. NST, on the other hand, is assumed to rely heavily on the unprecedented and unreasoned. Therefore, when the public watches a NST production, it places its faith in the belief that what is being observed is entirely new and is being produced ‘on the spur of the moment’.
Resumo:
For any proposed software project, when the software requirements specification has been established, requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development. Then it is necessary to provide effective and flexible requirements changes management. In this paper, we present an approach to managing requirements changes based on Booth’s negotiation-style framework for belief revision. Informally, we consider the current requirements specification as a belief set about the system-to-be. The request of requirements change is viewed as new information about the same system-to-be. Then the process of executing the requirements change is a process of revising beliefs about the system-to-be. We design a family of belief negotiation models appropriate for different processes of requirements revision, including the setting of the request of requirements change being fully accepted, the setting of the current requirements specification being fully preserved, and that of the current specification and the request of requirements change reaching a compromise. In particular, the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.
Resumo:
Introduction: The laboratory mouse is a powerful tool in cardiovascular research. In this report, we describe a method for a reproducible mouse myocardial infarction model that would allow subsequent comparative and quantitative studies on molecular and pathophysiological variables. Methods: (A) The distribution of the major coronary arteries including the septal artery in the left ventricle of the C57BL/6J mice (n=20) was mapped by perfusion of latex dye or fluorescent beads through the aorta. (B) The territory of myocardial infarction after the ligation of the most proximal aspect of the left anterior descending (LAD) coronary artery was quantified. (C) The consistency in the histological changes parallel to the infarction at different time points was analyzed. Results: (A) The coronary artery tree of the mouse is different from human and, particularly, in regard to the blood supply of the septum. (B) Contrary to previous belief, the septal coronary artery in the mouse is variable in origin. (C) A constant ligation of the LAD immediately below the left auricular level ensures a statistically significant reproducible infarct size. (D) The ischemic changes can be monitored at a histological level in a way similar to what is described in the human. Conclusion: We illustrate a method for maximal reproducibility of experimental acute myocardial infarction in the mouse model, due to a consistent loss of perfusion in the lower half of the left ventricle. This will allow the study of molecular and physiological variables in a controlled and quantifiable experimental model environment. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Concern with what can explain variation in generalized social trust has led to an abundance of theoretical models. Defining generalized social trust as a belief in human benevolence, we focus on the emancipation theory and social capital theory as well as the ethnic diversity and economic development models of trust. We then determine which dimensions of individuals’ behavior and attitudes as well as of their national context are the most important predictors. Using data from 20 countries that participated in round one of the European Social Survey, we test these models at their respective level of analysis, individual and/or national. Our analysis revealed that individuals’ own trust in the political system as a moral and competent institution was the most important predictor of generalized social trust at the individual level, while a country’s level of affluence was the most important contextual predictor, indicating that different dimensions are significant at the two levels of analysis. This analysis also raised further questions as to the meaning of social capital at the two levels of analysis and the conceptual equivalence of its civic engagement dimension across cultures.
Resumo:
Substantial sums of money are invested annually in preventative medicine and therapeutic treatment for people with a wide range of physical and psychological health problems, sometimes to no avail. There is now mounting evidence to suggest that companion animals, such as dogs and cats, can enhance the health of their human owners and may thus contribute significantly to the health expenditure of our country. This paper explores the evidence that pets can contribute to human health and well-being. The article initially concentrates on the value of animals for short- and long-term physical health, before exploring the relationship between animals and psychological health, focusing on the ability of dogs, cats, and other species to aid the disabled and serve as a "therapist" to those in institutional settings. The paper also discusses the evidence for the ability of dogs to facilitate the diagnosis and treatment of specific chronic diseases, notably cancer, epilepsy, and diabetes. Mechanisms underlying the ability of animals to promote human health are discussed within a theoretical framework. Whereas the evidence for a direct causal association between human well-being and companion animals is not conclusive, the literature reviewed is largely supportive of the widely held, and long-standing, belief that "pets are good for us." © 2009 The Society for the Psychological Study of Social Issues.
Resumo:
Some geological fakes and frauds are carried out solely for financial gain (mining fraud), whereas others maybe have increasing aesthetic appeal (faked fossils) or academic advancement (fabricated data) as their motive. All types of geological fake or fraud can be ingenious and sophisticated, as demonstrated in this article. Fake gems, faked fossils and mining fraud are common examples where monetary profit is to blame: nonetheless these may impact both scientific theory and the reputation of geologists and Earth scientists. The substitution or fabrication of both physical and intellectual data also occurs for no direct financial gain, such as career advancement or establishment of belief (e.g. evolution vs. creationism). Knowledge of such fakes and frauds may assist in spotting undetected geological crimes: application of geoforensic techniques helps the scientific community to detect such activity, which ultimately undermines scientific integrity.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
Body mass has been shown to scale negatively with abundance in a wide range of habitats and ecosystems. It is believed that this relationship has important consequences for the distribution and maintenance of energy in natural communities. Some studies have shown that the relationship between body mass and abundance may be robust to major food web perturbations, fuelling the belief that natural processes may preserve the slope of this relationship and the associated cycling of energy and nutrients. Here, we use data from a long-term experimental food web manipulation to examine this issue in a semi-natural environment. Similar communities were developed in large experimental mesocosms over a six month period. Some of the mesocosms were then subjected to species removals, based on the mean strength of their trophic interactions in the communities. In treatments where the strongest interactors were removed, a community-level trophic cascade occurred. The biomass density of invertebrates increased dramatically in these communities, which led to a suppression of primary production. In spite of these widespread changes in ecosystem functioning, the slope of the relationship between body mass and abundance remained unchanged. This was the case whether average species body mass and abundance or individual organism size spectra were considered. An examination of changes in species composition before and after the experimental manipulations revealed an important mechanism for maintaining the body mass-abundance relationship. The manipulated communities all had a higher species turnover than the intact communities, with the highest turnover in communities that experienced cascading effects. As some species increased in body mass and abundance, new species filled the available size-abundance niches that were created. This maintained the overall body mass-abundance relationship and provided a stabilising structure to these experimental communities.
Resumo:
We have studied over 1600 Am stars at a photometric precision of 1 mmag with SuperWASP photometric data. Contrary to previous belief, we find that around 200 Am stars are pulsating d Sct and ? Dor stars, with low amplitudes that have been missed in previous, less extensive studies. While the amplitudes are generally low, the presence of pulsation in Am stars places a strong constraint on atmospheric convection, and may require the pulsation to be laminar. While some pulsating Am stars have been previously found to be d Sct stars, the vast majority of Am stars known to pulsate are presented in this paper. They will form the basis of future statistical studies of pulsation in the presence of atomic diffusion. An extended version of Table 1 containing all the detected frequencies and amplitudes is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/535/A3
Resumo:
SN 2009ku, discovered by Pan-STARRS-1, is a Type Ia supernova (SN Ia), and a member of the distinct SN 2002cx-like class of SNe Ia. Its light curves are similar to the prototypical SN 2002cx, but are slightly broader and have a later rise to maximum in g. SN 2009ku is brighter (similar to 0.6 mag) than other SN 2002cx-like objects, peaking at M-V = -18.4 mag, which is still significantly fainter than typical SNe Ia. SN 2009ku, which had an ejecta velocity of similar to 2000 km s(-1) at 18 days after maximum brightness, is spectroscopically most similar to SN 2008ha, which also had extremely low-velocity ejecta. However, SN 2008ha had an exceedingly low luminosity, peaking at M-V = -14.2 mag, similar to 4 mag fainter than SN 2009ku. The contrast of high luminosity and low ejecta velocity for SN 2009ku is contrary to an emerging trend seen for the SN 2002cx class. SN 2009ku is a counterexample of a previously held belief that the class was more homogeneous than typical SNe Ia, indicating that the class has a diverse progenitor population and/or complicated explosion physics. As the first example of a member of this class of objects from the new generation of transient surveys, SN 2009ku is an indication of the potential for these surveys to find rare and interesting objects.
Resumo:
While the Quality and Outcomes Framework (QOF) is reported to improve performance, its impact on some aspects of organisations need to be explored given the increased reliance on such schemes. Organisational culture can be seen as providing a sense of common values, belief, and norms, which may act as guidelines for behaviour in organisational settings. This research employs a competing value framework depictures different types of culture based on specific focuses and processes. The study is based on interviews with 2 GP practices in the north of England involving 19 participants. Healthcare professionals were aware that there is a dominant value held and shared strongly among members of the organisations-to provide high quality patient-centred services. This study found that while clan culture is still strong in both practices, changes occured in respondents' culture after the implementation of the QOF.
Resumo:
Traditional static analysis fails to auto-parallelize programs with a complex control and data flow. Furthermore, thread-level parallelism in such programs is often restricted to pipeline parallelism, which can be hard to discover by a programmer. In this paper we propose a tool that, based on profiling information, helps the programmer to discover parallelism. The programmer hand-picks the code transformations from among the proposed candidates which are then applied by automatic code transformation techniques.
This paper contributes to the literature by presenting a profiling tool for discovering thread-level parallelism. We track dependencies at the whole-data structure level rather than at the element level or byte level in order to limit the profiling overhead. We perform a thorough analysis of the needs and costs of this technique. Furthermore, we present and validate the belief that programs with complex control and data flow contain significant amounts of exploitable coarse-grain pipeline parallelism in the program’s outer loops. This observation validates our approach to whole-data structure dependencies. As state-of-the-art compilers focus on loops iterating over data structure members, this observation also explains why our approach finds coarse-grain pipeline parallelism in cases that have remained out of reach for state-of-the-art compilers. In cases where traditional compilation techniques do find parallelism, our approach allows to discover higher degrees of parallelism, allowing a 40% speedup over traditional compilation techniques. Moreover, we demonstrate real speedups on multiple hardware platforms.