63 resultados para arbitrariness


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend the contingent valuation (CV) method to test three differing conceptions of individuals' preferences as either (i) a-priori well-formed or readily divined and revealed through a single dichotomous choice question (as per the NOAA CV guidelines [K. Arrow, R. Solow, P.R. Portney, E.E. Learner, R. Radner, H. Schuman, Report of the NOAA panel on contingent valuation, Fed. Reg. 58 (1993) 4601-4614]); (ii) learned or 'discovered' through a process of repetition and experience [J.A. List, Does market experience eliminate market anomalies? Q. J. Econ. (2003) 41-72; C.R. Plott, Rational individual behaviour in markets and social choice processes: the discovered preference hypothesis, in: K. Arrow, E. Colombatto, M. Perleman, C. Schmidt (Eds.), Rational Foundations of Economic Behaviour, Macmillan, London, St. Martin's, New York, 1996, pp. 225-250]; (iii) internally coherent but strongly influenced by some initial arbitrary anchor [D. Ariely, G. Loewenstein, D. Prelec, 'Coherent arbitrariness': stable demand curves without stable preferences, Q. J. Econ. 118(l) (2003) 73-105]. Findings reject both the first and last of these conceptions in favour of a model in which preferences converge towards standard expectations through a process of repetition and learning. In doing so, we show that such a 'learning design CV method overturns the 'stylised facts' of bias and anchoring within the double bound dichotomous choice elicitation format. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops an account of the normative basis of priority setting in health care as combining the values which a given society holds for the common good of its members, with the universal provided by a principle of common humanity. We discuss national differences in health basket in Europe and argue that health care decision-making in complex social and moral frameworks is best thought of as anchored in such a principle by drawing on the philosophy of need. We show that health care needs are ethically ‘thick’ needs whose psychological and social construction can best be understood in terms of David Wiggins's notion of vital need: a person's need is vital when failure to meet it leads to their harm and suffering. The moral dimension of priority setting which operates across different societies’ health care systems is located in the demands both of and on any society to avoid harm to its members.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates a puzzling feature of social conventions: the fact that they are both arbitrary and normative. We examine how this tension is addressed in sociological accounts of conventional phenomena. Traditional approaches tend to generate either synchronic accounts that fail to consider the arbitrariness of conventions, or diachronic accounts that miss central aspects of their normativity. As a remedy, we propose a processual conception that considers conventions as both the outcome and material cause of much human activity. This conceptualization, which borrows from the économie des conventions as well as critical realism, provides a novel perspective on how conventions are nested and defined, and on how they are established, maintained and challenged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Criminal sanctions involve the deliberate infliction of hardship on offenders. In sentencing, the state acts in its most coercive and decisive manner: ‘the state may use its most awesome power: the power to use force against its citizens and others’. Despite the importance of the interests at stake in the sentencing realm,sentencing is arguably the least coherent, predictable and principled area of law. The High Court of Australia has not facilitated attempts to inject clarity and precision into sentencing determinations. It has repeatedly endorsed the ‘instinctive synthesis’ approach to sentencing, emphasising the need for ‘individual justice’ over the need for transparency and a step-wise systematic approach to sentencing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discipline of education in Anglophone-dominant contexts has always grappled with a kind of status anxiety relative to other disciplines. This is in part due to the ways in which evidence has been thought about in the theoretico-experimental sciences relative to the ethico-redemptive ones. By examining that which was considered to fall to the side of science, even of social science, this paper complexifies contemporary debates over educational science and research, including debates over evidence-based education or assumed divisions between the quantitative/qualitative and empirical/conceptual. It reapproaches historical vagaries in discourses of vision that underscore the arbitrariness of approaches to social scientific research and its objects. A less-considered set of spatializations and regionalisms in social scientific conceptions of rationality especially are exposed through a close reading of the Harvard University philosopher William James' more marginalized texts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the elasticity, topological defects, and hydrodynamics of the recently discovered incommensurate smectic (AIC) phase, characterized by two collinear mass density waves of incommensurate spatial frequency. The low-energy long-wavelength excitations of the system can be described by a displacement field u(x) and a ��phason�� field w(x) associated, respectively, with collective and relative motion of the two constituent density waves. We formulate the elastic free energy in terms of these two variables and find that when w=0, its functional dependence on u is identical to that of a conventional smectic liquid crystal, while when u=0, its functional dependence on w is the same as that for the angle variable in a slightly anisotropic XY model. An arbitrariness in the definition of u and w allows a choice that eliminates all relevant couplings between them in the long-wavelength elastic energy. The topological defects of the system are dislocations with nonzero u and w components. We introduce a two-dimensional Burgers lattice for these dislocations, and compute the interaction between them. This has two parts: one arising from the u field that is short ranged and identical to the interaction between dislocations in an ordinary smectic liquid crystal, and one arising from the w field that is long ranged and identical to the logarithmic interaction between vortices in an XY model. The hydrodynamic modes of the AIC include first- and second-sound modes whose direction-dependent velocities are identical to those in ordinary smectics. The sound attenuations have a different direction dependence, however. The breakdown of hydrodynamics found in conventional smectic liquid crystals, with three of the five viscosities diverging as 1/? at small frequencies ?, occurs in these systems as well and is identical in all its details. In addition, there is a diffusive phason mode, not found in ordinary smectic liquid crystals, that leads to anomalously slow mechanical response analogous to that predicted in quasicrystals, but on a far more experimentally accessible time scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Direct use of experimental eigenvalues of the vibrational secular equation on to the ab initio predicted eigenvector space is suggested as a means of obtaining a reliable set of intramolecular force constants. This method which we have termed RECOVES (recovery in the eigenvector space) is computationally simple and free from arbitrariness. The RECOVES force constants, by definition, reproduce the experimental vibrational frequencies of the parent molecule exactly. The ab initio calculations were carried out for ethylene as a test molecule and the force constants obtained by the present procedure also correctly predict the vibrational frequencies of the deuterated species. The RECOVES force constants for ethylene are compared with those obtained by using the SQM procedure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We previously proposed a method for estimating Young's modulus from instrumented nanoindentation data based on a model assuming that the indenter had a spherical-capped Berkovich geometry to take account of the bluntness effect. The method is now further improved by releasing the constraint on the tip shape, allowing it to have a much broader arbitrariness to range from a conical-tipped shape to a flat-ended shape, whereas the spherical-capped shape is just a special case in between. This method requires two parameters to specify a tip geometry, namely, a volume bluntness ratio V-r and a height bluntness ratio h(r). A set of functional relationships correlating nominal hardness/reduced elastic modulus ratio (H-n/E-r) and elastic work/total work ratio (W-e/W) were established based on dimensional analysis and finite element simulations, with each relationship specified by a set of V-r and h(r). Young's modulus of an indented material can be estimated from these relationships. The method was shown to be valid when applied to S45C carbon steel and 6061 aluminum alloy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gêneros múltiplos: binarismos versus pluralismo em Stone Butch Blues e Stella Manhattan almeja discutir a arbitrariedade do sistema de sexo e gênero da sociedade ocidental contemporânea, que categoriza e fixa o sexo biológico dos indivíduos em duas exclusivas expressões de gênero possíveis: homem/masculino x mulher/feminino. Casos em que a referida consonância entre sexo e gênero não ocorre são tratados como aberrações passíveis de punições físicas e morais. O corpus literário desta tese é formado por romances da literatura norte-americana (Stone Butch Blues, de Leslie Feinberg) e brasileira (Stella Manhattan, de Silviano Santiago). A introdução discorre brevemente acerca da história e da teoria do romance, objeto principal deste estudo, posta em prática por renomados romancistas. O segundo capítulo ocupa-se de questões teóricas sobre sexo e gênero, importantes para o embasamento da discussão literária, além da trama e fortuna crítica sobre Stone Butch Blues, incluindo uma análise do autor deste trabalho sobre o referido romance. O terceiro capítulo discute outras questões teóricas, desta vez sobre teoria da Literatura e de gênero, além de apresentar a fortuna crítica de Stella Manhattan, culminando também com uma análise crítica do autor desta tese sobre o romance brasileiro. Ao final da pesquisa, objetiva-se demonstrar que o binário de gênero socialmente imposto precisa, em realidade, ceder espaço a um sistema plural e fluido, no qual a biologia perde seu papel determinante na masculinidade ou feminilidade do indivíduo

Relevância:

10.00% 10.00%

Publicador:

Resumo:

People are alarmingly susceptible to manipulations that change both their expectations and experience of the value of goods. Recent studies in behavioral economics suggest such variability reflects more than mere caprice. People commonly judge options and prices in relative terms, rather than absolutely, and display strong sensitivity to exemplar and price anchors. We propose that these findings elucidate important principles about reward processing in the brain. In particular, relative valuation may be a natural consequence of adaptive coding of neuronal firing to optimise sensitivity across large ranges of value. Furthermore, the initial apparent arbitrariness of value may reflect the brains' attempts to optimally integrate diverse sources of value-relevant information in the face of perceived uncertainty. Recent findings in neuroscience support both accounts, and implicate regions in the orbitofrontal cortex, striatum, and ventromedial prefrontal cortex in the construction of value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract. Latent Dirichlet Allocation (LDA) is a document level language model. In general, LDA employ the symmetry Dirichlet distribution as prior of the topic-words’ distributions to implement model smoothing. In this paper, we propose a data-driven smoothing strategy in which probability mass is allocated from smoothing-data to latent variables by the intrinsic inference procedure of LDA. In such a way, the arbitrariness of choosing latent variables'priors for the multi-level graphical model is overcome. Following this data-driven strategy,two concrete methods, Laplacian smoothing and Jelinek-Mercer smoothing, are employed to LDA model. Evaluations on different text categorization collections show data-driven smoothing can significantly improve the performance in balanced and unbalanced corpora.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the social and organizational environment become more and more complex, the topic of leadership complexity is gaining more and more attention. So far, some critical issues in this field need further exploration, such as clarifying the theoretical framework, developing and validating the measurments and exploring the mechanism of the its effectiveness. Using BEI(Behavioral Event Interview), content analysis and EFA/CFA, ANOVA, regression analysis and other qualitative/quantitative methods, this research explored the leadership structure of Chinese enterprise managers, developed a new leadership questionnaire, investigated the differences of the leadership roles among various managerial areas and on different hirachical levels, and examined the impacts of the leadership roles and leadership complexity on different indicators of leadership effectiveness in various organizational contexts. 1,020 managers were surveyed. The followings are the main findings: First,the structure of leadership behaviors of Chinese enterprise managers included ethical model, authoritarian, producer, director, monitor, mentor, strategist, enterpriser, among which ethical model and authoritarian are the new findings in Chinese cultural context. Ethical model was characterized by presenting honesty, setting an example to others, being just and diligent. Authoritarian was characterized by showing power and arbitrariness. In addition, mentor, strategist and enterpriser incarnated some cultural features of present China. The new developed leadership questionnaire’s reliability and validity reached the criterion of standardized measurement. Second, there were significant differences of frequency of leadership behaviors among the managers at different managerial postions and hirachical levels, while the impacts of different leadership roles on different leadership effectiveness indicators were also singnificantly different. Ethical model had positive impacts on the whole performance and three indicators across task and context performance, and authoritarian’s impacts on the whole performance and department performance were negatively significant. Third, the impact of leadership complexity on the whole leadership effectiveness was positively significant, while the moderating effects of organization level and position function was not significant.