895 resultados para Epistemic justification
Resumo:
The paper focuses on inter-personal aspects of the context in the analysis of evidential and related epistemic marking systems. While evidentiality is defined by its capacity to qualify the speaker's indexical point of view in terms of information source, it is argued that other aspects of the context are important to analyze evidentiality both conceptually and grammatically. These distinct, analytical components concern the illocutionary status of a given marker and its scope properties. The importance of the hearer's point of view in pragmatics and semantics is well attested and constitutes a convincing argument for an increased emphasis on the perspective of the hearer/addressee in analyses of epistemic marking, such as evidentiality. The paper discusses available accounts of evidentials that attend to the perspective of the addressee and also introduces lesser-known epistemic marking systems that share a functional space with evidentiality.
Resumo:
Discusses the implications for the doctrine of common mistake of the Court of Appeal ruling in Great Peace Shipping Ltd v Tsavliris Salvage (International) Ltd on whether a contract for the hire of a ship was void on the ground of common mistake regarding the position of the ship. Reviews the origins of the doctrine of common mistake and the relationship between the doctrine and the implication of terms. Considers the determination of impossibility. Examines the role of equity in common mistake and remedial equitable intervention.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES
Resumo:
A considerable body of research has developed on processes of neoliberal urban regeneration and gentrifi cation. On the one hand, there are many political economy accounts emphasising the role of economic capital in processes of urban change and gentrifi cation. On the other hand, there is a wealth of governmentality studies on the art of government that fail to explain how ungovernable subjects develop. Similarly, within gentrifi cation studies there are many accounts on the role of changing consumer lifestyles and defi ning gentrifi cation, but less concern with the governance processes between actors in urban regeneration and gentrifi cation. Yet such issues are of considerable importance given the role of the state in urban regeneration and dependence on private capital. This paper utilises the French Pragmatist approach of Boltanski and Thévenot to examine a case study state-led gentrifi cation project. Boltanski and Thévenot argue that social coordination occurs by way of actors working through broader value-laden ‘worlds of justifi cation’ that underpin processes of argumentation and coordination. The examined case study is a deprived area within an English city where a major state-led gentrification programme has been introduced. The rationale for the programme is based on the assumption that reducing deprivation relies upon substantially increasing the number of higher income earners. The paper concludes that market values have overridden broader civic values in the negotiation process, with this intensifying as the state internalised market crisis tendencies within the project. More broadly, there is a need for French Pragmatism to be more sensitive to the spatial processes of social coordination, which can be achieved through critical engagement with recent concepts of ‘assemblages’.
Resumo:
Recent times have witnessed a growing belief in urban spaces as 'assemblages' produced through interwoven and spatially differentiated forces that converge at particular sites. There is also continuing interest in the nature of neoliberal tendencies and the rise of post-politics and democracy in urban governance. These accounts typically lack attention towards the comprehensive conceptualization of the heterogeneous logics and mechanics of relations and negotiations between actors. This paper seeks to advance these perspectives by exploring the potential contribution of French pragmatism thinking to how social life is produced through practical dialogue between actors through critique, argumentation and justification. © The Author(s) 2012.
Resumo:
In this article I argue that the study of the linguistic aspects of epistemology has become unhelpfully focused on the corpus-based study of hedging and that a corpus-driven approach can help to improve upon this. Through focusing on a corpus of texts from one discourse community (that of genetics) and identifying frequent tri-lexical clusters containing highly frequent lexical items identified as keywords, I undertake an inductive analysis identifying patterns of epistemic significance. Several of these patterns are shown to be hedging devices and the whole corpus frequencies of the most salient of these, candidate and putative, are then compared to the whole corpus frequencies for comparable wordforms and clusters of epistemic significance. Finally I interviewed a ‘friendly geneticist’ in order to check my interpretation of some of the terms used and to get an expert interpretation of the overall findings. In summary I argue that the highly unexpected patterns of hedging found in genetics demonstrate the value of adopting a corpus-driven approach and constitute an advance in our current understanding of how to approach the relationship between language and epistemology.
Resumo:
Meier (2012) gave a "mathematical logic foundation" of the purely measurable universal type space (Heifetz and Samet, 1998). The mathematical logic foundation, however, discloses an inconsistency in the type space literature: a finitary language is used for the belief hierarchies and an infinitary language is used for the beliefs. In this paper we propose an epistemic model to fix the inconsistency above. We show that in this new model the universal knowledgebelief space exists, is complete and encompasses all belief hierarchies. Moreover, by examples we demonstrate that in this model the players can agree to disagree Aumann (1976)'s result does not hold, and Aumann and Brandenburger (1995)'s conditions are not sufficient for Nash equilibrium. However, we show that if we substitute selfevidence (Osborne and Rubinstein, 1994) for common knowledge, then we get at that both Aumann (1976)'s and Aumann and Brandenburger (1995)'s results hold.
Resumo:
Auditor decisions regarding the causes of accounting misstatements can have an audit effectiveness and efficiency. Specifically, overconfidence in one's decision can lead to an ineffective audit, whereas underconfidence in one's decision can lead to an inefficient audit. This dissertation explored the implications of providing various types of information cues to decision-makers regarding an Analytical Procedure task and investigated the relationship between different types of evidence cues (confirming, disconfirming, redundant or non-redundant) and the reduction in calibration bias. Information was collected using a laboratory experiment, from 45 accounting students participants. Research questions were analyzed using a 2 x 2 x 2 between-subject and within-subject analysis of covariance (ANCOVA). ^ Results indicated that presenting subjects with information cues dissimilar to the choice they made is an effective intervention in reducing the common overconfidence found in decision-making. In addition, other information characteristics, specifically non-redundant information can help in reducing a decision-maker's overconfidence/calibration bias for difficulty (compared to easy) decision-tasks. ^
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Acknowledgments The research for this paper was primarily funded by an Australian Research Council (ARC) grant (DP120101092), How do we know what works? Ethics and evidence in surgical research. Katrina Hutchison’s research was also partly funded by the ARC Centre of Excellence for Electromaterials Science, where she has worked since June 2015. Discussions about the paper were facilitated by Macquarie University funding of a visit by Vikki A. Entwistle to participate in a Centre for Agency, Values and Ethics (CAVE) seminar on Capabilities Approaches to Justice. The authors would like to thank the anonymous reviewers for a number of helpful comments.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
This paper discusses the Court’s reasoning in interpreting the EU Charter, using recent case law on horizontal effect as a case study. It identifies two possible means of interpreting the provisions of the Charter: firstly, an approach based on common values (e.g. equality or solidarity) and, secondly, an approach based on access to the public sphere. It argues in favour of the latter. Whereas an approach based on common values is more consonant with the development of the case law so far, it is conceptually problematic: it involves subjective assessments of the importance and degree of ‘sharedness’ of the value in question, which can undermine the equal constitutional status of different Charter provisions. Furthermore, it marginalises the Charter’s overall politically constructional character, which distinguishes it from other sources of rights protection listed in Art 6 TEU. The paper argues that, as the Charter’s provisions concretise the notion of political status in the EU, they have a primarily constitutional, rather than ethical, basis. Interpreting the Charter based on the very commitment to a process of sharing, drawing on Hannah Arendt’s idea of the ‘right to have rights’ (a right to access a political community on equal terms), is therefore preferable. This approach retains the pluralistic, post-national fabric of the EU polity, as it accommodates multiple narratives about its underlying values, while also having an inclusionary impact on previously underrepresented groups (e.g. non-market-active citizens or the sans-papiers) by recognising their equal political disposition.
Resumo:
This paper deals with the place of narrative, that is, storytelling, in public deliberation. A distinction is made between weak and strong conceptions of narrative. According to the weak one, storytelling is but one rhetorical device among others with which social actors produce and convey meaning. In contrast, the strong conception holds that narrative is necessary to communicate, and argue, about topics such as the human experience of time, collective identities and the moral and ethical validity of values. The upshot of this idea is that storytelling should be a necessary component of any ideal of public deliberation. Contrary to recent work by deliberative theorists, who tend to adopt the weak conception of narrative, the author argues for embracing the strong one. The main contention of this article is that stories not only have a legitimate place in deliberation, but are even necessary to formulate certain arguments in the fi rst place; for instance, arguments drawing on historical experience. This claim, namely that narrative is constitutive of certain arguments, in the sense that, without it, said reasons cannot be articulated, is illustrated by deliberative theory’s own narrative underpinnings. Finally, certain possible objections against the strong conception of narrative are dispelled.