12 resultados para general strain theory

em Greenwich Academic Literature Archive - UK


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper introduces a characterization of the so-called most general temporal constraint (GTC), which guarantees the common-sense assertion that "the beginning of the effect cannot precede the beginning of the cause". The formalism is based on general time theory which takes both points and intervals as primitive. It is shown that there are in fact 8 possible causal relationships which satisfy GTC, including cases where, on the one hand, effects start simultaneously with, during, immediately after, or some time after their causes, and on the other hand, events end before, simultaneously with, or after their causes. These causal relationships are versatile enough to subsume those representatives in the literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a framework for Historical Case-Based Reasoning (HCBR) which allows the expression of both relative and absolute temporal knowledge, representing case histories in the real world. The formalism is founded on a general temporal theory that accommodates both points and intervals as primitive time elements. A case history is formally defined as a collection of (time-independent) elemental cases, together with its corresponding temporal reference. Case history matching is two-fold, i.e., there are two similarity values need to be computed: the non-temporal similarity degree and the temporal similarity degree. On the one hand, based on elemental case matching, the non-temporal similarity degree between case histories is defined by means of computing the unions and intersections of the involved elemental cases. On the other hand, by means of the graphical presentation of temporal references, the temporal similarity degree in case history matching is transformed into conventional graph similarity measurement.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Leadership Academy Workshop presentation focused on 'Trust and Leadership in the Downturn', with particular reference to the public sector and to education. The presentation discussed a range of definitions of trust, including the view of Mayer, Davis and Schoorman (1995) that trust can be described as 'the willingness of a person to be vulnerable to the actions of another, based on the expectation that the other will perform a particular action important to the trustor, irrespective of the ability to monitor or control that action'. The presentation then focused on the reasons why this relational psychological state is important,particularly in an economic recession when people were facing job cuts and economic uncertainty in a wider political and social environment characterised by cynicism and a downturn in trust. If trust is defined in part as a belief in the honesty, competence and benevolence of others, it tends to act like 'social glue', cushioning difficult situations and enabling actions to take place easily that otherwise would not be permissible. A worrying state of affairs has recently been developing across the world, however, in the economic downturn, as reported in the Edelman Trust Barometer for 2009, in which there was a marked diminuition of trust in corporations, businesses and government, as a result of the credit crunch. While the US and parts of Europe was showing recovery from a generalised loss of trust by mid-year 2009, the UK had not. It seems that social attitudes in Britain may be hardening - it seems that from being a nation of sceptics we may be becoming a nation of cynics: for example, 69% of the population surveyed by Edelman trust the government less than six months ago. In this situation, there is a need to promote positive measures to build trust, including the establishment of more transparent and honest business practices and practices to ensure that employees are treated well. Following the presentation, a workshop was held to discuss the nature of a possible loss of trust in the downturn in the UK and its implications for leadership practices and development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lennart Åqvist (1992) proposed a logical theory of legal evidence, based on the Bolding-Ekelöf of degrees of evidential strength. This paper reformulates Åqvist's model in terms of the probabilistic version of the kappa calculus. Proving its acceptability in the legal context is beyond the present scope, but the epistemological debate about Bayesian Law isclearly relevant. While the present model is a possible link to that lineof inquiry, we offer some considerations about the broader picture of thepotential of AI & Law in the evidentiary context. Whereas probabilisticreasoning is well-researched in AI, calculations about the threshold ofpersuasion in litigation, whatever their value, are just the tip of theiceberg. The bulk of the modeling desiderata is arguably elsewhere, if one isto ideally make the most of AI's distinctive contribution as envisaged forlegal evidence research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

C.G. Jung and Literary Theory remedies a significant omission in literary studies by doing for Jung and poststructuralist literary theories what has been done extensively for Freud, Lacan and post-Freudian psychoanalysis. This work represents a complete departure from traditional Jungian literary criticism. Instead, radically new Jungian literary theories are developed of deconstruction, feminist theory, gender and psyche, the body and sexuality, spirituality, postcolonialism, historicism and reader-response. As well as linking Jung to the work of Derrida, Kristeva and Irigaray, the book traces contentious occult, cultural and political narratives in Jung's career. It contains a chapter on Jung and fascism in a literary context. [From the Publisher]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since 1984 David Kolb’s Experiential Learning Theory (ELT) has been a leading influence in the development of learner-centred pedagogy in management and business. It forms the basis of Kolb’s own Learning Styles’ Inventory and those of other authors including Honey and Mumford (2000). It also provides powerful underpinning for the emphasis, nay insistence, on reflection as a way of learning and the use of reflective practice in the preparation of students for business and management and other professions. In this paper, we confirm that Kolb’s ELT is still the most commonly cited source used in relation to reflective practice. Kolb himself continues to propound its relevance to teaching and learning in general. However, we also review some of the criticisms that ELT has attracted over the years and advance new criticisms that challenge its relevance to higher education and its validity as a model for formal, intentional learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prediction of tandem mass spectrometric (MS/MS) fragmentation for non-peptidic molecules based on structure is of immense interest to the mass spectrometrist. If a reliable approach to MS/MS prediction could be achieved its impact within the pharmaceutical industry could be immense. Many publications have stressed that the fragmentation of a molecular ion or protonated molecule is a complex process that depends on many parameters, making prediction difficult. Commercial prediction software relies on a collection of general heuristic rules of fragmentation, which involve cleaving every bond in the structure to produce a list of 'expected' masses which can be compared with the experimental data. These approaches do not take into account the thermodynamic or molecular orbital effects that impact on the molecule at the point of protonation which could influence the potential sites of bond cleavage based on the structural motif. A series of compounds have been studied by examining the experimentally derived high-resolution MS/MS data and comparing it with the in silico modelling of the neutral and protonated structures. The effect that protonation at specific sites can have on the bond lengths has also been determined. We have calculated the thermodynamically most stable protonated species and have observed how that information can help predict the cleavage site for that ion. The data have shown that this use of in silico techniques could be a possible way to predict MS/MS spectra. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Economic analysis of technology treats it as given exogenously, while determined endogenously. This paper examines the conceptual conflict. The paper outlines an alternative conceptual framework. This uses a 'General Vertical Division of Labour' into conceptual and executive parts to facilitate a coherent political economic explanation of technological change. The paper suggests that we may acquire rather than impose an understanding of technological change. It also suggests that we may re-define and reassess the efficiency of technological change, through the values inculcated into it.