26 resultados para Humanitariansim and complex emergencies

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The two areas of theory upon which this research was based were „strategy development process?(SDP) andcomplex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The deoxidation of steel with complex deoxidisers was studied at 1550°C and compared with silicon, aluminium and silicon/aluminium alloys as standards. The deoxidation alloy systems, Ca/Si/Al, Mg/Si/Al and Mn/Si/Al, were chosen for the low liquidus temperatures of many of their oxide mixtures and the potential deoxidising power of their constituent elements. Product separation rates and compositional relationships following deoxidation were examined. Silicon/aluminium alloy deoxidation resulted in the product compositions and residual oxygen contents expected from equilibrium and stoichiometric considerations, but with the Ca/Si/Al and Mg/Si/Al alloys the volatility of calcium and magnesium prevented them participating in the final solute equilibrium, despite their reported solubility in liquid iron. Electron-probe microanalysis of the products showed various concentrations of lime and magnesia, possibly resulting from reaction between the metal vapours and dissolved oxygen.The consequent reduction of silica activity in the products due to the presence of CaO and hgO produced an indirect effect of calcium and magnesium on the residual oxygen content. Product separation rates, indicated by vacuum fusion analyses, were not significantly influenced by calcium and magnesium but the rapid separation of products having a high Al2O3Si02 ratio was confirmed. Manganese participated in deoxidation, when present either as an alloying element in the steel or as a deoxidation alloy constituent. The compositions of initial oxide products were related to deoxidation alloy compositions. Separated products which were not alumina saturated, dissolved crucible material to achieve saturation. The melt equilibrated with this slag and crucible by diffusion to determine the residual oxygen content. MnO and SiO2 activities were calculated, and the approximate values of MnO deduced for the compositions obtained. Separation rates were greater for products of high interfacial tension. The rates calculated from a model based on Stoke's Law, showed qualitative agreement with experimental data when corrected for coalescence effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MEG beamformer algorithms work by making the assumption that correlated and spatially distinct local field potentials do not develop in the human brain. Despite this assumption, images produced by such algorithms concur with those from other non-invasive and invasive estimates of brain function. In this paper we set out to develop a method that could be applied to raw MEG data to explicitly test his assumption. We show that a promax rotation of MEG channel data can be used as an approximate estimator of the number of spatially distinct correlated sources in any frequency band.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studied the effect of (i) the number of grating components and (ii) parameter randomisation on root-mean-square (r.m.s.) contrast sensitivity and spatial integration. The effectiveness of spatial integration without external spatial noise depended on the number of equally spaced orientation components in the sum of gratings. The critical area marking the saturation of spatial integration was found to decrease when the number of components increased from 1 to 5-6 but increased again at 8-16 components. The critical area behaved similarly as a function of the number of grating components when stimuli consisted of 3, 6 or 16 components with different orientations and/or phases embedded in spatial noise. Spatial integration seemed to depend on the global Fourier structure of the stimulus. Spatial integration was similar for sums of two vertical cosine or sine gratings with various Michelson contrasts in noise. The critical area for a grating sum was found to be a sum of logarithmic critical areas for the component gratings weighted by their relative Michelson contrasts. The human visual system was modelled as a simple image processor where the visual stimuli is first low-pass filtered by the optical modulation transfer function of the human eye and secondly high-pass filtered, up to the spatial cut-off frequency determined by the lowest neural sampling density, by the neural modulation transfer function of the visual pathways. The internal noise is then added before signal interpretation occurs in the brain. The detection is mediated by a local spatially windowed matched filter. The model was extended to include complex stimuli and its applicability to the data was found to be successful. The shape of spatial integration function was similar for non-randomised and randomised simple and complex gratings. However, orientation and/or phase randomised reduced r.m.s contrast sensitivity by a factor of 2. The effect of parameter randomisation on spatial integration was modelled under the assumption that human observers change the observer strategy from cross-correlation (i.e., a matched filter) to auto-correlation detection when uncertainty is introduced to the task. The model described the data accurately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Roma population has become a policy issue highly debated in the European Union (EU). The EU acknowledges that this ethnic minority faces extreme poverty and complex social and economic problems. 52% of the Roma population live in extreme poverty, 75% in poverty (Soros Foundation, 2007, p. 8), with a life expectancy at birth of about ten years less than the majority population. As a result, Romania has received a great deal of policy attention and EU funding, being eligible for 19.7 billion Euros from the EU for 2007-2013. Yet progress is slow; it is debated whether Romania's government and companies were capable to use these funds (EurActiv.ro, 2012). Analysing three case studies, this research looks at policy implementation in relation to the role of Roma networks in different geographical regions of Romania. It gives insights about how to get things done in complex settings and it explains responses to the Roma problem as a „wicked‟ policy issue. This longitudinal research was conducted between 2008 and 2011, comprising 86 semi-structured interviews, 15 observations, and documentary sources and using a purposive sample focused on institutions responsible for implementing social policies for Roma: Public Health Departments, School Inspectorates, City Halls, Prefectures, and NGOs. Respondents included: governmental workers, academics, Roma school mediators, Roma health mediators, Roma experts, Roma Councillors, NGOs workers, and Roma service users. By triangulating the data collected with various methods and applied to various categories of respondents, a comprehensive and precise representation of Roma network practices was created. The provisions of the 2001 „Governmental Strategy to Improve the Situation of the Roma Population‟ facilitated forming a Roma network by introducing special jobs in local and central administration. In different counties, resources, people, their skills, and practices varied. As opposed to the communist period, a new Roma elite emerged: social entrepreneurs set the pace of change by creating either closed cliques or open alliances and by using more or less transparent practices. This research deploys the concept of social/institutional entrepreneurs to analyse how key actors influence clique and alliance formation and functioning. Significantly, by contrasting three case studies, it shows that both closed cliques and open alliances help to achieve public policy network objectives, but that closed cliques can also lead to failure to improve the health and education of Roma people in a certain region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment criteria are increasingly incorporated into teaching, making it important to clarify the pedagogic status of the qualities to which they refer. We reviewed theory and evidence about the extent to which four core criteria for student writing-critical thinking, use of language, structuring, and argument-refer to the outcomes of three types of learning: generic skills learning, a deep approach to learning, and complex learning. The analysis showed that all four of the core criteria describe to some extent properties of text resulting from using skills, but none qualify fully as descriptions of the outcomes of applying generic skills. Most also describe certain aspects of the outcomes of taking a deep approach to learning. Critical thinking and argument correspond most closely to the outcomes of complex learning. At lower levels of performance, use of language and structuring describe the outcomes of applying transferable skills. At higher levels of performance, they describe the outcomes of taking a deep approach to learning. We propose that the type of learning required to meet the core criteria is most usefully and accurately conceptualized as the learning of complex skills, and that this provides a conceptual framework for maximizing the benefits of using assessment criteria as part of teaching. © 2006 Taylor & Francis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alzheimer's disease is the commonest degenerative disease of the nervous system to affect elderly people. It is characterised by 'dementia', a global cognitive decline involving loss of short term memory, judgement and emotional control. In addition, patients may suffer a range of visual problems including impairment of visual acuity, colour vision, eye movement problems and complex visual disturbances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We employ the methods of statistical physics to study the performance of Gallager type error-correcting codes. In this approach, the transmitted codeword comprises Boolean sums of the original message bits selected by two randomly-constructed sparse matrices. We show that a broad range of these codes potentially saturate Shannon's bound but are limited due to the decoding dynamics used. Other codes show sub-optimal performance but are not restricted by the decoding dynamics. We show how these codes may also be employed as a practical public-key cryptosystem and are of competitive performance to modern cyptographical methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The assertion about the peculiarly intricate and complex character of social phenomena has, in much of social discourse, a virtually uncontested tradition. A significant part of the premise about the complexity of social phenomena is the conviction that it complicates, perhaps even inhibits the development and application of social scientific knowledge. Our paper explores the origins, the basis and the consequences of this assertion and asks in particular whether the classic complexity assertion still deserves to be invoked in analyses that ask about the production and the utilization of social scientific knowledge in modern society. We refer to one of the most prominent and politically influential social scientific theories, John Maynard Keynes' economic theory as an illustration. We conclude that, the practical value of social scientific knowledge is not necessarily dependent on a faithful, in the sense of complete, representation of (complex) social reality. Practical knowledge is context sensitive if not project bound. Social scientific knowledge that wants to optimize its practicality has to attend and attach itself to elements of practical social situations that can be altered or are actionable by relevant actors. This chapter represents an effort to re-examine the relation between social reality, social scientific knowledge and its practical application. There is a widely accepted view about the potential social utility of social scientific knowledge that invokes the peculiar complexity of social reality as an impediment to good theoretical comprehension and hence to its applicability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper complements the preceding one by Clarke et al, which looked at the long-term impact of retail restructuring on consumer choice at the local level. Whereas the previous paper was based on quantitative evidence from survey research, this paper draws on the qualitative phases of the same three-year study, and in it we aim to understand how the changing forms of retail provision are experienced at the neighbourhood and household level. The empirical material is drawn from focus groups, accompanied shopping trips, diaries, interviews, and kitchen visits with eight households in two contrasting neighbourhoods in the Portsmouth area. The data demonstrate that consumer choice involves judgments of taste, quality, and value as well as more ‘objective’ questions of convenience, price, and accessibility. These judgments are related to households’ differential levels of cultural capital and involve ethical and moral considerations as well as more mundane considerations of practical utility. Our evidence suggests that many of the terms that are conventionally advanced as explanations of consumer choice (such as ‘convenience’, ‘value’, and ‘habit’) have very different meanings according to different household circumstances. To understand these meanings requires us to relate consumers’ at-store behaviour to the domestic context in which their consumption choices are embedded. Bringing theories of practice to bear on the nature of consumer choice, our research demonstrates that consumer choice between stores can be understood in terms of accessibility and convenience, whereas choice within stores involves notions of value, price, and quality. We also demonstrate that choice between and within stores is strongly mediated by consumers’ household contexts, reflecting the extent to which shopping practices are embedded within consumers’ domestic routines and complex everyday lives. The paper concludes with a summary of the overall findings of the project, and with a discussion of the practical and theoretical implications of the study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To make vision possible, the visual nervous system must represent the most informative features in the light pattern captured by the eye. Here we use Gaussian scale-space theory to derive a multiscale model for edge analysis and we test it in perceptual experiments. At all scales there are two stages of spatial filtering. An odd-symmetric, Gaussian first derivative filter provides the input to a Gaussian second derivative filter. Crucially, the output at each stage is half-wave rectified before feeding forward to the next. This creates nonlinear channels selectively responsive to one edge polarity while suppressing spurious or "phantom" edges. The two stages have properties analogous to simple and complex cells in the visual cortex. Edges are found as peaks in a scale-space response map that is the output of the second stage. The position and scale of the peak response identify the location and blur of the edge. The model predicts remarkably accurately our results on human perception of edge location and blur for a wide range of luminance profiles, including the surprising finding that blurred edges look sharper when their length is made shorter. The model enhances our understanding of early vision by integrating computational, physiological, and psychophysical approaches. © ARVO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to demonstrate the existence of a strong and significant effect of complexity in aphasia independent from other variables including length. Complexity was found to be a strong and significant predictor of accurate repetition in a group of 13 Italian aphasic patients when it was entered in a regression equation either simultaneously or after a large number of other variables. Significant effects were found both when complexity was measured in terms of number of complex onsets (as in a recent paper by Nickels & Howard, 2004) and when it was measured in a more comprehensive way. Significant complexity effects were also found with matched lists contrasting simple and complex words and in analyses of errors. Effects of complexity, however, were restricted to patients with articulatory difficulties. Reasons for this association and for the lack of significant results in Nickels and Howard (2004) are discussed. © 2005 Psychology Press Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this chapter, we discuss the interviewing of adult witnesses and victims with reference to how the extant psychological and linguistic literature has contributed to understanding and informing interview practice over the past 20 years and how it continues to support practical and procedural improvements. We have only scratched the surface of this important and complex topic, but throughout this chapter we have directed readers to many in-depth reviews and some of the most contemporary research literature currently available in this domain. We have introduced the PEACE model and described the Cognitive Interview procedure and its development. We have also discussed rapport building, question types and communication style, all with reference to witness memory and practical interviewing. Finally, we highlight areas that would benefit from research, for example conducting interviews with interpreters, and how new training initiatives are seeking to improve interview procedures and interviewer practice.