18 resultados para 230106 Real and Complex Functions
em Aston University Research Archive
Resumo:
Isotropic scattering Raman spectra of liquid acetonitrile (AN) solutions of LiBF4 and NaI at various temperatures and concentrations have been investigated. For the first time imaginary as well as real parts of the solvent vibrational correlation functions have been extracted from the spectra. Such imaginary parts are currently an important component of modern theories of vibrational relaxation in liquids. This investigation thus provides the first experimental data on imaginary parts of a correlation function in AN solutions. Using the fitting algorithm we recently developed, statistically confident models for the Raman spectra were deduced. The parameters of the band shapes, with an additional correction, of the ν2 AN vibration (CN stretching), together with their confidence intervals are also reported for the first time. It is shown that three distinct species, with lifetimes greater than ∼10−13 s, of the AN molecules can be detected in solutions containing Li+ and Na+. These species are attributed to AN molecules directly solvating cations; the single oriented and polarised molecules interleaving the cation and anion of a Solvent Shared Ion Pair (SShIP); and molecules solvating anions. These last are considered to be equivalent to the next layer of solvent molecules, because the CN end of the molecule is distant from the anion and thus less affected by the ionic charge compared with the anion situation. Calculations showed that at the concentrations employed, 1 and 0.3 M, there were essentially no other solvent molecules remaining that could be considered as bulk solvent. Calculations also showed that the internuclear distance in these solutions supported the proposal that the ionic entity dominating in solution was the SShIP, and other evidence was adduced that confirmed the absence of Contact Ion Pairs at these concentrations. The parameters of the shape of the vibrational correlation functions of all three species are reported. The parameters of intramolecular anharmonic coupling between the potential surfaces in AN and the dynamics of the intermolecular environment fluctuations and intermolecular energy transfer are presented. These results will assist investigations made at higher and lower concentrations, when additional species and interactions with AN molecules will be present.
Resumo:
The two areas of theory upon which this research was based were „strategy development process?(SDP) and „complex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.
Resumo:
The deoxidation of steel with complex deoxidisers was studied at 1550°C and compared with silicon, aluminium and silicon/aluminium alloys as standards. The deoxidation alloy systems, Ca/Si/Al, Mg/Si/Al and Mn/Si/Al, were chosen for the low liquidus temperatures of many of their oxide mixtures and the potential deoxidising power of their constituent elements. Product separation rates and compositional relationships following deoxidation were examined. Silicon/aluminium alloy deoxidation resulted in the product compositions and residual oxygen contents expected from equilibrium and stoichiometric considerations, but with the Ca/Si/Al and Mg/Si/Al alloys the volatility of calcium and magnesium prevented them participating in the final solute equilibrium, despite their reported solubility in liquid iron. Electron-probe microanalysis of the products showed various concentrations of lime and magnesia, possibly resulting from reaction between the metal vapours and dissolved oxygen.The consequent reduction of silica activity in the products due to the presence of CaO and hgO produced an indirect effect of calcium and magnesium on the residual oxygen content. Product separation rates, indicated by vacuum fusion analyses, were not significantly influenced by calcium and magnesium but the rapid separation of products having a high Al2O3Si02 ratio was confirmed. Manganese participated in deoxidation, when present either as an alloying element in the steel or as a deoxidation alloy constituent. The compositions of initial oxide products were related to deoxidation alloy compositions. Separated products which were not alumina saturated, dissolved crucible material to achieve saturation. The melt equilibrated with this slag and crucible by diffusion to determine the residual oxygen content. MnO and SiO2 activities were calculated, and the approximate values of MnO deduced for the compositions obtained. Separation rates were greater for products of high interfacial tension. The rates calculated from a model based on Stoke's Law, showed qualitative agreement with experimental data when corrected for coalescence effects.
Resumo:
MEG beamformer algorithms work by making the assumption that correlated and spatially distinct local field potentials do not develop in the human brain. Despite this assumption, images produced by such algorithms concur with those from other non-invasive and invasive estimates of brain function. In this paper we set out to develop a method that could be applied to raw MEG data to explicitly test his assumption. We show that a promax rotation of MEG channel data can be used as an approximate estimator of the number of spatially distinct correlated sources in any frequency band.
Resumo:
We used magnetoencephalography (MEG) to examine the nature of oscillatory brain rhythms when passively viewing both illusory and real visual contours. Three stimuli were employed: a Kanizsa triangle; a Kanizsa triangle with a real triangular contour superimposed; and a control figure in which the corner elements used to form the Kanizsa triangle were rotated to negate the formation of illusory contours. The MEG data were analysed using synthetic aperture magnetometry (SAM) to enable the spatial localisation of task-related oscillatory power changes within specific frequency bands, and the time-course of activity within given locations-of-interest was determined by calculating time-frequency plots using a Morlet wavelet transform. In contrast to earlier studies, we did not find increases in gamma activity (> 30 Hz) to illusory shapes, but instead a decrease in 10–30 Hz activity approximately 200 ms after stimulus presentation. The reduction in oscillatory activity was primarily evident within extrastriate areas, including the lateral occipital complex (LOC). Importantly, this same pattern of results was evident for each stimulus type. Our results further highlight the importance of the LOC and a network of posterior brain regions in processing visual contours, be they illusory or real in nature. The similarity of the results for both real and illusory contours, however, leads us to conclude that the broadband (< 30 Hz) decrease in power we observed is more likely to reflect general changes in visual attention than neural computations specific to processing visual contours.
Resumo:
We report on teaching Information Systems Analysis (ISA) in a way that takes the classroom into the real world to enrich students' understanding of the broader role of being an IS professional. Through exposure to less controllable and more uncomfortable issues (e.g., client deadlines; unclear scope; client expectations; unhelpful colleagues, complexity about what is the problem never mind the solution) we aim to better prepare students to respond to the complex issues surrounding deployment of systems analysis methodologies in the real world. In this paper we provide enough detail on what these classes involve to allow a reader to replicate appealing elements in their own teaching. This paper is a reflection on integrating in the real world when teaching ISA – a reflection from the standpoint of students who face an unstructured and complex world and of lecturers who aim to prepare students to hit the floor running when they encounter that world.
Resumo:
This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.
Resumo:
This thesis studied the effect of (i) the number of grating components and (ii) parameter randomisation on root-mean-square (r.m.s.) contrast sensitivity and spatial integration. The effectiveness of spatial integration without external spatial noise depended on the number of equally spaced orientation components in the sum of gratings. The critical area marking the saturation of spatial integration was found to decrease when the number of components increased from 1 to 5-6 but increased again at 8-16 components. The critical area behaved similarly as a function of the number of grating components when stimuli consisted of 3, 6 or 16 components with different orientations and/or phases embedded in spatial noise. Spatial integration seemed to depend on the global Fourier structure of the stimulus. Spatial integration was similar for sums of two vertical cosine or sine gratings with various Michelson contrasts in noise. The critical area for a grating sum was found to be a sum of logarithmic critical areas for the component gratings weighted by their relative Michelson contrasts. The human visual system was modelled as a simple image processor where the visual stimuli is first low-pass filtered by the optical modulation transfer function of the human eye and secondly high-pass filtered, up to the spatial cut-off frequency determined by the lowest neural sampling density, by the neural modulation transfer function of the visual pathways. The internal noise is then added before signal interpretation occurs in the brain. The detection is mediated by a local spatially windowed matched filter. The model was extended to include complex stimuli and its applicability to the data was found to be successful. The shape of spatial integration function was similar for non-randomised and randomised simple and complex gratings. However, orientation and/or phase randomised reduced r.m.s contrast sensitivity by a factor of 2. The effect of parameter randomisation on spatial integration was modelled under the assumption that human observers change the observer strategy from cross-correlation (i.e., a matched filter) to auto-correlation detection when uncertainty is introduced to the task. The model described the data accurately.
Resumo:
The Roma population has become a policy issue highly debated in the European Union (EU). The EU acknowledges that this ethnic minority faces extreme poverty and complex social and economic problems. 52% of the Roma population live in extreme poverty, 75% in poverty (Soros Foundation, 2007, p. 8), with a life expectancy at birth of about ten years less than the majority population. As a result, Romania has received a great deal of policy attention and EU funding, being eligible for 19.7 billion Euros from the EU for 2007-2013. Yet progress is slow; it is debated whether Romania's government and companies were capable to use these funds (EurActiv.ro, 2012). Analysing three case studies, this research looks at policy implementation in relation to the role of Roma networks in different geographical regions of Romania. It gives insights about how to get things done in complex settings and it explains responses to the Roma problem as a „wicked‟ policy issue. This longitudinal research was conducted between 2008 and 2011, comprising 86 semi-structured interviews, 15 observations, and documentary sources and using a purposive sample focused on institutions responsible for implementing social policies for Roma: Public Health Departments, School Inspectorates, City Halls, Prefectures, and NGOs. Respondents included: governmental workers, academics, Roma school mediators, Roma health mediators, Roma experts, Roma Councillors, NGOs workers, and Roma service users. By triangulating the data collected with various methods and applied to various categories of respondents, a comprehensive and precise representation of Roma network practices was created. The provisions of the 2001 „Governmental Strategy to Improve the Situation of the Roma Population‟ facilitated forming a Roma network by introducing special jobs in local and central administration. In different counties, resources, people, their skills, and practices varied. As opposed to the communist period, a new Roma elite emerged: social entrepreneurs set the pace of change by creating either closed cliques or open alliances and by using more or less transparent practices. This research deploys the concept of social/institutional entrepreneurs to analyse how key actors influence clique and alliance formation and functioning. Significantly, by contrasting three case studies, it shows that both closed cliques and open alliances help to achieve public policy network objectives, but that closed cliques can also lead to failure to improve the health and education of Roma people in a certain region.
Resumo:
The thesis contributes to the evolving process of moving the study of Complexity from the arena of metaphor to something real and operational. Acknowledging this phenomenon ultimately changes the underlying assumptions made about working environments and leadership; organisations are dynamic and so should their leaders be. Dynamic leaders are behaviourally complex. Behavioural Complexity is a product of behavioural repertoire - range of behaviours; and behavioural differentiation - where effective leaders apply appropriate behaviour to the demands of the situation. Behavioural Complexity was operationalised using the Competing Values Framework (CVF). The CVF is a measure that captures the extent to which leaders demonstrate four behaviours on four quadrants: Control, Compete, Collaborate and Create, which are argued to be critical to all types of organisational leadership. The results provide evidence to suggest Behavioural Complexity is an enabler of leadership effectiveness; Organisational Complexity (captured using a new measure developed in the thesis) moderates Behavioural Complexity and leadership effectiveness; and leadership training supports Behavioural Complexity in contributing to leadership effectiveness. Most definitions of leadership come down to changing people’s behaviour. Such definitions have contributed to a popularity of focus in leadership research intent on exploring how to elicit change in others when maybe some of the popularity of attention should have been on eliciting change in the leader them self. It is hoped that this research will provoke interest into the factors that cause behavioural change in leaders that in turn enable leadership effectiveness and in doing so contribute to a better understanding of leadership in organisations.
Resumo:
Dementia with Lewy bodies ('Lewy body dementia' or 'diffuse Lewy body disease') (DLB) is the second most common form of dementia to affect elderly people, after Alzheimer's disease. A combination of the clinical symptoms of Alzheimer's disease and Parkinson's disease is present in DLB and the disorder is classified as a 'parkinsonian syndrome', a group of diseases which also includes Parkinson's disease, progressive supranuclear palsy, corticobasal degeneration and multiple system atrophy. Characteristics of DLB are fluctuating cognitive ability with pronounced variations in attention and alertness, recurrent visual hallucinations and spontaneous motor features, including akinesia, rigidity and tremor. In addition, DLB patients may exhibit visual signs and symptoms, including defects in eye movement, pupillary function and complex visual functions. Visual symptoms may aid the differential diagnoses of parkinsonian syndromes. Hence, the presence of visual hallucinations supports a diagnosis of Parkinson's disease or DLB rather than progressive supranuclear palsy. DLB and Parkinson's disease may exhibit similar impairments on a variety of saccadic and visual perception tasks (visual discrimination, space-motion and object-form recognition). Nevertheless, deficits in orientation, trail-making and reading the names of colours are often significantly greater in DLB than in Parkinson's disease. As primary eye-care practitioners, optometrists should be able to work with patients with DLB and their carers to manage their visual welfare.
Resumo:
This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.
Resumo:
The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.