35 resultados para Multi-level Analysis
Resumo:
Projection of a high-dimensional dataset onto a two-dimensional space is a useful tool to visualise structures and relationships in the dataset. However, a single two-dimensional visualisation may not display all the intrinsic structure. Therefore, hierarchical/multi-level visualisation methods have been used to extract more detailed understanding of the data. Here we propose a multi-level Gaussian process latent variable model (MLGPLVM). MLGPLVM works by segmenting data (with e.g. K-means, Gaussian mixture model or interactive clustering) in the visualisation space and then fitting a visualisation model to each subset. To measure the quality of multi-level visualisation (with respect to parent and child models), metrics such as trustworthiness, continuity, mean relative rank errors, visualisation distance distortion and the negative log-likelihood per point are used. We evaluate the MLGPLVM approach on the ‘Oil Flow’ dataset and a dataset of protein electrostatic potentials for the ‘Major Histocompatibility Complex (MHC) class I’ of humans. In both cases, visual observation and the quantitative quality measures have shown better visualisation at lower levels.
Resumo:
We develop an analytical methodology for optimizing phase regeneration based on phase sensitive amplification. The results demonstrate the scalability of the scheme and show the significance of simultaneous optimization of transfer function and the signal alphabet.
Resumo:
The re-entrant flow shop scheduling problem (RFSP) is regarded as a NP-hard problem and attracted the attention of both researchers and industry. Current approach attempts to minimize the makespan of RFSP without considering the interdependency between the resource constraints and the re-entrant probability. This paper proposed Multi-level genetic algorithm (GA) by including the co-related re-entrant possibility and production mode in multi-level chromosome encoding. Repair operator is incorporated in the Multi-level genetic algorithm so as to revise the infeasible solution by resolving the resource conflict. With the objective of minimizing the makespan, Multi-level genetic algorithm (GA) is proposed and ANOVA is used to fine tune the parameter setting of GA. The experiment shows that the proposed approach is more effective to find the near-optimal schedule than the simulated annealing algorithm for both small-size problem and large-size problem. © 2013 Published by Elsevier Ltd.
Resumo:
Nonlinearity management is explored as a complete tool to obtain maximum transmission reach in a WDM fiber transmission system, making it possible to optimize multiple system parameters, including optimal dispersion pre-compensation, with fast simulations based on the continuous-wave approximation. © 2006 Optical Society of America.
Resumo:
The purpose of this article is to investigate in which ways multi-level actor cooperation advances national and local implementation processes of human rights norms in weak-state contexts. Examining the cases of women’s rights in Bosnia and Herzegovina and children’s rights in Bangladesh, we comparatively point to some advantages and disadvantages cooperative relations between international organisations, national governments and local NGOs can entail. Whereas these multi-level actor constellations (MACs) usually initiate norm implementation processes reliably and compensate governmental deficits, they are not always sustainable in the long run. If international organisations withdraw support from temporary missions or policy projects, local NGOs are not able to perpetuate implementation activities if state capacities have not been strengthened by MACs. Our aim is to highlight functions of local agency within multi-level cooperation and to critically raise sustainability issues in human rights implementation to supplement norm research in International Relations.
Resumo:
The central goal of this research is to explore the approach of the Islamic banking industry in defining and implementing religious compliance at regulatory, institutional, and individual level within the Islamic Banking and Finance (IBF) industry. It also examines the discrepancies, ambiguities and paradoxes that are exhibited in the individual and institutional behaviour in relation to the infusion and enactment of religious exigencies into compliance processes in IBF. Through the combined lenses of institutional work and a sensemaking perspective, this research portrays the practice of infusion of Islamic law in Islamic banks as being ambiguous and drifting down to the institutional and actor levels. In instances of both well-codified and non-codified regulatory frameworks for Shariah compliance, institutional rules ambiguity, rules interpretation and enactment ambiguities were found to be prevalent. The individual IBF professionals performed retrospective and prospective actions to adjust the role and rules boundaries both in the case of a Muslim and a non-Muslim country. The sensitizing concept of religious compliance is the primary theoretical contribution of this research and provides a tool to understand the nature of what constitutes Shariah compliance and the dynamics of its implementation. It helps to explain the empirical consequences of the lack of a clear definition of Shariah compliance in the regulatory frameworks and standards available for the industry. It also addresses the calls to have a clear reference on what constitute Shariah compliance in IBF as proposed in previous studies (Hayat, Butter, & Kock, 2013; Maurer, 2003, 2012; Pitluck, 2012). The methodological and theoretical perspective of this research are unique in the use of multi-level analysis and approaches that blend micro and macro perspectives of the research field, to illuminate and provide a more complete picture of religious compliance infusion and enactment in IBF.
Resumo:
In this concluding chapter, we bring together the threads and reflections on the chapters contained in this text and show how they relate to multi-level issues. The book has focused on the world of Human Resource Management (HRM) and the systems and practices it must put in place to foster innovation. Many of the contributions argue that in order to bring innovation about, organisations have to think carefully about the way in which they will integrate what is, in practice, organisationally relevant — but socially distributed — knowledge. They need to build a series of knowledge-intensive activities and networks, both within their own boundaries and across other important external inter-relationships. In so doing, they help to co-ordinate important information structures. They have, in effect, to find ways of enabling people to collaborate with each other at lower cost, by reducing both the costs of their co-ordination and the levels of unproductive search activity. They have to engineer these behaviours by reducing the risks for people that might be associated with incorrect ideas and help individuals, teams and business units to advance incomplete ideas that are so often difficult to codify. In short, a range of intangible assets must flow more rapidly throughout the organisation and an appropriate balance must be found between the rewards and incentives associated with creativity, novelty and innovation, versus the risks that innovation may also bring.
Resumo:
Underpinned by the resource-based view (RBV), social exchange theory (SET), and a theory of intrinsic motivation (empowerment), I proposed and tested a multi-level model that simultaneously examines the intermediate linkages or mechanisms through which HPWS impact individual and organizational performance. First and underpinned by RBV, I examined at the unit level, collective human capital and competitive advantage as path-ways through which the use of HPWS influences – branch market performance. Second and-, underpinned by social exchange (perceived organizational support) and intrinsic motivation (psychological empowerment) theories, I examined cross and individual level mechanisms through which experienced HPWS may influence employee performance. I tested the propositions of this study with multisource data obtained from junior and senior customer contact employees, and managers of 37 branches of two banks in Ghana. Results of the Structural Equation Modeling (SEM) analysis revealed that (i) collective human capital partially mediated the relationship between management-rated HPWS and competitive advantage, while competitive advantage completely mediated the influence of human capital on branch market performance. Consequently, management-rated HPWS influenced branch market performance indirectly through collective human capital and competitive advantage. Additionally, results of hierarchical linear modeling (HLM) tests of the cross-level influences on the motivational implications of HPWS revealed that (i) management-rated HPWS influenced experienced HPWS; (ii) perceived organizational support (POS) and psychological empowerment fully mediated the influence of experienced HPWS on service-oriented organizational citizenship behaviour (OCB), and; (iii) service-oriented OCB mediated the influence of psychological empowerment and POS on service quality and task performance. I discuss the theoretical and practical implications of these findings.
Resumo:
The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed. The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14
Resumo:
This doctoral thesis responds to the need for greater understanding of small businesses and their inherent unique problem-types. Integral to the investigation is the theme that for governments to effectively influence small business, a sound understanding of the factors they are seeking to influence is essential. Moreover, the study, in its recognition of the many shortcomings in management research and, in particular that the research methods and approaches adopted often fail to give adequate understanding of issues under study, attempts to develop an innovative and creative research approach. The aim thus being to produce, not only advances in small business management knowledge from the standpoints of government policy makers and `lq recipient small business, but also insights into future potential research method for the continued development of that knowledge. The origins of the methodology lay in the non-acceptance of traditional philosophical positions in epistemology and ontology, with a philosophical standpoint of internal realism underpinning the research. Internal realism presents the basis for the potential co-existence of qualitative and quantitative research strategy and underlines the crucial contributory role of research method in provision of ultimate factual status of the assertions of research findings. The concept of epistemological bootstrapping is thus used to develop a `lq partial research framework to foothold case study research, thereby avoiding limitations of objectivism and brute inductivism. The major insights and issues highlighted by the `lq bootstrap, guide the researcher around the participant case studies. A novel attempt at contextualist (linked multi-level and processual) analysis was attempted in the major in-depth case study, with two further cases playing a support role and contributing to a balanced emphasis of empirical research within the context of time constraints inherent within part-time research.
Resumo:
The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.
Resumo:
This thesis is based upon a case study of the adoption of digital, electronic, microprocessor-based control systems by Albright & Wilson Limited - a UK chemical producer. It offers an explanation of the company's changing technology policy between 1978 and 1981, by examining its past development, internal features and industrial environment. Part One of the thesis gives an industry-level analysis which relates the development of process control technology to changes in the economic requirements of production . The rapid diffusion of microcomputers and other microelectronic equipment in the chemical industry is found to be a response to general need to raise the efficiency of all processes, imposed by the economic recession following 1973. Part Two examines the impaot of these technical and eoonomic ohanges upon Albright & Wilson Limited. The company's slowness in adopting new control technology is explained by its long history in which trends are identified whlich produced theconservatism of the 1970s. By contrast, a study of Tenneco Incorporated, a much more successful adoptor of automating technology, is offered with an analysis of the new technology policy of adoption of such equipment which it imposed upon Albright & Wilson, following the latter's takeover by Tenneco in 1978. Some indications of the consequences by this new policy of widespread adoptions of microprocessor-based control equipment are derived from a study of the first Albright & Wilson plant to use such equipment. The thesis concludes that companies which fail to adopt rapidly the new control technology may not survive in the recessionary environment, the long- established British companies may lack the flexibility to make such necessary changes and that multi-national companies may have an important role jn the planned transfer and adoption of new production technology through their subsidiaries in the UK.
Resumo:
Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participant's volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images.
Resumo:
Purpose: To investigate the coexistence of ocular microvascular and systemic macrovascular abnormalities in early stage, newly diagnosed and previously untreated normal tension glaucoma patients (NTG). Methods: Retinal vascular reactivity to flickering light was assessed in 19 NTG and 28 age-matched controls by means of dynamic retinal vessel analysis (IMEDOS GmbH, Jena, Germany). Using a newly developed computational model, the entire dynamic vascular response profile to flicker light was imaged and used for analysis. In addition, assessments of carotid intima-media thickness (IMT) and pulse wave analysis (PWA) were conducted on all participants, along with blood pressure (BP) measurements and blood analyses for lipid metabolism markers. Results: Patients with NTG demonstrated an increased right and left carotid IMT (p = 0.015, p = 0.045) and an elevated PWA augmentation index (p = 0.017) in comparison with healthy controls, along with an enhanced retinal arterial constriction response (p = 0.028), a steeper retinal arterial constriction slope (p = 0.031) and a reduced retinal venous dilation response (p = 0.026) following flicker light stimulation. Conclusions: Early stage, newly diagnosed, NTG patients showed signs of subclinical vascular abnormalities at both macro- and micro-vascular levels, highlighting the need to consider multi-level circulation-related pathologies in the development and progression of this type of glaucoma.