14 resultados para new in ILL units
em Aston University Research Archive
Resumo:
During the 1970s and 1980s, close linkages were established between unionists in Volkswagen's Uitenhage plant in South Africa and Wolfsburg in Germany. The ensuing relationship resulted in trade union internationalism and solidarity with South African workers in their struggle against apartheid. After the insertion of the South African plant into the global production networks of the company, a range of new pressures and challenges confronted the union in South Africa. This resulted in the mass dismissal in 2000. In an attempt to garner international support and solidarity, the dismissed workers tapped into existing structures with no success, illustrating the reconfiguration of trade union internationalism away from worker interests to those of the unions and company. © 2010 UALE.
Resumo:
This research concerns the development of coordination and co-governance within three different regeneration programmes within one Midlands city over the period from 1999 to 2002. The New Labour government, in office since 1997, had an agenda for ‘joining-up’ government, part of which has had considerable impact in the area of regeneration policy. Joining-up government encompasses a set of related activities which can include the coordination of policy-making and service delivery. In regeneration, it also includes a commitment to operate through co-governance. Central government and local and regional organisations have sought to put this idea into practice by using what may be referred to as network management processes. Many characteristics of new policies are designed to address the management of networks. Network management is not new in this area, it has developed at least since the early 1990s with the City Challenge and Single Regeneration Budget (SRB) programmes as a way of encouraging more inclusive and effective regeneration interventions. Network management theory suggests that better management can improve decision-making outcomes in complex networks. The theories and concepts are utilised in three case studies as a way of understanding how and why regeneration attempts demonstrate real advances in inter-organisational working at certain times whilst faltering at others. Current cases are compared to the historical case of the original SRB programme as a method of assessing change. The findings suggest that: The use of network management can be identified at all levels of governance. As previous literature has highlighted, central government is the most important actor regarding network structuring. However, it can be argued that network structuring and game management are both practised by central and local actors; Furthermore, all three of the theoretical perspectives within network management (Instrumental, Institutional and Interactive), have been identified within UK regeneration networks. All may have a role to play with no single perspective likely to succeed on its own. Therefore, all could make an important contribution to the understanding of how groups can be brought together to work jointly; The findings support Klijn’s (1997) assertion that the institutional perspective is dominant for understanding network management processes; Instrumentalism continues on all sides, as the acquisition of resources remains the major driver for partnership activity; The level of interaction appears to be low despite the intentions for interactive decision-making; Overall, network management remains partial. Little attention is paid to the issues of accountability or to the institutional structures which can prevent networks from implementing the policies designed by central government, and/or the regional tier.
Resumo:
Modern business trends such as agile manufacturing and virtual corporations require high levels of flexibility and responsiveness to consumer demand, and require the ability to quickly and efficiently select trading partners. Automated computational techniques for supply chain formation have the potential to provide significant advantages in terms of speed and efficiency over the traditional manual approach to partner selection. Automated supply chain formation is the process of determining the participants within a supply chain and the terms of the exchanges made between these participants. In this thesis we present an automated technique for supply chain formation based upon the min-sum loopy belief propagation algorithm (LBP). LBP is a decentralised and distributed message-passing algorithm which allows participants to share their beliefs about the optimal structure of the supply chain based upon their costs, capabilities and requirements. We propose a novel framework for the application of LBP to the existing state-of-the-art case of the decentralised supply chain formation problem, and extend this framework to allow for application to further novel and established problem cases. Specifically, the contributions made by this thesis are: • A novel framework to allow for the application of LBP to the decentralised supply chain formation scenario investigated using the current state-of-the-art approach. Our experimental analysis indicates that LBP is able to match or outperform this approach for the vast majority of problem instances tested. • A new solution goal for supply chain formation in which economically motivated producers aim to maximise their profits by intelligently altering their profit margins. We propose a rational pricing strategy that allows producers to earn significantly greater profits than a comparable LBP-based profitmaking approach. • An LBP-based framework which allows the algorithm to be used to solve supply chain formation problems in which goods are exchanged in multiple units, a first for a fully decentralised technique. As well as multiple-unit exchanges, we also model in this scenario realistic constraints such as factory capacities and input-to-output ratios. LBP continues to be able to match or outperform an extended version of the existing state-of-the-art approach in this scenario. • Introduction of a dynamic supply chain formation scenario in which participants are able to alter their properties or to enter or leave the process at any time. Our results suggest that LBP is able to deal easily with individual occurences of these alterations and that performance degrades gracefully when they occur in larger numbers.
Resumo:
Supply chain formation (SCF) is the process of determining the set of participants and exchange relationships within a network with the goal of setting up a supply chain that meets some predefined social objective. Many proposed solutions for the SCF problem rely on centralized computation, which presents a single point of failure and can also lead to problems with scalability. Decentralized techniques that aid supply chain emergence offer a more robust and scalable approach by allowing participants to deliberate between themselves about the structure of the optimal supply chain. Current decentralized supply chain emergence mechanisms are only able to deal with simplistic scenarios in which goods are produced and traded in single units only and without taking into account production capacities or input-output ratios other than 1:1. In this paper, we demonstrate the performance of a graphical inference technique, max-sum loopy belief propagation (LBP), in a complex multiunit unit supply chain emergence scenario which models additional constraints such as production capacities and input-to-output ratios. We also provide results demonstrating the performance of LBP in dynamic environments, where the properties and composition of participants are altered as the algorithm is running. Our results suggest that max-sum LBP produces consistently strong solutions on a variety of network structures in a multiunit problem scenario, and that performance tends not to be affected by on-the-fly changes to the properties or composition of participants.
Resumo:
Since publication of the first edition, huge developments have taken place in sensory biology research and new insights have been provided in particular by molecular biology. These show the similarities in the molecular architecture and in the physiology of sensory cells across species and across sensory modality and often indicate a common ancestry dating back over half a billion years. Biology of Sensory Systems has thus been completely revised and takes a molecular, evolutionary and comparative approach, providing an overview of sensory systems in vertebrates, invertebrates and prokaryotes, with a strong focus on human senses. Written by a renowned author with extensive teaching experience, the book covers, in six parts, the general features of sensory systems, the mechanosenses, the chemosenses, the senses which detect electromagnetic radiation, other sensory systems including pain, thermosensitivity and some of the minority senses and, finally, provides an outline and discussion of philosophical implications. New in this edition: - Greater emphasis on molecular biology and intracellular mechanisms - New chapter on genomics and sensory systems - Sections on TRP channels, synaptic transmission, evolution of nervous systems, arachnid mechanosensitive sensilla and photoreceptors, electroreception in the Monotremata, language and the FOXP2 gene, mirror neurons and the molecular biology of pain - Updated passages on human olfaction and gustation. Over four hundred illustrations, boxes containing supplementary material and self-assessment questions and a full bibliography at the end of each part make Biology of Sensory Systems essential reading for undergraduate students of biology, zoology, animal physiology, neuroscience, anatomy and physiological psychology. The book is also suitable for postgraduate students in more specialised courses such as vision sciences, optometry, neurophysiology, neuropathology, developmental biology.
The transformational implementation of JSD process specifications via finite automata representation
Resumo:
Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.
Resumo:
Hospital employees who work in an environment with zero tolerance to error, face several stressors that may result in psychological, physiological, and behavioural strains, and subsequently, in suboptimal performance. This thesis includes two studies which investigate the stressor-to-strain-to-performance relationships in hospitals. The first study is a cross-sectional, multi-group investigation based on secondary data from 65,142 respondents in 172 acute/specialist UK NHS trusts. This model proposes that senior management leadership predicts social support and job design which, in turn, moderate stressors-to-strains across team structure. The results confirm the model's robustness. Regression analysis provides support for main effects and minimal support for moderation hypotheses. Therefore, based on its conclusions and inherent limitations, study one lays the framework for study two. The second study is a cross-sectional, multilevel investigation of the strain-reducing effects of social environment on externally-rated unit-level performance based on primary data from 1,137 employees in 136 units, in a hospital in Malta. The term "social environment" refers to the prediction of the moderator variables, which is to say, social support and decision latitude/control, by transformational leadership and team climate across hospital units. This study demonstrates that transformational leadership is positively associated with social support, whereas team climate is positively associated with both moderators. At the same time, it identifies a number of moderating effects which social support and decision latitude/control, both separately and together, had on specific stressor-to-strain relationships. The results show significant mediated stressor-to-strain-to-performance relationships. Furthermore, at the higher level, unit-level performance is positively associated with shared unit-level team climate and with unit-level vision, the latter being one of the five sub-dimension of transformational leadership. At the same time, performance is also positively related to both transformational leadership and team climate when the two constructs are tested together. Few studies have linked the buffering effects of the social environment in occupational stress with performance. Therefore, this research strives to make a significant contribution to the occupational stress and performance literature with a focus on hospital practice. Indeed, the study highlights the wide-ranging and far-reaching implications that these findings provide for theory, management, and practice.
Resumo:
With the rebirth of coherent detection, various algorithms have come forth to alleviate phase noise, one of the main impairments for coherent receivers. These algorithms provide stable compensation, however they limit the DSP. With this key issue in mind, Fabry Perot filter based self coherent optical OFDM was analyzed which does not require phase noise compensation reducing the complexity in DSP at low OSNR. However, the performance of such a receiver is limited due to ASE noise at the carrier wavelength, especially since an optical amplifier is typically employed with the filter to ensure sufficient carrier power. Subsequently, the use of an injection-locked laser (ILL) to retrieve the frequency and phase information from the extracted carrier without the use of an amplifier was recently proposed. In ILL based system, an optical carrier is sent along with the OFDM signal in the transmitter. At the receiver, the carrier is extracted from the OFDM signal using a Fabry-Perot tunable filter and an ILL is used to significantly amplify the carrier and reduce intensity and phase noise. In contrast to CO-OFDM, such a system supports low-cost broad linewidth lasers and benefits with lower complexity in the DSP as no carrier frequency estimation and correction along with phase noise compensation is required.
Resumo:
The use of arm's-length bodies to deliver certain services, to regulate certain sectors or to assume responsibility for particularly salient political issues is neither new in historical terms or a feature unique to the UK in comparative terms. What is particularly distinctive, however, is the Coalition Government's attempts since 2010 to reduce the number of ‘quangos’ while also strengthening the capacity of the core executive and sponsor departments to control and co-ordinate this dense and fragmented sphere of delegated governance. Drawing upon the findings of the first research project to analyse the current Public Bodies Reform Agenda, this article provides an account of the ‘filling-in’ of the ‘hollowing out’. It argues that when viewed through a historical lens, the Coalition Government has adopted a distinctive approach to ‘the quango problem’.
Resumo:
Service innovations in retailing have the potential to benefit consumers as well as retailers. This research models key factors associated with the trial and continuous use of a specific self-service technology (SST), the personal shopping assistant (PSA), and estimates retailer benefits from implementing that innovation. Based on theoretical insights from prior SST studies, diffusion of innovation literature, and the technology acceptance model (TAM), this study develops specific hypotheses and tests them on a sample of 104 actual users of the PSA and 345 nonusers who shopped at the retail store offering the PSA device. Results indicate that factors affecting initial trial are different from those affecting continuous use. More specifically, consumers' trust toward the retailer, novelty seeking, and market mavenism are positively related to trial, while technology anxiety hinders the likelihood of trying the PSA. Perceived ease of use of the device positively impacts continuous use while consumers' need for interaction in shopping environments reduces the likelihood of continuous use. Importantly, there is evidence on retailer benefits from introducing the innovation since consumers using the PSA tend to spend more during each shopping trip. However, given the high costs of technology, the payback period for recovery of investments in innovation depends largely upon continued use of the innovation by consumers. Important implications are provided for retailers considering investments in new in-store service innovations. Incorporation of technology within physical stores affords opportunities for the retailer to reduce costs, while enhancing service provided to consumers. Therefore, service innovations in retailing have the potential to benefit consumers as well as retailers. This research models key factors associated with the trial and continuous use of a specific SST in the retail context, the PSA, and estimates retailer benefits from implementing that innovation. In so doing, the study contributes to the nascent area of research on SSTs in the retail sector. Based on theoretical insights from prior SST studies, diffusion of innovation literature, and the TAM, this study develops specific hypotheses regarding the (1) antecedent effects of technological anxiety, novelty seeking, market mavenism, and trust in the retailer on trial of the service innovation; (2) the effects of ease of use, perceived waiting time, and need for interaction on continuous use of the innovation; and (3) the effect of use of innovation on consumer spending at the store. The hypotheses were tested on a sample of 104 actual users of the PSA and 345 nonusers who shopped at the retail store offering the PSA device, one of the early adopters of PSA in Germany. Data were analyzed using logistic regression (antecedents of trial), multiple regression (antecedents of continuous use), and propensity score matching (assessing retailer benefits). Results indicate that factors affecting initial trial are different from those affecting continuous use. More specifically, consumers' trust toward the retailer, novelty seeking, and market mavenism are positively related to trial, while technology anxiety hinders the likelihood of trying the PSA. Perceived ease of use of the device positively impacts continuous use, while consumers' need for interaction in shopping environments reduces the likelihood of continuous use. Importantly, there is evidence on retailer benefits from introducing the innovation since consumers using the PSA tend to spend more during each shopping trip. However, given the high costs of technology, the payback period for recovery of investments in innovation depends largely upon continued use of the innovation by consumers. Important implications are provided for retailers considering investments in new in-store service innovations. The study contributes to the literature through its (1) simultaneous examination of antecedents of trial and continuous usage of a specific SST, (2) the demonstration of economic benefits of SST introduction for the retailer, and (3) contribution to the stream of research on service innovation, as against product innovation.
Resumo:
With the rebirth of coherent detection, various algorithms have come forth to alleviate phase noise, one of the main impairments for coherent receivers. These algorithms provide stable compensation, however they limit the DSP. With this key issue in mind, Fabry Perot filter based self coherent optical OFDM was analyzed which does not require phase noise compensation reducing the complexity in DSP at low OSNR. However, the performance of such a receiver is limited due to ASE noise at the carrier wavelength, especially since an optical amplifier is typically employed with the filter to ensure sufficient carrier power. Subsequently, the use of an injection-locked laser (ILL) to retrieve the frequency and phase information from the extracted carrier without the use of an amplifier was recently proposed. In ILL based system, an optical carrier is sent along with the OFDM signal in the transmitter. At the receiver, the carrier is extracted from the OFDM signal using a Fabry-Perot tunable filter and an ILL is used to significantly amplify the carrier and reduce intensity and phase noise. In contrast to CO-OFDM, such a system supports low-cost broad linewidth lasers and benefits with lower complexity in the DSP as no carrier frequency estimation and correction along with phase noise compensation is required.
Resumo:
This paper introduces a new mathematical method for improving the discrimination power of data envelopment analysis and to completely rank the efficient decision-making units (DMUs). Fuzzy concept is utilised. For this purpose, first all DMUs are evaluated with the CCR model. Thereafter, the resulted weights for each output are considered as fuzzy sets and are then converted to fuzzy numbers. The introduced model is a multi-objective linear model, endpoints of which are the highest and lowest of the weighted values. An added advantage of the model is its ability to handle the infeasibility situation sometimes faced by previously introduced models.
Resumo:
DEA literature continues apace but software has lagged behind. This session uses suitably selected data to present newly developed software which includes many of the most recent DEA models. The software enables the user to address a variety of issues not frequently found in existing DEA software such as: -Assessments under a variety of possible assumptions of returns to scale including NIRS and NDRS; -Scale elasticity computations; -Numerous Input/Output variables and truly unlimited number of assessment units (DMUs) -Panel data analysis -Analysis of categorical data (multiple categories) -Malmquist Index and its decompositions -Computations of Supper efficiency -Automated removal of super-efficient outliers under user-specified criteria; -Graphical presentation of results -Integrated statistical tests