15 resultados para Effective number of parties

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, UK industry has seen an explosive growth in the number of `Computer Aided Production Management' (CAPM) system installations. Of the many CAPM systems, materials requirement planning/manufacturing resource planning (MRP/MRPII) is the most widely implemented. Despite the huge investments in MRP systems, over 80 percent are said to have failed within 3 to 5 years of installation. Many people now assume that Just-In-Time (JIT) is the best manufacturing technique. However, those who have implemented JIT have found that it also has many problems. The author argues that the success of a manufacturing company will not be due to a system which complies with a single technique; but due to the integration of many techniques and the ability to make them complement each other in a specific manufacturing environment. This dissertation examines the potential for integrating MRP with JIT and Two-Bin systems to reduce operational costs involved in managing bought-out inventory. Within this framework it shows that controlling MRP is essential to facilitate the integrating process. The behaviour of MRP systems is dependent on the complex interactions between the numerous control parameters used. Methodologies/models are developed to set these parameters. The models are based on the Pareto principle. The idea is to use business targets to set a coherent set of parameters, which not only enables those business targets to be realised, but also facilitates JIT implementation. It illustrates this approach in the context of an actual manufacturing plant - IBM Havant. (IBM Havant is a high volume electronics assembly plant with the majority of the materials bought-out). The parameter setting models are applicable to control bought-out items in a wide range of industries and are not dependent on specific MRP software. The models have produced successful results in several companies and are now being developed as commercial products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fibre overlay is a cost-effective technique to alleviate wavelength blocking in some links of a wavelength-routed optical network by increasing the number of wavelengths in those links. In this letter, we investigate the effects of overlaying fibre in an all-optical network (AON) based on GÉANT2 topology. The constraint-based routing and wavelength assignment (CB-RWA) algorithm locates where cost-efficient upgrades should be implemented. Through numerical examples, we demonstrate that the network capacity improves by 25 per cent by overlaying fibre on 10 per cent of the links, and by 12 per cent by providing hop reduction links comprising 2 per cent of the links. For the upgraded network, we also show the impact of dynamic traffic allocation on the blocking probability. Copyright © 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fibre overlay is a cost-effective technique to alleviate wavelength blocking in some links of a wavelength-routed optical network by increasing the number of wavelengths in those links. In this letter, we investigate the effects of overlaying fibre in an all-optical network (AON) based on GÉANT2 topology. The constraint-based routing and wavelength assignment (CB-RWA) algorithm locates where cost-efficient upgrades should be implemented. Through numerical examples, we demonstrate that the network capacity improves by 25 per cent by overlaying fibre on 10 per cent of the links, and by 12 per cent by providing hop reduction links comprising 2 per cent of the links. For the upgraded network, we also show the impact of dynamic traffic allocation on the blocking probability. Copyright © 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional project management techniques are not always sufficient to ensure time, cost and quality achievement of large-scale construction projects due to complexity in planning, design and implementation processes. The main reasons for project non-achievement are changes in scope and design, changes in government policies and regulations, unforeseen inflation, underestimation and improper estimation. Projects that are exposed to such an uncertain environment can be effectively managed with the application of risk management throughout the project's life cycle. However, the effectiveness of risk management depends on the technique through which the effects of risk factors are analysed/quantified. This study proposes the Analytic Hierarchy Process (AHP), a multiple attribute decision making technique, as a tool for risk analysis because it can handle subjective as well as objective factors in a decision model that are conflicting in nature. This provides a decision support system (DSS) to project management for making the right decision at the right time for ensuring project success in line with organisation policy, project objectives and a competitive business environment. The whole methodology is explained through a case application of a cross-country petroleum pipeline project in India and its effectiveness in project management is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using methods of statistical physics, we study the average number and kernel size of general sparse random matrices over GF(q), with a given connectivity profile, in the thermodynamical limit of large matrices. We introduce a mapping of GF(q) matrices onto spin systems using the representation of the cyclic group of order q as the q-th complex roots of unity. This representation facilitates the derivation of the average kernel size of random matrices using the replica approach, under the replica symmetric ansatz, resulting in saddle point equations for general connectivity distributions. Numerical solutions are then obtained for particular cases by population dynamics. Similar techniques also allow us to obtain an expression for the exact and average number of random matrices for any general connectivity profile. We present numerical results for particular distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This doctoral thesis responds to the need for greater understanding of small businesses and their inherent unique problem-types. Integral to the investigation is the theme that for governments to effectively influence small business, a sound understanding of the factors they are seeking to influence is essential. Moreover, the study, in its recognition of the many shortcomings in management research and, in particular that the research methods and approaches adopted often fail to give adequate understanding of issues under study, attempts to develop an innovative and creative research approach. The aim thus being to produce, not only advances in small business management knowledge from the standpoints of government policy makers and `lq recipient small business, but also insights into future potential research method for the continued development of that knowledge. The origins of the methodology lay in the non-acceptance of traditional philosophical positions in epistemology and ontology, with a philosophical standpoint of internal realism underpinning the research. Internal realism presents the basis for the potential co-existence of qualitative and quantitative research strategy and underlines the crucial contributory role of research method in provision of ultimate factual status of the assertions of research findings. The concept of epistemological bootstrapping is thus used to develop a `lq partial research framework to foothold case study research, thereby avoiding limitations of objectivism and brute inductivism. The major insights and issues highlighted by the `lq bootstrap, guide the researcher around the participant case studies. A novel attempt at contextualist (linked multi-level and processual) analysis was attempted in the major in-depth case study, with two further cases playing a support role and contributing to a balanced emphasis of empirical research within the context of time constraints inherent within part-time research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guidelines on developing strategy and on the planning and implementation of projects

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time, cost and quality are the prime objectives of any project. Unfortunately, today’s project management does not always ensure the realisation of these objectives. The main reasons of project non-achievement are changes in scope and design, changes in Government policies and regulations, unforeseen inflation, under-estimation and mis-estimation. An overall organisational approach with the application of appropriate management philosophies, tools and techniques can only solve the problem. The present study establishes a methodology for achieving success in implementing projects using a business process re-engineering (BPR) framework. Internal performance characteristics are introspected through condition diagnosis that identifies and prioritises areas of concern requiring attention. Process re-engineering emerges as a most critical area for immediate attention. Project process re-engineering is carried out by eliminating non-value added activities, taking up activities concurrently by applying information systems rigorously and applying risk management techniques throughout the project life cycle. The overall methodology is demonstrated through applications to cross country petroleum pipeline project organisation in an Indian scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of nodes has large impact on the performance, lifetime and cost of wireless sensor network (WSN). It is difficult to determine, because it depends on many factors, such as the network protocols, the collaborative signal processing (CSP) algorithms, etc. A mathematical model is proposed in this paper to calculate the number based on the required working time. It can be used in the general situation by treating these factors as the parameters of energy consumption. © 2004 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Macroeconomic developments, such as the business cycle, have a remarkable influence on firms and their performance. In business-to-business (B-to-B) markets characterized by a strong emphasis on long-term customer relationships, market orientation (MO) provides a particularly important safeguard for firms against fluctuating market forces. Using panel data from an economic upturn and downturn, we examine the effectiveness of different forms of MO (i.e., customer orientation, competitor orientation, interfunctional coordination, and their combinations) on firm performance in B-to-B firms. Our findings suggest that the impact of MO increases especially during a downturn, with interfunctional coordination clearly boosting firm performance and, conversely, competitor orientation becoming even detrimental. The findings further indicate that both the role of MO and its most effective forms vary across industry sectors, MO having a particularly strong impact on performance among B-to-B service firms. The findings of our study provide guidelines for executives to better manage performance across the business cycle and tailor their investments in MO more effectively, according to the firm's specific industry sector.