18 resultados para Multiprocessor scheduling with resource sharing
em Aston University Research Archive
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
This paper analyses the mechanisms through which profit-sharing schemes may induce debt constrained firms to improve technical efficiency over time to guarantee positive profits. This hypothesis is first formalised in a partial equilibrium framework and then is tested on a sample of Italian traditional and cooperative firms. Technical efficiency change indexes are computed by DEA. These are regressed on a measure of finance constraints to analyse their impact on firms’ efficiency growth. The results support the hypothesis that a restriction in the availability of financial resources can affect positively the growth in efficiency in firms with profit-sharing schemes.
Resumo:
Definitions and measures of supply chain integration (SCI) are diverse. More empirical research, with clear definition and appropriate measures are needed. The purpose of this article is to identify dimensions and variables for SCI and develop an integrated framework to facilitate this. A literature review of the relevant academic papers in international journals in Logistics, Supply Chain Management and Operations Management for the period 1995-2009 has been undertaken. This study reveals that information integration, coordination and resource sharing and organisational relationship linkage are three major dimensions for SCI. The proposed framework helps integrate both upstream suppliers and downstream customers with the focal organisation. It also allows measuring SCI using both qualitative and quantitative approach. This study encourages researchers and practitioners to identify dimensions and variables for SCI and analyses how it affects the overall supply chain (SC) performance in terms of efficiency and responsiveness. Although there is extensive research in the area of SCI, a comprehensive and integrated approach is missing. This study bridges the gap by developing a framework for measuring SCI, which enables any organisation to identify critical success factors for integrating their SC, measures the degree of integration qualitatively and quantitatively and suggest improvement measures. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
We investigate the impact of market-supporting institutions on business strategies by analyzing the entry strategies of foreign investors entering emerging economies. We apply and advance the institution-based view of strategy by integrating it with resource-based considerations. In particular, we show how resource-seeking strategies are pursued using different entry modes in different institutional contexts. Alternative modes of entry—greenfield, acquisition, and joint venture (JV)—allow firms to overcome different kinds of market inefficiencies related to both characteristics of the resources and to the institutional context. In a weaker institutional framework, JVs are used to access many resources, but in a stronger institutional framework, JVs become less important while acquisitions can play a more important role in accessing resources that are intangible and organizationally embedded. Combining survey and archival data from four emerging economies, India, Vietnam, South Africa, and Egypt, we provide empirical support for our hypotheses.
Resumo:
Advances in statistical physics relating to our understanding of large-scale complex systems have recently been successfully applied in the context of communication networks. Statistical mechanics methods can be used to decompose global system behavior into simple local interactions. Thus, large-scale problems can be solved or approximated in a distributed manner with iterative lightweight local messaging. This survey discusses how statistical physics methodology can provide efficient solutions to hard network problems that are intractable by classical methods. We highlight three typical examples in the realm of networking and communications. In each case we show how a fundamental idea of statistical physics helps solve the problem in an efficient manner. In particular, we discuss how to perform multicast scheduling with message passing methods, how to improve coding using the crystallization process, and how to compute optimal routing by representing routes as interacting polymers.
Resumo:
The widespread implementation of Manufacturing Resource Planning (MRPII) systems in this country and abroad and the reported dissatisfaction with their use formed the initial basis of this piece of research which concentrates on the fundamental theory and design of the Closed Loop MRPII system itself. The dissertation concentrates on two key aspects namely; how Master Production Scheduling is carried out in differing business environments and how well the `closing of the loop' operates by checking the capcity requirements of the different levels of plans within an organisation. The main hypothesis which is tested is that in U.K. manufacturing industry, resource checks are either not being carried out satisfactorily or they are not being fed back to the appropriate plan in a timely fashion. The research methodology employed involved initial detailed investigations into Master Scheduling and capacity planning in eight diverse manufacturing companies. This was followed by a nationwide survey of users in 349 companies, a survey of all the major suppliers of Production Management software in the U.K. and an analysis of the facilities offered by current software packages. The main conclusion which is drawn is that the hypothesis is proved in the majority of companies in that only just over 50% of companies are attempting Resource and Capacity Planning and only 20% are successfully feeding back CRP information to `close the loop'. Various causative factors are put forward and remedies are suggested.
Resumo:
The re-entrant flow shop scheduling problem (RFSP) is regarded as a NP-hard problem and attracted the attention of both researchers and industry. Current approach attempts to minimize the makespan of RFSP without considering the interdependency between the resource constraints and the re-entrant probability. This paper proposed Multi-level genetic algorithm (GA) by including the co-related re-entrant possibility and production mode in multi-level chromosome encoding. Repair operator is incorporated in the Multi-level genetic algorithm so as to revise the infeasible solution by resolving the resource conflict. With the objective of minimizing the makespan, Multi-level genetic algorithm (GA) is proposed and ANOVA is used to fine tune the parameter setting of GA. The experiment shows that the proposed approach is more effective to find the near-optimal schedule than the simulated annealing algorithm for both small-size problem and large-size problem. © 2013 Published by Elsevier Ltd.
Resumo:
From a manufacturing perspective, the efficiency of manufacturing operations (such as process planning and production scheduling) are the key element for enhancing manufacturing competence. Process planning and production scheduling functions have been traditionally treated as two separate activities, and have resulted in a range of inefficiencies. These include infeasible process plans, non-available/overloaded resources, high production costs, long production lead times, and so on. Above all, it is unlikely that the dynamic changes can be efficiently dealt with. Despite much research has been conducted to integrate process planning and production scheduling to generate optimised solutions to improve manufacturing efficiency, there is still a gap to achieve the competence required for the current global competitive market. In this research, the concept of multi-agent system (MAS) is adopted as a means to address the aforementioned gap. A MAS consists of a collection of intelligent autonomous agents able to solve complex problems. These agents possess their individual objectives and interact with each other to fulfil the global goal. This paper describes a novel use of an autonomous agent system to facilitate the integration of process planning and production scheduling functions to cope with unpredictable demands, in terms of uncertainties in product mix and demand pattern. The novelty lies with the currency-based iterative agent bidding mechanism to allow process planning and production scheduling options to be evaluated simultaneously, so as to search for an optimised, cost-effective solution. This agent based system aims to achieve manufacturing competence by means of enhancing the flexibility and agility of manufacturing enterprises.
Resumo:
In this paper, we propose a resource allocation scheme to minimize transmit power for multicast orthogonal frequency division multiple access systems. The proposed scheme allows users to have different symbol error rate (SER) across subcarriers and guarantees an average bit error rate and transmission rate for all users. We first provide an algorithm to determine the optimal bits and target SER on subcarriers. Because the worst-case complexity of the optimal algorithm is exponential, we further propose a suboptimal algorithm that separately assigns bit and adjusts SER with a lower complexity. Numerical results show that the proposed algorithm can effectively improve the performance of multicast orthogonal frequency division multiple access systems and that the performance of the suboptimal algorithm is close to that of the optimal one. Copyright © 2012 John Wiley & Sons, Ltd. This paper proposes optimal and suboptimal algorithms for minimizing transmitting power of multicast orthogonal frequency division multiple access systems with guaranteed average bit error rate and data rate requirement. The proposed scheme allows users to have different symbol error rate across subcarriers and guarantees an average bit error rate and transmission rate for all users. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
This paper presents a simulated genetic algorithm (GA) model of scheduling the flow shop problem with re-entrant jobs. The objective of this research is to minimize the weighted tardiness and makespan. The proposed model considers that the jobs with non-identical due dates are processed on the machines in the same order. Furthermore, the re-entrant jobs are stochastic as only some jobs are required to reenter to the flow shop. The tardiness weight is adjusted once the jobs reenter to the shop. The performance of the proposed GA model is verified by a number of numerical experiments where the data come from the case company. The results show the proposed method has a higher order satisfaction rate than the current industrial practices.
Resumo:
Computational performance increasingly depends on parallelism, and many systems rely on heterogeneous resources such as GPUs and FPGAs to accelerate computationally intensive applications. However, implementations for such heterogeneous systems are often hand-crafted and optimised to one computation scenario, and it can be challenging to maintain high performance when application parameters change. In this paper, we demonstrate that machine learning can help to dynamically choose parameters for task scheduling and load-balancing based on changing characteristics of the incoming workload. We use a financial option pricing application as a case study. We propose a simulation of processing financial tasks on a heterogeneous system with GPUs and FPGAs, and show how dynamic, on-line optimisations could improve such a system. We compare on-line and batch processing algorithms, and we also consider cases with no dynamic optimisations.
Resumo:
A real-time adaptive resource allocation algorithm considering the end user's Quality of Experience (QoE) in the context of video streaming service is presented in this work. An objective no-reference quality metric, namely Pause Intensity (PI), is used to control the priority of resource allocation to users during the scheduling process. An online adjustment has been introduced to adaptively set the scheduler's parameter and maintain a desired trade-off between fairness and efficiency. The correlation between the data rates (i.e. video code rates) demanded by users and the data rates allocated by the scheduler is taken into account as well. The final allocated rates are determined based on the channel status, the distribution of PI values among users, and the scheduling policy adopted. Furthermore, since the user's capability varies as the environment conditions change, the rate adaptation mechanism for video streaming is considered and its interaction with the scheduling process under the same PI metric is studied. The feasibility of implementing this algorithm is examined and the result is compared with the most commonly existing scheduling methods.
Resumo:
This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.
Resumo:
Purpose – Traditionally, most studies focus on institutionalized management-driven actors to understand technology management innovation. The purpose of this paper is to argue that there is a need for research to study the nature and role of dissident non-institutionalized actors’ (i.e. outsourced web designers and rapid application software developers). The authors propose that through online social knowledge sharing, non-institutionalized actors’ solution-finding tensions enable technology management innovation. Design/methodology/approach – A synthesis of the literature and an analysis of the data (21 interviews) provided insights in three areas of solution-finding tensions enabling management innovation. The authors frame the analysis on the peripherally deviant work and the nature of the ways that dissident non-institutionalized actors deviate from their clients (understood as the firm) original contracted objectives. Findings – The findings provide insights into the productive role of solution-finding tensions in enabling opportunities for management service innovation. Furthermore, deviant practices that leverage non-institutionalized actors’ online social knowledge to fulfill customers’ requirements are not interpreted negatively, but as a positive willingness to proactively explore alternative paths. Research limitations/implications – The findings demonstrate the importance of dissident non-institutionalized actors in technology management innovation. However, this work is based on a single country (USA) and additional research is needed to validate and generalize the findings in other cultural and institutional settings. Originality/value – This paper provides new insights into the perceptions of dissident non-institutionalized actors in the practice of IT managerial decision making. The work departs from, but also extends, the previous literature, demonstrating that peripherally deviant work in solution-finding practice creates tensions, enabling management innovation between IT providers and users.
Resumo:
Financing is a critical entrepreneurial activity (Shane et al. 2003) and within the study of entrepreneurship, behaviour has been identified as an area requiring further exploration (Bird et al. 2012). Since 2008 supply side conditions for SMEs have been severe and increasingly entrepreneurs have to bundle or ‘orchestrate’ funding from a variety of sources in order to successfully finance the firm (Wright and Stigliani 2013: p.15). This longitudinal study uses psychometric testing to measure the behavioural competences of a panel of sixty entrepreneurs in the Creative Industries sector. Interviews were conducted over a 3 year period to identify finance finding behaviour. The research takes a pragmatic realism perspective to examine process and the different behavioural competences of entrepreneurs. The predictive qualities of this behaviour are explored in a funding context. The research confirmed a strong behavioural characteristic as validated through interviews and psychometric testing, was an orientation towards engagement and working with other organisations. In a funding context, this manifested itself in entrepreneurs using networks, seeking advice and sharing equity to fund growth. These co-operative, collaborative characteristics are different to the classic image of the entrepreneur as a risk-taker or extrovert. Leadership and achievement orientation were amongst the lowest scores. Three distinctive groups were identified and also shown by subsequent analysis to be a positive contribution to how entrepreneurial behavioural competences can be considered. Belonging to one of these three clusters is a strong predictive indicator of entrepreneurial behaviour – in this context, how entrepreneurs access finance. These Clusters were also proven to have different characteristics in relation to funding outcomes. The study seeks to make a contribution through the development of a methodology for entrepreneurs, policy makers and financial institutions to identify competencies in finding finance and overcome problems in information asymmetry.