836 resultados para load-balancing scheduling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work contains a series of studies on the optimization of three real-world scheduling problems, school timetabling, sports scheduling and staff scheduling. These challenging problems are solved to customer satisfaction using the proposed PEAST algorithm. The customer satisfaction refers to the fact that implementations of the algorithm are in industry use. The PEAST algorithm is a product of long-term research and development. The first version of it was introduced in 1998. This thesis is a result of a five-year development of the algorithm. One of the most valuable characteristics of the algorithm has proven to be the ability to solve a wide range of scheduling problems. It is likely that it can be tuned to tackle also a range of other combinatorial problems. The algorithm uses features from numerous different metaheuristics which is the main reason for its success. In addition, the implementation of the algorithm is fast enough for real-world use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to characterize the microclimatic profile of broiler transport trucks under commercial conditions, in the summer, by continuous monitoring of environmental variables (temperature and air relative humidity). Three loads were monitored from farm to slaughterhouse, considering factors such as distance and day periods (morning, afternoon, and night). To obtain the profile of the environmental variables during journey, data loggers, that determined the microclimate to which birds were submitted, were installed in the trucks; data loggers also allowed visualization of the Enthalpy Comfort Index (ECI) so that load regions could be classified according to heat comfort limits for 6-week old poultries. Temperature, relative humidity, and ECI in the truck were analyzed, using geostatistics, by the ordinary kriging method. The afternoon was the most critical period, and truck central and rear regions were most problematic for chickens, thus most susceptible to losses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The maintenance of electric distribution network is a topical question for distribution system operators because of increasing significance of failure costs. In this dissertation the maintenance practices of the distribution system operators are analyzed and a theory for scheduling maintenance activities and reinvestment of distribution components is created. The scheduling is based on the deterioration of components and the increasing failure rates due to aging. The dynamic programming algorithm is used as a solving method to maintenance problem which is caused by the increasing failure rates of the network. The other impacts of network maintenance like environmental and regulation reasons are not included to the scope of this thesis. Further the tree trimming of the corridors and the major disturbance of the network are not included to the problem optimized in this thesis. For optimizing, four dynamic programming models are presented and the models are tested. Programming is made in VBA-language to the computer. For testing two different kinds of test networks are used. Because electric distribution system operators want to operate with bigger component groups, optimal timing for component groups is also analyzed. A maintenance software package is created to apply the presented theories in practice. An overview of the program is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eutrophication caused by anthropogenic nutrient pollution has become one of the most severe threats to water bodies. Nutrients enter water bodies from atmospheric precipitation, industrial and domestic wastewaters and surface runoff from agricultural and forest areas. As point pollution has been significantly reduced in developed countries in recent decades, agricultural non-point sources have been increasingly identified as the largest source of nutrient loading in water bodies. In this study, Lake Säkylän Pyhäjärvi and its catchment are studied as an example of a long-term, voluntary-based, co-operative model of lake and catchment management. Lake Pyhäjärvi is located in the centre of an intensive agricultural area in southwestern Finland. More than 20 professional fishermen operate in the lake area, and the lake is used as a drinking water source and for various recreational activities. Lake Pyhäjärvi is a good example of a large and shallow lake that suffers from eutrophication and is subject to measures to improve this undesired state under changing conditions. Climate change is one of the most important challenges faced by Lake Pyhäjärvi and other water bodies. The results show that climatic variation affects the amounts of runoff and nutrient loading and their timing during the year. The findings from the study area concerning warm winters and their influences on nutrient loading are in accordance with the IPCC scenarios of future climate change. In addition to nutrient reduction measures, the restoration of food chains (biomanipulation) is a key method in water quality management. The food-web structure in Lake Pyhäjärvi has, however, become disturbed due to mild winters, short ice cover and low fish catch. Ice cover that enables winter seining is extremely important to the water quality and ecosystem of Lake Pyhäjärvi, as the vendace stock is one of the key factors affecting the food web and the state of the lake. New methods for the reduction of nutrient loading and the treatment of runoff waters from agriculture, such as sand filters, were tested in field conditions. The results confirm that the filter technique is an applicable method for nutrient reduction, but further development is needed. The ability of sand filters to absorb nutrients can be improved with nutrient binding compounds, such as lime. Long-term hydrological, chemical and biological research and monitoring data on Lake Pyhäjärvi and its catchment provide a basis for water protection measures and improve our understanding of the complicated physical, chemical and biological interactions between the terrestrial and aquatic realms. In addition to measurements carried out in field conditions, Lake Pyhäjärvi and its catchment were studied using various modelling methods. In the calibration and validation of models, long-term and wide-ranging time series data proved to be valuable. Collaboration between researchers, modellers and local water managers further improves the reliability and usefulness of models. Lake Pyhäjärvi and its catchment can also be regarded as a good research laboratory from the point of view of the Baltic Sea. The main problem in both of them is eutrophication caused by excess nutrients, and nutrient loading has to be reduced – especially from agriculture. Mitigation measures are also similar in both cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main strengths of professional knowledge-intensive business services (P-KIBS) are knowledge and creativity which needs to be fostered, maintained and supported. The process of managing P-KIBS companies deals with financial, operational and strategic risks. That is why it is reasonable to apply risk management techniques and frameworks in this context. A significant challenge hides in choosing reasonable ways of implementing risk management, which will not limit creative ability in organization, and furthermore will contribute to the process. This choice is related to a risk intelligent approach which becomes a justified way of finding the required balance. On a theoretical level the field of managing both creativity and risk intelligence as a balanced process remains understudied in particular within KIBS industry. For instance, there appears to be a wide range of separate models for innovation and risk management, but very little discussion in terms of trying to find the right balance between them. This study aims to shed light on the importance of well-managed combination of these concepts. The research purpose of the present study is to find out how the balance between creativity and risk intelligence can be managed in P-KIBS. The methodological approach utilized in the study is strictly conceptual without empirical aspects. The research purpose can be achieved through answering the following research supporting questions: 1. What are the characteristics and role of creativity as a component of innovation process in a P-KIBS company? 2. What are the characteristics and role of risk intelligence as an approach towards risk management process implementation in a P-KIBS company? 3. How can risk intelligence and creativity be balanced in P-KIBS? The main theoretical contribution of the study conceals in a proposed creativity and risk intelligence stage process framework. It is designed as an algorithm that can be applied on organizational canvas. It consists of several distinct stages specified by actors involved, their roles and implications. Additional stage-wise description provides detailed tasks for each of the enterprise levels, while combining strategies into one. The insights driven from the framework can be utilized by a vast range of specialists from strategists to risk managers, and from innovation managers to entrepreneurs. Any business that is designing and delivering knowledge service can potentially gain valuable thoughts and expand conceptual understanding from the present report. Risk intelligence in the current study is a unique way of emphasizing the role of creativity in professional knowledge-intensive industry and a worthy technique for making profound decisions towards risks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main challenges in Software Engineering is to cope with the transition from an industry based on software as a product to software as a service. The field of Software Engineering should provide the necessary methods and tools to develop and deploy new cost-efficient and scalable digital services. In this thesis, we focus on deployment platforms to ensure cost-efficient scalability of multi-tier web applications and on-demand video transcoding service for different types of load conditions. Infrastructure as a Service (IaaS) clouds provide Virtual Machines (VMs) under the pay-per-use business model. Dynamically provisioning VMs on demand allows service providers to cope with fluctuations on the number of service users. However, VM provisioning must be done carefully, because over-provisioning results in an increased operational cost, while underprovisioning leads to a subpar service. Therefore, our main focus in this thesis is on cost-efficient VM provisioning for multi-tier web applications and on-demand video transcoding. Moreover, to prevent provisioned VMs from becoming overloaded, we augment VM provisioning with an admission control mechanism. Similarly, to ensure efficient use of provisioned VMs, web applications on the under-utilized VMs are consolidated periodically. Thus, the main problem that we address is cost-efficient VM provisioning augmented with server consolidation and admission control on the provisioned VMs. We seek solutions for two types of applications: multi-tier web applications that follow the request-response paradigm and on-demand video transcoding that is based on video streams with soft realtime constraints. Our first contribution is a cost-efficient VM provisioning approach for multi-tier web applications. The proposed approach comprises two subapproaches: a reactive VM provisioning approach called ARVUE and a hybrid reactive-proactive VM provisioning approach called Cost-efficient Resource Allocation for Multiple web applications with Proactive scaling. Our second contribution is a prediction-based VM provisioning approach for on-demand video transcoding in the cloud. Moreover, to prevent virtualized servers from becoming overloaded, the proposed VM provisioning approaches are augmented with admission control approaches. Therefore, our third contribution is a session-based admission control approach for multi-tier web applications called adaptive Admission Control for Virtualized Application Servers. Similarly, the fourth contribution in this thesis is a stream-based admission control and scheduling approach for on-demand video transcoding called Stream-Based Admission Control and Scheduling. Our fifth contribution is a computation and storage trade-o strategy for cost-efficient video transcoding in cloud computing. Finally, the sixth and the last contribution is a web application consolidation approach, which uses Ant Colony System to minimize the under-utilization of the virtualized application servers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To assess the clinical relevance of a semi-quantitative measurement of human cytomegalovirus (HCMV) DNA in renal transplant recipients within the typical clinical context of a developing country where virtually 100% of both receptors and donors are seropositive for this virus, we have undertaken HCMV DNA quantification using a simple, semi-quantitative, limiting dilution polymerase chain reaction (PCR). We evaluated this assay prospectively in 52 renal transplant patients from whom a total of 495 serial blood samples were collected. The samples scored HCMV positive by qualitative PCR had the levels of HCMV DNA determined by end-point dilution-PCR. All patients were HCMV DNA positive during the monitoring period and a diagnosis of symptomatic infection was made for 4 of 52 patients. In symptomatic patients the geometric mean of the highest level of HCMV DNAemia was 152,000 copies per 106 leukocytes, while for the asymptomatic group this value was 12,050. Symptomatic patients showed high, protracted HCMV DNA levels, whereas asymptomatic patients demonstrated intermittent low or moderate levels. Using a cut-off value of 100,000 copies per 106 leukocytes, the limiting dilution assay had sensitivity of 100%, specificity of 92%, a positive predictive value of 43% and a negative predictive value of 100% for HCMV disease. In this patient group, there was universal HCMV infection but relatively infrequent symptomatic HCMV disease. The two patient groups were readily distinguished by monitoring with the limiting dilution assay, an extremely simple technology immediately applicable in any clinical laboratory with PCR capability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this project was to introduce a new software product to pulp industry, a new market for case company. An optimization based scheduling tool has been developed to allow pulp operations to better control their production processes and improve both production efficiency and stability. Both the work here and earlier research indicates that there is a potential for savings around 1-5%. All the supporting data is available today coming from distributed control systems, data historians and other existing sources. The pulp mill model together with the scheduler, allows what-if analyses of the impacts and timely feasibility of various external actions such as planned maintenance of any particular mill operation. The visibility gained from the model proves also to be a real benefit. The aim is to satisfy demand and gain extra profit, while achieving the required customer service level. Research effort has been put both in understanding the minimum features needed to satisfy the scheduling requirements in the industry and the overall existence of the market. A qualitative study was constructed to both identify competitive situation and the requirements vs. gaps on the market. It becomes clear that there is no such system on the marketplace today and also that there is room to improve target market overall process efficiency through such planning tool. This thesis also provides better overall understanding of the different processes in this particular industry for the case company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to search how to match the demand and supply effectively in industrial and project-oriented business environment. The demand-supply balancing process is searched through three different phases: the demand planning and forecasting, synchronization of demand and supply and measurement of the results. The thesis contains a single case study that has been implemented in a company called Outotec. In the case study the demand is planned and forecasted with qualitative (judgmental) forecasting method. The quantitative forecasting methods are searched further to support the demand forecast and long term planning. The sales and operations planning process is used in the synchronization of the demand and supply. The demand forecast is applied in the management of a supply chain of critical unit of elemental analyzer. Different meters on operational and strategic level are proposed for the measurement of performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the possibility to use lean tools and methods in a quotation process which is carried out in an office environment. The aim of the study was to find out and test the relevant lean tools and methods which can help to balance and standardize the quotation process, and reduce the variance in quotation lead times and in quality. Seminal works, researches and guide books related to the topic were used as the basis for the theory development. Based on the literature review and the case company’s own lean experience, the applicable lean tools and methods were selected to be tested by a sales support team. Leveling production, by product categorization and value stream mapping, was a key method to be used to balance the quotation process. 5S method was started concurrently for standardizing the work. Results of the testing period showed that lean tools and methods are applicable in office process and selected tools and methods helped to balance and standardize the quotation process. Case company’s sales support team decided to implement new lean based quotation process model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to assess the spectral behavior of the erector spinae muscle during isometric contractions performed before and after a dynamic manual load-lifting test carried out by the trunk in order to determine the capacity of muscle to perform this task. Nine healthy female students participated in the experiment. Their average age, height, and body mass (± SD) were 20 ± 1 years, 1.6 ± 0.03 m, and 53 ± 4 kg, respectively. The development of muscle fatigue was assessed by spectral analysis (median frequency) and root mean square with time. The test consisted of repeated bending movements from the trunk, starting from a 45º angle of flexion, with the application of approximately 15, 25 and 50% of maximum individual load, to the stand up position. The protocol used proved to be more reliable with loads exceeding 50% of the maximum for the identification of muscle fatigue by electromyography as a function of time. Most of the volunteers showed an increase in root mean square versus time on both the right (N = 7) and the left (N = 6) side, indicating a tendency to become fatigued. With respect to the changes in median frequency of the electromyographic signal, the loads used in this study had no significant effect on either the right or the left side of the erector spinae muscle at this frequency, suggesting that a higher amount and percentage of loads would produce more substantial results in the study of isotonic contractions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

the response to an oral calcium load test was assessed in 17 hypercalciuric nephrolithiasis patients who presented elevated parathyroid hormone (PTH) irrespective of the ionized calcium (sCa2+) levels. Blood samples were collected at baseline (0 min) and at 60 and 180 min after 1 g calcium load for serum PTH, total calcium, sCa2+, and 1.25(OH)2D3 determinations. According to the sCa2+ level at baseline, patients were classified as normocalcemic (N = 9) or hypercalcemic (N = 8). Six healthy subjects were also evaluated as controls. Bone mineral density was reduced in 14/17 patients. In the normocalcemic group, mean PTH levels at 0, 60 and 180 min (95 ± 76, 56 ± 40, 57 ± 45 pg/ml, respectively) did not differ from the hypercalcemic group (130 ± 75, 68 ± 35, 80 ± 33 pg/ml) but were significantly higher compared to healthy subjects despite a similar elevation in sCa2+ after 60 and 180 min vs baseline in all 3 groups. Mean total calcium and 1.25(OH)2D3 were similar in the 3 groups. Additionally, we observed that 5 of 9 normocalcemic patients presented a significantly higher concentration-time curve for serum PTH (AUC0',60',180') than the other 4 patients and the healthy subjects, suggesting a primary parathyroid dysfunction. These data suggest that the individual response to an oral calcium load test may be a valuable dynamic tool to disclose a subtle primary hyperparathyroidism in patients with high PTH and fluctuating sCa2+ levels, avoiding repeated measurements of both parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate the human T-cell lymphotropic virus type I (HTLV-I) proviral DNA load among asymptomatic HTLV-I-infected carriers and patients with HTLV-I-associated myelopathy/tropical spastic paraparesis (HAM/TSP), real time PCR using TaqMan probes for the pol gene was performed in two million peripheral blood mononuclear cells (PBMC). The albumin gene was the internal genomic control and MT2 cells were used as positive control. The results are reported as copies/10,000 PBMC, and the detection limit was 10 copies. A total of 89 subjects (44 HAM/TSP and 45 healthy HTLV-I-infected carriers) followed up at the Institute of Infectious Diseases "Emilio Ribas" and in the Neurology Division of Hospital of Clínicas were studied. The asymptomatic HTLV-I-infected carriers had a median number of 271 copies (ranging from 5 to 4756 copies), whereas the HAM/TSP cases presented a median of 679 copies (5-5360 copies) in 10,000 PBMC. Thus, HAM/TSP patients presented a significantly higher HTLV-I proviral DNA load than healthy HTLV-I carriers (P = 0.005, one-way Mann-Whitney test). As observed in other persistent infections, proviral DNA load quantification may be an important tool for monotoring HTLV-I-infected subjects. However, long-term follow-up is necessary to validate this assay in the clinical setting.