963 resultados para load balancing algorithm
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
In this paper we present the operational matrices of the left Caputo fractional derivative, right Caputo fractional derivative and Riemann–Liouville fractional integral for shifted Legendre polynomials. We develop an accurate numerical algorithm to solve the two-sided space–time fractional advection–dispersion equation (FADE) based on a spectral shifted Legendre tau (SLT) method in combination with the derived shifted Legendre operational matrices. The fractional derivatives are described in the Caputo sense. We propose a spectral SLT method, both in temporal and spatial discretizations for the two-sided space–time FADE. This technique reduces the two-sided space–time FADE to a system of algebraic equations that simplifies the problem. Numerical results carried out to confirm the spectral accuracy and efficiency of the proposed algorithm. By selecting relatively few Legendre polynomial degrees, we are able to get very accurate approximations, demonstrating the utility of the new approach over other numerical methods.
Resumo:
In this paper, we formulate the electricity retailers’ short-term decision-making problem in a liberalized retail market as a multi-objective optimization model. Retailers with light physical assets, such as generation and storage units in the distribution network, are considered. Following advances in smart grid technologies, electricity retailers are becoming able to employ incentive-based demand response (DR) programs in addition to their physical assets to effectively manage the risks of market price and load variations. In this model, the DR scheduling is performed simultaneously with the dispatch of generation and storage units. The ultimate goal is to find the optimal values of the hourly financial incentives offered to the end-users. The proposed model considers the capacity obligations imposed on retailers by the grid operator. The profit seeking retailer also has the objective to minimize the peak demand to avoid the high capacity charges in form of grid tariffs or penalties. The non-dominated sorting genetic algorithm II (NSGA-II) is used to solve the multi-objective problem. It is a fast and elitist multi-objective evolutionary algorithm. A case study is solved to illustrate the efficient performance of the proposed methodology. Simulation results show the effectiveness of the model for designing the incentive-based DR programs and indicate the efficiency of NSGA-II in solving the retailers’ multi-objective problem.
Resumo:
Biomechanical gait parameters—ground reaction forces (GRFs) and plantar pressures—during load carriage of young adults were compared at a low gait cadence and a high gait cadence. Differences between load carriage and normal walking during both gait cadences were also assessed. A force plate and an in-shoe plantar pressure system were used to assess 60 adults while they were walking either normally (unloaded condition) or wearing a backpack (loaded condition) at low (70 steps per minute) and high gait cadences (120 steps per minute). GRF and plantar pressure peaks were scaled to body weight (or body weight plus backpack weight). With medium to high effect sizes we found greater anterior-posterior and vertical GRFs and greater plantar pressure peaks in the rearfoot, forefoot and hallux when the participants walked carrying a backpack at high gait cadences compared to walking at low gait cadences. Differences between loaded and unloaded conditions in both gait cadences were also observed.
Resumo:
The objective of the present study was to evaluate the serum viral load in chronically infected Hepatitis B virus (HBV) patients and to investigate the distribution of HBV genotypes in São Paulo city. Quantitative HBV-DNA assays and HBV genotyping have gained importance for predicting HBV disease progression, have been employed for assessing infectivity, for treatment monitoring and for detecting the emergence of drug resistance. Twenty-nine Brazilian patients with suspected chronic hepatitis B were studied, using real time PCR for viral load determination and direct DNA sequencing for the genotyping. The serology revealed chronic HBV infection in 22 samples. The HBV-DNA was positive in 68% samples (15/22). The phylogenetic analysis disclosed that eleven patients were infected with HBV genotype A, two with genotype F and two with genotype D. Thus, the genotype A was the most prevalent in our study.
Resumo:
No âmbito da investigação operacional o problema de empacotamento de contentores é conhecido por procurar definir uma configuração de carga, de forma a otimizar a utilização de um espaço disponível para efetuar o empacotamento. Este problema pode ser apresentado em diversas formas, formas estas que variam em função das características de cada empacotamento. Estas características podem ser: o tipo de carga que se pretende carregar (homogénea ou heterogénea), a possibilidade de a carga poder sofrer rotações em todas as suas dimensões ou apenas em algumas, o lucro que está associado a cada caixa carregada ou restrições inerentes ao contentor como por exemplo dimensões. O interesse pelo estudo de problemas de empacotamento de contentores tem vindo a receber cada vez mais ênfase por várias razões, uma delas é o interesse financeiro dado que o transporte é uma prática que representa custos, sendo importante diminuir estes custos aproveitando o volume do contentor da melhor forma. Outra preocupação que motiva o estudo deste problema prende-se com fatores ambientes, onde se procura racionalizar os recursos naturais estando esta também ligada a questões financeiras. Na literatura podem ser encontradas varias propostas para solucionar este problema, cada uma destas dirigidas a uma variante do problema, estas propostas podem ser determinísticas ou não determinísticas onde utilizam heurísticas ou metaheurísticas. O estudo realizado nesta dissertação descreve algumas destas propostas, nomeadamente as metaheurísticas que são utilizadas na resolução deste problema. O trabalho aqui apresentado traz também uma nova metaheurísticas, mais precisamente um algoritmo genético que terá como objetivo, apresentar uma configuração de carga para um problema de empacotamento de um contentor. O algoritmo genético tem como objetivo a resolução do seguinte problema: empacotar várias caixas retangulares com diversos tamanhos num contentor. Este problema é conhecido como Bin-Packing. A novidade que este algoritmo genético vai introduzir nas diversas soluções apresentadas até à data, é uma nova forma de criar padrões iniciais, ou seja, é utilizada a heurística HSSI (Heurística de Suavização de Superfícies Irregulares) que tem como objetivo criar uma população inicial de forma a otimizar o algoritmo genético. A heurística HSSI tenta resolver problemas de empacotamento simulando, o comportamento da maioria das pessoas ao fazer este processo na vida real, contudo, tem um campo de busca reduzido entre as soluções possíveis e será então utilizado um algoritmo genético para ampliar este campo de busca e explorar novas soluções. No final pretende-se obter um software onde será possível configurar um dado problema de empacotamento de um contentor e obter, a solução do mesmo através do algoritmo genético. Assim sendo, o estudo realizado tem como principal objetivo contribuir com pesquisas e conclusões, sobre este problema e trazer uma nova proposta de solução para o problema de empacotamento de contentores.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
Stone masonry is one of the oldest and most worldwide used building techniques. Nevertheless, the structural response of masonry structures is complex and the effective knowledge about their mechanical behaviour is still limited. This fact is particularly notorious when dealing with the description of their out-of-plane behaviour under horizontal loadings, as is the case of the earthquake action. In this context, this paper describes an experimental program, conducted in laboratory environment, aiming at characterizing the out-of-plane behaviour of traditional unreinforced stone masonry walls. In the scope of this campaign, six full-scale sacco stone masonry specimens were fully characterised regarding their most important mechanic, geometric and dynamic features and were tested resorting to two different loading techniques under three distinct vertical pre-compression states; three of the specimens were subjected to an out-of-plane surface load by means of a system of airbags and the remaining were subjected to an out-of-plane horizontal line-load at the top. From the experiments it was possible to observe that both test setups were able to globally mobilize the out-of-plane response of the walls, which presented substantial displacement capacity, with ratios of ultimate displacement to the wall thickness ranging between 26 and 45 %, as well as good energy dissipation capacity. Finally, very interesting results were also obtained from a simple analytical model used herein to compute a set of experimental-based ratios, namely between the maximum stability displacement and the wall thickness for which a mean value of about 60 % was found.
Resumo:
The purpose of this work is to present an algorithm to solve nonlinear constrained optimization problems, using the filter method with the inexact restoration (IR) approach. In the IR approach two independent phases are performed in each iteration—the feasibility and the optimality phases. The first one directs the iterative process into the feasible region, i.e. finds one point with less constraints violation. The optimality phase starts from this point and its goal is to optimize the objective function into the satisfied constraints space. To evaluate the solution approximations in each iteration a scheme based on the filter method is used in both phases of the algorithm. This method replaces the merit functions that are based on penalty schemes, avoiding the related difficulties such as the penalty parameter estimation and the non-differentiability of some of them. The filter method is implemented in the context of the line search globalization technique. A set of more than two hundred AMPL test problems is solved. The algorithm developed is compared with LOQO and NPSOL software packages.
Resumo:
BACKGROUND: Cytomegalovirus (CMV) remains an important pathogen to immunocompromised patients even in the era of HAART. The present study aimed at evaluating the influence of CMV viral load and its gB genotypes on AIDS patients' outcome. METHODS: Blood samples of 101 AIDS patients were collected and tested for HIV load, CD4 - cell count and opportunistic pathogens, including CMV. Semi-nested PCRs were run to detect CMV genome and in the positive samples, gB genotyping and CMV load were established using enzymatic restriction and real time PCR, respectively. All patients were clinically followed for four years. RESULTS: In thirty patients (31%) CMV was detected and all fatal cases (n = 5) occurred in this group of patients (p = 0.007), but only two patients had CMV disease (1.9%). However, viral load was not statistically associated with any analyzed parameter. The most frequently observed CMV genotype was gB2 (45.16%) followed by gB3 (35.48%). gB2 genotype was more frequently found in patients with CD4-cell counts under 200 cells/mm³ (p = 0.0017), and almost all fatal cases (80%) had gB2 genotype. CONCLUSIONS: Our study suggests that CMV and its polymorphisms in biologically relevant genes, such as the gB encoding ORF, may still influence the prognosis and outcome of AIDS patients. The gB2 genotype was associated to patient's bad outcome.
Resumo:
Schistosomiasis constitutes a major public health problem, with an estimated 200 million individuals infected worldwide and 700 million people living in risk areas. In Brazil there are areas of high, medium and low endemicity. Studies have shown that in endemic areas with a low prevalence of Schistosoma infection the sensitivity of parasitological methods is clearly reduced. Consequently diagnosis is often impeded due to the presence of false-negative results. The aim of this study is to present the PCR reamplification (Re-PCR) protocol for the detection of Schistosoma mansoni in samples with low parasite load (with less than 100 eggs per gram (epg) of feces). Three methods were used for the lysis of the envelopes of the S. mansoni eggs and two techniques of DNA extraction were carried out. Extracted DNA was quantified, and the results suggested that the extraction technique, which mixed glass beads with a guanidine isothiocyanate/phenol/chloroform (GT) solution, produced good results. PCR reamplification was conducted and detection sensitivity was found to be five eggs per 500 mg of artificially marked feces. The results achieved using these methods suggest that they are potentially viable for the detection of Schistosoma infection with low parasite load.
Resumo:
This report describes the full research proposal for the project \Balancing and lot-sizing mixed-model lines in the footwear industry", to be developed as part of the master program in Engenharia Electrotécnica e de Computadores - Sistemas de Planeamento Industrial of the Instituto Superior de Engenharia do Porto. The Portuguese footwear industry is undergoing a period of great development and innovation. The numbers speak for themselves, Portugal footwear exported 71 million pairs of shoes to over 130 countries in 2012. It is a diverse sector, which covers different categories of women, men and children shoes, each of them with various models. New and technologically advanced mixed-model assembly lines are being projected and installed to replace traditional mass assembly lines. Obviously there is a need to manage them conveniently and to improve their operations. This work focuses on balancing and lot-sizing stitching mixed-model lines in a real world environment. For that purpose it will be fundamental to develop and evaluate adequate effective solution methods. Different objectives may be considered, which are relevant for the companies, such as minimizing the number of workstations, and minimizing the makespan, while taking into account a lot of practical restrictions. The solution approaches will be based on approximate methods, namely by resorting to metaheuristics. To show the impact of having different lots in production the initial maximum amount for each lot is changed and a Tabu Search based procedure is used to improve the solutions. The developed approaches will be evaluated and tested. A special attention will be given to the solution of real applied problems. Future work may include the study of other neighbourhood structures related to Tabu Search and the development of ways to speed up the evaluation of neighbours, as well as improving the balancing solution method.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Nos dias de hoje, os sistemas de tempo real crescem em importância e complexidade. Mediante a passagem do ambiente uniprocessador para multiprocessador, o trabalho realizado no primeiro não é completamente aplicável no segundo, dado que o nível de complexidade difere, principalmente devido à existência de múltiplos processadores no sistema. Cedo percebeu-se, que a complexidade do problema não cresce linearmente com a adição destes. Na verdade, esta complexidade apresenta-se como uma barreira ao avanço científico nesta área que, para já, se mantém desconhecida, e isto testemunha-se, essencialmente no caso de escalonamento de tarefas. A passagem para este novo ambiente, quer se trate de sistemas de tempo real ou não, promete gerar a oportunidade de realizar trabalho que no primeiro caso nunca seria possível, criando assim, novas garantias de desempenho, menos gastos monetários e menores consumos de energia. Este último fator, apresentou-se desde cedo, como, talvez, a maior barreira de desenvolvimento de novos processadores na área uniprocessador, dado que, à medida que novos eram lançados para o mercado, ao mesmo tempo que ofereciam maior performance, foram levando ao conhecimento de um limite de geração de calor que obrigou ao surgimento da área multiprocessador. No futuro, espera-se que o número de processadores num determinado chip venha a aumentar, e como é óbvio, novas técnicas de exploração das suas inerentes vantagens têm de ser desenvolvidas, e a área relacionada com os algoritmos de escalonamento não é exceção. Ao longo dos anos, diferentes categorias de algoritmos multiprocessador para dar resposta a este problema têm vindo a ser desenvolvidos, destacando-se principalmente estes: globais, particionados e semi-particionados. A perspectiva global, supõe a existência de uma fila global que é acessível por todos os processadores disponíveis. Este fato torna disponível a migração de tarefas, isto é, é possível parar a execução de uma tarefa e resumir a sua execução num processador distinto. Num dado instante, num grupo de tarefas, m, as tarefas de maior prioridade são selecionadas para execução. Este tipo promete limites de utilização altos, a custo elevado de preempções/migrações de tarefas. Em contraste, os algoritmos particionados, colocam as tarefas em partições, e estas, são atribuídas a um dos processadores disponíveis, isto é, para cada processador, é atribuída uma partição. Por essa razão, a migração de tarefas não é possível, acabando por fazer com que o limite de utilização não seja tão alto quando comparado com o caso anterior, mas o número de preempções de tarefas decresce significativamente. O esquema semi-particionado, é uma resposta de caráter hibrido entre os casos anteriores, pois existem tarefas que são particionadas, para serem executadas exclusivamente por um grupo de processadores, e outras que são atribuídas a apenas um processador. Com isto, resulta uma solução que é capaz de distribuir o trabalho a ser realizado de uma forma mais eficiente e balanceada. Infelizmente, para todos estes casos, existe uma discrepância entre a teoria e a prática, pois acaba-se por se assumir conceitos que não são aplicáveis na vida real. Para dar resposta a este problema, é necessário implementar estes algoritmos de escalonamento em sistemas operativos reais e averiguar a sua aplicabilidade, para caso isso não aconteça, as alterações necessárias sejam feitas, quer a nível teórico quer a nível prá
Resumo:
Classical serological screening assays for Chagas' disease are time consuming and subjective. The objective of the present work is to evaluate the enzyme immuno-assay (ELISA) methodology and to propose an algorithm for blood banks to be applied to Chagas' disease. Seven thousand, nine hundred and ninety nine blood donor samples were screened by both reverse passive hemagglutination (RPHA) and indirect immunofluorescence assay (IFA). Samples reactive on RPHA and/or IFA were submitted to supplementary RPHA, IFA and complement fixation (CFA) tests. This strategy allowed us to create a panel of 60 samples to evaluate the ELISA methodology from 3 different manufacturers. The sensitivity of the screening by IFA and the 3 different ELISA's was 100%. The specificity was better on ELISA methodology. For Chagas disease, ELISA seems to be the best test for blood donor screening, because it showed high sensitivity and specificity, it is not subjective and can be automated. Therefore, it was possible to propose an algorithm to screen samples and confirm donor results at the blood bank.