35 resultados para stress based approach
Resumo:
This paper proposes a novel agent-based approach to Meta-Heuristics self-configuration. Meta-heuristics are algorithms with parameters which need to be set up as efficient as possible in order to unsure its performance. A learning module for self-parameterization of Meta-heuristics (MH) in a Multi-Agent System (MAS) for resolution of scheduling problems is proposed in this work. The learning module is based on Case-based Reasoning (CBR) and two different integration approaches are proposed. A computational study is made for comparing the two CBR integration perspectives. Finally, some conclusions are reached and future work outlined.
Resumo:
Following the deregulation experience of retail electricity markets in most countries, the majority of the new entrants of the liberalized retail market were pure REP (retail electricity providers). These entities were subject to financial risks because of the unexpected price variations, price spikes, volatile loads and the potential for market power exertion by GENCO (generation companies). A REP can manage the market risks by employing the DR (demand response) programs and using its' generation and storage assets at the distribution network to serve the customers. The proposed model suggests how a REP with light physical assets, such as DG (distributed generation) units and ESS (energy storage systems), can survive in a competitive retail market. The paper discusses the effective risk management strategies for the REPs to deal with the uncertainties of the DAM (day-ahead market) and how to hedge the financial losses in the market. A two-stage stochastic programming problem is formulated. It aims to establish the financial incentive-based DR programs and the optimal dispatch of the DG units and ESSs. The uncertainty of the forecasted day-ahead load demand and electricity price is also taken into account with a scenario-based approach. The principal advantage of this model for REPs is reducing the risk of financial losses in DAMs, and the main benefit for the whole system is market power mitigation by virtually increasing the price elasticity of demand and reducing the peak demand.
Resumo:
The intensification of agricultural productivity is an important challenge worldwide. However, environmental stressors can provide challenges to this intensification. The progressive occurrence of the cyanotoxins cylindrospermopsin (CYN) and microcystin-LR (MC-LR) as a potential consequence of eutrophication and climate change is of increasing concern in the agricultural sector because it has been reported that these cyanotoxins exert harmful effects in crop plants. A proteomic-based approach has been shown to be a suitable tool for the detection and identification of the primary responses of organisms exposed to cyanotoxins. The aim of this study was to compare the leaf-proteome profiles of lettuce plants exposed to environmentally relevant concentrations of CYN and a MC-LR/CYN mixture. Lettuce plants were exposed to 1, 10, and 100 lg/l CYN and a MC-LR/CYN mixture for five days. The proteins of lettuce leaves were separated by twodimensional electrophoresis (2-DE), and those that were differentially abundant were then identified by matrix-assisted laser desorption/ionization time of flight-mass spectrometry (MALDI-TOF/TOF MS). The biological functions of the proteins that were most represented in both experiments were photosynthesis and carbon metabolism and stress/defense response. Proteins involved in protein synthesis and signal transduction were also highly observed in the MC-LR/CYN experiment. Although distinct protein abundance patterns were observed in both experiments, the effects appear to be concentration-dependent, and the effects of the mixture were clearly stronger than those of CYN alone. The obtained results highlight the putative tolerance of lettuce to CYN at concentrations up to 100 lg/l. Furthermore, the combination of CYN with MC-LR at low concentrations (1 lg/l) stimulated a significant increase in the fresh weight (fr. wt) of lettuce leaves and at the proteomic level resulted in the increase in abundance of a high number of proteins. In contrast, many proteins exhibited a decrease in abundance or were absent in the gels of the simultaneous exposure to 10 and 100 lg/l MC-LR/CYN. In the latter, also a significant decrease in the fr. wt of lettuce leaves was obtained. These findings provide important insights into the molecular mechanisms of the lettuce response to CYN and MC-LR/CYN and may contribute to the identification of potential protein markers of exposure and proteins that may confer tolerance to CYN and MC-LR/CYN. Furthermore, because lettuce is an important crop worldwide, this study may improve our understanding of the potential impact of these cyanotoxins on its quality traits (e.g., presence of allergenic proteins).
Resumo:
Over the past decades several approaches for schedulability analysis have been proposed for both uni-processor and multi-processor real-time systems. Although different techniques are employed, very little has been put forward in using formal specifications, with the consequent possibility for mis-interpretations or ambiguities in the problem statement. Using a logic based approach to schedulability analysis in the design of hard real-time systems eases the synthesis of correct-by-construction procedures for both static and dynamic verification processes. In this paper we propose a novel approach to schedulability analysis based on a timed temporal logic with time durations. Our approach subsumes classical methods for uni-processor scheduling analysis over compositional resource models by providing the developer with counter-examples, and by ruling out schedules that cause unsafe violations on the system. We also provide an example showing the effectiveness of our proposal.
Resumo:
It is imperative to accept that failures can and will occur, even in meticulously designed distributed systems, and design proper measures to counter those failures. Passive replication minimises resource consumption by only activating redundant replicas in case of failures, as typically providing and applying state updates is less resource demanding than requesting execution. However, most existing solutions for passive fault tolerance are usually designed and configured at design time, explicitly and statically identifying the most critical components and their number of replicas, lacking the needed flexibility to handle the runtime dynamics of distributed component-based embedded systems. This paper proposes a cost-effective adaptive fault tolerance solution with a significant lower overhead compared to a strict active redundancy-based approach, achieving a high error coverage with the minimum amount of redundancy. The activation of passive replicas is coordinated through a feedback-based coordination model that reduces the complexity of the needed interactions among components until a new collective global service solution is determined, improving the overall maintainability and robustness of the system.
Resumo:
Objectivos: Verificar o efeito de uma intervenção baseada na abordagem segundo o Conceito de Bobath nos Ajustes Posturais Anticipatórios no Início da Marcha em duas crianças com hemiparésia espástica. Pretendeu-se ainda, verificar o efeito desta abordagem nas actividades e participação, bem como comparar os aspectos individuais das duas crianças com a capacidade de mudança após a intervenção. Metodologia: A avaliação foi realizada antes e três meses após a intervenção através da Electromiografia, da Plataforma de Forças, de um sistema de Câmaras de Vídeo, de uma Máquina Fotográfica e da Classificação Internacional de Funcionalidade para Crianças e Jovens. Resultados: A sequência de activação muscular alterou-se apenas na criança A. A postura na posição de pé, a actividade muscular, o deslocamento do centro de pressão e as actividades e participação modificaram-se em ambas as crianças, sendo que a criança A apresentou maior capacidade de mudança. Conclusão: A intervenção com base numa abordagem segundo o conceito de Bobath induziu mudanças positivas nos Ajustes Posturais Anticipatórios e nas actividades e participação dos casos em estudo.
Resumo:
To comply with natural gas demand growth patterns and Europe´s import dependency, the gas industry needs to organize an efficient upstream infrastructure. The best location of Gas Supply Units – GSUs and the alternative transportation mode – by phisical or virtual pipelines, are the key of a successful industry. In this work we study the optimal location of GSUs, as well as determining the most efficient allocation from gas loads to sources, selecting the best transportation mode, observing specific technical restrictions and minimizing system total costs. For the location of GSUs on system we use the P-median problem, for assigning gas demands nodes to source facilities we use the classical transportation problem. The developed model is an optimisation-based approach, based on a Lagrangean heuristic, using Lagrangean relaxation for P-median problems – Simple Lagrangean Heuristic. The solution of this heuristic can be improved by adding a local search procedure - the Lagrangean Reallocation Heuristic. These two heuristics, Simple Lagrangean and Lagrangean Reallocation, were tested on a realistic network - the primary Iberian natural gas network, organized with 65 nodes, connected by physical and virtual pipelines. Computational results are presented for both approaches, showing the location gas sources and allocation loads arrangement, system total costs and gas transportation mode.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.
Resumo:
This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn.
Resumo:
Managing programming exercises require several heterogeneous systems such as evaluation engines, learning objects repositories and exercise resolution environments. The coordination of networks of such disparate systems is rather complex. These tools would be too specific to incorporate in an e-Learning platform. Even if they could be provided as pluggable components, the burden of maintaining them would be prohibitive to institutions with few courses in those domains. This work presents a standard based approach for the coordination of a network of e-Learning systems participating on the automatic evaluation of programming exercises. The proposed approach uses a pivot component to orchestrate the interaction among all the systems using communication standards. This approach was validated through its effective use on classroom and we present some preliminary results.
Resumo:
Several projects in the recent past have aimed at promoting Wireless Sensor Networks as an infrastructure technology, where several independent users can submit applications that execute concurrently across the network. Concurrent multiple applications cause significant energy-usage overhead on sensor nodes, that cannot be eliminated by traditional schemes optimized for single-application scenarios. In this paper, we outline two main optimization techniques for reducing power consumption across applications. First, we describe a compiler based approach that identifies redundant sensing requests across applications and eliminates those. Second, we cluster the radio transmissions together by concatenating packets from independent applications based on Rate-Harmonized Scheduling.
Resumo:
This work reports on an experimental and finite element method (FEM) parametric study of adhesively-bonded single and double-strap repairs on carbon-epoxy structures under buckling unrestrained compression. The influence of the overlap length and patch thickness was evaluated. This loading gains a particular significance from the additional characteristic mechanisms of structures under compression, such as fibres microbuckling, for buckling restrained structures, or global buckling of the assembly, if no transverse restriction exists. The FEM analysis is based on the use of cohesive elements including mixed-mode criteria to simulate a cohesive fracture of the adhesive layer. Trapezoidal laws in pure modes I and II were used to account for the ductility of most structural adhesives. These laws were estimated for the adhesive used from double cantilever beam (DCB) and end-notched flexure (ENF) tests, respectively, using an inverse technique. The pure mode III cohesive law was equalled to the pure mode II one. Compression failure in the laminates was predicted using a stress-based criterion. The accurate FEM predictions open a good prospect for the reduction of the extensive experimentation in the design of carbon-epoxy repairs. Design principles were also established for these repairs under buckling.
Resumo:
This paper presents a genetic algorithm-based approach for project scheduling with multi-modes and renewable resources. In this problem activities of the project may be executed in more than one operating mode and renewable resource constraints are imposed. The objective function is the minimization of the project completion time. The idea of this approach is integrating a genetic algorithm with a schedule generation scheme. This study also proposes applying a local search procedure trying to yield a better solution when the genetic algorithm and the schedule generation scheme obtain a solution. The experimental results show that this algorithm is an effective method for solving this problem.
Resumo:
Remote Laboratories or WebLabs constitute a first-order didactic resource in engineering faculties. However, in many cases, they lack a proper software design, both in the client and server side, which degrades their quality and academic usefulness. This paper presents the main characteristics of a Remote Laboratory, analyzes the software technologies to implement the client and server sides in a WebLab, and correlates these technologies with the characteristics to facilitate the selection of a technology to implement a WebLab. The results obtained suggest the adoption of a Service Oriented Laboratory Architecture-based approach for the design of future Remote Laboratories so that client-agnostic Remote Laboratories and Remote Laboratory composition are enabled. The experience with the real Remote Laboratory, WebLab-Deusto, is also presented.
Resumo:
Este artigo apresenta uma nova abordagem (MM-GAV-FBI), aplicável ao problema da programação de projectos com restrições de recursos e vários modos de execução por actividade, problema conhecido na literatura anglo-saxónica por MRCPSP. Cada projecto tem um conjunto de actividades com precedências tecnológicas definidas e um conjunto de recursos limitados, sendo que cada actividade pode ter mais do que um modo de realização. A programação dos projectos é realizada com recurso a um esquema de geração de planos (do inglês Schedule Generation Scheme - SGS) integrado com uma metaheurística. A metaheurística é baseada no paradigma dos algoritmos genéticos. As prioridades das actividades são obtidas a partir de um algoritmo genético. A representação cromossómica utilizada baseia-se em chaves aleatórias. O SGS gera planos não-atrasados. Após a obtenção de uma solução é aplicada uma melhoria local. O objectivo da abordagem é encontrar o melhor plano (planning), ou seja, o plano que tenha a menor duração temporal possível, satisfazendo as precedências das actividades e as restrições de recursos. A abordagem proposta é testada num conjunto de problemas retirados da literatura da especialidade e os resultados computacionais são comparados com outras abordagens. Os resultados computacionais validam o bom desempenho da abordagem, não apenas em termos de qualidade da solução, mas também em termos de tempo útil.