792 resultados para cloud-based computing
Resumo:
An environmentally friendly analytical procedure with high sensitivity for determination of carbaryl pesticide in natural waters was developed. The flow system was designed with solenoid micro-pumps in order to improve mixing conditions and minimize reagent consumption as well as waste generation. A long pathlength (100 cm) flow cell based on a liquid core waveguide (LCW) was employed to increase the sensitivity in detection of the indophenol formed from the reaction between carbaryl and p-aminophenol (PAP). A clean-up step based on cloud-point extraction was explored to remove the interfering organic matter, avoiding the use of toxic organic solvents. A linear response was observed within the range 5-200 mu g L(-1) and the detection limit, coefficient of variation and sampling rate were estimated as 1.7 mu g L(-1) (99.7% confidence level), 0.7% (n=20) and 55 determinations per hour, respectively. The reagents consumption was 1.9 mu g of PAP and 5.7 mu g of potassium metaperiodate, with volume of 2.6 mL of effluent per determination. The proposed procedure was selective for the determination of carbaryl, without interference from other carbamate pesticides. Recoveries within 84% and 104% were estimated for carbaryl spiked to water samples and the results obtained were also in agreement with those found by a batch spectrophotometric procedure at the 95% confidence level. The waste of the analytical procedure was treated with potassium persulphate and ultraviolet irradiation, yielding a colorless residue and a decrease of 94% of total organic carbon. In addition, the residue after treatment was not toxic for Vibrio fischeri bacteria. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
A procedure for simultaneous separation/preconcentration of copper. zinc, cadmium, and nickel in water samples, based on cloud point extraction (CPE) as a prior step to their determination by inductively coupled plasma optic emission spectrometry (ICP-OES), has been developed. The analytes reacted with 4-(2-pyridylazo)-resorcinol (PAR) at pH 5 to form hydrophobic chelates, which were separated and preconcentrated in a surfactant-rich phase of octylphenoxypolyethoxyethanol (Triton X-I 14). The parameters affecting the extraction efficiency of the proposed method, such as sample pH, complexing agent concentration, buffer amount, surfactant concentration, temperature, kinetics of complexation reaction, and incubation time were optimized and their respective values were 5, 0.6 mmol L(-1). 0.3 mL, 0.15% (w/v), 50 degrees C, 40 min, and 10 min for 15 mL of preconcentrated solution. The method presented precision (R.S.D.) between 1.3% and 2.6% (n = 9). The concentration factors with and without dilution of the surfactant-rich phase for the analytes ranged from 9.4 to 10.1 and from 94.0 to 100.1, respectively. The limits of detection (L.O.D.) obtained for copper, zinc, cadmium, and nickel were 1.2, 1.1, 1.0. and 6.3 mu g L(-1), respectively. The accuracy of the procedure was evaluated through recovery experiments on aqueous samples. (C) 2009 Published by Elsevier B.V.
Resumo:
An improved procedure is proposed for determination of the pesticide carbaryl in natural waters based on double cloud point extraction. The clean up step was carried out only with Triton X-114 in alkaline medium in order to avoid the use of toxic organic solvents as well as to minimise waste generation. Cloud point preconcentration of the product of the reaction of the analyte with p-aminophenol and cetyltrimethylammonium bromide was explored to increase sensitivity and improve the detection limit. Linear response was achieved within 10 and 500 mu g L-1 and the apparent molar absorptivity was estimated as 4.6 x 105 L mol-1 cm-1. The detection limit was estimated as 7 mu g L-1 at the 99.7% confidence level and the coefficient of variation was 3.4% (n = 8). Recoveries within 91 and 99% were estimated for carbaryl spiked water samples. The results obtained for natural water samples were in agreement with those achieved by the batch of spectrophotometric procedure at the 95% confidence level. The proposed procedure is then a simple, fast, inexpensive and greener alternative for carbaryl determination.
Resumo:
Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This research presents a method for frequency estimation in power systems using an adaptive filter based on the Least Mean Square Algorithm (LMS). In order to analyze a power system, three-phase voltages were converted into a complex signal applying the alpha beta-transform and the results were used in an adaptive filtering algorithm. Although the use of the complex LMS algorithm is described in the literature, this paper deals with some practical aspects of the algorithm implementation. In order to reduce computing time, a coefficient generator was implemented. For the algorithm validation, a computing simulation of a power system was carried Out using the ATP software. Many different situations were Simulated for the performance analysis of the proposed methodology. The results were compared to a commercial relay for validation, showing the advantages of the new method. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The crossflow filtration process differs of the conventional filtration by presenting the circulation flow tangentially to the filtration surface. The conventional mathematical models used to represent the process have some limitations in relation to the identification and generalization of the system behaviour. In this paper, a system based on artificial neural networks is developed to overcome the problems usually found in the conventional mathematical models. More specifically, the developed system uses an artificial neural network that simulates the behaviour of the crossflow filtration process in a robust way. Imprecisions and uncertainties associated with the measurements made on the system are automatically incorporated in the neural approach. Simulation results are presented to justify the validity of the proposed approach. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Voltage and current waveforms of a distribution or transmission power system are not pure sinusoids. There are distortions in these waveforms that can be represented as a combination of the fundamental frequency, harmonics and high frequency transients. This paper presents a novel approach to identifying harmonics in power system distorted waveforms. The proposed method is based on Genetic Algorithms, which is an optimization technique inspired by genetics and natural evolution. GOOAL, a specially designed intelligent algorithm for optimization problems, was successfully implemented and tested. Two kinds of representations concerning chromosomes are utilized: binary and real. The results show that the proposed method is more precise than the traditional Fourier Transform, especially considering the real representation of the chromosomes.
Resumo:
We explore of the feasibility of the computationally oriented institutional agency framework proposed by Governatori and Rotolo testing it against an industrial strength scenario. In particular we show how to encode in defeasible logic the dispute resolution policy described in Article 67 of FIDIC.
Resumo:
In this paper we present a model of specification-based testing of interactive systems. This model provides the basis for a framework to guide such testing. Interactive systems are traditionally decomposed into a functionality component and a user interface component; this distinction is termed dialogue separation and is the underlying basis for conceptual and architectural models of such systems. Correctness involves both proper behaviour of the user interface and proper computation by the underlying functionality. Specification-based testing is one method used to increase confidence in correctness, but it has had limited application to interactive system development to date.
Resumo:
O presente projecto tem como objectivo a disponibilização de uma plataforma de serviços para gestão e contabilização de tempo remunerável, através da marcação de horas de trabalho, férias e faltas (com ou sem justificação). Pretende-se a disponibilização de relatórios com base nesta informação e a possibilidade de análise automática dos dados, como por exemplo excesso de faltas e férias sobrepostas de trabalhadores. A ênfase do projecto está na disponibilização de uma arquitectura que facilite a inclusão destas funcionalidades. O projecto está implementado sobre a plataforma Google App Engine (i.e. GAE), de forma a disponibilizar uma solução sob o paradigma de Software as a Service, com garantia de disponibilidade e replicação de dados. A plataforma foi escolhida a partir da análise das principais plataformas cloud existentes: Google App Engine, Windows Azure e Amazon Web Services. Foram analisadas as características de cada plataforma, nomeadamente os modelos de programação, os modelos de dados disponibilizados, os serviços existentes e respectivos custos. A escolha da plataforma foi realizada com base nas suas características à data de iniciação do presente projecto. A solução está estruturada em camadas, com as seguintes componentes: interface da plataforma, lógica de negócio e lógica de acesso a dados. A interface disponibilizada está concebida com observação dos princípios arquitecturais REST, suportando dados nos formatos JSON e XML. A esta arquitectura base foi acrescentada uma componente de autorização, suportada em Spring-Security, sendo a autenticação delegada para os serviços Google Acounts. De forma a permitir o desacoplamento entre as várias camadas foi utilizado o padrão Dependency Injection. A utilização deste padrão reduz a dependência das tecnologias utilizadas nas diversas camadas. Foi implementado um protótipo, para a demonstração do trabalho realizado, que permite interagir com as funcionalidades do serviço implementadas, via pedidos AJAX. Neste protótipo tirou-se partido de várias bibliotecas javascript e padrões que simplificaram a sua realização, tal como o model-view-viewmodel através de data binding. Para dar suporte ao desenvolvimento do projecto foi adoptada uma abordagem de desenvolvimento ágil, baseada em Scrum, de forma a implementar os requisitos do sistema, expressos em user stories. De forma a garantir a qualidade da implementação do serviço foram realizados testes unitários, sendo também feita previamente a análise da funcionalidade e posteriormente produzida a documentação recorrendo a diagramas UML.
Resumo:
Object-oriented programming languages presently are the dominant paradigm of application development (e. g., Java,. NET). Lately, increasingly more Java applications have long (or very long) execution times and manipulate large amounts of data/information, gaining relevance in fields related with e-Science (with Grid and Cloud computing). Significant examples include Chemistry, Computational Biology and Bio-informatics, with many available Java-based APIs (e. g., Neobio). Often, when the execution of such an application is terminated abruptly because of a failure (regardless of the cause being a hardware of software fault, lack of available resources, etc.), all of its work already performed is simply lost, and when the application is later re-initiated, it has to restart all its work from scratch, wasting resources and time, while also being prone to another failure and may delay its completion with no deadline guarantees. Our proposed solution to address these issues is through incorporating mechanisms for checkpointing and migration in a JVM. These make applications more robust and flexible by being able to move to other nodes, without any intervention from the programmer. This article provides a solution to Java applications with long execution times, by extending a JVM (Jikes research virtual machine) with such mechanisms. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.
Resumo:
This paper presents a Swarm based Cooperation Mechanism for scheduling optimization. We intend to conceptualize real manufacturing systems as interacting autonomous entities in order to support decision making in agile manufacturing environments. Agents coordinate their actions automatically without human supervision considering a common objective – global scheduling solution taking advantages from collective behavior of species through implicit and explicit cooperation. The performance of the cooperation mechanism will be evaluated consider implicit cooperation at first stage through ACS, PSO and ABC algorithms and explicit through cooperation mechanism application.
Resumo:
This paper presents a Multi-Agent Market simulator designed for analyzing agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. The system includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agents reactions.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.