37 resultados para Probabilistic methodology
Resumo:
A QuEChERS method for the extraction of ochratoxin A (OTA) from bread samples was evaluated. A factorial design (23) was used to find the optimal QuEChERS parameters (extraction time, extraction solvent volume and sample mass). Extracts were analysed by LC with fluorescence detection. The optimal extraction conditions were: 5 g of sample, 15 mL of acetonitrile and 3 min of agitation. The extraction procedure was validated by systematic recovery experiments at three levels. The recoveries obtained ranged from 94.8% (at 1.0 μg kg -1) to 96.6% (at 3.0 μg kg -1). The limit of quantification of the method was 0.05 μg kg -1. The optimised procedure was applied to 20 samples of different bread types (‘‘Carcaça’’, ‘‘Broa de Milho’’, and ‘‘Broa de Avintes’’) highly consumed in Portugal. None of the samples exceeded the established European legal limit of 3 μg kg -1.
Resumo:
This thesis presents the Fuzzy Monte Carlo Model for Transmission Power Systems Reliability based studies (FMC-TRel) methodology, which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states. This is followed by a remedial action algorithm, based on Optimal Power Flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. For the system states that cause load curtailment, an optimization approach is applied to reduce the probability of occurrence of these states while minimizing the costs to achieve that reduction. This methodology is of most importance for supporting the transmission system operator decision making, namely in the identification of critical components and in the planning of future investments in the transmission power system. A case study based on Reliability Test System (RTS) 1996 IEEE 24 Bus is presented to illustrate with detail the application of the proposed methodology.
Resumo:
A Box–Behnken factorial design coupled with surface response methodology was used to evaluate the effects of temperature, pH and initial concentration in the Cu(II) sorption process onto the marine macroalgae Ascophyllum nodosum. The effect of the operating variables on metal uptake capacitywas studied in a batch system and a mathematical model showing the influence of each variable and their interactions was obtained. Study ranges were 10–40ºC for temperature, 3.0–5.0 for pH and 50–150mgL−1 for initial Cu(II) concentration. Within these ranges, the biosorption capacity is slightly dependent on temperature but markedly increases with pH and initial concentration of Cu(II). The uptake capacities predicted by the model are in good agreement with the experimental values. Maximum biosorption capacity of Cu(II) by A. nodosum is 70mgg−1 and corresponds to the following values of those variables: temperature = 40ºC, pH= 5.0 and initial Cu(II) concentration = 150mgL−1.
Resumo:
A method for the determination of some pesticide residues in must and wine samples was developed using solid-phase microextraction (SPME) and gas chromatography – electron capture detection (GC/ECD). The procedure only needs dilution as sample pre-treatment and is therefore simple, fast and solvent-free. Eight fungicides (vinclozolin, procymidone, iprodione, penconazole, fenarimol, folpet, nuarimol and hexaconazole), one insecticide (chlorpyriphos) and two acaricides (bromopropylate and tetradifon) can be quantified. Good linearity was observed for all the compounds in the range 5–100 µg/L. The reproducibility of the measurements was found acceptable (with RSD’s below 20%). Detection limits of 11 µg/L, on average, are sufficiently below the proposed maximum residue limits (MRL’s) for these compounds in wine. The analytical method was applied to the determination of these compounds in Portuguese must and wine samples from the Demarcated Region of Alentejo, where any residues could be detected.
Resumo:
Microwave-assisted extraction (MAE) of agar from Gracilaria vermiculophylla, produced in an integrated multitrophic aquaculture (IMTA) system, from Ria de Aveiro (northwestern Portugal), was tested and optimized using response surface methodology. The influence of the MAE operational parameters (extraction time, temperature, solvent volume and stirring speed) on the physical and chemical properties of agar (yield, gel strength, gelling and melting temperatures, as well as, sulphate and 3,6-anhydro-Lgalactose contents) was evaluated in a 2^4 orthogonal composite design. The quality of the extracted agar compared favorably with the attained using traditional extraction (2 h at 85ºC) while reducing drastically extraction time, solvent consumption and waste disposal requirements. Agar MAE optimum results were: an yield of 14.4 ± 0.4%, a gel strength of 1331 ± 51 g/cm2, 40.7 ± 0.2 _C gelling temperature, 93.1 ± 0.5ºC melting temperature, 1.73 ± 0.13% sulfate content and 39.4 ± 0.3% 3,6-anhydro-L-galactose content. Furthermore, this study suggests the feasibility of the exploitation of G. vermiculophylla grew in IMTA systems for agar production.
Resumo:
Serious games are starting to attain a higher role as tools for learning in various contexts, but in particular in areas such as education and training. Due to its characteristics, such as rules, behavior simulation and feedback to the player's actions, serious games provide a favorable learning environment where errors can occur without real life penalty and students get instant feedback from challenges. These challenges are in accordance with the intended objectives and will self-adapt and repeat according to the student’s difficulty level. Through motivating and engaging environments, which serve as base for problem solving and simulation of different situations and contexts, serious games have a great potential to aid players developing professional skills. But, how do we certify the acquired knowledge and skills? With this work we intend to propose a methodology to establish a relationship between the game mechanics of serious games and an array of competences for certification, evaluating the applicability of various aspects in the design and development of games such as the user interfaces and the gameplay, obtaining learning outcomes within the game itself. Through the definition of game mechanics combined with the necessary pedagogical elements, the game will ensure the certification. This paper will present a matrix of generic skills, based on the European Framework of Qualifications, and the definition of the game mechanics necessary for certification on tour guide training context. The certification matrix has as reference axes: skills, knowledge and competencies, which describe what the students should learn, understand and be able to do after they complete the learning process. The guides-interpreters welcome and accompany tourists on trips and visits to places of tourist interest and cultural heritage such as museums, palaces and national monuments, where they provide various information. Tour guide certification requirements include skills and specific knowledge about foreign languages and in the areas of History, Ethnology, Politics, Religion, Geography and Art of the territory where it is inserted. These skills are communication, interpersonal relationships, motivation, organization and management. This certification process aims to validate the skills to plan and conduct guided tours on the territory, demonstrate knowledge appropriate to the context and finally match a good group leader. After defining which competences are to be certified, the next step is to delineate the expected learning outcomes, as well as identify the game mechanics associated with it. The game mechanics, as methods invoked by agents for interaction with the game world, in combination with game elements/objects allows multiple paths through which to explore the game environment and its educational process. Mechanics as achievements, appointments, progression, reward schedules or status, describe how game can be designed to affect players in unprecedented ways. In order for the game to be able to certify tour guides, the design of the training game will incorporate a set of theoretical and practical tasks to acquire skills and knowledge of various transversal themes. For this end, patterns of skills and abilities in acquiring different knowledge will be identified.
Resumo:
The application of information technologies (specially the Internet, Web 2.0 and social tools) make informal learning more visible. This kind of learning is not linked to an institution or a period of time, but it is important enough to be taken into account. On the one hand, learners should be able to communicate to the institutions they are related to, what skills they possess, whether they were achieved in a formal or informal way. On the other hand the companies and educational institutions need to have a deeper knowledge about the competencies of their staff. The TRAILER project provides a methodology supported by a technological framework to facilitate communication about informal learning between businesses, employees and learners. The paper presents the project and some of the work carried out, an exploratory analysis about how informal learning is considered and the technological framework proposed. Whilst challenges remain in terms of establishing the meaningfulness of technological engagement for employees and businesses, the continuing transformation of the social, technological and educational environment is likely to lead to greater emphasis for the effective exploitation of informal learning.
Resumo:
An analytical method using microwave-assisted extraction (MAE) and liquid chromatography (LC) with fluorescence detection (FD) for the determination of ochratoxin A (OTA) in bread samples is described. A 24 orthogonal composite design coupled with response surface methodology was used to study the influence of MAE parameters (extraction time, temperature, solvent volume, and stirring speed) in order to maximize OTA recovery. The optimized MAE conditions were the following: 25 mL of acetonitrile, 10 min of extraction, at 80 °C, and maximum stirring speed. Validation of the overall methodology was performed by spiking assays at five levels (0.1–3.00 ng/g). The quantification limit was 0.005 ng/g. The established method was then applied to 64 bread samples (wheat, maize, and wheat/maize bread) collected in Oporto region (Northern Portugal). OTAwas detected in 84 % of the samples with a maximum value of 2.87 ng/g below the European maximum limit established for OTA in cereal products of 3 ng/g.
Resumo:
Over the last few years, there has been a growing concern about the presence of pharmaceuticals in the environment. The main objective of this study was to develop and validate an SPE method using surface response methodology for the determination of ibuprofen in different types of water samples. The influence of sample pH and sample volume on the ibuprofen recovery was studied. The effect of each studied independent variable is pronounced on the dependent variable (ibuprofen recovery). Good selectivity, extraction efficiency, and precision were achieved using 600 mL of sample volume with the pH adjusted to 2.2. LC with fluorescence detection was employed. The optimized method was applied to 20 water samples from the North and South of Portugal.
Resumo:
O trabalho apresentado centra-se na determinação dos custos de construção de condutas de pequenos e médios diâmetros em Polietileno de Alta Densidade (PEAD) para saneamento básico, tendo como base a metodologia descrita no livro Custos de Construção e Exploração – Volume 9 da série Gestão de Sistemas de Saneamento Básico, de Lencastre et al. (1994). Esta metodologia descrita no livro já referenciado, nos procedimentos de gestão de obra, e para tal foram estimados custos unitários de diversos conjuntos de trabalhos. Conforme Lencastre et al (1994), “esses conjuntos são referentes a movimentos de terras, tubagens, acessórios e respetivos órgãos de manobra, pavimentações e estaleiro, estando englobado na parte do estaleiro trabalhos acessórios correspondentes à obra.” Os custos foram obtidos analisando vários orçamentos de obras de saneamento, resultantes de concursos públicos de empreitadas recentemente realizados. Com vista a tornar a utilização desta metodologia numa ferramenta eficaz, foram organizadas folhas de cálculo que possibilitam obter estimativas realistas dos custos de execução de determinada obra em fases anteriores ao desenvolvimento do projeto, designadamente numa fase de preparação do plano diretor de um sistema ou numa fase de elaboração de estudos de viabilidade económico-financeiros, isto é, mesmo antes de existir qualquer pré-dimensionamento dos elementos do sistema. Outra técnica implementada para avaliar os dados de entrada foi a “Análise Robusta de Dados”, Pestana (1992). Esta metodologia permitiu analisar os dados mais detalhadamente antes de se formularem hipóteses para desenvolverem a análise de risco. A ideia principal é o exame bastante flexível dos dados, frequentemente antes mesmo de os comparar a um modelo probabilístico. Assim, e para um largo conjunto de dados, esta técnica possibilitou analisar a disparidade dos valores encontrados para os diversos trabalhos referenciados anteriormente. Com os dados recolhidos, e após o seu tratamento, passou-se à aplicação de uma metodologia de Análise de Risco, através da Simulação de Monte Carlo. Esta análise de risco é feita com recurso a uma ferramenta informática da Palisade, o @Risk, disponível no Departamento de Engenharia Civil. Esta técnica de análise quantitativa de risco permite traduzir a incerteza dos dados de entrada, representada através de distribuições probabilísticas que o software disponibiliza. Assim, para por em prática esta metodologia, recorreu-se às folhas de cálculo que foram realizadas seguindo a abordagem proposta em Lencastre et al (1994). A elaboração e a análise dessas estimativas poderão conduzir à tomada de decisões sobre a viabilidade da ou das obras a realizar, nomeadamente no que diz respeito aos aspetos económicos, permitindo uma análise de decisão fundamentada quanto à realização dos investimentos.
Resumo:
This paper presents an electricity medium voltage (MV) customer characterization framework supportedby knowledge discovery in database (KDD). The main idea is to identify typical load profiles (TLP) of MVconsumers and to develop a rule set for the automatic classification of new consumers. To achieve ourgoal a methodology is proposed consisting of several steps: data pre-processing; application of severalclustering algorithms to segment the daily load profiles; selection of the best partition, corresponding tothe best consumers’ segmentation, based on the assessments of several clustering validity indices; andfinally, a classification model is built based on the resulting clusters. To validate the proposed framework,a case study which includes a real database of MV consumers is performed.
Resumo:
This paper presents the first phase of the redevelopment of the Electric Vehicle Scenario Simulator (EVeSSi) tool. A new methodology to generate traffic demand scenarios for the Simulation of Urban MObility (SUMO) tool for urban traffic simulation is described. This methodology is based on a Portugal census database to generate a synthetic population for a given area under study. A realistic case study of a Portuguese city, Vila Real, is assessed. For this area the road network was created along with a synthetic population and public transport. The traffic results were obtained and an electric buses fleet was evaluated assuming that the actual fleet would be replaced in a near future. The energy requirements to charge the electric fleet overnight were estimated in order to evaluate the impacts that it would cause in the local electricity network.
Resumo:
Most of distributed generation and smart grid research works are dedicated to network operation parameters studies, reliability, etc. However, many of these works normally uses traditional test systems, for instance, IEEE test systems. This paper proposes voltage magnitude and reliability studies in presence of fault conditions, considering realistic conditions found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12-bus sub-transmission network.
Resumo:
20th International Conference on Reliable Software Technologies - Ada-Europe 2015 (Ada-Europe 2015), 22 to 26, Jun, 2015, Madrid, Spain.
Resumo:
Demo presented in 12th Workshop on Models and Algorithms for Planning and Scheduling Problems (MAPSP 2015). 8 to 12, Jun, 2015. La Roche-en-Ardenne, Belgium. Extended abstract.