985 resultados para Particle method
Resumo:
Competitive electricity markets have arisen as a result of power-sector restructuration and power-system deregulation. The players participating in competitive electricity markets must define strategies and make decisions using all the available information and business opportunities.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
O projeto “Avaliação da Exposição a Fungos e Partículas em Explorações Avícolas e Suinícolas” contemplou um elevado número de colheitas ambientais e biológicas e respectivo processamento laboratorial, sendo apenas possível a sua concretização graças ao financiamento disponibilizado pela Autoridade para as Condições de Trabalho. Foi realizado um estudo transversal para avaliar a contaminação causada por fungos e partículas em 7 explorações avícolas e 7 explorações suinícolas. No que concerne à monitorização biológica, foram medidos os parâmetros espirométricos, utilizando o espirómetro MK8 Microlab, avaliada a existência de sintomas clínicos associados com a asma e outras doenças alérgicas, através de questionário adaptado European Community Respiratory Health Survey e, ainda, avaliada a sensibilização aos agentes fúngicos (IgE). Foram ainda adicionados dois objetivos ao estudo, designadamente: aferir a existência de três espécies/estirpes potencialmente patogénicas/toxinogénicas com recurso à biologia molecular e avaliar a exposição dos trabalhadores à micotoxina aflatoxina B1 por recurso a indicador biológico de exposição. Foram colhidas 27 amostras de ar de 25 litros nas explorações avícolas e 56 de 50 litros nas explorações suinícolas através do método de impacto. As colheitas de ar e a medição da concentração das partículas foram realizadas no interior e no exterior dos pavilhões, sendo este último considerado como local de referência. Simultaneamente, a temperatura e a humidade relativa também foram registadas. As colheitas das superfícies foram realizadas através da técnica de zaragatoa, tendo sido utilizado um quadrado de metal inoxidável de 10 cm de lado, de acordo com a International Standard ISO 18593 – 2004. As zaragatoas obtidas (20 das explorações avícolas e 48 das explorações suinícolas) foram inoculadas em malte de extract agar (2%) com cloranfenicol (0,05 g/L). Além das colheitas de ar e de superfícies, foram também obtidas colheitas da cama das explorações avícolas (7 novas e 14 usadas) e da cobertura do pavimento das explorações suinícolas (3 novas e 4 usadas) e embaladas em sacos esterilizados. Cada amostra foi diluída e inoculada em placas contendo malte extract agar. Todas as amostras foram incubadas a 27,5ºC durante 5 a 7 dias e obtidos resultados quantitativos (UFC/m3; UFC/m2; UFC/g) e qualitativos com a identificação das espécies fúngicas. Para a aplicação dos métodos de biologia molecular foram realizadas colheitas de ar de 300 litros utilizando o método de impinger com a velocidade de recolha de 300 L/min. A identificação molecular de três espécies potencialmente patogénicas e/ou toxinogénicas (Aspergillus flavus, Aspergillus fumigatus e Stachybotrys chartarum) foram obtidas por PCR em tempo real (PCR TR) utilizando o Rotor-Gene 6000 qPCR Detection System. As medições de partículas foram realizadas por recurso a equipamento de leitura direta (modelo Lighthouse, 2016 IAQ). Este recurso permitiu medir a concentração (mg/m3) de partículas em 5 dimensões distintas (PM 0.5; PM 1.0; PM 2.5; PM 5.0; PM10). Nas explorações avícolas, 28 espécies/géneros de fungos foram isolados no ar, tendo Aspergillus versicolor sido a espécie mais frequente (20.9%), seguida por Scopulariopsis brevicaulis (17.0%) e Penicillium sp. (14.1%). Entre o género Aspergillus, Aspergillus flavus apresentou o maior número de esporos (>2000 UFC/m3). Em relação às superfícies, A. versicolor foi detetada em maior número (>3 × 10−2 UFC/m2). Na cama nova, Penicillium foi o género mais frequente (59,9%), seguido por Alternaria (17,8%), Cladosporium (7,1%) e Aspergillus (5,7%). Na cama usada, Penicillium sp. foi o mais frequente (42,3%), seguido por Scopulariopsis sp. (38,3%), Trichosporon sp. (8,8%) e Aspergillus sp. (5,5%). Em relação à contaminação por partículas, as partículas com maior dimensão foram detectadas em maiores concentrações, designadamente as PM5.0 (partículas com a dimensão de 5.0 bm ou menos) e PM10 (partículas com a dimensão de 10 bm ou menos). Neste setting a prevalência da alteração ventilatória obstrutiva foi superior nos indivíduos com maior tempo de exposição (31,7%) independentemente de serem fumadores (17,1%) ou não fumadores (14,6%). Relativamente à avaliação do IgE específico, foi apenas realizado em trabalhadores das explorações avícolas (14 mulheres e 33 homens), não tendo sido encontrada associação positiva (p<0.05%) entre a contaminação fúngica e a sensibilização a antigénios fúngicos. No caso das explorações suinícolas, Aspergillus versicolor foi a espécie mais frequente (20,9%), seguida por Scopulariopsis brevicaulis (17,0%) e Penicillium sp. (14,1%). No género Aspergillus, A. versicolor apresentou o maior isolamento no ar (>2000 UFC/m3) e a maior prevalência (41,9%), seguida por A. flavus e A. fumigatus (8,1%). Em relação às superfícies analisadas, A. versicolor foi detetada em maior número (>3 ×10−2 UFC/m2). No caso da cobertura do pavimento das explorações suinícolas, o género Thicoderma foi o mais frequente na cobertura nova (28,0%) seguida por A. versicolor e Acremonium sp. (14,0%). O género Mucor foi o mais frequente na cobertura usada (25,1%), seguido por Trichoderma sp. (18,3%) e Acremonium sp. (11,2%). Relativamente às partículas, foram evidenciados também valores mais elevados na dimensão PM5 e, predominantes nas PM10. Neste contexto, apenas 4 participantes (22,2%) apresentaram uma alteração ventilatória obstrutiva. Destes, as obstruções mais graves encontraram-se nos que também apresentavam maior tempo de exposição. A prevalência de asma na amostra de trabalhadores em estudo, pertencentes aos 2 contextos em estudo, foi de 8,75%, tendo-se verificado também uma prevalência elevada de sintomatologia respiratória em profissionais não asmáticos. Em relação à utilização complementar dos métodos convencionais e moleculares, é recomendável que a avaliação da contaminação fúngica nestes settings, e, consequentemente, a exposição profissional a fungos, seja suportada pelas duas metodologias e, ainda, que ocorre exposição ocupacional à micotoxina aflatoxina B1 em ambos os contextos profissionais. Face aos resultados obtidos, é importante salientar que os settings alvo de estudo carecem de uma intervenção integrada em Saúde Ocupacional no âmbito da vigilância ambiental e da vigilância da saúde, com o objetivo de diminuir a exposição aos dois factores de risco estudados (fungos e partículas).
Resumo:
Component joining is typically performed by welding, fastening, or adhesive-bonding. For bonded aerospace applications, adhesives must withstand high-temperatures (200°C or above, depending on the application), which implies their mechanical characterization under identical conditions. The extended finite element method (XFEM) is an enhancement of the finite element method (FEM) that can be used for the strength prediction of bonded structures. This work proposes and validates damage laws for a thin layer of an epoxy adhesive at room temperature (RT), 100, 150, and 200°C using the XFEM. The fracture toughness (G Ic ) and maximum load ( ); in pure tensile loading were defined by testing double-cantilever beam (DCB) and bulk tensile specimens, respectively, which permitted building the damage laws for each temperature. The bulk test results revealed that decreased gradually with the temperature. On the other hand, the value of G Ic of the adhesive, extracted from the DCB data, was shown to be relatively insensitive to temperature up to the glass transition temperature (T g ), while above T g (at 200°C) a great reduction took place. The output of the DCB numerical simulations for the various temperatures showed a good agreement with the experimental results, which validated the obtained data for strength prediction of bonded joints in tension. By the obtained results, the XFEM proved to be an alternative for the accurate strength prediction of bonded structures.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial
Resumo:
Global warming and the associated climate changes are being the subject of intensive research due to their major impact on social, economic and health aspects of the human life. Surface temperature time-series characterise Earth as a slow dynamics spatiotemporal system, evidencing long memory behaviour, typical of fractional order systems. Such phenomena are difficult to model and analyse, demanding for alternative approaches. This paper studies the complex correlations between global temperature time-series using the Multidimensional scaling (MDS) approach. MDS provides a graphical representation of the pattern of climatic similarities between regions around the globe. The similarities are quantified through two mathematical indices that correlate the monthly average temperatures observed in meteorological stations, over a given period of time. Furthermore, time dynamics is analysed by performing the MDS analysis over slices sampling the time series. MDS generates maps describing the stations’ locus in the perspective that, if they are perceived to be similar to each other, then they are placed on the map forming clusters. We show that MDS provides an intuitive and useful visual representation of the complex relationships that are present among temperature time-series, which are not perceived on traditional geographic maps. Moreover, MDS avoids sensitivity to the irregular distribution density of the meteorological stations.
Resumo:
This paper studies a discrete dynamical system of interacting particles that evolve by interacting among them. The computational model is an abstraction of the natural world, and real systems can range from the huge cosmological scale down to the scale of biological cell, or even molecules. Different conditions for the system evolution are tested. The emerging patterns are analysed by means of fractal dimension and entropy measures. It is observed that the population of particles evolves towards geometrical objects with a fractal nature. Moreover, the time signature of the entropy can be interpreted at the light of complex dynamical systems.
Resumo:
This manuscript analyses the data generated by a Zero Length Column (ZLC) diffusion experimental set-up, for 1,3 Di-isopropyl benzene in a 100% alumina matrix with variable particle size. The time evolution of the phenomena resembles those of fractional order systems, namely those with a fast initial transient followed by long and slow tails. The experimental measurements are best fitted with the Harris model revealing a power law behavior.
Resumo:
One of the most well-known bio-inspired algorithms used in optimization problems is the particle swarm optimization (PSO), which basically consists on a machinelearning technique loosely inspired by birds flocking in search of food. More specifically, it consists of a number of particles that collectively move on the search space in search of the global optimum. The Darwinian particle swarm optimization (DPSO) is an evolutionary algorithm that extends the PSO using natural selection, or survival of the fittest, to enhance the ability to escape from local optima. This paper firstly presents a survey on PSO algorithms mainly focusing on the DPSO. Afterward, a method for controlling the convergence rate of the DPSO using fractional calculus (FC) concepts is proposed. The fractional-order optimization algorithm, denoted as FO-DPSO, is tested using several well-known functions, and the relationship between the fractional-order velocity and the convergence of the algorithm is observed. Moreover, experimental results show that the FO-DPSO significantly outperforms the previously presented FO-PSO.
Resumo:
Collective behaviours can be observed in both natural and man-made systems composed of a large number of elemental subsystems. Typically, each elemental subsystem has its own dynamics but, whenever interaction between individuals occurs, the individual behaviours tend to be relaxed, and collective behaviours emerge. In this paper, the collective behaviour of a large-scale system composed of several coupled elemental particles is analysed. The dynamics of the particles are governed by the same type of equations but having different parameter values and initial conditions. Coupling between particles is based on statistical feedback, which means that each particle is affected by the average behaviour of its neighbours. It is shown that the global system may unveil several types of collective behaviours, corresponding to partial synchronisation, characterised by the existence of several clusters of synchronised subsystems, and global synchronisation between particles, where all the elemental particles synchronise completely.
Resumo:
In a “perfect” drinking water system, the water quality for the consumers should be the same as the quality of the water leaving the treatment plant. However, some variability along the system can lead to a decrease in water quality (such as discolouration) which is usually reflected in the number of the customer complaints. This change may be related to the amount of sediment in the distribution network, leading to an increase in turbidity at the water supply. Since there is no such thing as a perfect drinking water system, the behaviour of particles in a drinking water network needs a suitable approach in order to understand how it works. Moreover, the combination of measurements, such as turbidity patterns and the Resuspension Potential Method (RPM) aid in the prevention of discoloured water complaints and intervention in the treatment upgrade or the network cleaning. Besides sediments there is also bacterial regrowth in the network, which is related to the water quality and distribution network characteristics. In a theoretical drinking water system higher velocities, temperature and shorter residences times lead to wider bacterial growth. In this study we observe velocity and residence steady-states and bacterial does not seem to be related to either. It can be concluded that adequate measurements of RPM, customer complaints and bacterial concentrations allow a wider knowledge on particle behaviour in drinking water systems.
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
OBJECTIVE To propose a method of redistributing ill-defined causes of death (IDCD) based on the investigation of such causes.METHODS In 2010, an evaluation of the results of investigating the causes of death classified as IDCD in accordance with chapter 18 of the International Classification of Diseases (ICD-10) by the Mortality Information System was performed. The redistribution coefficients were calculated according to the proportional distribution of ill-defined causes reclassified after investigation in any chapter of the ICD-10, except for chapter 18, and used to redistribute the ill-defined causes not investigated and remaining by sex and age. The IDCD redistribution coefficient was compared with two usual methods of redistribution: a) Total redistribution coefficient, based on the proportional distribution of all the defined causes originally notified and b) Non-external redistribution coefficient, similar to the previous, but excluding external causes.RESULTS Of the 97,314 deaths by ill-defined causes reported in 2010, 30.3% were investigated, and 65.5% of those were reclassified as defined causes after the investigation. Endocrine diseases, mental disorders, and maternal causes had a higher representation among the reclassified ill-defined causes, contrary to infectious diseases, neoplasms, and genitourinary diseases, with higher proportions among the defined causes reported. External causes represented 9.3% of the ill-defined causes reclassified. The correction of mortality rates by the total redistribution coefficient and non-external redistribution coefficient increased the magnitude of the rates by a relatively similar factor for most causes, contrary to the IDCD redistribution coefficient that corrected the different causes of death with differentiated weights.CONCLUSIONS The proportional distribution of causes among the ill-defined causes reclassified after investigation was not similar to the original distribution of defined causes. Therefore, the redistribution of the remaining ill-defined causes based on the investigation allows for more appropriate estimates of the mortality risk due to specific causes.
Resumo:
A simple procedure to measure the cohesive laws of bonded joints under mode I loading using the double cantilever beam test is proposed. The method only requires recording the applied load–displacement data and measuring the crack opening displacement at its tip in the course of the experimental test. The strain energy release rate is obtained by a procedure involving the Timoshenko beam theory, the specimen’s compliance and the crack equivalent concept. Following the proposed approach the influence of the fracture process zone is taken into account which is fundamental for an accurate estimation of the failure process details. The cohesive law is obtained by differentiation of the strain energy release rate as a function of the crack opening displacement. The model was validated numerically considering three representative cohesive laws. Numerical simulations using finite element analysis including cohesive zone modeling were performed. The good agreement between the inputted and resulting laws for all the cases considered validates the model. An experimental confirmation was also performed by comparing the numerical and experimental load–displacement curves. The numerical load–displacement curves were obtained by adjusting typical cohesive laws to the ones measured experimentally following the proposed approach and using finite element analysis including cohesive zone modeling. Once again, good agreement was obtained in the comparisons thus demonstrating the good performance of the proposed methodology.
Resumo:
Constraints nonlinear optimization problems can be solved using penalty or barrier functions. This strategy, based on solving the problems without constraints obtained from the original problem, have shown to be e ective, particularly when used with direct search methods. An alternative to solve the previous problems is the lters method. The lters method introduced by Fletcher and Ley er in 2002, , has been widely used to solve problems of the type mentioned above. These methods use a strategy di erent from the barrier or penalty functions. The previous functions de ne a new one that combine the objective function and the constraints, while the lters method treat optimization problems as a bi-objective problems that minimize the objective function and a function that aggregates the constraints. Motivated by the work of Audet and Dennis in 2004, using lters method with derivative-free algorithms, the authors developed works where other direct search meth- ods were used, combining their potential with the lters method. More recently. In a new variant of these methods was presented, where it some alternative aggregation restrictions for the construction of lters were proposed. This paper presents a variant of the lters method, more robust than the previous ones, that has been implemented with a safeguard procedure where values of the function and constraints are interlinked and not treated completely independently.