974 resultados para Symmetry Ratio Algorithm
Resumo:
This paper presents an algorithm to efficiently generate the state-space of systems specified using the IOPT Petri-net modeling formalism. IOPT nets are a non-autonomous Petri-net class, based on Place-Transition nets with an extended set of features designed to allow the rapid prototyping and synthesis of system controllers through an existing hardware-software co-design framework. To obtain coherent and deterministic operation, IOPT nets use a maximal-step execution semantics where, in a single execution step, all enabled transitions will fire simultaneously. This fact increases the resulting state-space complexity and can cause an arc "explosion" effect. Real-world applications, with several million states, will reach a higher order of magnitude number of arcs, leading to the need for high performance state-space generator algorithms. The proposed algorithm applies a compilation approach to read a PNML file containing one IOPT model and automatically generate an optimized C program to calculate the corresponding state-space.
Resumo:
This study explores a large set of OC and EC measurements in PM(10) and PM(2.5) aerosol samples, undertaken with a long term constant analytical methodology, to evaluate the capability of the OC/EC minimum ratio to represent the ratio between the OC and EC aerosol components resulting from fossil fuel combustion (OC(ff)/EC(ff)). The data set covers a wide geographical area in Europe, but with a particular focus upon Portugal, Spain and the United Kingdom, and includes a great variety of sites: urban (background, kerbside and tunnel), industrial, rural and remote. The highest minimum ratios were found in samples from remote and rural sites. Urban background sites have shown spatially and temporally consistent minimum ratios, of around 1.0 for PM(10) and 0.7 for PM(2.5).The consistency of results has suggested that the method could be used as a tool to derive the ratio between OC and EC from fossil fuel combustion and consequently to differentiate OC from primary and secondary sources. To explore this capability, OC and EC measurements were performed in a busy roadway tunnel in central Lisbon. The OC/EC ratio, which reflected the composition of vehicle combustion emissions, was in the range of 03-0.4. Ratios of OC/EC in roadside increment air (roadside minus urban background) in Birmingham, UK also lie within the range 03-0.4. Additional measurements were performed under heavy traffic conditions at two double kerbside sites located in the centre of Lisbon and Madrid. The OC/EC minimum ratios observed at both sites were found to be between those of the tunnel and those of urban background air, suggesting that minimum values commonly obtained for this parameter in open urban atmospheres over-predict the direct emissions of OC(ff) from road transport. Possible reasons for this discrepancy are explored. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
To maintain a power system within operation limits, a level ahead planning it is necessary to apply competitive techniques to solve the optimal power flow (OPF). OPF is a non-linear and a large combinatorial problem. The Ant Colony Search (ACS) optimization algorithm is inspired by the organized natural movement of real ants and has been successfully applied to different large combinatorial optimization problems. This paper presents an implementation of Ant Colony optimization to solve the OPF in an economic dispatch context. The proposed methodology has been developed to be used for maintenance and repairing planning with 48 to 24 hours antecipation. The main advantage of this method is its low execution time that allows the use of OPF when a large set of scenarios has to be analyzed. The paper includes a case study using the IEEE 30 bus network. The results are compared with other well-known methodologies presented in the literature.
Resumo:
This paper presents a Unit Commitment model with reactive power compensation that has been solved by Genetic Algorithm (GA) optimization techniques. The GA has been developed a computational tools programmed/coded in MATLAB. The main objective is to find the best generations scheduling whose active power losses are minimal and the reactive power to be compensated, subjected to the power system technical constraints. Those are: full AC power flow equations, active and reactive power generation constraints. All constraints that have been represented in the objective function are weighted with a penalty factors. The IEEE 14-bus system has been used as test case to demonstrate the effectiveness of the proposed algorithm. Results and conclusions are dully drawn.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tools must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
Although it is always weak between RFID Tag and Terminal in focus of the security, there are no security skills in RFID Tag. Recently there are a lot of studying in order to protect it, but because it has some physical limitation of RFID, that is it should be low electric power and high speed, it is impossible to protect with the skills. At present, the methods of RFID security are using a security server, a security policy and security. One of them the most famous skill is the security module, then they has an authentication skill and an encryption skill. In this paper, we designed and implemented after modification original SEED into 8 Round and 64 bits for Tag.
Resumo:
Mestrado em Radioterapia.
Resumo:
Mestrado em Radioterapia
The use of non-standard CT conversion ramps for Monte Carlo verification of 6 MV prostate IMRT plans
Resumo:
Monte Carlo (MC) dose calculation algorithms have been widely used to verify the accuracy of intensity-modulated radiotherapy (IMRT) dose distributions computed by conventional algorithms due to the ability to precisely account for the effects of tissue inhomogeneities and multileaf collimator characteristics. Both algorithms present, however, a particular difference in terms of dose calculation and report. Whereas dose from conventional methods is traditionally computed and reported as the water-equivalent dose (Dw), MC dose algorithms calculate and report dose to medium (Dm). In order to compare consistently both methods, the conversion of MC Dm into Dw is therefore necessary. This study aims to assess the effect of applying the conversion of MC-based Dm distributions to Dw for prostate IMRT plans generated for 6 MV photon beams. MC phantoms were created from the patient CT images using three different ramps to convert CT numbers into material and mass density: a conventional four material ramp (CTCREATE) and two simplified CT conversion ramps: (1) air and water with variable densities and (2) air and water with unit density. MC simulations were performed using the BEAMnrc code for the treatment head simulation and the DOSXYZnrc code for the patient dose calculation. The conversion of Dm to Dw by scaling with the stopping power ratios of water to medium was also performed in a post-MC calculation process. The comparison of MC dose distributions calculated in conventional and simplified (water with variable densities) phantoms showed that the effect of material composition on dose-volume histograms (DVH) was less than 1% for soft tissue and about 2.5% near and inside bone structures. The effect of material density on DVH was less than 1% for all tissues through the comparison of MC distributions performed in the two simplified phantoms considering water. Additionally, MC dose distributions were compared with the predictions from an Eclipse treatment planning system (TPS), which employed a pencil beam convolution (PBC) algorithm with Modified Batho Power Law heterogeneity correction. Eclipse PBC and MC calculations (conventional and simplified phantoms) agreed well (<1%) for soft tissues. For femoral heads, differences up to 3% were observed between the DVH for Eclipse PBC and MC calculated in conventional phantoms. The use of the CT conversion ramp of water with variable densities for MC simulations showed no dose discrepancies (0.5%) with the PBC algorithm. Moreover, converting Dm to Dw using mass stopping power ratios resulted in a significant shift (up to 6%) in the DVH for the femoral heads compared to the Eclipse PBC one. Our results show that, for prostate IMRT plans delivered with 6 MV photon beams, no conversion of MC dose from medium to water using stopping power ratio is needed. In contrast, MC dose calculations using water with variable density may be a simple way to solve the problem found using the dose conversion method based on the stopping power ratio.
Resumo:
The fatty acid profile of erythrocyte membranes has been considered a good biomarker for several pathologic situations. Dietary intake, digestion, absorption, metabolism, storage and exchange amongst compartments, greatly influence the fatty acids composition of different cells and tissues. Lipoprotein and hepatic lipases were also involved in fatty acid availability. In the present work we examined the correlations between fatty acid in Red Blood Cells (RBCs) membranes, the fatty acid desaturase and elongase activities, glycaemia, blood lipids, lipoproteins and apoproteins, and the endothelial lipase (EL) mass in plasma. Twenty one individuals were considered in the present study, with age >18 y. RBCs membranes were obtained and analysed for fatty acid composition by gas chromatography. The amount of fatty acids (as percentage) were analysed, and the ratios between fatty acid 16:1/16:0; 18:1/18:0; 18:0/16:0; 22:6 n-3/20:5 n-3 and 20:4 n-6/18:2 n-6 were calculated. Bivariate analysis (rs) and partial correlations were determined. SCD16 estimation activity correlated positively with BMI (rs=0.466, p=0.043) and triacylglycerols (TAG) (rs=0.483, p=0.026), and negatively with the ratio ApoA1/ApoB (rs=-0.566, p=0.007). Endothelial lipase (EL) correlated positively with the EPA/AA ratio in RBCs membranes (rs=0.524, p=0.045). After multi-adjustment for BMI, age, hs-CRP and dietary n3/n6 ratio, the correlations remained significant between EL and EPA/AA ratio. At the best of our knowledge this is the first report that correlated EL with the fatty acid profile of RBCs plasma membranes. The association found here can suggest that the enzyme may be involved in the bioavailability and distribution of n-3/n-6 fatty acids, suggesting a major role for EL in the pathophysiological mechanisms involving biomembranes’ fatty acids, such as in inflammatory response and eicosanoids metabolites pathways.
Resumo:
In the hustle and bustle of daily life, how often do we stop to pay attention to the tiny details around us, some of them right beneath our feet? Such is the case of interesting decorative patterns that can be found in squares and sidewalks beautified by the traditional Portuguese pavement. Its most common colors are the black and the white of the basalt and the limestone used; the result is a large variety and richness in patterns. No doubt, it is worth devoting some of our time enjoying the lovely Portuguese pavement, a true worldwide attraction. The interesting patterns found on the Azorean handicrafts are as fascinating and substantial from the cultural point of view. Patterns existing in the sidewalks and crafts can be studied from the mathematical point of view, thus allowing a thorough and rigorous cataloguing of such heritage. The mathematical classification is based on the concept of symmetry, a unifying principle of geometry. Symmetry is a unique tool for helping us relate things that at first glance may appear to have no common ground at all. By interlacing different fields of endeavor, the mathematical approach to sidewalks and crafts is particularly interesting, and an excellent source of inspiration for the development of highly motivated recreational activities. This text is an invitation to visit the nine islands of the Azores and to identify a wide range of patterns, namely rosettes and friezes, by getting to know different arts and crafts and sidewalks.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.