971 resultados para Logic outer-approximation algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a single-phase Series Active Power Filter (Series APF) for mitigation of the load voltage harmonic content, while maintaining the voltage on the DC side regulated without the support of a voltage source. The proposed series active power filter control algorithm eliminates the additional voltage source to regulate the DC voltage, and with the adopted topology it is not used a coupling transformer to interface the series active power filter with the electrical power grid. The paper describes the control strategy which encapsulates the grid synchronization scheme, the compensation voltage calculation, the damping algorithm and the dead-time compensation. The topology and control strategy of the series active power filter have been evaluated in simulation software and simulations results are presented. Experimental results, obtained with a developed laboratorial prototype, validate the theoretical assumptions, and are within the harmonic spectrum limits imposed by the international recommendations of the IEEE-519 Standard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado em Biofísica e Bionanossistemas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACTThe Amazon várzeas are an important component of the Amazon biome, but anthropic and climatic impacts have been leading to forest loss and interruption of essential ecosystem functions and services. The objectives of this study were to evaluate the capability of the Landsat-based Detection of Trends in Disturbance and Recovery (LandTrendr) algorithm to characterize changes in várzeaforest cover in the Lower Amazon, and to analyze the potential of spectral and temporal attributes to classify forest loss as either natural or anthropogenic. We used a time series of 37 Landsat TM and ETM+ images acquired between 1984 and 2009. We used the LandTrendr algorithm to detect forest cover change and the attributes of "start year", "magnitude", and "duration" of the changes, as well as "NDVI at the end of series". Detection was restricted to areas identified as having forest cover at the start and/or end of the time series. We used the Support Vector Machine (SVM) algorithm to classify the extracted attributes, differentiating between anthropogenic and natural forest loss. Detection reliability was consistently high for change events along the Amazon River channel, but variable for changes within the floodplain. Spectral-temporal trajectories faithfully represented the nature of changes in floodplain forest cover, corroborating field observations. We estimated anthropogenic forest losses to be larger (1.071 ha) than natural losses (884 ha), with a global classification accuracy of 94%. We conclude that the LandTrendr algorithm is a reliable tool for studies of forest dynamics throughout the floodplain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los materiales lignocelulósicos residuales de las actividades agroindustriales pueden ser aprovechados como fuente de lignina, hemicelulosa y celulosa. El tratamiento químico del material lignocelulósico se debe enfrentar al hecho de que dicho material es bastante recalcitrante a tal ataque, fundamentalmente debido a la presencia del polímero lignina. Esto se puede lograr también utilizando hongos de la podredumbre blanca de la madera. Estos producen enzimas lignolíticas extracelulares fundamentalmente Lacasa, que oxida la lignina a CO2. Tambien oxida un amplio rango de sustratos ( fenoles, polifenoles, anilinas, aril-diaminas, fenoles metoxi-sustituídos, y otros), lo cual es una buena razón de su atracción para aplicaciones biotecnológicas. La enzima tiene potencial aplicación en procesos tales como en la delignificación de materiales lignocelulósicos y en el bioblanqueado de pulpas para papel, en el tratamiento de aguas residuales de plantas industriales, en la modificación de fibras y decoloración en industrias textiles y de colorantes, en el mejoramiento de alimentos para animales, en la detoxificación de polutantes y en bioremediación de suelos contaminados. También se la ha utilizado en Q.Orgánica para la oxidación de grupos funcionales, en la formación de enlaces carbono- nitrógeno y en la síntesis de productos naturales complejos. HIPOTESIS: Los hongos de podredumbre blanca, y en condiciones óptimas de cultivo producen distintos tipos de enzimas oxidasas, siendo las lacasas las más adecuadas para explorarlas como catalizadores en los siguientes procesos:  Delignificación de residuos de la industria forestal con el fin de aprovechar tales desechos en la alimentación animal.  Decontaminación/remediación de suelos y/o efluentes industriales. Se realizarán los estudios para el diseño de bio-reactores que permitan responder a las dos cuestiones planteadas en la hipótesis. Para el proceso de delignificación de material lignocelulósico se proponen dos estrategias: 1- tratar el material con el micelio del hongo adecuando la provisión de nutrientes para un desarrollo sostenido y favorecer la liberación de la enzima. 2- Utilizar la enzima lacasa parcialmente purificada acoplada a un sistema mediador para oxidar los compuestos polifenólicos. Para el proceso de decontaminación/remediación de suelos y/o efluentes industriales se trabajará también en dos frentes: 3) por un lado, se ha descripto que existe una correlación positiva entre la actividad de algunas enzimas presentes en el suelo y la fertilidad. En este sentido se conoce que un sistema enzimático, tentativamente identificado como una lacasa de origen microbiano es responsable de la transformación de compuestos orgánicos en el suelo. La enzima protege al suelo de la acumulación de compuestos orgánicos peligrosos catalizando reacciones que involucran degradación, polimerización e incorporación a complejos del ácido húmico. Se utilizarán suelos incorporados con distintos polutantes(por ej. policlorofenoles ó cloroanilinas.) 4) Se trabajará con efluentes industriales contaminantes (alpechínes y/o el efluente líquido del proceso de desamargado de las aceitunas). The lignocellulosic raw materials of the agroindustrial activities can be taken advantage as source of lignin, hemicellulose and cellulose. The chemical treatment of this material is not easy because the above mentioned material is recalcitrant enough to such an assault, due to the presence of the lignin. This can be achieved also using the white-rot fungi of the wood. It produces extracellular ligninolitic enzymes, fundamentally Laccase, which oxidizes the lignin to CO2. The enzyme has application in such processes as in the delignification of lignocellulosic materials and in the biobleaching of fibers for paper industry, in the treatment of waste water of industrial plants, in the discoloration in textile industries, in the improvement of food for ruminants, in the detoxification of polutants and in bioremediation of contaminated soils. HYPOTHESIS: The white-rot fungi produce different types of enzymes, being the laccases the most adapted to explore them as catalysts in the following processes:  Delignification of residues of the forest industry in order to take advantage of such waste in the animal feed.  Decontamination of soils and / or waste waters. The studies will be conducted for the design of bio reactors that allow to answer to both questions raised in the hypothesis. For the delignification process of lignocellulosic material they propose two strategies: 1- to treat the material with the fungi 2-to use the partially purified enzyme to oxidize the polyphenolic compounds. For the soil and/or waste water decontamination process, we have: 3- Is know that the enzyme protects to the soil of the accumulation of organic dangerous compounds catalyzing reactions that involve degradation, polymerization and incorporation to complexes of the humic acid. There will be use soils incorporated into different pollutants. 4- We will work with waste waters (alpechins or the green olive debittering effluents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the determinism of the AS-lnterface network and the 3 main families of control systems, which may use it, namely PLC, PC and RTOS. During the course of this study the PROFIBUS and Ethernet field level networks were also considered in order to ensure that they would not introduce unacceptable latencies into the overall control system. This research demonstrated that an incorrectly configured Ethernet network introduces unacceptable variable duration latencies into the control system, thus care must be exercised if the determinism of a control system is not to be compromised. This study introduces a new concept of using statistics and process capability metrics in the form of CPk values, to specify how suitable a control system is for a given control task. The PLC systems, which were tested, demonstrated extremely deterministic responses, but when a large number of iterations were introduced in the user program, the mean control system latency was much too great for an AS-I network. Thus the PLC was found to be unsuitable for an AS-I network if a large, complex user program Is required. The PC systems, which were tested were non-deterministic and had latencies of variable duration. These latencies became extremely exaggerated when a graphing ActiveX was included in the control application. These PC systems also exhibited a non-normal frequency distribution of control system latencies, and as such are unsuitable for implementation with an AS-I network. The RTOS system, which was tested, overcame the problems identified with the PLC systems and produced an extremely deterministic response, even when a large number of iterations were introduced in the user program. The RTOS system, which was tested, is capable of providing a suitable deterministic control system response, even when an extremely large, complex user program is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Navier-Stokes-Gleichungen, Gleitrandbedingung, Konvektions-Diffusions-Gleichung, Finite-Elemente-Methode, Mehrgitterverfahren, Fehlerabschätzung, Iterative Entkopplung

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background:Vascular remodeling, the dynamic dimensional change in face of stress, can assume different directions as well as magnitudes in atherosclerotic disease. Classical measurements rely on reference to segments at a distance, risking inappropriate comparison between dislike vessel portions.Objective:to explore a new method for quantifying vessel remodeling, based on the comparison between a given target segment and its inferred normal dimensions.Methods:Geometric parameters and plaque composition were determined in 67 patients using three-vessel intravascular ultrasound with virtual histology (IVUS-VH). Coronary vessel remodeling at cross-section (n = 27.639) and lesion (n = 618) levels was assessed using classical metrics and a novel analytic algorithm based on the fractional vessel remodeling index (FVRI), which quantifies the total change in arterial wall dimensions related to the estimated normal dimension of the vessel. A prediction model was built to estimate the normal dimension of the vessel for calculation of FVRI.Results:According to the new algorithm, “Ectatic” remodeling pattern was least common, “Complete compensatory” remodeling was present in approximately half of the instances, and “Negative” and “Incomplete compensatory” remodeling types were detected in the remaining. Compared to a traditional diagnostic scheme, FVRI-based classification seemed to better discriminate plaque composition by IVUS-VH.Conclusion:Quantitative assessment of coronary remodeling using target segment dimensions offers a promising approach to evaluate the vessel response to plaque growth/regression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The comparative analysis of continuous signals restoration by different kinds of approximation is performed. The software product, allowing to define optimal method of different original signals restoration by Lagrange polynomial, Kotelnikov interpolation series, linear and cubic splines, Haar wavelet and Kotelnikov-Shannon wavelet based on criterion of minimum value of mean-square deviation is proposed. Practical recommendations on the selection of approximation function for different class of signals are obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."