764 resultados para Consensus algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a single-phase Series Active Power Filter (Series APF) for mitigation of the load voltage harmonic content, while maintaining the voltage on the DC side regulated without the support of a voltage source. The proposed series active power filter control algorithm eliminates the additional voltage source to regulate the DC voltage, and with the adopted topology it is not used a coupling transformer to interface the series active power filter with the electrical power grid. The paper describes the control strategy which encapsulates the grid synchronization scheme, the compensation voltage calculation, the damping algorithm and the dead-time compensation. The topology and control strategy of the series active power filter have been evaluated in simulation software and simulations results are presented. Experimental results, obtained with a developed laboratorial prototype, validate the theoretical assumptions, and are within the harmonic spectrum limits imposed by the international recommendations of the IEEE-519 Standard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACTThe Amazon várzeas are an important component of the Amazon biome, but anthropic and climatic impacts have been leading to forest loss and interruption of essential ecosystem functions and services. The objectives of this study were to evaluate the capability of the Landsat-based Detection of Trends in Disturbance and Recovery (LandTrendr) algorithm to characterize changes in várzeaforest cover in the Lower Amazon, and to analyze the potential of spectral and temporal attributes to classify forest loss as either natural or anthropogenic. We used a time series of 37 Landsat TM and ETM+ images acquired between 1984 and 2009. We used the LandTrendr algorithm to detect forest cover change and the attributes of "start year", "magnitude", and "duration" of the changes, as well as "NDVI at the end of series". Detection was restricted to areas identified as having forest cover at the start and/or end of the time series. We used the Support Vector Machine (SVM) algorithm to classify the extracted attributes, differentiating between anthropogenic and natural forest loss. Detection reliability was consistently high for change events along the Amazon River channel, but variable for changes within the floodplain. Spectral-temporal trajectories faithfully represented the nature of changes in floodplain forest cover, corroborating field observations. We estimated anthropogenic forest losses to be larger (1.071 ha) than natural losses (884 ha), with a global classification accuracy of 94%. We conclude that the LandTrendr algorithm is a reliable tool for studies of forest dynamics throughout the floodplain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We sought to verify the prevalence of lymphocytic thyroiditis (LT) and Hashimoto's thyroiditis (HT) in autopsy materials. Cases examined between 2003 and 2007 at the Department of Pathology of Faculty of Medicine of São Paulo University were studied. Immunohistochemical analyses were conducted in selected cases to characterize the type of infiltrating mononuclear cells; in addition, we evaluated the frequency of apoptosis by TUNEL assay technique and caspase-3 immunostaining. Significant increase in overall thyroiditis frequency was observed in the present series when compared with the previous report (2.2978% vs. 0.0392%). Thyroiditis was more prevalent among older people. Selected cases of LT and HT (5 cases each) had their infiltrating lymphocytes characterized by immunohistochemical analyses. Both LT and HT showed similar immunostaining patterns for CD4, CD8, CD68, thus supporting a common pathophysiology mechanism and indicating that LT and HT should be considered different presentations of a same condition, that is, autoimmune thyroiditis. Moreover, apoptosis markers strongly evidenced that apoptosis was present in all studied cases. Our results demonstrated an impressive increase in the prevalence of thyroiditis during recent years and our data support that the terminology of autoimmune thyroiditis should be used to designate both LT and HT. This classification would facilitate comparison of prevalence data from different series and studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background:Vascular remodeling, the dynamic dimensional change in face of stress, can assume different directions as well as magnitudes in atherosclerotic disease. Classical measurements rely on reference to segments at a distance, risking inappropriate comparison between dislike vessel portions.Objective:to explore a new method for quantifying vessel remodeling, based on the comparison between a given target segment and its inferred normal dimensions.Methods:Geometric parameters and plaque composition were determined in 67 patients using three-vessel intravascular ultrasound with virtual histology (IVUS-VH). Coronary vessel remodeling at cross-section (n = 27.639) and lesion (n = 618) levels was assessed using classical metrics and a novel analytic algorithm based on the fractional vessel remodeling index (FVRI), which quantifies the total change in arterial wall dimensions related to the estimated normal dimension of the vessel. A prediction model was built to estimate the normal dimension of the vessel for calculation of FVRI.Results:According to the new algorithm, “Ectatic” remodeling pattern was least common, “Complete compensatory” remodeling was present in approximately half of the instances, and “Negative” and “Incomplete compensatory” remodeling types were detected in the remaining. Compared to a traditional diagnostic scheme, FVRI-based classification seemed to better discriminate plaque composition by IVUS-VH.Conclusion:Quantitative assessment of coronary remodeling using target segment dimensions offers a promising approach to evaluate the vessel response to plaque growth/regression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fracture risk assessment tool (FRAX(®)) has been developed for the identification of individuals with high risk of fracture in whom treatment to prevent fractures would be appropriate. FRAX models are not yet available for all countries or ethnicities, but surrogate models can be used within regions with similar fracture risk. The International Society for Clinical Densitometry (ISCD) and International Osteoporosis Foundation (IOF) are nonprofit multidisciplinary international professional organizations. Their visions are to advance the awareness, education, prevention, and treatment of osteoporosis. In November 2010, the IOF/ISCD FRAX initiative was held in Bucharest, bringing together international experts to review and create evidence-based official positions guiding clinicians for the practical use of FRAX. A consensus meeting of the Asia-Pacific (AP) Panel of the ISCD recently reviewed the most current Official Positions of the Joint Official Positions of ISCD and IOF on FRAX in view of the different population characteristics and health standards in the AP regions. The reviewed position statements included not only the key spectrum of positions but also unique concerns in AP regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2009 International Society of Urological Pathology consensus conference in Boston made recommendations regarding the standardization of pathology reporting of radical prostatectomy specimens. Issues relating to the substaging of pT2 prostate cancers according to the TNM 2002/2010 system, reporting of tumor size/volume and zonal location of prostate cancers were coordinated by working group 2. A survey circulated before the consensus conference demonstrated that 74% of the 157 participants considered pT2 substaging of prostate cancer to be of clinical and/or academic relevance. The survey also revealed a considerable variation in the frequency of reporting of pT2b substage prostate cancer, which was likely a consequence of the variable methodologies used to distinguish pT2a from pT2b tumors. Overview of the literature indicates that current pT2 substaging criteria lack clinical relevance and the majority (65.5%) of conference attendees wished to discontinue pT2 substaging. Therefore, the consensus was that reporting of pT2 substages should, at present, be optional. Several studies have shown that prostate cancer volume is significantly correlated with other clinicopathological features, including Gleason score and extraprostatic extension of tumor; however, most studies fail to demonstrate this to have prognostic significance on multivariate analysis. Consensus was reached with regard to the reporting of some quantitative measure of the volume of tumor in a prostatectomy specimen, without prescribing a specific methodology. Incorporation of the zonal and/or anterior location of the dominant/index tumor in the pathology report was accepted by most participants, but a formal definition of the identifying features of the dominant/index tumor remained undecided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Many clinical practice guidelines (CPG) have been published in reply to the development of the concept of "evidence-based medicine" (EBM) and as a solution to the difficulty of synthesizing and selecting relevant medical literature. Taking into account the expansion of new CPG, the question of choice arises: which CPG to consider in a given clinical situation? It is of primary importance to evaluate the quality of the CPG, but until recently, there has been no standardized tool of evaluation or comparison of the quality of the CPG. An instrument of evaluation of the quality of the CPG, called "AGREE" for appraisal of guidelines for research and evaluation was validated in 2002. AIM OF THE STUDY: The six principal CPG concerning the treatment of schizophrenia are compared with the help of the "AGREE" instrument: (1) "the Agence nationale pour le développement de l'évaluation médicale (ANDEM) recommendations"; (2) "The American Psychiatric Association (APA) practice guideline for the treatment of patients with schizophrenia"; (3) "The quick reference guide of APA practice guideline for the treatment of patients with schizophrenia"; (4) "The schizophrenia patient outcomes research team (PORT) treatment recommendations"; (5) "The Texas medication algorithm project (T-MAP)" and (6) "The expert consensus guideline for the treatment of schizophrenia". RESULTS: The results of our study were then compared with those of a similar investigation published in 2005, structured on 24 CPG tackling the treatment of schizophrenia. The "AGREE" tool was also used by two investigators in their study. In general, the scores of the two studies differed little and the two global evaluations of the CPG converged; however, each of the six CPG is perfectible. DISCUSSION: The rigour of elaboration of the six CPG was in general average. The consideration of the opinion of potential users was incomplete, and an effort made in the presentation of the recommendations would facilitate their clinical use. Moreover, there was little consideration by the authors regarding the applicability of the recommendations. CONCLUSION: Globally, two CPG are considered as strongly recommended: "the quick reference guide of the APA practice guideline for the treatment of patients with schizophrenia" and "the T-MAP".

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2009 International Society of Urological Pathology Consensus Conference in Boston made recommendations regarding the standardization of pathology reporting of radical prostatectomy specimens. Issues relating to the handling and processing of radical prostatectomy specimens were coordinated by working group 1. Most uropathologists followed similar procedures for fixation of radical prostatectomy specimens, with 51% of respondents transporting tissue in formalin. There was also consensus that the prostate weight without the seminal vesicles should be recorded. There was consensus that the surface of the prostate should be painted. It was agreed that both the prostate apex and base should be examined by the cone method with sagittal sectioning of the tissue sample. There was consensus that the gland should be fully fixed before sectioning. Both partial and complete embedding of prostates was considered to be acceptable as long as the method of partial embedding is stated. No consensus was determined regarding the necessity of weighing and measuring the length of the seminal vesicles, the preparation of whole mounts rather than standardized blocks and the methodology for sampling of fresh tissue for research purposes, and it was agreed that these should be left to the discretion of the working pathologist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.