963 resultados para Link variables method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most well-known bio-inspired algorithms used in optimization problems is the particle swarm optimization (PSO), which basically consists on a machinelearning technique loosely inspired by birds flocking in search of food. More specifically, it consists of a number of particles that collectively move on the search space in search of the global optimum. The Darwinian particle swarm optimization (DPSO) is an evolutionary algorithm that extends the PSO using natural selection, or survival of the fittest, to enhance the ability to escape from local optima. This paper firstly presents a survey on PSO algorithms mainly focusing on the DPSO. Afterward, a method for controlling the convergence rate of the DPSO using fractional calculus (FC) concepts is proposed. The fractional-order optimization algorithm, denoted as FO-DPSO, is tested using several well-known functions, and the relationship between the fractional-order velocity and the convergence of the algorithm is observed. Moreover, experimental results show that the FO-DPSO significantly outperforms the previously presented FO-PSO.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução – A estimativa da função renal relativa (FRR) através de cintigrafia renal (CR) com ácido dimercaptossuccínico marcado com tecnécio-99 metaestável (99mTc-DMSA) pode ser influenciada pela profundidade renal (PR), atendendo ao efeito de atenuação por parte dos tecidos moles que envolvem os rins. Dado que raramente é conhecida esta mesma PR, diferentes métodos de correção de atenuação (CA) foram desenvolvidos, nomeadamente os que utilizam fórmulas empíricas, como os de Raynaud, de Taylor ou de Tonnesen, ou recorrendo à aplicação direta da média geométrica (MG). Objetivos – Identificar a influência dos diferentes métodos de CA na quantificação da função renal relativa através da CR com 99mTc-DMSA e avaliar a respetiva variabilidade dos resultados de PR. Metodologia – Trinta e um pacientes com indicação para realização de CR com 99mTc-DMSA foram submetidos ao mesmo protocolo de aquisição. O processamento foi efetuado por dois operadores independentes, três vezes por exame, variando para o mesmo processamento o método de determinação da FRR: Raynaud, Taylor, Tonnesen, MG ou sem correção de atenuação (SCA). Aplicou-se o teste de Friedman para o estudo da influência dos diferentes métodos de CA e a correlação de Pearson para a associação e significância dos valores de PR com as variáveis idade, peso e altura. Resultados – Da aplicação do teste de Friedman verificaram-se diferenças estatisticamente significativas entre os vários métodos (p=0,000), excetuando as comparações SCA/Raynaud, Tonnesen/MG e Taylor/MG (p=1,000) para ambos os rins. A correlação de Pearson demonstra que a variável peso apresenta uma correlação forte positiva com todos os métodos de cálculo da PR. Conclusões – O método de Taylor, entre os três métodos de cálculo de PR, é o que apresenta valores de FRR mais próximos da MG. A escolha do método de CA influencia significativamente os parâmetros quantitativos de FRR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radio Link Quality Estimation (LQE) is a fundamental building block for Wireless Sensor Networks, namely for a reliable deployment, resource management and routing. Existing LQEs (e.g. PRR, ETX, Fourbit, and LQI ) are based on a single link property, thus leading to inaccurate estimation. In this paper, we propose F-LQE, that estimates link quality on the basis of four link quality properties: packet delivery, asymmetry, stability, and channel quality. Each of these properties is defined in linguistic terms, the natural language of Fuzzy Logic. The overall quality of the link is specified as a fuzzy rule whose evaluation returns the membership of the link in the fuzzy subset of good links. Values of the membership function are smoothed using EWMA filter to improve stability. An extensive experimental analysis shows that F-LQE outperforms existing estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Link quality estimation is a fundamental building block for the design of several different mechanisms and protocols in wireless sensor networks (WSN). A thorough experimental evaluation of link quality estimators (LQEs) is thus mandatory. Several WSN experimental testbeds have been designed ([1–4]) but only [3] and [2] targeted link quality measurements. However, these were exploited for analyzing low-power links characteristics rather than the performance of LQEs. Despite its importance, the experimental performance evaluation of LQEs remains an open problem, mainly due to the difficulty to provide a quantitative evaluation of their accuracy. This motivated us to build a benchmarking testbed for LQE - RadiaLE, which we present here as a demo. It includes (i.) hardware components that represent the WSN under test and (ii.) a software tool for the set up and control of the experiments and also for analyzing the collected data, allowing for LQEs evaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In Angola, malaria is an endemic disease having a major impact on the economy. The WHO recommends testing for all suspected malaria cases, to avoid the presumptive treatment of this disease. In malaria endemic regions laboratory technicians must be very comfortable with microscopy, the golden standard for malaria diagnosis, to avoid the incorrect diagnosis. The improper use of medication promotes drug resistance and undesirable side effects. The present study aims to assess the impact of a three-day refresher course on the knowledge of technicians, quality of blood smears preparation and accuracy of microscopy malaria diagnosis, using qPCR as reference method. Methods: This study was implemented in laboratories from three hospitals in different provinces of Angola: Bengo, Benguela and Luanda. In each laboratory samples were collected before and after the training course (slide with thin and thick blood smears, a dried blood spot and a form). The impact of the intervention was evaluated through a written test, the quality of slide preparation and the performance of microscopy. Results: It was found a significant increase on the written test median score, from 52.5% to 65.0%. A total of 973 slides were analysed to evaluate the quality of thick and thin blood smears. Considering all laboratories there was a significant increase in quality of thick and thin blood smears. To determine the performance of microscopy using qPCR as the reference method we used 1,028 samples. Benguela presented the highest values for specificity, 92.9% and 98.8% pre and post-course, respectively and for sensitivity the best pre-course was Benguela (75.9%) and post-course Luanda (75.0%). However, no significant increase in sensitivity and specificity after the training course was registered in any laboratory analysed. Discussion: The findings of this study support the need of continuous refresher training for microscopists and other laboratory staff. The laboratories should have a quality control programme to supervise the diagnosis and also to assess the periodicity of new training. However, other variables needed to be considered to have a correct malaria diagnosis, such as adequate equipment and reagents for staining and visualization, good working conditions, motivated and qualified personnel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High loads of fungi have been reported in different types of waste management plants. This study intends to assess fungal contamination in one waste-sorting plant before and after cleaning procedures in order to analyze their effectiveness. Air samples of 50 L were collected through an impaction method, while surface samples, taken at the same time, were collected by the swabbing method and subject to further macro- and microscopic observations. In addition, we collected air samples of 250 L using the impinger Coriolis μ air sampler (Bertin Technologies) at 300 L/min airflow rate in order to perform real-time quantitative PCR (qPCR) amplification of genes from specific fungal species, namely Aspergillus fumigatus and Aspergillus flavus complexes, as well as Stachybotrys chartarum species. Fungal quantification in the air ranged from 180 to 5,280 CFU m−3 before cleaning and from 220 to 2,460 CFU m−3 after cleaning procedures. Surfaces presented results that ranged from 29 × 104 to 109 × 104 CFU m−2 before cleaning and from 11 × 104 to 89 × 104 CFU m−2 after cleaning. Statistically significant differences regarding fungal load were not detected between before and after cleaning procedures. Toxigenic strains from A. flavus complex and S. chartarum were not detected by qPCR. Conversely, the A. fumigatus species was successfully detected by qPCR and interestingly it was amplified in two samples where no detection by conventional methods was observed. Overall, these results reveal the inefficacy of the cleaning procedures and that it is important to determine fungal burden in order to carry out risk assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To compare image quality and effective dose when the 10 kVp rule is applied with manual and AEC mode in PA chest X-ray. Methods and Materials: A total of 68 images (with and without lesions) were acquired of an anthropomorphic chest phantom in a Wolverson Arcoma X-ray unit. The images were evaluated against a reference image using image quality criteria and the 2 alternative forced choice (2 AFC) method by five radiographers. The effective dose was calculated using PCXMC software using the exposure parameters and DAP. The exposure index (lgM) was recorded. Results: Exposure time decreases considerably when applying the 10 kVp rule in manual mode (50%-28%) compared to AEC mode (36%-23%). Statistical differences for effective dose between several AEC modes were found (p=0.002). The effective dose is lower when using only the right AEC ionization chamber. Considering image quality, there are no statistical differences (p=0.348) between the different AEC modes for images with no lesions. Using a higher kVp value the lgM values will also increase. The lgM values showed significant statistical differences (p=0.000). The image quality scores did not present statistically significant differences (p=0.043) for the images with lesions when comparing manual with AEC modes. Conclusion: In general, the dose is lower in the manual mode. By using the right AEC ionising chamber the effective dose will be the lowest in comparison to other ionising chambers. The use of the 10 kVp rule did not affect the detectability of the lesions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To propose a method of redistributing ill-defined causes of death (IDCD) based on the investigation of such causes.METHODS In 2010, an evaluation of the results of investigating the causes of death classified as IDCD in accordance with chapter 18 of the International Classification of Diseases (ICD-10) by the Mortality Information System was performed. The redistribution coefficients were calculated according to the proportional distribution of ill-defined causes reclassified after investigation in any chapter of the ICD-10, except for chapter 18, and used to redistribute the ill-defined causes not investigated and remaining by sex and age. The IDCD redistribution coefficient was compared with two usual methods of redistribution: a) Total redistribution coefficient, based on the proportional distribution of all the defined causes originally notified and b) Non-external redistribution coefficient, similar to the previous, but excluding external causes.RESULTS Of the 97,314 deaths by ill-defined causes reported in 2010, 30.3% were investigated, and 65.5% of those were reclassified as defined causes after the investigation. Endocrine diseases, mental disorders, and maternal causes had a higher representation among the reclassified ill-defined causes, contrary to infectious diseases, neoplasms, and genitourinary diseases, with higher proportions among the defined causes reported. External causes represented 9.3% of the ill-defined causes reclassified. The correction of mortality rates by the total redistribution coefficient and non-external redistribution coefficient increased the magnitude of the rates by a relatively similar factor for most causes, contrary to the IDCD redistribution coefficient that corrected the different causes of death with differentiated weights.CONCLUSIONS The proportional distribution of causes among the ill-defined causes reclassified after investigation was not similar to the original distribution of defined causes. Therefore, the redistribution of the remaining ill-defined causes based on the investigation allows for more appropriate estimates of the mortality risk due to specific causes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple procedure to measure the cohesive laws of bonded joints under mode I loading using the double cantilever beam test is proposed. The method only requires recording the applied load–displacement data and measuring the crack opening displacement at its tip in the course of the experimental test. The strain energy release rate is obtained by a procedure involving the Timoshenko beam theory, the specimen’s compliance and the crack equivalent concept. Following the proposed approach the influence of the fracture process zone is taken into account which is fundamental for an accurate estimation of the failure process details. The cohesive law is obtained by differentiation of the strain energy release rate as a function of the crack opening displacement. The model was validated numerically considering three representative cohesive laws. Numerical simulations using finite element analysis including cohesive zone modeling were performed. The good agreement between the inputted and resulting laws for all the cases considered validates the model. An experimental confirmation was also performed by comparing the numerical and experimental load–displacement curves. The numerical load–displacement curves were obtained by adjusting typical cohesive laws to the ones measured experimentally following the proposed approach and using finite element analysis including cohesive zone modeling. Once again, good agreement was obtained in the comparisons thus demonstrating the good performance of the proposed methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constraints nonlinear optimization problems can be solved using penalty or barrier functions. This strategy, based on solving the problems without constraints obtained from the original problem, have shown to be e ective, particularly when used with direct search methods. An alternative to solve the previous problems is the lters method. The lters method introduced by Fletcher and Ley er in 2002, , has been widely used to solve problems of the type mentioned above. These methods use a strategy di erent from the barrier or penalty functions. The previous functions de ne a new one that combine the objective function and the constraints, while the lters method treat optimization problems as a bi-objective problems that minimize the objective function and a function that aggregates the constraints. Motivated by the work of Audet and Dennis in 2004, using lters method with derivative-free algorithms, the authors developed works where other direct search meth- ods were used, combining their potential with the lters method. More recently. In a new variant of these methods was presented, where it some alternative aggregation restrictions for the construction of lters were proposed. This paper presents a variant of the lters method, more robust than the previous ones, that has been implemented with a safeguard procedure where values of the function and constraints are interlinked and not treated completely independently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Maxwell equations play a fundamental role in the electromagnetic theory and lead to models useful in physics and engineering. This formalism involves integer-order differential calculus, but the electromagnetic diffusion points towards the adoption of a fractional calculus approach. This study addresses the skin effect and develops a new method for implementing fractional-order inductive elements. Two genetic algorithms are adopted, one for the system numerical evaluation and another for the parameter identification, both with good results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adhesive-bonding for the unions in multi-component structures is gaining momentum over welding, riveting and fastening. It is vital for the design of bonded structures the availability of accurate damage models, to minimize design costs and time to market. Cohesive Zone Models (CZM’s) have been used for fracture prediction in structures. The eXtended Finite Element Method (XFEM) is a recent improvement of the Finite Element Method (FEM) that relies on traction-separation laws similar to those of CZM’s but it allows the growth of discontinuities within bulk solids along an arbitrary path, by enriching degrees of freedom. This work proposes and validates a damage law to model crack propagation in a thin layer of a structural epoxy adhesive using the XFEM. The fracture toughness in pure mode I (GIc) and tensile cohesive strength (sn0) were defined by Double-Cantilever Beam (DCB) and bulk tensile tests, respectively, which permitted to build the damage law. The XFEM simulations of the DCB tests accurately matched the experimental load-displacement (P-d) curves, which validated the analysis procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the Pseudo phase plane (PPP) method for detecting the existence of a nanofilm on the nitroazobenzene-modified glassy carbon electrode (NAB-GC) system. This modified electrode systems and nitroazobenze-nanofilm were prepared by the electrochemical reduction of diazonium salt of NAB at the glassy carbon electrodes (GCE) in nonaqueous media. The IR spectra of the bare glassy carbon electrodes (GCE), the NAB-GC electrode system and the organic NAB film were recorded. The IR data of the bare GC, NAB-GC and NAB film were categorized into five series consisting of FILM1, GC-NAB1, GC1; FILM2, GC-NAB2, GC2; FILM3, GC-NAB3, GC3 and FILM4, GC-NAB4, GC4 respectively. The PPP approach was applied to each group of the data of unmodified and modified electrode systems with nanofilm. The results provided by PPP method show the existence of the NAB film on the modified GC electrode.