23 resultados para RM(rate monotonic)algorithm
em Universidade do Minho
Resumo:
The Electromagnetism-like (EM) algorithm is a population- based stochastic global optimization algorithm that uses an attraction- repulsion mechanism to move sample points towards the optimal. In this paper, an implementation of the EM algorithm in the Matlab en- vironment as a useful function for practitioners and for those who want to experiment a new global optimization solver is proposed. A set of benchmark problems are solved in order to evaluate the performance of the implemented method when compared with other stochastic methods available in the Matlab environment. The results con rm that our imple- mentation is a competitive alternative both in term of numerical results and performance. Finally, a case study based on a parameter estimation problem of a biology system shows that the EM implementation could be applied with promising results in the control optimization area.
Resumo:
The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.
Resumo:
The development of novel strengthening techniques to address the seismic vulnerability of masonry elements is gradually leading to simpler, faster and more effective strengthening strategies. In particular, the use of fabric reinforced cementitious matrix systems is considered of great potential, given the increase of ductility achieved with simple and economic strengthening procedures. To assess the effectiveness of these strengthening systems, and considering that the seismic action is involved, one important component of the structural behaviour is the in-plane cyclic response. In this work is discussed the applicability of the diagonal tensile test for the assessment of the cyclic response of strengthened masonry. The results obtained allowed to assess the contribution of the strengthening system to the increase of the load carrying capacity of masonry elements, as well as to evaluate the damage evolution and the stiffness degradation mechanisms developing under cyclic loading.
Resumo:
Delay Tolerant Network (DTN) is a communication architecture enabling connectivity in a topology with unregular end-to-end network connection. DTN enables communication in environments with cross-connectivity, large delays and delivery time variations, and a high error rate. DTN can be used in vehicular networks where public transport get involved. This research aims to analyze the role of public transit as a DTN routing infrastructure. The impact of using public transit as a relay router is investigated by referencing the network performance, defined by its delivery ratio, average delay and overhead. The results show that public transit can be used as a backbone for DTN in an urban scenario using existing protocols. This opens insights for future researches on routing algorithm and protocol design.
Resumo:
Doctoral Thesis Civil Engineering
Resumo:
The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.
Resumo:
Dissertação de mestrado integrado em Psicologia
Resumo:
Wireless body sensor networks (WBSNs) constitute a key technology for closing the loop between patients and healthcare providers, as WBSNs provide sensing ability, as well as mobility and portability, essential characteristics for wide acceptance of wireless healthcare technology. However, one important and difficult aspect of WBSNs is to provide data transmissions with quality of service, among other factors due to the antennas being small size and placed close to the body. Such transmissions cannot be fully provided without the assumption of a MAC protocol that solves the problems of the medium sharing. A vast number of MAC protocols conceived for wireless networks are based on random or scheduled schemes. This paper studies firstly the suitability of two MAC protocols, one using CSMA and the other TDMA, to transmit directly to the base station the signals collected continuously from multiple sensor nodes placed on the human body. Tests in a real scenario show that the beaconed TDMA MAC protocol presents an average packet loss ratio lower than CSMA. However, the average packet loss ratio is above 1.0 %. To improve this performance, which is of vital importance in areas such as e-health and ambient assisted living, a hybrid TDMA/CSMA scheme is proposed and tested in a real scenario with two WBSNs and four sensor nodes per WBSN. An average packet loss ratio lower than 0.2 % was obtained with the hybrid scheme. To achieve this significant improvement, the hybrid scheme uses a lightweight algorithm to control dynamically the start of the superframes. Scalability and traffic rate variation tests show that this strategy allows approximately ten WBSNs operating simultaneously without significant performance degradation.
Resumo:
The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.
Resumo:
In this paper, we propose an extension of the firefly algorithm (FA) to multi-objective optimization. FA is a swarm intelligence optimization algorithm inspired by the flashing behavior of fireflies at night that is capable of computing global solutions to continuous optimization problems. Our proposal relies on a fitness assignment scheme that gives lower fitness values to the positions of fireflies that correspond to non-dominated points with smaller aggregation of objective function distances to the minimum values. Furthermore, FA randomness is based on the spread metric to reduce the gaps between consecutive non-dominated solutions. The obtained results from the preliminary computational experiments show that our proposal gives a dense and well distributed approximated Pareto front with a large number of points.
Resumo:
This paper presents a single-phase Series Active Power Filter (Series APF) for mitigation of the load voltage harmonic content, while maintaining the voltage on the DC side regulated without the support of a voltage source. The proposed series active power filter control algorithm eliminates the additional voltage source to regulate the DC voltage, and with the adopted topology it is not used a coupling transformer to interface the series active power filter with the electrical power grid. The paper describes the control strategy which encapsulates the grid synchronization scheme, the compensation voltage calculation, the damping algorithm and the dead-time compensation. The topology and control strategy of the series active power filter have been evaluated in simulation software and simulations results are presented. Experimental results, obtained with a developed laboratorial prototype, validate the theoretical assumptions, and are within the harmonic spectrum limits imposed by the international recommendations of the IEEE-519 Standard.
Resumo:
Purpose: To study the relationship among the variables intensity ofthe end-of-day (EOD) dryness, corneal sensitivity and blink rate in soft contact lens (CL) wearers. Methods: Thirty-eight soft CL wearers (25 women and 13 men; mean age 27.1 ± 7.2 years) were enrolled. EOD dryness was assessed using a scale of 0–5 (0, none to 5, very intense). Mechanical and thermal (heat and cold) sensitivity were measured using a Belmonte’s gas esthesiometer. The blink rate was recorded using a video camera while subjects were wearing a hydrogel CL and watching a film for 90 min in a controlled environmental chamber. Results: A significant inverse correlation was found between EOD dryness and mechanical sensitivity (r: −0.39; p = 0.02); however, there were no significant correlations between EOD dryness and thermal sensitivity. A significant (r: 0.56; p < 0.001) correlation also was observed between EOD dryness and blink rate, but no correlations were found between blink rate and mechanical or thermal sensitivity. Conclusions: CL wearers with higher corneal sensitivity to mechanical stimulation reported more EOD dryness with habitual CL wear. Moreover, subjects reporting more EOD dryness had an increased blink rates during wear of a standard CL type. The increased blink rate could act to improve the ocular surface environment and relieve symptoms
Resumo:
Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.
Resumo:
This Letter presents measurements of correlated production of nearby jets in Pb+Pb collisions at sNN−−−√=2.76 TeV using the ATLAS detector at the Large Hadron Collider. The measurement was performed using 0.14 nb−1 of data recorded in 2011. The production of correlated jet pairs was quantified using the rate, RΔR, of ``neighbouring'' jets that accompany ``test'' jets within a given range of angular distance, ΔR, in the pseudorapidity--azimuthal angle plane. The jets were measured in the ATLAS calorimeter and were reconstructed using the anti-kt algorithm with radius parameters d=0.2, 0.3, and 0.4. RΔR was measured in different Pb+Pb collision centrality bins, characterized by the total transverse energy measured in the forward calorimeters. A centrality dependence of RΔR is observed for all three jet radii with RΔR found to be lower in central collisions than in peripheral collisions. The ratios formed by the RΔR values in different centrality bins and the values in the 40--80 % centrality bin are presented.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação