21 resultados para parametric implicit vector equilibrium problems

em Universidade do Minho


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a critical and quantitative analysis of the influence of the Power Quality in grid connected solar photovoltaic microgeneration installations. First are introduced the main regulations and legislation related with the solar photovoltaic microgeneration, in Portugal and Europe. Next are presented Power Quality monitoring results obtained from two residential solar photovoltaic installations located in the north of Portugal, and is explained how the Power Quality events affect the operation of these installations. Afterwards, it is described a methodology to estimate the energy production losses and the impact in the revenue caused by the abnormal operation of the electrical installation. This is done by comparing the amount of energy that was injected into the power grid with the theoretical value of energy that could be injected in normal conditions. The performed analysis shows that Power Quality severally affects the solar photovoltaic installations operation. The losses of revenue in the two monitored installations M1 and M2 are estimated in about 27% and 22%, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Gibbs free energy of transfer of a methylene group, G*(CH2), is reported as a measure of the relative hydrophobicity of the equilibrium phases. Furthermore, G*(CH2) is a characteristic parameter of each tie-line, and for that reason can be used for comparing different tie-lines of a given aqueous two-phase system (ATPS) or even to establish comparisons among different ATPSs. In this work, the partition coefficients of a series of four dinitrophenylated-amino acids were experimentally determined, at 23 °C, in five different tie-lines of PEG8000(sodium or potassium) citrate ATPSs. G*(CH2) values were calculated from the partition coefficients and used to evaluate the relative hydrophobicity of the equilibrium phases. PEG8000potassium citrate ATPSs presented larger relative hydrophobicity than PEG8000sodium citrate ATPSs. Furthermore, the results obtained indicated that the PEG-rich phase (top phase) has higher affinity to participate in hydrophobic hydration interactions than the salt-rich phase (bottom phase).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing masonry structures are usually associated to a high seismic vulnerability, mainly due to the properties of the materials, weak connections between floors and load-bearing walls, high mass of the masonry walls and flexibility of the floors. For these reasons, the seismic performance of existing masonry structures has received much attention in the last decades. This study presents the parametric analysis taking into account the deviations on features of the gaioleiro buildings - Portuguese building typology. The main objective of the parametric analysis is to compare the seismic performance of the structure as a function of the variations of its properties with respect to the response of a reference model. The parametric analysis was carried out for two types of structural analysis, namely for the non-linear dynamic analysis with time integration and for the pushover analysis with distribution of forces proportional to the inertial forces of the structure. The Young's modulus of the masonry walls, Young's modulus of the timber floors, the compressive and tensile non-linear properties (strength and fracture energy) were the properties considered in both type of analysis. Additionally, in the dynamic analysis, the influences of the vis-cous damping and of the vertical component of the earthquake were evaluated. A pushover analysis proportional to the modal displacement of the first mode in each direction was also carried out. The results shows that the Young's modulus of the masonry walls, the Young's modulus of the timber floors and the compressive non-linear properties are the pa-rameters that most influence the seismic performance of this type of tall and weak existing masonry structures. Furthermore, it is concluded that that the stiffness of the floors influences significantly the strength capacity and the collapse mecha-nism of the numerical model. Thus, a study on the strengthening of the floors was also carried out. The increase of the thickness of the timber floors was the strengthening technique that presented the best seismic performance, in which the reduction of the out-of-plane displacements of the masonry walls is highlighted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During recent decades it has been possible to identify several problems in construction industry project management, related with to systematic failures in terms of fulfilling its schedule, cost and quality targets, which highlight a need for an evaluation of the factors that may cause these failures. Therefore, it is important to understand how project managers plan the projects, so that the performance and the results can be improved. However, it is important to understand if other areas beyond cost and time management that are mentioned on several studies as the most critical areas, receive the necessary attention from construction project managers. Despite the cost and time are the most sensitive areas/fields, there are several other factors that may lead to project failure. This study aims at understand the reasons that may cause the deviation in terms of cost, time and quality, from the project management point of view, looking at the knowledge areas mentioned by PMI (Project Management Institute).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of Barotrauma is identified as a major concern for health professionals, since it can be fatal for patients. In order to support the decision process and to predict the risk of occurring barotrauma Data Mining models were induced. Based on this principle, the present study addresses the Data Mining process aiming to provide hourly probability of a patient has Barotrauma. The process of discovering implicit knowledge in data collected from Intensive Care Units patientswas achieved through the standard process Cross Industry Standard Process for Data Mining. With the goal of making predictions according to the classification approach they several DM techniques were selected: Decision Trees, Naive Bayes and Support Vector Machine. The study was focused on identifying the validity and viability to predict a composite variable. To predict the Barotrauma two classes were created: “risk” and “no risk”. Such target come from combining two variables: Plateau Pressure and PCO2. The best models presented a sensitivity between 96.19% and 100%. In terms of accuracy the values varied between 87.5% and 100%. This study and the achieved results demonstrated the feasibility of predicting the risk of a patient having Barotrauma by presenting the probability associated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Doctoral Thesis Civil Engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia Industrial e de Sistemas (PDEIS)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Closest Vector Problem (CVP) and the Shortest Vector Problem (SVP) are prime problems in lattice-based cryptanalysis, since they underpin the security of many lattice-based cryptosystems. Despite the importance of these problems, there are only a few CVP-solvers publicly available, and their scalability was never studied. This paper presents a scalable implementation of an enumeration-based CVP-solver for multi-cores, which can be easily adapted to solve the SVP. In particular, it achieves super-linear speedups in some instances on up to 8 cores and almost linear speedups on 16 cores when solving the CVP on a 50-dimensional lattice. Our results show that enumeration-based CVP-solvers can be parallelized as effectively as enumeration-based solvers for the SVP, based on a comparison with a state of the art SVP-solver. In addition, we show that we can optimize the SVP variant of our solver in such a way that it becomes 35%-60% faster than the fastest enumeration-based SVP-solver to date.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A search has been performed for pair production of heavy vector-like down-type (B) quarks. The analysis explores the lepton-plus-jets final state, characterized by events with one isolated charged lepton (electron or muon), significant missing transverse momentum and multiple jets. One or more jets are required to be tagged as arising from b-quarks, and at least one pair of jets must be tagged as arising from the hadronic decay of an electroweak boson. The analysis uses the full data sample of pp collisions recorded in 2012 by the ATLAS detector at the LHC, operating at a center-of-mass energy of 8 TeV, corresponding to an integrated luminosity of 20.3 fb−1. No significant excess of events is observed above the expected background. Limits are set on vector-like B production, as a function of the B branching ratios, assuming the allowable decay modes are B→Wt/Zb/Hb. In the chiral limit with a branching ratio of 100% for the decay B→Wt, the observed (expected) 95% CL lower limit on the vector-like B mass is 810 GeV (760 GeV). In the case where the vector-like B quark has branching ratio values corresponding to those of an SU(2) singlet state, the observed (expected) 95% CL lower limit on the vector-like B mass is 640 GeV (505 GeV). The same analysis, when used to investigate pair production of a colored, charge 5/3 exotic fermion T5/3, with subsequent decay T5/3→Wt, sets an observed (expected) 95% CL lower limit on the T5/3 mass of 840 GeV (780 GeV).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A search for a charged Higgs boson, H±, decaying to a W± boson and a Z boson is presented. The search is based on 20.3 fb−1 of proton-proton collision data at a center-of-mass energy of 8 TeV recorded with the ATLAS detector at the LHC. The H± boson is assumed to be produced via vector-boson fusion and the decays W±→qq′¯ and Z→e+e−/μ+μ− are considered. The search is performed in a range of charged Higgs boson masses from 200 to 1000 GeV. No evidence for the production of an H± boson is observed. Upper limits of 31--1020 fb at 95% CL are placed on the cross section for vector-boson fusion production of an H± boson times its branching fraction to W±Z. The limits are compared with predictions from the Georgi-Machacek Higgs Triplet Model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A search for pair production of vector-like quarks, both up-type (T) and down-type (B), as well as for four-top-quark production, is presented. The search is based on pp collisions at s√=8 TeV recorded in 2012 with the ATLAS detector at the CERN Large Hadron Collider and corresponding to an integrated luminosity of 20.3 fb−1. Data are analysed in the lepton-plus-jets final state, characterised by an isolated electron or muon with high transverse momentum, large missing transverse momentum and multiple jets. Dedicated analyses are performed targeting three cases: a T quark with significant branching ratio to a W boson and a b-quark (TT¯→Wb+X), and both a T quark and a B quark with significant branching ratio to a Higgs boson and a third-generation quark (TT¯→Ht+X and BB¯→Hb+X respectively). No significant excess of events above the Standard Model expectation is observed, and 95% CL lower limits are derived on the masses of the vector-like T and B quarks under several branching ratio hypotheses assuming contributions from T→Wb, Zt, Ht and B→Wt, Zb, Hb decays. The 95% CL observed lower limits on the T quark mass range between 715 GeV and 950 GeV for all possible values of the branching ratios into the three decay modes, and are the most stringent constraints to date. Additionally, the most restrictive upper bounds on four-top-quark production are set in a number of new physics scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Firefly Algorithm is a recent swarm intelligence method, inspired by the social behavior of fireflies, based on their flashing and attraction characteristics [1, 2]. In this paper, we analyze the implementation of a dynamic penalty approach combined with the Firefly algorithm for solving constrained global optimization problems. In order to assess the applicability and performance of the proposed method, some benchmark problems from engineering design optimization are considered.