8 resultados para many-objective problems
em Universidad de Alicante
Resumo:
In many classification problems, it is necessary to consider the specific location of an n-dimensional space from which features have been calculated. For example, considering the location of features extracted from specific areas of a two-dimensional space, as an image, could improve the understanding of a scene for a video surveillance system. In the same way, the same features extracted from different locations could mean different actions for a 3D HCI system. In this paper, we present a self-organizing feature map able to preserve the topology of locations of an n-dimensional space in which the vector of features have been extracted. The main contribution is to implicitly preserving the topology of the original space because considering the locations of the extracted features and their topology could ease the solution to certain problems. Specifically, the paper proposes the n-dimensional constrained self-organizing map preserving the input topology (nD-SOM-PINT). Features in adjacent areas of the n-dimensional space, used to extract the feature vectors, are explicitly in adjacent areas of the nD-SOM-PINT constraining the neural network structure and learning. As a study case, the neural network has been instantiate to represent and classify features as trajectories extracted from a sequence of images into a high level of semantic understanding. Experiments have been thoroughly carried out using the CAVIAR datasets (Corridor, Frontal and Inria) taken into account the global behaviour of an individual in order to validate the ability to preserve the topology of the two-dimensional space to obtain high-performance classification for trajectory classification in contrast of non-considering the location of features. Moreover, a brief example has been included to focus on validate the nD-SOM-PINT proposal in other domain than the individual trajectory. Results confirm the high accuracy of the nD-SOM-PINT outperforming previous methods aimed to classify the same datasets.
Resumo:
Background: in both Spain and Italy the number of immigrants has strongly increased in the last 20 years, currently representing more than the 10% of workforce in each country. The segregation of immigrants into unskilled or risky jobs brings negative consequences for their health. The objective of this study is to compare prevalence of work-related health problems between immigrants and native workers in Italy and Spain. Methods: data come from the Italian Labour Force Survey (n=65 779) and Spanish Working Conditions Survey (n=11 019), both conducted in 2007. We analyzed merged datasets to evaluate whether interviewees, both natives and migrants, judge their health being affected by their work conditions and, if so, which specific diseases. For migrants, we considered those coming from countries with a value of the Human Development Index lower than 0.85. Logistic regression models were used, including gender, age, and education as adjusting factors. Results: migrants reported skin diseases (Mantel-Haenszel pooled OR=1.49; 95%CI: 0.59-3.74) and musculoskeletal problems among those employed in agricultural sector (Mantel-Haenszel pooled OR=1.16; 95%CI: 0.69-1.96) more frequently than natives; country-specific analysis showed higher risks of musculoskeletal problems among migrants compared to the non-migrant population in Italy (OR=1.17; 95% CI: 0.48-1.59) and of respiratory problems in Spain (OR=2.02; 95%CI: 1.02-4.0). In both countries the risk of psychological stress was predominant among national workers. Conclusions: this collaborative study allows to strength the evidence concerning the health of migrant workers in Southern European countries.
Resumo:
The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).
Resumo:
Modern compilers present a great and ever increasing number of options which can modify the features and behavior of a compiled program. Many of these options are often wasted due to the required comprehensive knowledge about both the underlying architecture and the internal processes of the compiler. In this context, it is usual, not having a single design goal but a more complex set of objectives. In addition, the dependencies between different goals are difficult to be a priori inferred. This paper proposes a strategy for tuning the compilation of any given application. This is accomplished by using an automatic variation of the compilation options by means of multi-objective optimization and evolutionary computation commanded by the NSGA-II algorithm. This allows finding compilation options that simultaneously optimize different objectives. The advantages of our proposal are illustrated by means of a case study based on the well-known Apache web server. Our strategy has demonstrated an ability to find improvements up to 7.5% and up to 27% in context switches and L2 cache misses, respectively, and also discovers the most important bottlenecks involved in the application performance.
Resumo:
Feature selection is an important and active issue in clustering and classification problems. By choosing an adequate feature subset, a dataset dimensionality reduction is allowed, thus contributing to decreasing the classification computational complexity, and to improving the classifier performance by avoiding redundant or irrelevant features. Although feature selection can be formally defined as an optimisation problem with only one objective, that is, the classification accuracy obtained by using the selected feature subset, in recent years, some multi-objective approaches to this problem have been proposed. These either select features that not only improve the classification accuracy, but also the generalisation capability in case of supervised classifiers, or counterbalance the bias toward lower or higher numbers of features that present some methods used to validate the clustering/classification in case of unsupervised classifiers. The main contribution of this paper is a multi-objective approach for feature selection and its application to an unsupervised clustering procedure based on Growing Hierarchical Self-Organising Maps (GHSOMs) that includes a new method for unit labelling and efficient determination of the winning unit. In the network anomaly detection problem here considered, this multi-objective approach makes it possible not only to differentiate between normal and anomalous traffic but also among different anomalies. The efficiency of our proposals has been evaluated by using the well-known DARPA/NSL-KDD datasets that contain extracted features and labelled attacks from around 2 million connections. The selected feature sets computed in our experiments provide detection rates up to 99.8% with normal traffic and up to 99.6% with anomalous traffic, as well as accuracy values up to 99.12%.
Resumo:
In this paper we examine multi-objective linear programming problems in the face of data uncertainty both in the objective function and the constraints. First, we derive a formula for the radius of robust feasibility guaranteeing constraint feasibility for all possible scenarios within a specified uncertainty set under affine data parametrization. We then present numerically tractable optimality conditions for minmax robust weakly efficient solutions, i.e., the weakly efficient solutions of the robust counterpart. We also consider highly robust weakly efficient solutions, i.e., robust feasible solutions which are weakly efficient for any possible instance of the objective matrix within a specified uncertainty set, providing lower bounds for the radius of highly robust efficiency guaranteeing the existence of this type of solutions under affine and rank-1 objective data uncertainty. Finally, we provide numerically tractable optimality conditions for highly robust weakly efficient solutions.
Resumo:
Convex vector (or multi-objective) semi-infinite optimization deals with the simultaneous minimization of finitely many convex scalar functions subject to infinitely many convex constraints. This paper provides characterizations of the weakly efficient, efficient and properly efficient points in terms of cones involving the data and Karush–Kuhn–Tucker conditions. The latter characterizations rely on different local and global constraint qualifications. The results in this paper generalize those obtained by the same authors on linear vector semi-infinite optimization problems.
Resumo:
Natural stone has been a popular and reliable building material throughout history appearing in many historic monuments and in more recent buildings. Research into the intrinsic properties of specific stones is important because it gives us a greater understanding of the factors that limit and act on them. This can help prevent serious problems from occurring in our buildings bringing both esthetic benefits and financial savings. To this end, the main objective of this research has been to study the influence of the fabric and the mineral composition of two types of sandstone on their durability. The first is a red continental sandstone from the Buntsandstein Age called “Molinaza Roja”, which is quarried in Montoro (Cordoba). The second is quarried in Ronda (Malaga) and is sold under the trade name of “Arenisca Ronda”. It is a light pink-whitish calcarenite deposited during the Late Tortonian to Late Messinian. We characterized their petrological and petrophysical properties by studying their rock fabrics, porous systems and mechanical properties. In order to obtain a complete vision of the behavior of their rock fabrics, we also carried out two decay tests, the salt crystallization and the freeze–thaw tests. We then measured the effects on the textures of the altered samples during and after the decay tests and we evaluated the changes in the porous system. By comparing the results between intact and altered samples, we found that Arenisca Ronda is less durable because it has a high quantity of expandable clays (smectites) and a high percentage of pores in the 0.1–1 μm range, in which the pressure produced by salt crystallization is strongest. In Molinaza Roja the decay agents caused significant sanding due to loss of cohesion between the clasts, especially during the salt crystallization test. In both stones, the anisotropies (oriented textures) have an important role in their hydric and dynamic behavior and also affect their mechanical properties (especially in the compression resistance). No changes in color were detected.