942 resultados para Hybrid Methods
Resumo:
This research is a step forward in improving the accuracy of detecting anomaly in a data graph representing connectivity between people in an online social network. The proposed hybrid methods are based on fuzzy machine learning techniques utilising different types of structural input features. The methods are presented within a multi-layered framework which provides the full requirements needed for finding anomalies in data graphs generated from online social networks, including data modelling and analysis, labelling, and evaluation.
Resumo:
Objective Smoking prevalence among Vietnamese men is among the highest in the world. Our aim was to provide estimates of tobacco attributable mortality to support tobacco control policies. Method We used the Peto–Lopez method using lung cancer mortality to derive a Smoking Impact Ratio (SIR) as a marker of cumulative exposure to smoking. SIRs were applied to relative risks from the Cancer Prevention Study, Phase II. Prevalence-based and hybrid methods, using the SIR for cancers and chronic obstructive pulmonary disease and smoking prevalence for all other outcomes, were used in sensitivity analyses. Results When lung cancer was used to measure cumulative smoking exposure, 28% (95% uncertainty interval 24–31%) of all adult male deaths (> 35 years) in Vietnam in 2008 were attributable to smoking. Lower estimates resulted from prevalence-based methods [24% (95% uncertainty interval 21–26%)] with the hybrid method yielding intermediate estimates [26% (95% uncertainty interval 23–28%)]. Conclusion Despite uncertainty in these estimates of attributable mortality, tobacco smoking is already a major risk factor for death in Vietnamese men. Given the high current prevalence of smoking, this has important implications not only for preventing the uptake of tobacco but also for immediate action to adopt and enforce stronger tobacco control measures.
Resumo:
Hybrid methods based on the Reynolds Averaged Navier Stokes (RANS) equations and the Large Eddy Simulation (LES) formulation are investigated to try and improve the accuracy of heat transfer and surface temperature predictions for electronics systems and components. Two relatively low Reynolds number flows are studied using hybrid RANS-LES, RANS-Implicit-LES (RANS-ILES) and non-linear LES models. Predictions using these methods are in good agreement with each other, even using different grid resolutions. © 2008 IEEE.
Resumo:
小麦条锈病(Puccinia striiformis f. sp. tritici)是世界性小麦病害,可导致受害小麦减产30%以上,甚至绝收。小麦条锈病在我国西南、华北麦区危害严重,四川麦区是小麦条锈病发病最重的地区之一,每年因条锈病流行造成小麦产量损失巨大。利用抗条锈病品种是控制该病害最安全、经济的有效途径,因此挖掘利用抗病新基因,开展抗病遗传基础研究是当前育种工作中面临的重要任务。 偏凸山羊草(Aegilops ventricosa,DDMvMv,2n=28)是一年生草本植物,起源于地中海西部沿岸地区,具有对小麦白粉病、锈病等高抗或免疫、耐盐、抗寒、蛋白质含量高等优良性状,是小麦遗传育种很好的种质资源。本研究以高抗条锈病的小麦—偏凸山羊草6Mv/6B代换系(Moisson 6Mv/6B)为材料,对其含有的带条锈病抗性基因的偏凸山羊草6Mv染色体在四川小麦背景中的传递情况、与小麦—簇毛麦双端体附加系所具有的白粉病抗性的聚合以及对Moisson 6Mv/6B进行电离辐射诱变筛选抗条锈病的小麦—偏凸山羊草易位系三个方面进行了研究。取得的主要研究结果如下: 1. Moisson 6Mv/6B与高感条锈病的四川地区普通小麦品种绵阳26、绵阳93-124和SW3243的杂种F1与其普通小麦亲本分别作为父、母本回交,通过对其BC1和F2的结实率、根尖细胞有丝分裂中期染色体的观察以及对条锈病抗性的鉴定,发现含6Mv染色体的F1植株作母本时的回交结实率(83.10%)普遍高于含6Mv染色体的F1植株作父本(48.61%),结实率与普通小麦基因型密切相关(χ2=34.15>>χ20.05=5.99(df=2));6Mv染色体在三种四川小麦中通过雌、雄配子传递的传递方式与其传递率间没有显著相关性,其传递率与普通小麦基因型呈显著相关性(χ2=6.42>χ20.05=5.99(df=2))。 2. Moisson 6Mv/6B与高抗白粉病的小麦—簇毛麦双端体附加系Pana(2n=42+2t)正反杂交,希望在聚合两者抗性的同时观察不同受体背景下的抗性反应。对Moisson 6Mv/6B和Pana正反杂交的结实率、杂交后代的农艺性状进行观察,并对杂交后代进行基因组荧光原位杂交(GISH)分析及条锈病和白粉病的抗性鉴定。结果表明Moisson 6Mv/6B作母本时杂交结实率(80.56%)高于Pana作母本时(58.33%),结实率与杂交方式间紧密相关(χ2=4.96>χ20.05=3.84(df=1));Moisson 6Mv/6B和Pana杂交后代株高比最高亲本高约10cm,成熟期也较两亲本提前两个星期左右;正反杂交后代中具有偏凸山羊草6Mv染色体的植株具有条锈病抗性,具有簇毛麦端体的植株具有白粉病抗性,同时筛选到4株含有偏凸山羊草和簇毛麦遗传物质并对条锈病和白粉病兼抗的材料,证明来自偏凸山羊草6Mv染色体的条锈病抗性与来自簇毛麦端体的白粉病抗性已经聚合在一起,且没有产生相互抑制的作用,暗示通过这两个抗性基因的聚合是完全能获得兼抗条锈病和白粉病的小麦新种质。 3. 对Moisson 6Mv/6B在减数分裂时期的成株进行总剂量为6Gy、辐射频率为120rad/min的60Co-γ射线辐射,对辐射植株自交后代进行农艺性状及根尖细胞有丝分裂中期染色体形态观察和条锈病抗性鉴定。结果为辐射植株自交结实率为2.22%,根尖细胞有丝分裂中期的染色体存在明显碎片,辐射自交后代植株对条锈病具有成株期抗性。 小麦—偏凸山羊草6Mv/6B代换系对条锈病抗性稳定,是培育条锈病抗性品种的良好供体。本研究证明在四川小麦背景中要利用该品种抗性,在结实数满足需要时,可将其作父本,亦可作母本,但关键是要选择好一个优良的受体基因型;同时其条锈病抗性与来自簇毛麦的白粉病抗性没有相互抑制作用,可将两者抗性有效聚合用于小麦育种中。 Wheat stripe rust (Puccinia striiformis f. sp. Tritici) is a worldwide disease of wheat, and could lead to victims of 30 percent or even total destruction of wheat production. Wheat stripe rust harms badly in China's southwest and North China. Sichuan province is one of the regions damaged by wheat stripe rust heavily. The use of resistant varieties is the most secure and economical way to control the wheat stripe rust. Therefore, it is essential to identify new disease-resistant genes and genetically research of disease resistance. Aegilops ventricosa (DDMvMv, 2n = 28) is an annual herbaceous plant, originating in the coastal areas of the western Mediterranean, with good characters such as resistance of wheat powdery, rust, salt, cold and high protein content. It is a good germplasm resource. In this study, the wheat- Aegilops ventricosa 6Mv/6B substitution line Moisson 6Mv/6B (highly resistant to the wheat stripe rust) was used to study on the transmission of chromosome 6Mv of Aegilops ventricosa in different genetic background of Sichuan wheat varieties, hybridization with wheat- Haynaldia villosa ditelosomic addition line Pana (highly resistant to the powdery mildew) and screening of wheat- Aegilops ventricosa translocation line by exposuring Moisson 6Mv/6B under ionizing radiation. The main results are as following: 1. Moisson 6Mv/6B was crossed with Sichuan wheat varieties mianyang26, mianyang93-124 and SW3243 (highly susceptible to stripe rust), respectively. Their F1 hybrids were further backcrossed as male and female to corresponding wheat varieties. The seed-setting rate, chromosomes confirmation in the mitotic metaphase of root tip cells, and resistance to stripe rust of the subsequent BC1 and F2 plants were investigated. The average seed-setting rate of backcross via 6Mv as female donor (83.10%) was higher than that of backcross via 6Mv as male donor (48.61%), suggesting that the seed-setting rate was associated with the wheat genotypes(χ2=34.15>>χ20.05=5.99(df=2)). In all analyzed populations, transmission frequencies of chromosome 6Mv were not correlated with the ways of 6Mv through male or through female. However, transmission frequencies of chromosome 6Mv were significantly correlated with Sichuan wheat genotypes(χ2=6.42>χ20.05=5.99(df=2)). 2. To aggregating the resistances to stripe rust and powdery mildew, as well as research on the resistance reactions in different genetic background, Moisson 6Mv/6B was reciprocally hybrided with the wheat- Haynaldia villosa ditelosomic addition line Pana (highly resistant to the powdery mildew). The seed-setting rate, agronomic characters, genomic in situ hybridization (GISH) of hybrid progenies,and resistances to stripe rust and powdery mildew were investigated. The results showed that the seed-setting rate of hybridization via Moisson 6Mv/6B as female donor (80.56%) was significant higher than that via Pana as female donor (58.33%). The seed-setting rate was associated with the hybrid methods (χ2 = 4.96> χ20.05 = 3.84 (df = 1)). The plant height of hybrid progenies was about 10 cm higher than Pana, the parent with maximum height. And the maturity of hybrid progenies was about two weeks earlier than that of the parents. In the hybrid progenies, the plants with the 6Mv chromosome have the resistance to stripe rust and the plants with the telosome from Haynaldia villosa have the resistance to powdery mildew. It was found that four plants with both the 6Mv chromosome and the telosome from Haynaldia villosa were resistant to stripe rust and powdery mildew. It indicated that the resistance to stripe rust and powdery mildew aggregated, and no mutual inhibition was found. It implied that the aggregation of the two resistance genes was able to provide the new wheat germplasm with the resistances to stripe rust and powdery mildew. 3. Moisson 6Mv/6B was irradiated with 60Co-γ rays of 6Gy (120rad/min) during meiosis. The agronomic characters and chromosomes confirmation in the mitotic metaphase of root tip cells,as well as resistance to stripe rust were investigated. The seed-setting rate of irradiated plants was only 2.22%. The chromosomes in mitotic metaphase had clear fragments. The resistance to stripe rust of progeny of irradiated plants was the adult-plant resistance. The wheat- Aegilops ventricosa 6Mv/6B substitution line is a good stripe rust resistance donor for its stabile resistance. Our study demonstrated that the key for use the resistance is to choose a good receptor. There is no difference between Moisson 6Mv/6B be the female and be the male if the seed number meets the requirement. At the same time, the stripe rust resistance of Moisson 6Mv/6B did not have the mutual inhibition with the powdery mildew resistance from Haynaldia villosa. It is able to aggregate the two resistances for wheat breeding.
Resumo:
In this PhD by Publication I revisit and contextualize art works and essays I have collaboratively created under the name Flow Motion between 2004-13, in order to generate new insights on the contributions they have made to diverse and emerging fields of contemporary arts practice/research, including digital, virtual, sonic and interdisciplinary art. The works discussed comprise the digital multimedia installation and sound art performance Astro Black Morphologies/Astro Dub Morphologies (2004-5), the sound installation and performance Invisible (2006-7), the web art archive and performance presentation project promised lands (2008-10), and two related texts, Astro Black Morphologies: Music and Science Lovers (2004) and Music and Migration (2013). I show how these works map new thematic constellations around questions of space and diaspora, music and cosmology, invisibility and spectrality, the body and perception. I also show how the works generate new connections between and across contemporary avant-garde, experimental and popular music, and visual art and cinema traditions. I describe the methodological design, approaches and processes through which the works were produced, with an emphasis on transversality, deconstruction and contemporary black music forms as key tools in my collaborative artistic and textual practice. I discuss how, through the development of methods of data translation and transformation, and distinctive visual approaches for the re-elaboration of archival material, the works produced multiple readings of scientific narratives, digital X-ray data derived from astronomical research on black holes and dark energy, and musical, photographic and textual material related to historical and contemporary accounts of migration. I also elaborate on the relation between difference and repetition, the concepts of multiplicity and translation, and the processes of collective creation which characterize my/Flow Motion’s work. The art works and essays I engage with in this commentary produce an idea of contemporary art as the result of a fluid, open and mutating assemblage of diverse and hybrid methods and mediums, and as an embodiment of a cross-cultural, transversal and transdisciplinary knowledge shaped by research, process, creative dialogues, collaborative practice and collective signature.
Resumo:
Cette thèse porte sur la reconstruction active de modèles 3D à l’aide d’une caméra et d’un projecteur. Les méthodes de reconstruction standards utilisent des motifs de lumière codée qui ont leurs forces et leurs faiblesses. Nous introduisons de nouveaux motifs basés sur la lumière non structurée afin de pallier aux manques des méthodes existantes. Les travaux présentés s’articulent autour de trois axes : la robustesse, la précision et finalement la comparaison des patrons de lumière non structurée aux autres méthodes. Les patrons de lumière non structurée se différencient en premier lieu par leur robustesse aux interréflexions et aux discontinuités de profondeur. Ils sont conçus de sorte à homogénéiser la quantité d’illumination indirecte causée par la projection sur des surfaces difficiles. En contrepartie, la mise en correspondance des images projetées et capturées est plus complexe qu’avec les méthodes dites structurées. Une méthode d’appariement probabiliste et efficace est proposée afin de résoudre ce problème. Un autre aspect important des reconstructions basées sur la lumière non structurée est la capacité de retrouver des correspondances sous-pixels, c’est-à-dire à un niveau de précision plus fin que le pixel. Nous présentons une méthode de génération de code de très grande longueur à partir des motifs de lumière non structurée. Ces codes ont l’avantage double de permettre l’extraction de correspondances plus précises tout en requérant l’utilisation de moins d’images. Cette contribution place notre méthode parmi les meilleures au niveau de la précision tout en garantissant une très bonne robustesse. Finalement, la dernière partie de cette thèse s’intéresse à la comparaison des méthodes existantes, en particulier sur la relation entre la quantité d’images projetées et la qualité de la reconstruction. Bien que certaines méthodes nécessitent un nombre constant d’images, d’autres, comme la nôtre, peuvent se contenter d’en utiliser moins aux dépens d’une qualité moindre. Nous proposons une méthode simple pour établir une correspondance optimale pouvant servir de référence à des fins de comparaison. Enfin, nous présentons des méthodes hybrides qui donnent de très bons résultats avec peu d’images.
Resumo:
The electronic structure of an isolated oxygen vacancy in SrTiO3 has been investigated with a variety of ab initio quantum mechanical approaches. In particular we compared pure density functional theory (DFT) approaches with the Hartree-Fock method, and with hybrid methods where the exchange term is treated in a mixed way. Both local cluster models and periodic calculations with large supercells containing up to 80 atoms have been performed. Both diamagnetic (singlet state) and paramagnetic (triplet state) solutions have been considered. We found that the formation of an O vacancy is accompanied by the transfer of two electrons to the 3d(z2) orbitals of the two Ti atoms along the Ti-Vac-Ti axis. The two electrons are spin coupled and the ground state is diamagnetic. New states associated with the defect center appear in the gap just below the conduction band edge. The formation energy computed with respect to an isolated oxygen atom in the triplet state is 9.4 eV.
Resumo:
Recommender systems attempt to predict items in which a user might be interested, given some information about the user's and items' profiles. Most existing recommender systems use content-based or collaborative filtering methods or hybrid methods that combine both techniques (see the sidebar for more details). We created Informed Recommender to address the problem of using consumer opinion about products, expressed online in free-form text, to generate product recommendations. Informed recommender uses prioritized consumer product reviews to make recommendations. Using text-mining techniques, it maps each piece of each review comment automatically into an ontology
Resumo:
Techniques of optimization known as metaheuristics have achieved success in the resolution of many problems classified as NP-Hard. These methods use non deterministic approaches that reach very good solutions which, however, don t guarantee the determination of the global optimum. Beyond the inherent difficulties related to the complexity that characterizes the optimization problems, the metaheuristics still face the dilemma of xploration/exploitation, which consists of choosing between a greedy search and a wider exploration of the solution space. A way to guide such algorithms during the searching of better solutions is supplying them with more knowledge of the problem through the use of a intelligent agent, able to recognize promising regions and also identify when they should diversify the direction of the search. This way, this work proposes the use of Reinforcement Learning technique - Q-learning Algorithm - as exploration/exploitation strategy for the metaheuristics GRASP (Greedy Randomized Adaptive Search Procedure) and Genetic Algorithm. The GRASP metaheuristic uses Q-learning instead of the traditional greedy-random algorithm in the construction phase. This replacement has the purpose of improving the quality of the initial solutions that are used in the local search phase of the GRASP, and also provides for the metaheuristic an adaptive memory mechanism that allows the reuse of good previous decisions and also avoids the repetition of bad decisions. In the Genetic Algorithm, the Q-learning algorithm was used to generate an initial population of high fitness, and after a determined number of generations, where the rate of diversity of the population is less than a certain limit L, it also was applied to supply one of the parents to be used in the genetic crossover operator. Another significant change in the hybrid genetic algorithm is the proposal of a mutually interactive cooperation process between the genetic operators and the Q-learning algorithm. In this interactive/cooperative process, the Q-learning algorithm receives an additional update in the matrix of Q-values based on the current best solution of the Genetic Algorithm. The computational experiments presented in this thesis compares the results obtained with the implementation of traditional versions of GRASP metaheuristic and Genetic Algorithm, with those obtained using the proposed hybrid methods. Both algorithms had been applied successfully to the symmetrical Traveling Salesman Problem, which was modeled as a Markov decision process
Resumo:
The metaheuristics techiniques are known to solve optimization problems classified as NP-complete and are successful in obtaining good quality solutions. They use non-deterministic approaches to generate solutions that are close to the optimal, without the guarantee of finding the global optimum. Motivated by the difficulties in the resolution of these problems, this work proposes the development of parallel hybrid methods using the reinforcement learning, the metaheuristics GRASP and Genetic Algorithms. With the use of these techniques, we aim to contribute to improved efficiency in obtaining efficient solutions. In this case, instead of using the Q-learning algorithm by reinforcement learning, just as a technique for generating the initial solutions of metaheuristics, we use it in a cooperative and competitive approach with the Genetic Algorithm and GRASP, in an parallel implementation. In this context, was possible to verify that the implementations in this study showed satisfactory results, in both strategies, that is, in cooperation and competition between them and the cooperation and competition between groups. In some instances were found the global optimum, in others theses implementations reach close to it. In this sense was an analyze of the performance for this proposed approach was done and it shows a good performance on the requeriments that prove the efficiency and speedup (gain in speed with the parallel processing) of the implementations performed
Resumo:
Pós-graduação em Engenharia Elétrica - FEB
Resumo:
Pós-graduação em Biofísica Molecular - IBILCE
Resumo:
Pós-graduação em Biofísica Molecular - IBILCE
Resumo:
Support Vector Machines (SVMs) have achieved very good performance on different learning problems. However, the success of SVMs depends on the adequate choice of the values of a number of parameters (e.g., the kernel and regularization parameters). In the current work, we propose the combination of meta-learning and search algorithms to deal with the problem of SVM parameter selection. In this combination, given a new problem to be solved, meta-learning is employed to recommend SVM parameter values based on parameter configurations that have been successfully adopted in previous similar problems. The parameter values returned by meta-learning are then used as initial search points by a search technique, which will further explore the parameter space. In this proposal, we envisioned that the initial solutions provided by meta-learning are located in good regions of the search space (i.e. they are closer to optimum solutions). Hence, the search algorithm would need to evaluate a lower number of candidate solutions when looking for an adequate solution. In this work, we investigate the combination of meta-learning with two search algorithms: Particle Swarm Optimization and Tabu Search. The implemented hybrid algorithms were used to select the values of two SVM parameters in the regression domain. These combinations were compared with the use of the search algorithms without meta-learning. The experimental results on a set of 40 regression problems showed that, on average, the proposed hybrid methods obtained lower error rates when compared to their components applied in isolation.
Resumo:
La evaluación de la seguridad de estructuras antiguas de fábrica es un problema abierto.El material es heterogéneo y anisótropo, el estado previo de tensiones difícil de conocer y las condiciones de contorno inciertas. A comienzos de los años 50 se demostró que el análisis límite era aplicable a este tipo de estructuras, considerándose desde entonces como una herramienta adecuada. En los casos en los que no se produce deslizamiento la aplicación de los teoremas del análisis límite estándar constituye una herramienta formidable por su simplicidad y robustez. No es necesario conocer el estado real de tensiones. Basta con encontrar cualquier solución de equilibrio, y que satisfaga las condiciones de límite del material, en la seguridad de que su carga será igual o inferior a la carga real de inicio de colapso. Además esta carga de inicio de colapso es única (teorema de la unicidad) y se puede obtener como el óptimo de uno cualquiera entre un par de programas matemáticos convexos duales. Sin embargo, cuando puedan existir mecanismos de inicio de colapso que impliquen deslizamientos, cualquier solución debe satisfacer tanto las restricciones estáticas como las cinemáticas, así como un tipo especial de restricciones disyuntivas que ligan las anteriores y que pueden plantearse como de complementariedad. En este último caso no está asegurada la existencia de una solución única, por lo que es necesaria la búsqueda de otros métodos para tratar la incertidumbre asociada a su multiplicidad. En los últimos años, la investigación se ha centrado en la búsqueda de un mínimo absoluto por debajo del cual el colapso sea imposible. Este método es fácil de plantear desde el punto de vista matemático, pero intratable computacionalmente, debido a las restricciones de complementariedad 0 y z 0 que no son ni convexas ni suaves. El problema de decisión resultante es de complejidad computacional No determinista Polinomial (NP)- completo y el problema de optimización global NP-difícil. A pesar de ello, obtener una solución (sin garantía de exito) es un problema asequible. La presente tesis propone resolver el problema mediante Programación Lineal Secuencial, aprovechando las especiales características de las restricciones de complementariedad, que escritas en forma bilineal son del tipo y z = 0; y 0; z 0 , y aprovechando que el error de complementariedad (en forma bilineal) es una función de penalización exacta. Pero cuando se trata de encontrar la peor solución, el problema de optimización global equivalente es intratable (NP-difícil). Además, en tanto no se demuestre la existencia de un principio de máximo o mínimo, existe la duda de que el esfuerzo empleado en aproximar este mínimo esté justificado. En el capítulo 5, se propone hallar la distribución de frecuencias del factor de carga, para todas las soluciones de inicio de colapso posibles, sobre un sencillo ejemplo. Para ello, se realiza un muestreo de soluciones mediante el método de Monte Carlo, utilizando como contraste un método exacto de computación de politopos. El objetivo final es plantear hasta que punto está justificada la busqueda del mínimo absoluto y proponer un método alternativo de evaluación de la seguridad basado en probabilidades. Las distribuciones de frecuencias, de los factores de carga correspondientes a las soluciones de inicio de colapso obtenidas para el caso estudiado, muestran que tanto el valor máximo como el mínimo de los factores de carga son muy infrecuentes, y tanto más, cuanto más perfecto y contínuo es el contacto. Los resultados obtenidos confirman el interés de desarrollar nuevos métodos probabilistas. En el capítulo 6, se propone un método de este tipo basado en la obtención de múltiples soluciones, desde puntos de partida aleatorios y calificando los resultados mediante la Estadística de Orden. El propósito es determinar la probabilidad de inicio de colapso para cada solución.El método se aplica (de acuerdo a la reducción de expectativas propuesta por la Optimización Ordinal) para obtener una solución que se encuentre en un porcentaje determinado de las peores. Finalmente, en el capítulo 7, se proponen métodos híbridos, incorporando metaheurísticas, para los casos en que la búsqueda del mínimo global esté justificada. Abstract Safety assessment of the historic masonry structures is an open problem. The material is heterogeneous and anisotropic, the previous state of stress is hard to know and the boundary conditions are uncertain. In the early 50's it was proven that limit analysis was applicable to this kind of structures, being considered a suitable tool since then. In cases where no slip occurs, the application of the standard limit analysis theorems constitutes an excellent tool due to its simplicity and robustness. It is enough find any equilibrium solution which satisfy the limit constraints of the material. As we are certain that this load will be equal to or less than the actual load of the onset of collapse, it is not necessary to know the actual stresses state. Furthermore this load for the onset of collapse is unique (uniqueness theorem), and it can be obtained as the optimal from any of two mathematical convex duals programs However, if the mechanisms of the onset of collapse involve sliding, any solution must satisfy both static and kinematic constraints, and also a special kind of disjunctive constraints linking the previous ones, which can be formulated as complementarity constraints. In the latter case, it is not guaranted the existence of a single solution, so it is necessary to look for other ways to treat the uncertainty associated with its multiplicity. In recent years, research has been focused on finding an absolute minimum below which collapse is impossible. This method is easy to set from a mathematical point of view, but computationally intractable. This is due to the complementarity constraints 0 y z 0 , which are neither convex nor smooth. The computational complexity of the resulting decision problem is "Not-deterministic Polynomialcomplete" (NP-complete), and the corresponding global optimization problem is NP-hard. However, obtaining a solution (success is not guaranteed) is an affordable problem. This thesis proposes solve that problem through Successive Linear Programming: taking advantage of the special characteristics of complementarity constraints, which written in bilinear form are y z = 0; y 0; z 0 ; and taking advantage of the fact that the complementarity error (bilinear form) is an exact penalty function. But when it comes to finding the worst solution, the (equivalent) global optimization problem is intractable (NP-hard). Furthermore, until a minimum or maximum principle is not demonstrated, it is questionable that the effort expended in approximating this minimum is justified. XIV In chapter 5, it is proposed find the frequency distribution of the load factor, for all possible solutions of the onset of collapse, on a simple example. For this purpose, a Monte Carlo sampling of solutions is performed using a contrast method "exact computation of polytopes". The ultimate goal is to determine to which extent the search of the global minimum is justified, and to propose an alternative approach to safety assessment based on probabilities. The frequency distributions for the case study show that both the maximum and the minimum load factors are very infrequent, especially when the contact gets more perfect and more continuous. The results indicates the interest of developing new probabilistic methods. In Chapter 6, is proposed a method based on multiple solutions obtained from random starting points, and qualifying the results through Order Statistics. The purpose is to determine the probability for each solution of the onset of collapse. The method is applied (according to expectations reduction given by the Ordinal Optimization) to obtain a solution that is in a certain percentage of the worst. Finally, in Chapter 7, hybrid methods incorporating metaheuristics are proposed for cases in which the search for the global minimum is justified.