921 resultados para Complex problems
Resumo:
Rechnergestützte Modellansätze, die Logistiksysteme gestalten und generieren, sind eine hochkomplexe Aufgabenstellung. Die bisher in der Praxis existierenden Planungs- und Steuerungsmodelle für Intralogistiksysteme weisen für die aktuellen und zukünftigen Anforderungen wie der Komplexitätsbewältigung, Reaktionsschnelligkeit und Anpassungsfähigkeit Schwachstellen auf. – Ein innovativer Ansatz, diesen Ansprüchen gerecht zu werden, stellen Multiagentensysteme dar. Mit ihrem dezentralen und modularen Charakter sind sie für ein komplexes Problem mit einem geringen Grad an Strukturiertheit geeignet. Außerdem ermöglichen diese computergestützten intelligenten Systeme den Anwendern eine einfache und aufwandsarme Handhabung.
Resumo:
Synchronizing mind maps with fuzzy cognitive maps can help to handle complex problems with many involved stakeholders by taking advantage of human creativity. The proposed approach has the capacity to instantiate cognitive cities by including cognitive computing. A use case in the context of decision-finding (concerning a transportation system) is presented to illustrate the approach.
Resumo:
Este trabajo tiene por objetivo proponer un concepto complejo de interdisciplina que sea a la vez epistemológicamente riguroso, metodológicamente factible y políticamente crítico. Esto implica una toma de posición ideológica explícita que involucra una determinada concepción de la relación entre la ciencia y la sociedad: el conocimiento interdisciplinario es necesario para una política transformadora de los problemas complejos que afectan la vida de los pueblos en América Latina
Resumo:
Este trabajo tiene por objetivo proponer un concepto complejo de interdisciplina que sea a la vez epistemológicamente riguroso, metodológicamente factible y políticamente crítico. Esto implica una toma de posición ideológica explícita que involucra una determinada concepción de la relación entre la ciencia y la sociedad: el conocimiento interdisciplinario es necesario para una política transformadora de los problemas complejos que afectan la vida de los pueblos en América Latina
Resumo:
Este trabajo tiene por objetivo proponer un concepto complejo de interdisciplina que sea a la vez epistemológicamente riguroso, metodológicamente factible y políticamente crítico. Esto implica una toma de posición ideológica explícita que involucra una determinada concepción de la relación entre la ciencia y la sociedad: el conocimiento interdisciplinario es necesario para una política transformadora de los problemas complejos que afectan la vida de los pueblos en América Latina
Resumo:
Esta tesis doctoral se enmarca dentro de la computación con membranas. Se trata de un tipo de computación bio-inspirado, concretamente basado en las células de los organismos vivos, en las que se producen múltiples reacciones de forma simultánea. A partir de la estructura y funcionamiento de las células se han definido diferentes modelos formales, denominados P sistemas. Estos modelos no tratan de modelar el comportamiento biológico de una célula, sino que abstraen sus principios básicos con objeto de encontrar nuevos paradigmas computacionales. Los P sistemas son modelos de computación no deterministas y masivamente paralelos. De ahí el interés que en los últimos años estos modelos han suscitado para la resolución de problemas complejos. En muchos casos, consiguen resolver de forma teórica problemas NP-completos en tiempo polinómico o lineal. Por otra parte, cabe destacar también la aplicación que la computación con membranas ha tenido en la investigación de otros muchos campos, sobre todo relacionados con la biología. Actualmente, una gran cantidad de estos modelos de computación han sido estudiados desde el punto de vista teórico. Sin embargo, el modo en que pueden ser implementados es un reto de investigación todavía abierto. Existen varias líneas en este sentido, basadas en arquitecturas distribuidas o en hardware dedicado, que pretenden acercarse en lo posible a su carácter no determinista y masivamente paralelo, dentro de un contexto de viabilidad y eficiencia. En esta tesis doctoral se propone la realización de un análisis estático del P sistema, como vía para optimizar la ejecución del mismo en estas plataformas. Se pretende que la información recogida en tiempo de análisis sirva para configurar adecuadamente la plataforma donde se vaya a ejecutar posteriormente el P sistema, obteniendo como consecuencia una mejora en el rendimiento. Concretamente, en esta tesis se han tomado como referencia los P sistemas de transiciones para llevar a cabo el estudio de dicho análisis estático. De manera un poco más específica, el análisis estático propuesto en esta tesis persigue que cada membrana sea capaz de determinar sus reglas activas de forma eficiente en cada paso de evolución, es decir, aquellas reglas que reúnen las condiciones adecuadas para poder ser aplicadas. En esta línea, se afronta el problema de los estados de utilidad de una membrana dada, que en tiempo de ejecución permitirán a la misma conocer en todo momento las membranas con las que puede comunicarse, cuestión que determina las reglas que pueden aplicarse en cada momento. Además, el análisis estático propuesto en esta tesis se basa en otra serie de características del P sistema como la estructura de membranas, antecedentes de las reglas, consecuentes de las reglas o prioridades. Una vez obtenida toda esta información en tiempo de análisis, se estructura en forma de árbol de decisión, con objeto de que en tiempo de ejecución la membrana obtenga las reglas activas de la forma más eficiente posible. Por otra parte, en esta tesis se lleva a cabo un recorrido por un número importante de arquitecturas hardware y software que diferentes autores han propuesto para implementar P sistemas. Fundamentalmente, arquitecturas distribuidas, hardware dedicado basado en tarjetas FPGA y plataformas basadas en microcontroladores PIC. El objetivo es proponer soluciones que permitan implantar en dichas arquitecturas los resultados obtenidos del análisis estático (estados de utilidad y árboles de decisión para reglas activas). En líneas generales, se obtienen conclusiones positivas, en el sentido de que dichas optimizaciones se integran adecuadamente en las arquitecturas sin penalizaciones significativas. Summary Membrane computing is the focus of this doctoral thesis. It can be considered a bio-inspired computing type. Specifically, it is based on living cells, in which many reactions take place simultaneously. From cell structure and operation, many different formal models have been defined, named P systems. These models do not try to model the biological behavior of the cell, but they abstract the basic principles of the cell in order to find out new computational paradigms. P systems are non-deterministic and massively parallel computational models. This is why, they have aroused interest when dealing with complex problems nowadays. In many cases, they manage to solve in theory NP problems in polynomial or lineal time. On the other hand, it is important to note that membrane computing has been successfully applied in many researching areas, specially related to biology. Nowadays, lots of these computing models have been sufficiently characterized from a theoretical point of view. However, the way in which they can be implemented is a research challenge, that it is still open nowadays. There are some lines in this way, based on distributed architectures or dedicated hardware. All of them are trying to approach to its non-deterministic and parallel character as much as possible, taking into account viability and efficiency. In this doctoral thesis it is proposed carrying out a static analysis of the P system in order to optimize its performance in a computing platform. The general idea is that after data are collected in analysis time, they are used for getting a suitable configuration of the computing platform in which P system is going to be performed. As a consequence, the system throughput will improve. Specifically, this thesis has made use of Transition P systems for carrying out the study in static analysis. In particular, the static analysis proposed in this doctoral thesis tries to achieve that every membrane can efficiently determine its active rules in every evolution step. These rules are the ones that can be applied depending on the system configuration at each computational step. In this line, we are going to tackle the problem of the usefulness states for a membrane. This state will allow this membrane to know the set of membranes with which communication is possible at any time. This is a very important issue in determining the set of rules that can be applied. Moreover, static analysis in this thesis is carried out taking into account other properties such as membrane structure, rule antecedents, rule consequents and priorities among rules. After collecting all data in analysis time, they are arranged in a decision tree structure, enabling membranes to obtain the set of active rules as efficiently as possible in run-time system. On the other hand, in this doctoral thesis is going to carry out an overview of hardware and software architectures, proposed by different authors in order to implement P systems, such as distributed architectures, dedicated hardware based on PFGA, and computing platforms based on PIC microcontrollers. The aim of this overview is to propose solutions for implementing the results of the static analysis, that is, usefulness states and decision trees for active rules. In general, conclusions are satisfactory, because these optimizations can be properly integrated in most of the architectures without significant penalties.
Resumo:
Conventional programming techniques are not well suited for solving many highly combinatorial industrial problems, like scheduling, decision making, resource allocation or planning. Constraint Programming (CP), an emerging software technology, offers an original approach allowing for efficient and flexible solving of complex problems, through combined implementation of various constraint solvers and expert heuristics. Its applications are increasingly elded in various industries.
Resumo:
The objective of this thesis is model some processes from the nature as evolution and co-evolution, and proposing some techniques that can ensure that these learning process really happens and useful to solve some complex problems as Go game. The Go game is ancient and very complex game with simple rules which still is a challenge for the Artificial Intelligence. This dissertation cover some approaches that were applied to solve this problem, proposing solve this problem using competitive and cooperative co-evolutionary learning methods and other techniques proposed by the author. To study, implement and prove these methods were used some neural networks structures, a framework free available and coded many programs. The techniques proposed were coded by the author, performed many experiments to find the best configuration to ensure that co-evolution is progressing and discussed the results. Using co-evolutionary learning processes can be observed some pathologies which could impact co-evolution progress. In this dissertation is introduced some techniques to solve pathologies as loss of gradients, cycling dynamics and forgetting. According to some authors, one solution to solve these co-evolution pathologies is introduce more diversity in populations that are evolving. In this thesis is proposed some techniques to introduce more diversity and some diversity measurements for neural networks structures to monitor diversity during co-evolution. The genotype diversity evolved were analyzed in terms of its impact to global fitness of the strategies evolved and their generalization. Additionally, it was introduced a memory mechanism in the network neural structures to reinforce some strategies in the genes of the neurons evolved with the intention that some good strategies learned are not forgotten. In this dissertation is presented some works from other authors in which cooperative and competitive co-evolution has been applied. The Go board size used in this thesis was 9x9, but can be easily escalated to more bigger boards.The author believe that programs coded and techniques introduced in this dissertation can be used for other domains.
Resumo:
Thanks to their inherent properties, probabilistic graphical models are one of the prime candidates for machine learning and decision making tasks especially in uncertain domains. Their capabilities, like representation, inference and learning, if used effectively, can greatly help to build intelligent systems that are able to act accordingly in different problem domains. Evolutionary algorithms is one such discipline that has employed probabilistic graphical models to improve the search for optimal solutions in complex problems. This paper shows how probabilistic graphical models have been used in evolutionary algorithms to improve their performance in solving complex problems. Specifically, we give a survey of probabilistic model building-based evolutionary algorithms, called estimation of distribution algorithms, and compare different methods for probabilistic modeling in these algorithms.
Resumo:
La heterogeneidad del medio geológico introduce en el proyecto de obra subterránea un alto grado de incertidumbre que debe ser debidamente gestionado a fin de reducir los riesgos asociados, que son fundamentalmente de tipo geotécnico. Entre los principales problemas a los que se enfrenta la Mecánica de Rocas moderna en el ámbito de la construcción subterránea, se encuentran la fluencia de roca en túneles (squeezing) y la rotura de pilares de carbón. Es ampliamente conocido que su aparición causa importantes perjuicios en el coste y la seguridad de los proyectos por lo que su estudio, ha estado tradicionalmente vinculado a la predicción de su ocurrencia. Entre las soluciones existentes para la determinación de estos problemas se encuentran las que se basan en métodos analíticos y numéricos. Estas metodologías son capaces de proporcionar un alto nivel de representatividad respecto del comportamiento geotécnico real, sin embargo, su utilización solo es posible cuando se dispone de una suficiente caracterización geotécnica y por tanto de una detallada definición de los parámetros que alimentan los complejos modelos constitutivos y criterios de rotura que los fenómenos estudiados requieren. Como es lógico, este nivel de definición solo es posible cuando se alcanzan etapas avanzadas de proyecto, incluso durante la propia construcción, a fin de calibrar adecuadamente los parámetros introducidos en los modelos, lo que supone una limitación de uso en etapas iniciales, cuando su predicción tiene verdadero sentido. Por su parte, los métodos empíricos permiten proporcionar soluciones a estos complejos problemas de un modo sencillo, con una baja parametrización y, dado su eminente enfoque observacional, de gran fiabilidad cuando se implementan sobre condiciones de contorno similares a las originales. La sencillez y escasez de los parámetros utilizados permiten a estas metodologías ser utilizadas desde las fases preliminares del proyecto, ya que estos constituyen en general, información habitual de fácil y económica adquisición. Este aspecto permite por tanto incorporar la predicción desde el principio del proceso de diseño, anticipando el riesgo en origen. En esta tesis doctoral, se presenta una nueva metodología empírica que sirve para proporcionar predicciones para la ocurrencia de squeezing y el fallo de pilares de carbón basada en una extensa recopilación de información de casos reales de túneles y minas en las que ambos fenómenos fueron evaluados. Esta información, recogida de referencias bibliográficas de prestigio, ha permitido recopilar una de las más extensas bases de datos existentes hasta la fecha relativa a estos fenómenos, lo que supone en sí mismo una importante contribución sobre el estado del arte. Con toda esta información, y con la ayuda de la teoría de clasificadores estadísticos, se ha implementado sobre las bases de datos un clasificador lineal de tipo regresión logística que permite hacer predicciones sobre la ocurrencia de ambos fenómenos en términos de probabilidad, y por tanto ponderar la incertidumbre asociada a la heterogeneidad incorporada por el medio geológico. Este aspecto del desarrollo es el verdadero valor añadido proporcionado por la tesis y la principal ventaja de la solución propuesta respecto de otras metodologías empíricas. Esta capacidad de ponderación probabilística permite al clasificador constituir una solución muy interesante como metodología para la evaluación de riesgo geotécnico y la toma de decisiones. De hecho, y como ejercicio de validación práctica, se ha implementado la solución desarrollada en un modelo coste-beneficio asociado a la optimización del diseño de pilares involucrados en una de mina “virtual” explotada por tajos largos. La capacidad del clasificador para cuantificar la probabilidad de fallo del diseño, junto con una adecuada cuantificación de las consecuencias de ese fallo, ha permitido definir una ley de riesgo que se ha incorporado al balance de costes y beneficios, que es capaz, a partir del redimensionamiento iterativo del sistema de pilares y de la propia configuración de la mina, maximizar el resultado económico del proyecto minero bajo unas condiciones de seguridad aceptables, fijadas de antemano. Geological media variability introduces to the subterranean project a high grade of uncertainty that should be properly managed with the aim to reduce the associated risks, which are mainly geotechnical. Among the major problems facing the modern Rock Mechanics in the field of underground construction are both, the rock squeezing while tunneling and the failure of coal pillars. Given their harmfulness to the cost and safety of the projects, their study has been traditionally linked to the determination of its occurrence. Among the existing solutions for the determination of these problems are those that are based on analytical and numerical methods. Those methodologies allow providing a high level of reliability of the geotechnical behavior, and therefore a detailed definition of the parameters that feed the complex constitutive models and failure criteria that require the studied phenomena. Obviously, this level of definition is only possible when advanced stages of the project are achieved and even during construction in order to properly calibrate the parameters entered in the models, which suppose a limited use in early stages, when the prediction has true sense. Meanwhile, empirical methods provide solutions to these complex problems in a simple way, with low parameterization and, given his observational scope, with highly reliability when implemented on similar conditions to the original context. The simplicity and scarcity of the parameters used allow these methodologies be applied in the early stages of the project, since that information should be commonly easy and cheaply to get. This aspect can therefore incorporate the prediction from the beginning of the design process, anticipating the risk beforehand. This thesis, based on the extensive data collection of case histories of tunnels and underground mines, presents a novel empirical approach used to provide predictions for the occurrence of both, squeezing and coal pillars failures. The information has been collected from prestigious references, providing one of the largest databases to date concerning phenomena, a fact which provides an important contribution to the state of the art. With all this information, and with the aid of the theory of statistical classifiers, it has been implemented on both databases, a type linear logistic regression classifier that allows predictions about the occurrence of these phenomena in terms of probability, and therefore weighting the uncertainty associated with geological variability. This aspect of the development is the real added value provided by the thesis and the main advantage of the proposed solution over other empirical methodologies. This probabilistic weighting capacity, allows being the classifier a very interesting methodology for the evaluation of geotechnical risk and decision making. In fact, in order to provide a practical validation, we have implemented the developed solution within a cost-benefit analysis associated with the optimization of the design of coal pillar systems involved in a "virtual" longwall mine. The ability of the classifier to quantify the probability of failure of the design along with proper quantification of the consequences of that failure, has allowed defining a risk law which is introduced into the cost-benefits model, which is able, from iterative resizing of the pillar system and the configuration of the mine, maximize the economic performance of the mining project under acceptable safety conditions established beforehand.
Resumo:
Neste trabalho propomos o uso de um método Bayesiano para estimar o parâmetro de memória de um processo estocástico com memória longa quando sua função de verossimilhança é intratável ou não está disponível. Esta abordagem fornece uma aproximação para a distribuição a posteriori sobre a memória e outros parâmetros e é baseada numa aplicação simples do método conhecido como computação Bayesiana aproximada (ABC). Alguns estimadores populares para o parâmetro de memória serão revisados e comparados com esta abordagem. O emprego de nossa proposta viabiliza a solução de problemas complexos sob o ponto de vista Bayesiano e, embora aproximativa, possui um desempenho muito satisfatório quando comparada com métodos clássicos.
Resumo:
O mundo está passando por uma mudança paradigmática. O paradigma moderno baseado nas ideias de René Descartes e Isaac Newton está sendo substituído por um novo paradigma chamado de pós-moderno ou complexo. Essa mudança tem gerado uma crise em toda a sociedade, e essa crise pode ser muito bem percebida na escola. Enquanto o mundo passa a valorizar a criatividade, a autonomia e habilidades como o trabalho em equipe e a capacidade de resolução de problemas complexos, a escola procura se fechar em torno de si mesma, exigindo a memorização e reprodução de conteúdos prontos, visando a solução de problemas artificiais que em nada se relaciona com o cotidiano e os interesses dessa nova sociedade. Nessa perspectiva, esse trabalho propõe uma metodologia para o ensino do conteúdo de Física do primeiro ano do Ensino Médio inspirada na teoria do pensamento complexo de Edgar Morin. Para isso, desenvolvemos uma sequência didática com atividades que visam não só abordar os conteúdos previstos pelo Currículo Oficial do Estado de São Paulo de maneira contextualizada e motivadora, mas abordá-los com o objetivo de procurar iniciar o desenvolvimento do pensamento complexo dos alunos. Assim, para alcançar esse objetivo, na elaboração de cada atividade, levamos em consideração \"Os Sete Saberes Necessários à Educação do Futuro\" de Morin assim como as competências e habilidades que estão previstas nos Parâmetros Curriculares Nacionais (PCN). Sabemos que o pensamento complexo não se desenvolve do dia para noite, pois é um processo que se estende por toda a vida. Entretanto esperamos que com esse trabalho, possamos dar uma contribuição nesse sentido, e vislumbrar durante o ano letivo o princípio do desenvolvimento desse modo de pensar nos alunos.
Resumo:
Este trabalho trata da logística envolvida em operações de resposta a desastres, com foco na entrega final de suprimentos destinados a ajudar vítimas. Seu propósito é investigar os objetivos pertinentes ao planejamento do transporte da carga e encontrar uma metodologia para definir estratégia que sirva à tomada de decisão em campo. Para tanto, primeiramente identifica-se os objetivos adotados em modelos de Pesquisa Operacional para a tarefa em questão, através da análise de conteúdo das publicações pertinentes. Então, a abordagem do Pensamento Focado em Valores é utilizada para estruturar o problema. Finalmente, o método Simple Multi-Attribute Rating Technique Exploiting Ranks (SMARTER) é empregado na construção de um modelo de Análise da Decisão Multicritério (ADM), com consulta a um profissional experiente da área humanitária e aproveitando a análise da literatura previamente realizada. Neste processo, são elaboradas e avaliadas seis alternativas para a tomada de decisão condizentes com os valores da comunidade humanitária. Os resultados obtidos mostram que existe incompatibilidade entre os critérios de desempenho identificados nas publicações existentes e os objetivos perseguidos pelo Tomador da Decisão (TD) real. De acordo com o modelo construído, o atendimento de prioridades e a manutenção da sustentabilidade da operação são os objetivos que devem ser levados em conta para planejar a entrega de carga em pós-desastre, sendo que o custo e a equidade da distribuição não devem ser considerados. Conclui-se que o método adotado é útil à definição destes critérios e também ao desenvolvimento de estratégias que resultem em distribuições de ajuda melhores, aos olhos do próprio TD. Desta forma, ressalta-se que este trabalho contribui à área da Logística Humanitária com a investigação dos objetivos, assim como ao campo da ADM pela formalização dos processos de elaboração de alternativas, além da adição de mais uma aplicação possível ao repertório do método SMARTER.
Resumo:
Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed, most of them are only useful for standard machining. However, the algorithms used for tool path computation demand a higher computation performance, which makes the implementation on many existing systems very slow or even impractical. Hardware acceleration is an incremental solution that can be cleanly added to these systems while keeping everything else intact. It is completely transparent to the user. The cost is much lower and the development time is much shorter than replacing the computers by faster ones. This paper presents an optimisation that uses a specific graphic hardware approach using the power of multi-core Graphic Processing Units (GPUs) in order to improve the tool path computation. This improvement is applied on a highly accurate and robust tool path generation algorithm. The paper presents, as a case of study, a fully implemented algorithm used for turning lathe machining of shoe lasts. A comparative study will show the gain achieved in terms of total computing time. The execution time is almost two orders of magnitude faster than modern PCs.
Resumo:
Purpose – The purpose of this paper is to present a new geometric model based on the mathematical morphology paradigm, specialized to provide determinism to the classic morphological operations. The determinism is needed to model dynamic processes that require an order of application, as is the case for designing and manufacturing objects in CAD/CAM environments. Design/methodology/approach – The basic trajectory-based operation is the basis of the proposed morphological specialization. This operation allows the definition of morphological operators that obtain sequentially ordered sets of points from the boundary of the target objects, inexistent determinism in the classical morphological paradigm. From this basic operation, the complete set of morphological operators is redefined, incorporating the concept of boundary and determinism: trajectory-based erosion and dilation, and other morphological filtering operations. Findings – This new morphological framework allows the definition of complex three-dimensional objects, providing arithmetical support to generating machining trajectories, one of the most complex problems currently occurring in CAD/CAM. Originality/value – The model proposes the integration of the processes of design and manufacture, so that it avoids the problems of accuracy and integrity that present other classic geometric models that divide these processes in two phases. Furthermore, the morphological operative is based on points sets, so the geometric data structures and the operations are intrinsically simple and efficient. Another important value that no excessive computational resources are needed, because only the points in the boundary are processed.