34 resultados para Level-Set method
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.
Resumo:
An approach for the analysis of uncertainty propagation in reliability-based design optimization of composite laminate structures is presented. Using the Uniform Design Method (UDM), a set of design points is generated over a domain centered on the mean reference values of the random variables. A methodology based on inverse optimal design of composite structures to achieve a specified reliability level is proposed, and the corresponding maximum load is outlined as a function of ply angle. Using the generated UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on an evolutionary learning process. Then, a Monte Carlo simulation using ANN development is performed to simulate the behavior of the critical Tsai number, structural reliability index, and their relative sensitivities as a function of the ply angle of laminates. The results are generated for uniformly distributed random variables on a domain centered on mean values. The statistical analysis of the results enables the study of the variability of the reliability index and its sensitivity relative to the ply angle. Numerical examples showing the utility of the approach for robust design of angle-ply laminates are presented.
Resumo:
Due to the growing complexity and adaptability requirements of real-time systems, which often exhibit unrestricted Quality of Service (QoS) inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may be inter-dependent. This paper focuses on optimising a dynamic local set of inter-dependent tasks that can be executed at varying levels of QoS to achieve an efficient resource usage that is constantly adapted to the specific constraints of devices and users, nature of executing tasks and dynamically changing system conditions. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.
Resumo:
Mestrado em Engenharia Civil - Ramo de Gestão da Construção
Resumo:
This paper presents a methodology for applying scheduling algorithms using Monte Carlo simulation. The methodology is based on a decision support system (DSS). The proposed methodology combines a genetic algorithm with a new local search using Monte Carlo Method. The methodology is applied to the job shop scheduling problem (JSSP). The JSSP is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. The methodology is tested on a set of standard instances taken from the literature and compared with others. The computation results validate the effectiveness of the proposed methodology. The DSS developed can be utilized in a common industrial or construction environment.
Resumo:
Prototype validation is a major concern in modern electronic product design and development. Simulation, structural test, functional and timing debug are all forming parts of the validation process, although very often addressed as dissociated tasks. In this paper we describe an integrated approach to board-level prototype validation, based on a set of mandatory/optional BST instructions and a built-in controller for debug and test, that addresses the late mentioned tasks as inherent parts of a whole process
Resumo:
The paper presents a RFDSCA automated synthesis procedure. This algorithm determines several RFDSCA circuits from the top-level system specifications all with the same maximum performance. The genetic synthesis tool optimizes a fitness function proportional to the RFDSCA quality factor and uses the epsiv-concept and maximin sorting scheme to achieve a set of solutions well distributed along a non-dominated front. To confirm the results of the algorithm, three RFDSCAs were simulated in SpectreRF and one of them was implemented and tested. The design used a 0.25 mum BiCMOS process. All the results (synthesized, simulated and measured) are very close, which indicate that the genetic synthesis method is a very useful tool to design optimum performance RFDSCAs.
Resumo:
The main purpose of this work was the development of procedures for the simulation of atmospheric ows over complex terrain, using OpenFOAM. For this aim, tools and procedures were developed apart from this code for the preprocessing and data extraction, which were thereafter applied in the simulation of a real case. For the generation of the computational domain, a systematic method able to translate the terrain elevation model to a native OpenFOAM format (blockMeshDict) was developed. The outcome was a structured mesh, in which the user has the ability to de ne the number of control volumes and its dimensions. With this procedure, the di culties of case set up and the high computation computational e ort reported in literature associated to the use of snappyHexMesh, the OpenFOAM resource explored until then for the accomplishment of this task, were considered to be overwhelmed. Developed procedures for the generation of boundary conditions allowed for the automatic creation of idealized inlet vertical pro les, de nition of wall functions boundary conditions and the calculation of internal eld rst guesses for the iterative solution process, having as input experimental data supplied by the user. The applicability of the generated boundary conditions was limited to the simulation of turbulent, steady-state, incompressible and neutrally strati ed atmospheric ows, always recurring to RaNS (Reynolds-averaged Navier-Stokes) models. For the modelling of terrain roughness, the developed procedure allowed to the user the de nition of idealized conditions, like an uniform aerodynamic roughness length or making its value variable as a function of topography characteristic values, or the using of real site data, and it was complemented by the development of techniques for the visual inspection of generated roughness maps. The absence and the non inclusion of a forest canopy model limited the applicability of this procedure to low aerodynamic roughness lengths. The developed tools and procedures were then applied in the simulation of a neutrally strati ed atmospheric ow over the Askervein hill. In the performed simulations was evaluated the solution sensibility to di erent convection schemes, mesh dimensions, ground roughness and formulations of the k - ε and k - ω models. When compared to experimental data, calculated values showed a good agreement of speed-up in hill top and lee side, with a relative error of less than 10% at a height of 10 m above ground level. Turbulent kinetic energy was considered to be well simulated in the hill windward and hill top, and grossly predicted in the lee side, where a zone of ow separation was also identi ed. Despite the need of more work to evaluate the importance of the downstream recirculation zone in the quality of gathered results, the agreement between the calculated and experimental values and the OpenFOAM sensibility to the tested parameters were considered to be generally in line with the simulations presented in the reviewed bibliographic sources.
Resumo:
Compreender a funcionalidade de uma criança é um desafio persistente em contextos de saúde e educação. Na tentativa de superar esse desafio, em 2007, a Organização Mundial de Saúde desenvolveu a Classificação Internacional de Funcionalidade, Incapacidade e Saúde para Crianças e Jovens (CIF-CJ) como o primeiro sistema de classificação universal para documentar a saúde e funcionalidade da criança. Apesar de a CIF-CJ não ser um instrumento de avaliação e intervenção, tem, no entanto, a capacidade de servir de enquadramento para o desenvolvimento de ferramentas adaptadas às necessidades dos seus utilizadores. Considerando que no contexto escolar, a escrita manual encontra-se entre as atividades mais requeridas para a participação plena de uma criança, parece ser pertinente a definição de um conjunto de códigos destinados a caracterizar o perfil de funcionalidade de uma criança, no que se refere à escrita manual. O objetivo deste estudo foi, pois, o desenvolvimento de um conjunto preliminar de códigos baseado na CIF-CJ que possa vir a constituir um code set para a escrita manual. Dada a complexidade do tema e atendendo a que se pretende alcançar consenso entre os especialistas sobre quais as categorias da CIF-CJ que devem ser consideradas, optou-se pela utilização da técnica de Delphi. A escolha da metodologia seguiu a orientação dos procedimentos adotados pelo projeto Core Set CIF. De dezoito profissionais contactados, obtiveram-se respostas de sete terapeutas ocupacionais com experiência em pediatria, que participaram em todas as rondas. No total, três rondas de questionários foram realizadas para atingir um consenso, com um nível de concordância, previamente definido, de 70%. Deste estudo resultou um conjunto preliminar de códigos com 54 categorias da CIF-CJ (16 categorias de segundo nível, 14 categorias de terceiro nível e uma categoria de quarto nível), das quais 31 são categorias das funções do corpo, uma categoria das estruturas do corpo, 12 categorias de atividades e participação e 10 categorias de fatores ambientais. Este estudo é um primeiro passo para o desenvolvimento de um code set para a escrita manual baseado na CIF-CJ , sendo claramente necessário a realização de mais pesquisas no contexto do desenvolvimento e da validação deste code set.
Resumo:
This paper addresses a gap in the literature concerning the management of Intellectual Capital (IC) in a port, which is a network of independent organizations that act together in the provision of a set of services. As far as the authors are aware, this type of empirical context has been unexplored when regarding knowledge management or IC creation/destruction. Indeed, most research in IC still focus on individual firms, despite the more recent interest placed on the analysis of macro-level units such as regions or nations. In this study, we conceptualise the port as meta-organisation, which has the generic goal of economic development, both for itself and for the region where it is located. It provides us with a unique environment due to its complexity as an “organisation” composed by several organisations, connected by interdependency relationships and, typically, with no formal hierarchy. Accordingly, actors’ interests are not always aligned and in some situations their individual interests can be misaligned with the collective goals of the port. Moreover, besides having their own interests, port actors also have different sources of influence and different levels of power, which can impact on the port’s Collective Intellectual Capital (CIC). Consequently, the management of the port’s CIC can be crucial in order for its goals to be met. With this paper we intend to discuss how the network coordinator (the port authority) manages those complex relations of interest and power in order to develop collaboration and mitigate conflict, thus creating collective intellectual assets or avoiding intellectual liabilities that may emerge for the whole port. The fact that we are studying complex and dynamic processes, about which there is a lack of understanding, in a complex and atypical organisation, leads us to consider the case study as an appropriate method of research. Evidence presented in this study results from preliminary interviews and also from document analysis. Findings suggest that alignment of interests and actions, at both dyadic and networking levels, is critical to develop a context of collaboration/cooperation within the port community and, accordingly, the port coordinator should make use of different types of power in order to ensure that port’s goals are achieved.
Resumo:
More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.
Resumo:
This work proposes a novel approach for a suitable orientation of antibodies (Ab) on an immunosensing platform, applied here to the determination of 8-hydroxy-2′-deoxyguanosine (8OHdG), a biomarker of oxidative stress that has been associated to chronic diseases, such as cancer. The anti-8OHdG was bound to an amine modified gold support through its Fc region after activation of its carboxylic functions. Non-oriented approaches of Ab binding to the platform were tested in parallel, in order to show that the presented methodology favored Ab/Ag affinity and immunodetection of the antigen. The immunosensor design was evaluated by quartz-crystal microbalance with dissipation, atomic force microscopy, electrochemical impedance spectroscopy (EIS) and square-wave voltammetry. EIS was also a suitable technique to follow the analytical behavior of the device against 8OHdG. The affinity binding between 8OHdG and the antibody immobilized in the gold modified platform increased the charge transfer resistance across the electrochemical set-up. The observed behavior was linear from 0.02 to 7.0 ng/mL of 8OHdG concentrations. The interference from glucose, urea and creatinine was found negligible. An attempt of application to synthetic samples was also successfully conducted. Overall, the presented approach enabled the production of suitably oriented Abs over a gold platform by means of a much simpler process than other oriented-Ab binding approaches described in the literature, as far as we know, and was successful in terms of analytical features and sample application.
Resumo:
The paper presents a RFDSCA automated synthesis procedure. This algorithm determines several RFDSCA circuits from the top-level system specifications all with the same maximum performance. The genetic synthesis tool optimizes a fitness function proportional to the RFDSCA quality factor and uses the epsiv-concept and maximin sorting scheme to achieve a set of solutions well distributed along a non-dominated front. To confirm the results of the algorithm, three RFDSCAs were simulated in SpectreRF and one of them was implemented and tested. The design used a 0.25 mum BiCMOS process. All the results (synthesized, simulated and measured) are very close, which indicate that the genetic synthesis method is a very useful tool to design optimum performance RFDSCAs.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
A crescente evolução das tecnologias de informação e comunicação, aliadas ao desenvolvimento industrial, proporcionou um aumento de competitividade entre as indústrias, originando uma subida do nível da relação qualidade preço. Esta premissa causou uma maior preocupação com a procura contínua pela melhoria dos processos, de forma a aumentar as actividades de valor acrescentado, eliminando todo o tipo de desperdícios. Nesta conjuntura, a Grohe Portugal Componentes Sanitários, Lda propôs uma melhoria no âmbito da gestão de stocks de componentes existentes em dinâmico. Esta acção de melhoria passa pela definição e implementação de um método de gestão destes componentes, acompanhado por um conjunto de regras de identificação de actividades e respectivos intervenientes, por forma a optimizar os meios existentes e evitar a ocorrência de falhas de componentes nas linhas. Trata-se de um método baseado no cálculo das necessidades das linhas, que através da procura média semanal e constituição dos produtos finais define um nível de prioridade entre os componentes, identificando quais os mais requisitados pelas linhas e possibilitando a gestão do dinâmico. Na contínua tentativa de combater possíveis falhas, desenvolveu-se um sistema de gestão do tipo Kanban com a capacidade de gerir o produto semi-acabado para consumo interno. Foram, ainda, criadas melhorias que permitem um acréscimo de eficiência na gestão dos componentes em estante dinâmica, diminuindo o capital imobilizado investido em stocks, levando a um rearranjo de layouts, proporcionando melhores condições de trabalho e optimizando percursos e recursos. Descreve-se detalhadamente o processo de (i) actualização, definição e implementação do método de gestão de componentes em dinâmico, acompanhado pelo respectivo conjunto de regras, (ii) a implementação de um sistema do tipo Kanban orientado às reais preocupações da empresa, (iii) a redefinição de layouts em conformidade com a actualização dos dinâmicos e (iv) a identificação e execução de um conjunto de melhorias. Todas estas actividades acompanhadas pelo impacto financeiro na organização. Por fim, efectua-se o balanço deste projecto e sugerem-se oportunidades de melhoria.