960 resultados para Softwares dinâmicos
Resumo:
O presente relatório de estágio tem como propósito, a descrição das atividades realizadas no estágio de carácter curricular para o âmbito do curso de Mestrado em Engenharia Civil – Construções Civis. Este relatório trata um caso de estudo de uma moradia, para a realização de um projeto de estabilidade. O propósito da análise estrutural para a moradia, vem com a necessidade do projeto de arquitetura ter sido concebido no gabinete da empresa, não existindo ainda o seu projeto de estabilidade. O estudo passa pela realização de modelos estruturais realizados em dois softwares de cálculo automático, um disponibilizado pela empresa e outro aprendido em período académico. Os dois softwares de cálculo utilizados foram: o CypeCad 2016 e o Robot Structural Analysis Professional 2015 versão estudante. É feita uma abordagem quanto ao funcionamento e condições de inserção de dados que cada software, identificando as ações e valores característicos, selecionados para o seu cálculo estrutural do caso de estudo. Com o decorrer do estudo, verificou-se que existem várias opções na inserção de dados relativos às ações na estrutura, podendo ser criada uma modelação de forma automática, isto é, através de dados pré-definidos que se encontram na base de dados do software de acordo com a regulamentação escolhida, ou de uma forma mais manual, onde o utilizador pode inserir os valores que pretende, mediante de uma introdução de um maior número de dados. Com esta realidade, fez-se uma análise comparativa dos modelos gerados de forma automática e de forma manual em cada software de cálculo. Com a criação dos modelos concluída e calculada, fez-se uma comparação dos esforços para alguns elementos estruturais, verificando as diferenças de resultados, e fazendo a sua análise. Por fim, foi efetuado o dimensionamento estrutural da moradia com recurso ao software utilizado pela empresa e gerados todos os seus desenhos do projeto de estabilidade para o caso de estudo.
Resumo:
A problemática relacionada com a modelação da qualidade da água de albufeiras pode ser abordada de diversos pontos de vista. Neste trabalho recorre-se a metodologias de resolução de problemas que emanam da Área Cientifica da Inteligência Artificial, assim como a ferramentas utilizadas na procura de soluções como as Árvores de Decisão, as Redes Neuronais Artificiais e a Aproximação de Vizinhanças. Actualmente os métodos de avaliação da qualidade da água são muito restritivos já que não permitem aferir a qualidade da água em tempo real. O desenvolvimento de modelos de previsão baseados em técnicas de Descoberta de Conhecimento em Bases de Dados, mostrou ser uma alternativa tendo em vista um comportamento pró-activo que pode contribuir decisivamente para diagnosticar, preservar e requalificar as albufeiras. No decurso do trabalho, foi utilizada a aprendizagem não-supervisionada tendo em vista estudar a dinâmica das albufeiras sendo descritos dois comportamentos distintos, relacionados com a época do ano. ABSTRACT: The problems related to the modelling of water quality in reservoirs can be approached from different viewpoints. This work resorts to methods of resolving problems emanating from the Scientific Area of Artificial lntelligence as well as to tools used in the search for solutions such as Decision Trees, Artificial Neural Networks and Nearest-Neighbour Method. Currently, the methods for assessing water quality are very restrictive because they do not indicate the water quality in real time. The development of forecasting models, based on techniques of Knowledge Discovery in Databases, shows to be an alternative in view of a pro-active behavior that may contribute to diagnose, maintain and requalify the water bodies. ln this work. unsupervised learning was used to study the dynamics of reservoirs, being described two distinct behaviors, related to the time of year.
Resumo:
This work aims to study the application of Genetic Algorithms in anaerobic digestion modeling, in particular when using dynamical models. Along the work, different types of bioreactors are shown, such as batch, semi-batch and continuous, as well as their mathematical modeling. The work intendeds to estimate the parameter values of two biological reaction model. For that, simulated results, where only one output variable, the produced biogas, is known, are fitted to the model results. For this reason, the problems associated with reverse optimization are studied, using some graphics that provide clues to the sensitivity and identifiability associated with the problem. Particular solutions obtained by the identifiability analysis using GENSSI and DAISY softwares are also presented. Finally, the optimization is performed using genetic algorithms. During this optimization the need to improve the convergence of genetic algorithms was felt. This need has led to the development of an adaptation of the genetic algorithms, which we called Neighbored Genetic Algorithms (NGA1 and NGA2). In order to understand if this new approach overcomes the Basic Genetic Algorithms (BGA) and achieves the proposed goals, a study of 100 full optimization runs for each situation was further developed. Results show that NGA1 and NGA2 are statistically better than BGA. However, because it was not possible to obtain consistent results, the Nealder-Mead method was used, where the initial guesses were the estimated results from GA; Algoritmos Evolucionários para a Modelação de Bioreactores Resumo: Neste trabalho procura-se estudar os algoritmos genéticos com aplicação na modelação da digestão anaeróbia e, em particular, quando se utilizam modelos dinâmicos. Ao longo do mesmo, são apresentados diferentes tipos de bioreactores, como os batch, semi-batch e contínuos, bem como a modelação matemática dos mesmos. Neste trabalho procurou-se estimar o valor dos parâmetros que constam num modelo de digestão anaeróbia para o ajustar a uma situação simulada onde apenas se conhece uma variável de output, o biogas produzido. São ainda estudados os problemas associados à optimização inversa com recurso a alguns gráficos que fornecem pistas sobre a sensibilidade e identifiacabilidade associadas ao problema da modelação da digestão anaeróbia. São ainda apresentadas soluções particulares de idenficabilidade obtidas através dos softwares GENSSI e DAISY. Finalmente é realizada a optimização do modelo com recurso aos algoritmos genéticos. No decorrer dessa optimização sentiu-se a necessidade de melhorar a convergência e, portanto, desenvolveu-se ainda uma adaptação dos algoritmos genéticos a que se deu o nome de Neighboured Genetic Algorithms (NGA1 e NGA2). No sentido de se compreender se as adaptações permitiam superar os algoritmos genéticos básicos e atingir as metas propostas, foi ainda desenvolvido um estudo em que o processo de optimização foi realizado 100 vezes para cada um dos métodos, o que permitiu concluir, estatisticamente, que os BGA foram superados pelos NGA1 e NGA2. Ainda assim, porque não foi possivel obter consistência nos resultados, foi usado o método de Nealder-Mead utilizado como estimativa inicial os resultados obtidos pelos algoritmos genéticos.
Resumo:
O caqui ?Rama Forte? pertence ao grupo de cultivares cujos frutos possuem adstringência mesmo quando maduros, sendo necessária a destanização artificial antes do consumo. Os taninos solúveis são responsáveis por essa adstringência, que confere sensação de secura ao paladar. Um método qualitativo para determinar o índice de adstringência em caquis foi proposto por Gazit & Levy (1963). Esse método consiste na análise comparativa da impressão obtida por meio do contato da polpa do caqui em um papel-filtro tratado com cloreto férrico (FeCl3), com uma escala de notas que varia entre 1 (não-adstringente) e 5 (muito adstringente). Por ser um método visual e, consequentemente, sujeito à falhas, necessita-se de uma nova técnica que ofereça resultados mais confiáveis. Assim, o objetivo deste trabalho foi estudar a aplicação de softwares de imagens nessas impressões para quantificar, em termos percentuais, a área em que houve reação entre os taninos solúveis e o FeCl3, agrupar imagens semelhantes e, por fim, determinar o índice de adstringência desses frutos.
Resumo:
O atual nível das mudanças uso do solo causa impactos nas mudanças ambientais globais. Os processos de mudanças do uso e cobertura do solo são processos complexos e não acontecem ao acaso sobre uma região. Geralmente estas mudanças são determinadas localmente, regionalmente ou globalmente por fatores geográficos, ambientais, sociais, econômicos e políticos interagindo em diversas escalas temporais e espaciais. Parte desta complexidade é capturada por modelos de simulação de mudanças do uso e cobertura do solo. Uma etapa do processo de simulação do modelo CLUE-S é a quantificação da influência local dos impulsores de mudança sobre a probabilidade de ocorrência de uma classe de uso do solo. Esta influência local é obtida ajustando um modelo de regressão logística. Um modelo de regressão espacial é proposto como alternativa para selecionar os impulsores de mudanças. Este modelo incorpora a informação da vizinhança espacial existente nos dados que não é considerada na regressão logística. Baseado em um cenário de tendência linear para a demanda agregada do uso do solo, simulações da mudança do uso do solo para a microbacia do Coxim, Mato Grosso do Sul, foram geradas, comparadas e analisadas usando o modelo CLUE-S sob os enfoques da regressão logística e espacial para o período de 2001 a 2011. Ambos os enfoques apresentaram simulações com muito boa concordância, medidas de acurácia global e Kappa altos, com o uso do solo para o ano de referência de 2004. A diferença entre os enfoques foi observada na distribuição espacial da simulação do uso do solo para o ano 2011, sendo o enfoque da regressão espacial que teve a simulação com menor discrepância com a demanda do uso do solo para esse ano.
Resumo:
A agricultura é a atividade humana que mais afeta o meio ambiente. Imensas quantidades de insumos agrícolas são aplicados sobre o solo e grande parte destes degrada os recursos hídricos. Para uma investigação adequada do efeito destes insumos, estudam-se as propriedades hidráulicas do solo, que influem no transporte de solutos neste meio. Medir tais propriedades e modelar os parâmetros correlatos são tarefas extremamente complexas, devido ao tempo requerido, dinheiro, instrumentação e escala. As metodologias convencionais inferem as propriedades hidráulicas em amostras que estão em equilíbrio, através de técnicas invasivas e sob restrições especiais. esta tes contribui com a Ciência do Ambiente, via Ciência do Solo, propondo um novo método de estudo da infiltração da água na região não-saturada do solo, utilizando a tomografia computadorizada (TC). O tomógrafo foi aqui desenvolvido e construído. A TC, neste trabalho, mediu a umidade durante o fluxo não-saturado e, através da solução numérica da equação de Richards e do modelo Rossi-Nimmo, obtiveram-se a curva de retenção, a sortividade, e a difusividade. Resultados qualitativos, como imagens 2D e 3D, e resultados quantitativos demonstraram a boa correlação do método proposto com o método tradicional de medida da curva de retenção. Amostras de solo estruturado foram analisadas em laboratório e em camp.
Resumo:
The little grey cat engine (greyCat) is part of a series of projects which explore software which can enable access to the potentially empowering nature of represented space and game design. GreyCat is the result of research into the culture of the software itself in order to provide participatory environments which enable the telling of ‘small stories’ – stories and experiences which are those of the everyday or those of a cultural perspective other than that prioritised by most world building softwares or game engines. GreyCat offers a simple framework which allows participants to use their own image materials (photographs for the most part) as a basis for spatial exploration of their own places.---------- Truna aka j.turner (2008) The little grey cat engine: telling small stories (Demo), Australasian Computer Human Interaction Conference, OZCHI 2008, December 8th-12th, Cairns, Australia---------- Research Publications: truna aka j.turner & Browning, D. (2009) Designing spatial story telling software, in proceedings OZCHI09, Melbourne---------- Truna aka j.turner, Browning, D. & Champion, E. (2008) Designing for Engaged Experience, In proceedings Australasian Computer Human Interaction Conference, OZCHI 2008, December 8th-12th, Cairns, Australia---------- Truna aka. J.turner & Bidwell, N. (2007) Through the looking glass: game worlds as representations and views from elsewhere, Proceedings of the 4th Australasian conference on Interactive entertainment, Melbourne, Australia---------- Truna aka j.turner, Browning, D & Bidwell, N. (2007) Wanderer beyond game worlds, in proceedings, Hutchinson, A (ed) PerthDAC 2007: The seventh International Digital Arts and Culture Conference: The future of digital media culture, 15-18 September 2007, Perth, Australia, Curtin University of Technology---------- Truna aka j.turner (2006) To explore strange new worlds: experience design in 3 dimensional immersive environments - role and place in a world as object of interaction, In proceedings, Australasian Computer Human Interaction Conference, OZCHI 2006, November 22nd-24th, Sydney, Australia, November 20th – 24th 2006, pp 26- 29---------- Truna aka j.turner (2006) Digital songlines environment (Demonstration), In proceedings 2006 International conference on Game research and development, Perth, Australia---------- Truna aka j.turner (2006) Destination Space: Experiential Spatiality and Stories, Special Session on Experiential Spatiality, In proceedings 2006 International conference on Game research and development, Perth, Australia
Resumo:
Computer aided technologies, medical imaging, and rapid prototyping has created new possibilities in biomedical engineering. The systematic variation of scaffold architecture as well as the mineralization inside a scaffold/bone construct can be studied using computer imaging technology and CAD/CAM and micro computed tomography (CT). In this paper, the potential of combining these technologies has been exploited in the study of scaffolds and osteochondral repair. Porosity, surface area per unit volume and the degree of interconnectivity were evaluated through imaging and computer aided manipulation of the scaffold scan data. For the osteochondral model, the spatial distribution and the degree of bone regeneration were evaluated. In this study the versatility of two softwares Mimics (Materialize), CTan and 3D realistic visualization (Skyscan) were assessed, too.
Resumo:
The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).
Resumo:
Abstract—Computational Intelligence Systems (CIS) is one of advanced softwares. CIS has been important position for solving single-objective / reverse / inverse and multi-objective design problems in engineering. The paper hybridise a CIS for optimisation with the concept of Nash-Equilibrium as an optimisation pre-conditioner to accelerate the optimisation process. The hybridised CIS (Hybrid Intelligence System) coupled to the Finite Element Analysis (FEA) tool and one type of Computer Aided Design(CAD) system; GiD is applied to solve an inverse engineering design problem; reconstruction of High Lift Systems (HLS). Numerical results obtained by the hybridised CIS are compared to the results obtained by the original CIS. The benefits of using the concept of Nash-Equilibrium are clearly demonstrated in terms of solution accuracy and optimisation efficiency.
Resumo:
Voltage drop and rise at network peak and off–peak periods along with voltage unbalance are the major power quality problems in low voltage distribution networks. Usually, the utilities try to use adjusting the transformer tap changers as a solution for the voltage drop. They also try to distribute the loads equally as a solution for network voltage unbalance problem. On the other hand, the ever increasing energy demand, along with the necessity of cost reduction and higher reliability requirements, are driving the modern power systems towards Distributed Generation (DG) units. This can be in the form of small rooftop photovoltaic cells (PV), Plug–in Electric Vehicles (PEVs) or Micro Grids (MGs). Rooftop PVs, typically with power levels ranging from 1–5 kW installed by the householders are gaining popularity due to their financial benefits for the householders. Also PEVs will be soon emerged in residential distribution networks which behave as a huge residential load when they are being charged while in their later generation, they are also expected to support the network as small DG units which transfer the energy stored in their battery into grid. Furthermore, the MG which is a cluster of loads and several DG units such as diesel generators, PVs, fuel cells and batteries are recently introduced to distribution networks. The voltage unbalance in the network can be increased due to the uncertainties in the random connection point of the PVs and PEVs to the network, their nominal capacity and time of operation. Therefore, it is of high interest to investigate the voltage unbalance in these networks as the result of MGs, PVs and PEVs integration to low voltage networks. In addition, the network might experience non–standard voltage drop due to high penetration of PEVs, being charged at night periods, or non–standard voltage rise due to high penetration of PVs and PEVs generating electricity back into the grid in the network off–peak periods. In this thesis, a voltage unbalance sensitivity analysis and stochastic evaluation is carried out for PVs installed by the householders versus their installation point, their nominal capacity and penetration level as different uncertainties. A similar analysis is carried out for PEVs penetration in the network working in two different modes: Grid to vehicle and Vehicle to grid. Furthermore, the conventional methods are discussed for improving the voltage unbalance within these networks. This is later continued by proposing new and efficient improvement methods for voltage profile improvement at network peak and off–peak periods and voltage unbalance reduction. In addition, voltage unbalance reduction is investigated for MGs and new improvement methods are proposed and applied for the MG test bed, planned to be established at Queensland University of Technology (QUT). MATLAB and PSCAD/EMTDC simulation softwares are used for verification of the analyses and the proposals.
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however undesirable processing losses are unavoidable and always have been the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical methods are the most preferred; mechanical peeling methods do not create any harmful effects on the tissue and they keep edible portions of produce fresh. The main disadvantage of mechanical peeling is the rate of material loss and deformations. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry, this needs more study on technological aspects of these operations. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. A computer model of mechanical peeling process will be developed in this study to stimulate the energy consumption and stress strain interactions of cutter and tissue. The available Finite Element softwares and methods will be applied to establish the model. Improving the knowledge of interactions and involves variables in food operation particularly in peeling process is the main objectives of the proposed study. Understanding of these interrelationships will help researchers and designer of food processing equipments to develop new and more efficient technologies. Presented work intends to review available literature and previous works has been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.
Resumo:
An efficient strategy for identification of delamination in composite beams and connected structures is presented. A spectral finite-element model consisting of a damaged spectral element is used for model-based prediction of the damaged structural response in the frequency domain. A genetic algorithm (GA) specially tailored for damage identification is derived and is integrated with finite-element code for automation. For best application of the GA, sensitivities of various objective functions with respect to delamination parameters are studied and important conclusions are presented. Model-based simulations of increasing complexity illustrate some of the attractive features of the strategy in terms of accuracy as well as computational cost. This shows the possibility of using such strategies for the development of smart structural health monitoring softwares and systems.
Reconstructing Solid Model from 2D Scanned Images of Biological Organs for Finite Element Simulation
Resumo:
This work presents a methodology to reconstruct 3D biological organs from image sequences or other scan data using readily available free softwares with the final goal of using the organs (3D solids) for finite element analysis. The methodology deals with issues such as segmentation, conversion to polygonal surface meshes, and finally conversion of these meshes to 3D solids. The user is able to control the detail or the level of complexity of the solid constructed. The methodology is illustrated using 3D reconstruction of a porcine liver as an example. Finally, the reconstructed liver is imported into the commercial software ANSYS, and together with a cyst inside the liver, a nonlinear analysis performed. The results confirm that the methodology can be used for obtaining 3D geometry of biological organs. The results also demonstrate that the geometry obtained by following this methodology can be used for the nonlinear finite element analysis of organs. The methodology (or the procedure) would be of use in surgery planning and surgery simulation since both of these extensively use finite elements for numerical simulations and it is better if these simulations are carried out on patient specific organ geometries. Instead of following the present methodology, it would cost a lot to buy a commercial software which can reconstruct 3D biological organs from scanned image sequences.