9 resultados para Analysis Tools

em Instituto Politécnico do Porto, Portugal


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este trabalho insere-se no âmbito de um estágio curricular realizado no gabinete de projetos SE2P, durante o qual foram desenvolvidas ferramentas de cálculo estrutural em situação de incêndio, integradas numa metodologia de trabalho que segue os princípios inerentes à tecnologia BIM (Building Information Modeling). Em particular foi implementado um procedimento de análise ao fogo segundo os modelos simplificados prescritos pelos Eurocódigos. Estes modelos garantem a segurança estrutural, permitindo, de forma rápida e eficiente, a determinação das necessidades de proteção passiva para diferentes cenários, tendo em vista a obtenção da solução mais económica. Esta dissertação, para além da apresentação do trabalho desenvolvido em regime de estágio curricular, objetivou dotar o leitor de um documento que introduza os principais conceitos relativos ao cálculo estrutural em situação de incêndio, indicando as várias opções de análise e respetivas vantagens e desvantagens, ajudando a definir a sua adequabilidade ao projeto em estudo. Neste contexto é efetuada uma introdução geral ao fenómeno do fogo e às medidas mais correntes de proteção, indicando-se os documentos normativos aplicáveis tanto ao cálculo estrutural como aos materiais de proteção. É também abordada a interação entre as várias normas que devem ser consultadas quando é efetuada uma análise ao fogo, e quais se aplicam a cada fase da análise. Efetua-se uma clara distinção entre a análise do comportamento térmico e mecânico, indicando-se as principais propriedades dos materiais em função do tipo de análise e a forma como são afetadas pela temperatura. No campo da análise do comportamento térmico faz-se essencialmente referência aos modelos de cálculo simplificados do desenvolvimento da temperatura em elementos metálicos e vigas mistas, com e sem proteção passiva. No que concerne ao campo da análise do comportamento mecânico são descritos os modelos de cálculo simplificados para a verificação da segurança estrutural atendendo às ações e combinações em situação de incêndio e à perda de resistência a temperaturas elevadas. Relativamente ao trabalho desenvolvido na SE2P, relativo ao desenvolvimento de ferramentas de cálculo e a sua implementação na análise ao fogo, realiza-se uma descrição detalhada de todo o processo, e da forma como se integra no conceito BIM, utilizando informações provenientes da modelação das estruturas e introduzindo novos dados ao modelo. Realizou-se também a aplicação de todo o procedimento de análise e das ferramentas desenvolvidas, a um caso de estudo baseado num edifício de habitação. Este caso de estudo serviu também para criar cenários de otimização utilizando-se referências de preços de mercado para o aço, sua transformação em fábrica e sistemas de proteção passiva, demonstrando-se a dificuldade em encontrar caminhos rápidos e diretos de decisão no processo de otimização.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Si3N4 tools were coated with a thin diamond film using a Hot-Filament Chemical Vapour Deposition (HFCVD) reactor, in order to machining a grey cast iron. Wear behaviour of these tools in high speed machining was the main subject of this work. Turning tests were performed with a combination of cutting speeds of 500, 700 and 900 m min−1, and feed rates of 0.1, 0.25 and 0.4 mm rot−1, remaining constant the depth of cut of 1 mm. In order to evaluate the tool behaviour during the turning tests, cutting forces were analyzed being verified a significant increase with feed rate. Diamond film removal occurred for the most severe set of cutting parameters. It was also observed the adhesion of iron and manganese from the workpiece to the tool. Tests were performed on a CNC lathe provided with a 3-axis dynamometer. Results were collected and registered by homemade software. Tool wear analysis was achieved by a Scanning Electron Microscope (SEM) provided with an X-ray Energy Dispersive Spectroscopy (EDS) system. Surface analysis was performed by a profilometer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E) seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R) distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest fires dynamics is often characterized by the absence of a characteristic length-scale, long range correlations in space and time, and long memory, which are features also associated with fractional order systems. In this paper a public domain forest fires catalogue, containing information of events for Portugal, covering the period from 1980 up to 2012, is tackled. The events are modelled as time series of Dirac impulses with amplitude proportional to the burnt area. The time series are viewed as the system output and are interpreted as a manifestation of the system dynamics. In the first phase we use the pseudo phase plane (PPP) technique to describe forest fires dynamics. In the second phase we use multidimensional scaling (MDS) visualization tools. The PPP allows the representation of forest fires dynamics in two-dimensional space, by taking time series representative of the phenomena. The MDS approach generates maps where objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to better understand forest fires behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An overwhelming problem in Math Curriculums in Higher Education Institutions (HEI), we are daily facing in the last decade, is the substantial differences in Math background of our students. When you try to transmit, engage and teach subjects/contents that your “audience” is unable to respond to and/or even understand what we are trying to convey, it is somehow frustrating. In this sense, the Math projects and other didactic strategies, developed through Learning Management System Moodle, which include an array of activities that combine higher order thinking skills with math subjects and technology, for students of HE, appear as remedial but important, proactive and innovative measures in order to face and try to overcome these considerable problems. In this paper we will present some of these strategies, developed in some organic units of the Polytechnic Institute of Porto (IPP). But, how “fruitful” are the endless number of hours teachers spent in developing and implementing these platforms? Do students react to them as we would expect? Do they embrace this opportunity to overcome their difficulties? How do they use/interact individually with LMS platforms? Can this environment that provides the teacher with many interesting tools to improve the teaching – learning process, encourages students to reinforce their abilities and knowledge? In what way do they use each available material – videos, interactive tasks, texts, among others? What is the best way to assess student’s performance in these online learning environments? Learning Analytics tools provides us a huge amount of data, but how can we extract “good” and helpful information from them? These and many other questions still remain unanswered but we look forward to get some help in, at least, “get some drafts” for them because we feel that this “learning analysis”, that tackles the path from the objectives to the actual results, is perhaps the only way we have to move forward in the “best” learning and teaching direction.