52 resultados para 3D representation method
em Instituto Polit
Resumo:
Global warming and the associated climate changes are being the subject of intensive research due to their major impact on social, economic and health aspects of the human life. Surface temperature time-series characterise Earth as a slow dynamics spatiotemporal system, evidencing long memory behaviour, typical of fractional order systems. Such phenomena are difficult to model and analyse, demanding for alternative approaches. This paper studies the complex correlations between global temperature time-series using the Multidimensional scaling (MDS) approach. MDS provides a graphical representation of the pattern of climatic similarities between regions around the globe. The similarities are quantified through two mathematical indices that correlate the monthly average temperatures observed in meteorological stations, over a given period of time. Furthermore, time dynamics is analysed by performing the MDS analysis over slices sampling the time series. MDS generates maps describing the stations’ locus in the perspective that, if they are perceived to be similar to each other, then they are placed on the map forming clusters. We show that MDS provides an intuitive and useful visual representation of the complex relationships that are present among temperature time-series, which are not perceived on traditional geographic maps. Moreover, MDS avoids sensitivity to the irregular distribution density of the meteorological stations.
Resumo:
Para o projeto de qualquer estrutura existente (edifícios, pontes, veículos, máquinas, etc.) é necessário conhecer as condições de carga, geometria e comportamento de todas as suas partes, assim como respeitar as normativas em vigor nos países nos quais a estrutura será aplicada. A primeira parte de qualquer projeto nesta área passa pela fase da análise estrutural, onde são calculadas todas as interações e efeitos de cargas sobre as estruturas físicas e os seus componentes de maneira a verificar a aptidão da estrutura para o seu uso. Inicialmente parte-se de uma estrutura de geometria simplificada, pondo de parte os elementos físicos irrelevantes (elementos de fixação, revestimentos, etc.) de maneira a simplificar o cálculo de estruturas complexas e, em função dos resultados obtidos da análise estrutural, melhorar a estrutura se necessário. A análise por elementos finitos é a ferramenta principal durante esta primeira fase do projeto. E atualmente, devido às exigências do mercado, é imprescindível o suporte computorizado de maneira a agilizar esta fase do projeto. Existe para esta finalidade uma vasta gama de programas que permitem realizar tarefas que passam pelo desenho de estruturas, análise estática de cargas, análise dinâmica e vibrações, visualização do comportamento físico (deformações) em tempo real, que permitem a otimização da estrutura em análise. Porém, estes programas demostram uma certa complexidade durante a introdução dos parâmetros, levando muitas vezes a resultados errados. Assim sendo, é essencial para o projetista ter uma ferramenta fiável e simples de usar que possa ser usada para fins de projeto de estruturas e otimização. Sobre esta base nasce este projeto tese onde se elaborou um programa com interface gráfica no ambiente Matlab® para a análise de estruturas por elementos finitos, com elementos do tipo Barra e Viga, quer em 2D ou 3D. Este programa permite definir a estrutura por meio de coordenadas, introdução de forma rápida e clara, propriedades mecânicas dos elementos, condições fronteira e cargas a aplicar. Como resultados devolve ao utilizador as reações, deformações e distribuição de tensões nos elementos quer em forma tabular quer em representação gráfica sobre a estrutura em análise. Existe ainda a possibilidade de importação de dados e exportação dos resultados em ficheiros XLS e XLSX, de maneira a facilitar a gestão de informação. Foram realizados diferentes testes e análises de estruturas de forma a validar os resultados do programa e a sua integridade. Os resultados foram todos satisfatórios e convergem para os resultados de outros programas, publicados em livros, e para cálculo a mão feitos pelo autor.
Resumo:
The underground scenarios are one of the most challenging environments for accurate and precise 3d mapping where hostile conditions like absence of Global Positioning Systems, extreme lighting variations and geometrically smooth surfaces may be expected. So far, the state-of-the-art methods in underground modelling remain restricted to environments in which pronounced geometric features are abundant. This limitation is a consequence of the scan matching algorithms used to solve the localization and registration problems. This paper contributes to the expansion of the modelling capabilities to structures characterized by uniform geometry and smooth surfaces, as is the case of road and train tunnels. To achieve that, we combine some state of the art techniques from mobile robotics, and propose a method for 6DOF platform positioning in such scenarios, that is latter used for the environment modelling. A visual monocular Simultaneous Localization and Mapping (MonoSLAM) approach based on the Extended Kalman Filter (EKF), complemented by the introduction of inertial measurements in the prediction step, allows our system to localize himself over long distances, using exclusively sensors carried on board a mobile platform. By feeding the Extended Kalman Filter with inertial data we were able to overcome the major problem related with MonoSLAM implementations, known as scale factor ambiguity. Despite extreme lighting variations, reliable visual features were extracted through the SIFT algorithm, and inserted directly in the EKF mechanism according to the Inverse Depth Parametrization. Through the 1-Point RANSAC (Random Sample Consensus) wrong frame-to-frame feature matches were rejected. The developed method was tested based on a dataset acquired inside a road tunnel and the navigation results compared with a ground truth obtained by post-processing a high grade Inertial Navigation System and L1/L2 RTK-GPS measurements acquired outside the tunnel. Results from the localization strategy are presented and analyzed.
Resumo:
This work presents an automatic calibration method for a vision based external underwater ground-truth positioning system. These systems are a relevant tool in benchmarking and assessing the quality of research in underwater robotics applications. A stereo vision system can in suitable environments such as test tanks or in clear water conditions provide accurate position with low cost and flexible operation. In this work we present a two step extrinsic camera parameter calibration procedure in order to reduce the setup time and provide accurate results. The proposed method uses a planar homography decomposition in order to determine the relative camera poses and the determination of vanishing points of detected lines in the image to obtain the global pose of the stereo rig in the reference frame. This method was applied to our external vision based ground-truth at the INESC TEC/Robotics test tank. Results are presented in comparison with an precise calibration performed using points obtained from an accurate 3D LIDAR modelling of the environment.
Resumo:
Learning is not a spectator’s sport. Students do not learn much by just sitting in class listening their teachers, memorizing pre-packaged assignments and spitting out answers. The teaching-learning process has been a constant target of studies, particularly in Higher Education, in consequence of the annual increase of new students. The concern with maintaining a desired quality level in the training of these students, conjugated with the will to widen the access to all of those who finish Secondary School Education, has triggered a greater intervention from the education specialists, in partnership with the teachers of all Higher Education areas, in the analysis of this problem. Considering the particular case of Engineering, it has been witnessed a rising concern with the active learning strategies and forms of assessment. Research has demonstrated that students learn more if they are actively engaged with the material they are studying. In this presentation we describe, present and discuss the techniques and the results of Peer Instruction method in an introductory Calculus courses of an Engineering Bach
Resumo:
In this paper we present a Constraint Logic Programming (CLP) based model, and hybrid solving method for the Scheduling of Maintenance Activities in the Power Transmission Network. The model distinguishes from others not only because of its completeness but also by the way it models and solves the Electric Constraints. Specifically we present a efficient filtering algorithm for the Electrical Constraints. Furthermore, the solving method improves the pure CLP methods efficiency by integrating a type of Local Search technique with CLP. To test the approach we compare the method results with another method using a 24 bus network, which considerers 42 tasks and 24 maintenance periods.
Resumo:
This paper addresses the DNA code analysis in the perspective of dynamics and fractional calculus. Several mathematical tools are selected to establish a quantitative method without distorting the alphabet represented by the sequence of DNA bases. The association of Gray code, Fourier transform and fractional calculus leads to a categorical representation of species and chromosomes.
Resumo:
This paper proposes a computationally efficient methodology for the optimal location and sizing of static and switched shunt capacitors in large distribution systems. The problem is formulated as the maximization of the savings produced by the reduction in energy losses and the avoided costs due to investment deferral in the expansion of the network. The proposed method selects the nodes to be compensated, as well as the optimal capacitor ratings and their operational characteristics, i.e. fixed or switched. After an appropriate linearization, the optimization problem was formulated as a large-scale mixed-integer linear problem, suitable for being solved by means of a widespread commercial package. Results of the proposed optimizing method are compared with another recent methodology reported in the literature using two test cases: a 15-bus and a 33-bus distribution network. For the both cases tested, the proposed methodology delivers better solutions indicated by higher loss savings, which are achieved with lower amounts of capacitive compensation. The proposed method has also been applied for compensating to an actual large distribution network served by AES-Venezuela in the metropolitan area of Caracas. A convergence time of about 4 seconds after 22298 iterations demonstrates the ability of the proposed methodology for efficiently handling large-scale compensation problems.
Resumo:
Introduction / Aims: Adopting the important decisions represents a specific task of the manager. An efficient manager takes these decisions during a sistematic process with well-defined elements, each with a precise order. In the pharmaceutical practice and business, in the supply process of the pharmacies, there are situations when the medicine distributors offer a certain discount, but require payment in a shorter period of time. In these cases, the analysis of the offer can be made with the help of the decision tree method, which permits identifying the decision offering the best possible result in a given situation. The aims of the research have been the analysis of the product offers of many different suppliers and the establishing of the most advantageous ways of pharmacy supplying. Material / Methods: There have been studied the general product offers of the following medical stores: A&G Med, Farmanord, Farmexim, Mediplus, Montero and Relad. In the case of medicine offers including a discount, the decision tree method has been applied in order to select the most advantageous offers. The Decision Tree is a management method used in taking the right decisions and it is generally used when one needs to evaluate the decisions that involve a series of stages. The tree diagram is used in order to look for the most efficient means to attain a specific goal. The decision trees are the most probabilistic methods, useful when adopting risk taking decisions. Results: The results of the analysis on the tree diagrams have indicated the fact that purchasing medicines with discount (1%, 10%, 15%) and payment in a shorter time interval (120 days) is more profitable than purchasing without a discount and payment in a longer time interval (160 days). Discussion / Conclusion: Depending on the results of the tree diagram analysis, the pharmacies would purchase from the selected suppliers. The research has shown that the decision tree method represents a valuable work instrument in choosing the best ways for supplying pharmacies and it is very useful to the specialists from the pharmaceutical field, pharmaceutical management, to medicine suppliers, pharmacy practitioners from the community pharmacies and especially to pharmacy managers, chief – pharmacists.
Resumo:
Electricity markets are complex environments with very particular characteristics. MASCEM is a market simulator developed to allow deep studies of the interactions between the players that take part in the electricity market negotiations. This paper presents a new proposal for the definition of MASCEM players’ strategies to negotiate in the market. The proposed methodology is multiagent based, using reinforcement learning algorithms to provide players with the capabilities to perceive the changes in the environment, while adapting their bids formulation according to their needs, using a set of different techniques that are at their disposal. Each agent has the knowledge about a different method for defining a strategy for playing in the market, the main agent chooses the best among all those, and provides it to the market player that requests, to be used in the market. This paper also presents a methodology to manage the efficiency/effectiveness balance of this method, to guarantee that the degradation of the simulator processing times takes the correct measure.
Resumo:
This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.
Resumo:
Proteins are biochemical entities consisting of one or more blocks typically folded in a 3D pattern. Each block (a polypeptide) is a single linear sequence of amino acids that are biochemically bonded together. The amino acid sequence in a protein is defined by the sequence of a gene or several genes encoded in the DNA-based genetic code. This genetic code typically uses twenty amino acids, but in certain organisms the genetic code can also include two other amino acids. After linking the amino acids during protein synthesis, each amino acid becomes a residue in a protein, which is then chemically modified, ultimately changing and defining the protein function. In this study, the authors analyze the amino acid sequence using alignment-free methods, aiming to identify structural patterns in sets of proteins and in the proteome, without any other previous assumptions. The paper starts by analyzing amino acid sequence data by means of histograms using fixed length amino acid words (tuples). After creating the initial relative frequency histograms, they are transformed and processed in order to generate quantitative results for information extraction and graphical visualization. Selected samples from two reference datasets are used, and results reveal that the proposed method is able to generate relevant outputs in accordance with current scientific knowledge in domains like protein sequence/proteome analysis.
Resumo:
It is difficult to get the decision about an opinion after many users get the meeting in same place. It used to spend too much time in order to find solve some problem because of the various opinions of each other. TAmI (Group Decision Making Toolkit) is the System to Group Decision in Ambient Intelligence [1]. This program was composed with IGATA [2], WebMeeting and the related Database system. But, because it is sent without any encryption in IP / Password, it can be opened to attacker. They can use the IP / Password to the bad purpose. As the result, although they make the wrong result, the joined member can’t know them. Therefore, in this paper, we studied the applying method of user’s authentication into TAmI.
Resumo:
Dissertação para obtenção do Grau de Mestre em Auditoria Orientação científica do Professor Coordenador Rodrigo Mário Oliveira Carvalho
Resumo:
The performance of an amperometric biosensor constructed by associating tyrosinase (Tyr) enzyme with the advantages of a 3D gold nanoelectrode ensemble (GNEE) is evaluated in a flow-injection analysis (FIA) system for the analysis of l-dopa. GNEEs were fabricated by electroless deposition of the metal within the pores of polycarbonate track-etched membranes. A simple solvent etching procedure based on the solubility of polycarbonate membranes is adopted for the fabrication of the 3D GNEE. Afterward, enzyme was immobilized onto preformed self-assembled monolayers of cysteamine on the 3D GNEEs (GNEE-Tyr) via cross-linking with glutaraldehyde. The experimental conditions of the FIA system, such as the detection potential (−0.200 V vs. Ag/AgCl) and flow rates (1.0 mL min−1) were optimized. Analytical responses for l-dopa were obtained in a wide concentration range between 1 × 10−8 mol L−1 and 1 × 10−2 mol L−1. The limit of quantification was found to be 1 × 10−8 mol L−1 with a resultant % RSD of 7.23% (n = 5). The limit of detection was found to be 1 × 10−9 mol L−1 (S/N = 3). The common interfering compounds, namely glucose (10 mmol L−1), ascorbic acid (10 mmol L−1), and urea (10 mmol L−1), were studied. The recovery of l-dopa (1 × 10−7 mol L−1) from spiked urine samples was found to be 96%. Therefore, the developed method is adequate to be applied in the clinical analysis.