877 resultados para Analysis tools


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Finance is one of the fastest growing areas in modern applied mathematics with real world applications. The interest of this branch of applied mathematics is best described by an example involving shares. Shareholders of a company receive dividends which come from the profit made by the company. The proceeds of the company, once it is taken over or wound up, will also be distributed to shareholders. Therefore shares have a value that reflects the views of investors about the likely dividend payments and capital growth of the company. Obviously such value will be quantified by the share price on stock exchanges. Therefore financial modelling serves to understand the correlations between asset and movements of buy/sell in order to reduce risk. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. There are other financial activities and it is not an intention of this paper to discuss all of these activities. The main concern of this paper is to propose a parallel algorithm for the numerical solution of an European option. This paper is organised as follows. First, a brief introduction is given of a simple mathematical model for European options and possible numerical schemes of solving such mathematical model. Second, Laplace transform is applied to the mathematical model which leads to a set of parametric equations where solutions of different parametric equations may be found concurrently. Numerical inverse Laplace transform is done by means of an inversion algorithm developed by Stehfast. The scalability of the algorithm in a distributed environment is demonstrated. Third, a performance analysis of the present algorithm is compared with a spatial domain decomposition developed particularly for time-dependent heat equation. Finally, a number of issues are discussed and future work suggested.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hyperspectral sensors are being developed for remote sensing applications. These sensors produce huge data volumes which require faster processing and analysis tools. Vertex component analysis (VCA) has become a very useful tool to unmix hyperspectral data. It has been successfully used to determine endmembers and unmix large hyperspectral data sets without the use of any a priori knowledge of the constituent spectra. Compared with other geometric-based approaches VCA is an efficient method from the computational point of view. In this paper we introduce new developments for VCA: 1) a new signal subspace identification method (HySime) is applied to infer the signal subspace where the data set live. This step also infers the number of endmembers present in the data set; 2) after the projection of the data set onto the signal subspace, the algorithm iteratively projects the data set onto several directions orthogonal to the subspace spanned by the endmembers already determined. The new endmember signature corresponds to these extreme of the projections. The capability of VCA to unmix large hyperspectral scenes (real or simulated), with low computational complexity, is also illustrated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente trabalho é o resultado de um projecto experimental que teve como principal objetivo a redução dos tempos de setup, através da implementação da metodologia Single Minute Exchange of Die (SMED) em equipamentos de eletroerosão (EDM) por fio e penetração. Para garantir o sucesso da implementação, após uma revisão bibliográfica centrada no tema SMED, estabelece-se uma metodologia, numa lógica de melhoria contínua análoga a um ciclo PDCA. A metodologia estabelecida envolve as seguintes fases: caracterização da situação inicial, observação, recolha de dados, análise dos dados e implementação SMED. A caracterização da situação inicial visa qualificar a situação encontrada e identificar a zona de atuação - equipamentos em estudo. A observação do processo produtivo é suportada pela ferramenta fluxograma do processo, por sua vez, a recolha de dados visa a avaliação inicial e nessa perspectiva recorre-se à abordagem OEE. Na fase de análise dos dados são introduzidas as ferramentas de análise VSM e diagrama de causa e efeito que auxiliam a identificação de um conjunto de acções de melhoria, a realizar previamente à fase de implementação SMED, culminando assim uma fase PRÉ-SMED. Reunidas as condições ideais iniciou-se a implementação efectiva da metodologia SMED obtendo-se reduções do tempo total de setup superiores a 60%. O presente estudo permite concluir que podem surgir diferenças significativas em diversas aplicações SMED, enfatizando a importância de uma fase PRÉ-SMED de modo a potenciar os resultados alcançados com uma implementação SMED.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Efficient crop monitoring and pest damage assessments are key to protecting the Australian agricultural industry and ensuring its leading position internationally. An important element in pest detection is gathering reliable crop data frequently and integrating analysis tools for decision making. Unmanned aerial systems are emerging as a cost-effective solution to a number of precision agriculture challenges. An important advantage of this technology is it provides a non-invasive aerial sensor platform to accurately monitor broad acre crops. In this presentation, we will give an overview on how unmanned aerial systems and machine learning can be combined to address crop protection challenges. A recent 2015 study on insect damage in sorghum will illustrate the effectiveness of this methodology. A UAV platform equipped with a high-resolution camera was deployed to autonomously perform a flight pattern over the target area. We describe the image processing pipeline implemented to create a georeferenced orthoimage and visualize the spatial distribution of the damage. An image analysis tool has been developed to minimize human input requirements. The computer program is based on a machine learning algorithm that automatically creates a meaningful partition of the image into clusters. Results show the algorithm delivers decision boundaries that accurately classify the field into crop health levels. The methodology presented in this paper represents a venue for further research towards automated crop protection assessments in the cotton industry, with applications in detecting, quantifying and monitoring the presence of mealybugs, mites and aphid pests.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente estudo refere-se a uma Dissertação de Mestrado em Educação Ambiental (EA) a qual teve como objetivo estabelecer um diálogo em torno do conceito de nature-za a partir da análise das formas desse discurso veiculadas nas histórias em quadrinhos (HQs) do personagem Chico Bento e suas relações com as concepções presentes no campo de saber da EA. Com a intenção de problematizar a forma como as HQs, através do discurso de natureza, vêm contribuindo para pensarmos sobre EA, selecionaram-se HQs do Chico Bento publicadas entre os anos de 2009 e 2013 e que fazem referência à natureza. Apoiado em autores como Michel Foucault, Isabel Carvalho, Leandro Belina-so Guimarães, Maria Lúcia Castagna Wortmann, Mônica Meyer, Keith Thomas, Ray-mond Williams, entre outros, a pesquisa analisou as enunciações de natureza que com-puseram o corpus de análise desta investigação. O caminho metodológico selecionado para operar com o material empírico trata especificamente de algumas ferramentas da Análise do Discurso, a partir de Michel Foucault. Na análise do material posto em sus-penso, a pesquisa apontou para dois enunciados potentes, os quais vêm auxiliando na constituição do discurso de natureza por meio das HQs: a natureza constituída nos des-locamentos operados pelas diferenças culturais entre as realidades rural e urbana e um ideal romântico de natureza produzido pela visibilidade e enunciabilidade das HQs do Chico Bento. Com isso, evidenciou-se que as HQs analisadas entram na ordem do dis-curso verdadeiro no campo da EA. Sendo assim, ressalta-se a importância de atentarmos para os gibis e suas histórias, como um artefato cultural potente que vem nos auxiliando a olhar para o dispositivo da EA. Tal dispositivo interpelando-nos a constituir modos de ser e de viver, diante de saberes e verdades produzidas na e pela cultura, pois, diante dos significados travados por meio da cultura, vamos engendrando nossos modos de vida, bem como estabelecendo relações com o mundo em que vivemos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The incredible rapid development to huge volumes of air travel, mainly because of jet airliners that appeared to the sky in the 1950s, created the need for systematic research for aviation safety and collecting data about air traffic. The structured data can be analysed easily using queries from databases and running theseresults through graphic tools. However, in analysing narratives that often give more accurate information about the case, mining tools are needed. The analysis of textual data with computers has not been possible until data mining tools have been developed. Their use, at least among aviation, is still at a moderate level. The research aims at discovering lethal trends in the flight safety reports. The narratives of 1,200 flight safety reports from years 1994 – 1996 in Finnish were processed with three text mining tools. One of them was totally language independent, the other had a specific configuration for Finnish and the third originally created for English, but encouraging results had been achieved with Spanish and that is why a Finnish test was undertaken, too. The global rate of accidents is stabilising and the situation can now be regarded as satisfactory, but because of the growth in air traffic, the absolute number of fatal accidents per year might increase, if the flight safety will not be improved. The collection of data and reporting systems have reached their top level. The focal point in increasing the flight safety is analysis. The air traffic has generally been forecasted to grow 5 – 6 per cent annually over the next two decades. During this period, the global air travel will probably double also with relatively conservative expectations of economic growth. This development makes the airline management confront growing pressure due to increasing competition, signify cant rise in fuel prices and the need to reduce the incident rate due to expected growth in air traffic volumes. All this emphasises the urgent need for new tools and methods. All systems provided encouraging results, as well as proved challenges still to be won. Flight safety can be improved through the development and utilisation of sophisticated analysis tools and methods, like data mining, using its results supporting the decision process of the executives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study presents an application of the geographical information system technology on plant disease involving a multidisciplinary teamwork of geoprocessing and physiopathology specialists. The spatial analysis tools in a GIS were used to evaluate the spatial distribution of two diseases of maize in Brazil: polysora rusl caused by Puccinia polysora and tropical rust caused by Physopella zeae. A database of cIimate variables (mean temperature. relative humidity. and leaf wetness duration) of cIimatological normal from 1961-1990 was obtained and then related it to a mathematical model of disease development (polysora rust) and to the cIimate intervals (tropical rust) in order to obtain the maps. The choice of the model or the favorable climate interval is the important chalIenge of the method because the difficulty of adequacy to the spatial and temporal scales for the specific application. The major incidence of both disease occurred in almost alI the North region from January to June. although this region has traditionalIy a low production of maize. Considering the biggest producers regions. for both the diseases, favorable areas are located in part of Mato Grosso, Tocanlins. Minas Gerais; Mato Grosso do Sul. and coastal areas of São Paulo, Paraná, and Santa Catarina. varying among the dilferent months from January to June. The method allowed making an adequate distinction of the states and the months considered.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Human activities are altering greenhouse gas concentrations in the atmosphere and causing global climate change. The issue of impacts of human-induced climate change has become increasingly important in recent years. The objective of this work was to develop a database of climate information of the future scenarios using a Geographic Information System (GIS) tools. Future scenarios focused on the decades of the 2020?s, 2050?s, and 2080?s (scenarios A2 and B2) were obtained from the General Circulation Models (GCM) available on Data Distribution Centre from the Third Assessment Report (TAR) of Intergovernmental Panel on Climate Change (IPCC). The TAR is compounded by six GCM with different spatial resolutions (ECHAM4:2.8125×2.8125º, HadCM3: 3.75×2.5º, CGCM2: 3.75×3.75º, CSIROMk2b: 5.625×3.214º, and CCSR/NIES: 5.625×5.625º). The mean monthly of the climate variables was obtained by the average from the available models using the GIS spatial analysis tools (arithmetic operation). Maps of mean monthly variables of mean temperature, minimum temperature, maximum temperature, rainfall, relative humidity, and solar radiation were elaborated adopting the spatial resolution of 0.5° X 0.5° latitude and longitude. The method of elaborating maps using GIS tools allowed to evaluate the spatial and distribution of future climate assessments. Nowadays, this database is being used in studies of impacts of climate change on plant disease of Embrapa projects.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este relatório de estágio representa o desenvolvimento de uma metodologia, com recurso a Tecnologias de Informação Geográfica, que permite a identificação de zonas que condicionam os processos de planeamento, projecto e manutenção das estruturas que compõem a Rede de Distribuição de Energia Eléctrica. Centra-se assim nas Tecnologias de Informação Geográfica enquanto instrumentos que permitem realizar estudos de carácter territorial com relevância para o sector energético. São trabalhadas a identificação e delimitação de zonas de gelo, zonas de poluição industrial e zonas de salinidade em Portugal Continental. A delimitação destas zonas foi realizada com recurso a ferramentas de análise espacial, utilizando dados fidedignos provenientes de Instituições da Administração Pública. São documentadas também tarefas extraordinárias propostas pela EDP Distribuição - Energia, S.A. Desta forma é possível compreender as vantagens e desvantagens da utilização destas tecnologias por parte da EDP Distribuição, de forma a melhorar a gestão dos recursos disponíveis, para ajudar a manter a qualidade do serviço prestado. Ficou no entanto a recomendação de uma maior aposta nestas tecnologias.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation is devoted to the equations of motion governing the evolution of a fluid or gas at the macroscopic scale. The classical model is a PDE description known as the Navier-Stokes equations. The behavior of solutions is notoriously complex, leading many in the scientific community to describe fluid mechanics using a statistical language. In the physics literature, this is often done in an ad-hoc manner with limited precision about the sense in which the randomness enters the evolution equation. The stochastic PDE community has begun proposing precise models, where a random perturbation appears explicitly in the evolution equation. Although this has been an active area of study in recent years, the existing literature is almost entirely devoted to incompressible fluids. The purpose of this thesis is to take a step forward in addressing this statistical perspective in the setting of compressible fluids. In particular, we study the well posedness for the corresponding system of Stochastic Navier Stokes equations, satisfied by the density, velocity, and temperature. The evolution of the momentum involves a random forcing which is Brownian in time and colored in space. We allow for multiplicative noise, meaning that spatial correlations may depend locally on the fluid variables. Our main result is a proof of global existence of weak martingale solutions to the Cauchy problem set within a bounded domain, emanating from large initial datum. The proof involves a mix of deterministic and stochastic analysis tools. Fundamentally, the approach is based on weak compactness techniques from the deterministic theory combined with martingale methods. Four layers of approximate stochastic PDE's are built and analyzed. A careful study of the probability laws of our approximating sequences is required. We prove appropriate tightness results and appeal to a recent generalization of the Skorohod theorem. This ultimately allows us to deduce analogues of the weak compactness tools of Lions and Feireisl, appropriately interpreted in the stochastic setting.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Because of the occupation occurred in the last thirty years at Xingu river basin, this region has been suffering a large deforestation pressure, especially on its headwaters areas. This study aims to apply GIS techniques to evaluate how land use change has influenced the deforestation dynamics of Xingu water basin in Mato Grosso State. For that, a GIS based study was carried out were the deforestation data for the period between 2000 and 2005 was spatially integrated with settlement areas, indigenous lands, sites of mineral deposits and prospect areas. From this spatially integration, it was possible to analyze statistically how the deforestation has manifested on each kind of occupation, considering the original forest area. The techniques used, including inventory and database organization on GIS environment, and spatial analysis tools made it possible to analyze the deforestation in the Xingu basin in Mato Grosso State between the period of 2000 and 2005, and identify the most affected areas, considering different land uses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design