598 resultados para processor
Resumo:
Desenvolvemos a modelagem numérica de dados sintéticos Marine Controlled Source Electromagnetic (MCSEM) usada na exploração de hidrocarbonetos para simples modelos tridimensionais usando computação paralela. Os modelos são constituidos de duas camadas estrati cadas: o mar e o sedimentos encaixantes de um delgado reservatório tridimensional, sobrepostas pelo semi-espaço correspondente ao ar. Neste Trabalho apresentamos uma abordagem tridimensional da técnica dos elementos nitos aplicada ao método MCSEM, usando a formulação da decomposição primária e secundária dos potenciais acoplados magnético e elétrico. Num pós-processamento, os campos eletromagnéticos são calculados a partir dos potenciais espalhados via diferenciação numérica. Exploramos o paralelismo dos dados MCSEM 3D em um levantamento multitransmissor, em que para cada posição do transmissor temos o mesmo processo de cálculos com dados diferentes. Para isso, usamos a biblioteca Message Passing Interface (MPI) e o modelo servidor cliente, onde o processador administrador envia os dados de entradas para os processadores clientes computar a modelagem. Os dados de entrada são formados pelos parâmetros da malha de elementos nitos, dos transmissores e do modelo geoelétrico do reservatório. Esse possui geometria prismática que representa lentes de reservatórios de hidrocarbonetos em águas profundas. Observamos que quando a largura e o comprimento horizontais desses reservatório têm a mesma ordem de grandeza, as resposta in-line são muito semelhantes e conseqüentemente o efeito tridimensional não é detectado. Por sua vez, quando a diferença nos tamanhos da largura e do comprimento do reservatório é signi cativa o efeito 3D é facilmente detectado em medidas in-line na maior dimensão horizontal do reservatório. Para medidas na menor dimensão esse efeito não é detectável, pois, nesse caso o modelo 3D se aproxima de um modelo bidimensional. O paralelismo dos dados é de rápida implementação e processamento. O tempo de execução para a modelagem multitransmissor em ambiente paralelo é equivalente ao tempo de processamento da modelagem para um único transmissor em uma máquina seqüêncial, com o acréscimo do tempo de latência na transmissão de dados entre os nós do cluster, o que justi ca o uso desta metodologia na modelagem e interpretação de dados MCSEM. Devido a reduzida memória (2 Gbytes) em cada processador do cluster do departamento de geofísica da UFPA, apenas modelos muito simples foram executados.
Resumo:
Os Sistemas de Detecção e Prevenção de Intrusão (Intrusion Detection Systems – IDS e Intrusion Prevention Systems - IPS) são ferramentas bastante conhecidas e bem consagradas no mundo da segurança da informação. Porém, a falta de integração com os equipamentos de rede como switches e roteadores acaba limitando a atuação destas ferramentas e exige um bom dimensionamento de recursos de hardware como processamento, memória e interfaces de rede de alta velocidade, utilizados para implementá-las. Diante de diversas limitações deparadas por pesquisadores e administradores de redes, surgiu o conceito de Rede Definida por Software (Software Defined Network – SDN), que ao separar os planos de controle e de dados, permite adaptar o funcionamento da rede de acordo com as necessidades de cada um. Desta forma, devido à padronização e flexibilidade propostas pelas SDNs, e das limitações apresentadas dos IPSs, esta dissertação de mestrado propõe o IPSFlow, um framework que utiliza uma rede baseada na arquitetura SDN e o protocolo OpenFlow para a criação de um IPS com ampla cobertura e que permite bloquear um tráfego caracterizado pelos IDS(s) como malicioso no equipamento mais próximo da origem. Para validar o framework, experimentos no ambiente virtual Mininet foram realizados utilizando-se o Snort como IDS para analisar tráfego de varredura (scan) gerado pelo Nmap de um host ao outro. Os resultados coletados apresentam que o IPSFlow funcionou conforme planejado ao efetuar o bloqueio de 85% do tráfego de varredura.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
In many movies of scientific fiction, machines were capable of speaking with humans. However mankind is still far away of getting those types of machines, like the famous character C3PO of Star Wars. During the last six decades the automatic speech recognition systems have been the target of many studies. Throughout these years many technics were developed to be used in applications of both software and hardware. There are many types of automatic speech recognition system, among which the one used in this work were the isolated word and independent of the speaker system, using Hidden Markov Models as the recognition system. The goals of this work is to project and synthesize the first two steps of the speech recognition system, the steps are: the speech signal acquisition and the pre-processing of the signal. Both steps were developed in a reprogrammable component named FPGA, using the VHDL hardware description language, owing to the high performance of this component and the flexibility of the language. In this work it is presented all the theory of digital signal processing, as Fast Fourier Transforms and digital filters and also all the theory of speech recognition using Hidden Markov Models and LPC processor. It is also presented all the results obtained for each one of the blocks synthesized e verified in hardware
Resumo:
This work is an action research conducted in an industry of consumer goods, presenting a new model of inventory management in the company's processors. This replacement of the inventory management of the company stemmed from the need to reduce the large number of deviations in the product write off in stock, thereby generating a low accuracy and reliability of data on inventories of processors shown by the company's ERP system. Spending on inventory adjustments could thus be reduced with the implementation of the new model, thus generating a cost savings for the company and thus increasing their competitive potential in the market. In the old system adopted by the company, write off raw material inventory was done automatically by the system for customized transactions by the company. However, since the implementation of ERP in the company, the automatic write off based on historical consumption of each product were made in many cases at random, generating a lot of mistakes. The new management system has replaced the automatic system by manual at the time of the return of the processed product in the company, thus creating a control which lots and quantities were consumed in the processing and making the stock shown by the ERP reliable and accurate
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Identify opportunities for software parallelism is a task that takes a lot of human time, but once some code patterns for parallelism are identified, a software could quickly accomplish this task. Thus, automating this process brings many benefits such as saving time and reducing errors caused by the programmer [1]. This work aims at developing a software environment that identifies opportunities for parallelism in a source code written in C language, and generates a program with the same behavior, but with higher degree of parallelism, compatible with a graphics processor compatible with CUDA architecture.
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The objective of this work is to determine the membership functions for the construction of a fuzzy controller to evaluate the energy situation of the company with respect to load and power factors. The energy assessment of a company is performed by technicians and experts based on the indices of load and power factors, and analysis of the machines used in production processes. This assessment is conducted periodically to detect whether the procedures performed by employees in relation to how of use electricity energy are correct. With a fuzzy controller, this performed can be done by machines. The construction of a fuzzy controller is initially characterized by the definition of input and output variables, and their associated membership functions. We also need to define a method of inference and a processor output. Finally, you need the help of technicians and experts to build a rule base, consisting of answers that provide these professionals in function of characteristics of the input variables. The controller proposed in this paper has as input variables load and power factors, and output the company situation. Their membership functions representing fuzzy sets called by linguistic qualities, as “VERY BAD” and “GOOD”. With the method of inference Mandani and the processor to exit from the Center of Area chosen, the structure of a fuzzy controller is established, simply by the choice by technicians and experts of the field energy to determine a set of rules appropriate for the chosen company. Thus, the interpretation of load and power factors by software comes to meeting the need of creating a single index that indicates an overall basis (rational and efficient) as the energy is being used.
Resumo:
The aim of this study was to verify the carrot cooking most suitable method to minimize nutrient losses. Carrot peel slices were subjected to pre cooking tests that were initiated with 0.5 min of duration and then increased in 0.5 min successively. The carrot pieces texture was monitored during the pre tests so all would havethe same texture independent of the type of cooking. The degree of softennes was evaluated by pressuring the pieces between the toes. The carrot pulp and pell were subjected to four types of heat treatment (pressure, immersion, microwave, and steam), after that they were pounded with a food processor and stored at -18 ºC. The nutritional analyses were as follow: The evalu determination of proteins, lipids, fibers, sugars reducers, total of ascorbic acid content and minerals (iron, calcium, zinc, magnesium, potassium, phosphorus, and calcium). The analyses were accomplished with fresh carrot and after cooking with the different methods. The peel of the carrot presented as amounts of proteins, lipids, fibers percentages, sugars reducers, total and ascorbic acid content equivalent to the pulp. In addition, the minerals content was superior in the peel in relation to the pulp, presenting respective percentages of 38,10%, 95,12%, 47,04%, 58,88%, 70,27% and 21,27%. There were nutrient losses in relation to the raw vegetable, when the carrot pieces were submitted to the different cooking methods. The methods of steaming and microwave had lower nutritional losses.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.