919 resultados para Criptografia de dados (Computação)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attacks to devices connected to networks are one of the main problems related to the confidentiality of sensitive data and the correct functioning of computer systems. In spite of the availability of tools and procedures that harden or prevent the occurrence of security incidents, network devices are successfully attacked using strategies applied in previous events. The lack of knowledge about scenarios in which these attacks occurred effectively contributes to the success of new attacks. The development of a tool that makes this kind of information available is, therefore, of great relevance. This work presents a support system to the management of corporate security for the storage, retrieval and help in constructing attack scenarios and related information. If an incident occurs in a corporation, an expert must access the system to store the specific attack scenario. This scenario, made available through controlled access, must be analyzed so that effective decisions or actions can be taken for similar cases. Besides the strategy used by the attacker, attack scenarios also exacerbate vulnerabilities in devices. The access to this kind of information contributes to an increased security level of a corporation's network devices and a decreased response time to occurring incidents

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The using of supervision systems has become more and more essential in accessing, managing and obtaining data of industrial processes, because of constant and frequent developments in industrial automation. These supervisory systems (SCADA) have been widely used in many industrial environments to store process data and to control the processes in accordance with some adopted strategy. The SCADA s control hardware is the set of equipments that execute this work. The SCADA s supervision software accesses process data through the control hardware and shows them to the users. Currently, many industrial systems adopt supervision softwares developed by the same manufacturer of the control hardware. Usually, these softwares cannot be used with other equipments made by distinct manufacturers. This work proposes an approach for developing supervisory systems able to access process information through different control hardwares. An architecture for supervisory systems is first defined, in order to guarantee efficiency in communication and data exchange. Then, the architecture is applied in a supervisory system to monitor oil wells that use distinct control hardwares. The implementation was modeled and verified by using the formal method of the Petri networks. Finally, experimental results are presented to demonstrate the applicability of the proposed solution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reinforcement learning is a machine learning technique that, although finding a large number of applications, maybe is yet to reach its full potential. One of the inadequately tested possibilities is the use of reinforcement learning in combination with other methods for the solution of pattern classification problems. It is well documented in the literature the problems that support vector machine ensembles face in terms of generalization capacity. Algorithms such as Adaboost do not deal appropriately with the imbalances that arise in those situations. Several alternatives have been proposed, with varying degrees of success. This dissertation presents a new approach to building committees of support vector machines. The presented algorithm combines Adaboost algorithm with a layer of reinforcement learning to adjust committee parameters in order to avoid that imbalances on the committee components affect the generalization performance of the final hypothesis. Comparisons were made with ensembles using and not using the reinforcement learning layer, testing benchmark data sets widely known in area of pattern classification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of wireless sensor and actuator networks in industry has been increasing past few years, bringing multiple benefits compared to wired systems, like network flexibility and manageability. Such networks consists of a possibly large number of small and autonomous sensor and actuator devices with wireless communication capabilities. The data collected by sensors are sent directly or through intermediary nodes along the network to a base station called sink node. The data routing in this environment is an essential matter since it is strictly bounded to the energy efficiency, thus the network lifetime. This work investigates the application of a routing technique based on Reinforcement Learning s Q-Learning algorithm to a wireless sensor network by using an NS-2 simulated environment. Several metrics like energy consumption, data packet delivery rates and delays are used to validate de proposal comparing it with another solutions existing in the literature

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The opening of the Brazilian market of electricity and competitiveness between companies in the energy sector make the search for useful information and tools that will assist in decision making activities, increase by the concessionaires. An important source of knowledge for these utilities is the time series of energy demand. The identification of behavior patterns and description of events become important for the planning execution, seeking improvements in service quality and financial benefits. This dissertation presents a methodology based on mining and representation tools of time series, in order to extract knowledge that relate series of electricity demand in various substations connected of a electric utility. The method exploits the relationship of duration, coincidence and partial order of events in multi-dimensionals time series. To represent the knowledge is used the language proposed by Mörchen (2005) called Time Series Knowledge Representation (TSKR). We conducted a case study using time series of energy demand of 8 substations interconnected by a ring system, which feeds the metropolitan area of Goiânia-GO, provided by CELG (Companhia Energética de Goiás), responsible for the service of power distribution in the state of Goiás (Brazil). Using the proposed methodology were extracted three levels of knowledge that describe the behavior of the system studied, representing clearly the system dynamics, becoming a tool to assist planning activities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes a collaborative system for marking dangerous points in the transport routes and generation of alerts to drivers. It consisted of a proximity warning system for a danger point that is fed by the driver via a mobile device equipped with GPS. The system will consolidate data provided by several different drivers and generate a set of points common to be used in the warning system. Although the application is designed to protect drivers, the data generated by it can serve as inputs for the responsible to improve signage and recovery of public roads

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need to implement a software architecture that promotes the development of a SCADA supervisory system for monitoring industrial processes simulated with the flexibility of adding intelligent modules and devices such as CLP, according to the specifications of the problem, it was the motivation for this work. In the present study, we developed an intelligent supervisory system on a simulation of a distillation column modeled with Unisim. Furthermore, OLE Automation was used as communication between the supervisory and simulation software, which, with the use of the database, promoted an architecture both scalable and easy to maintain. Moreover, intelligent modules have been developed for preprocessing, data characteristics extraction, and variables inference. These modules were fundamentally based on the Encog software

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New versions of SCTP protocol allow the implementation of handover procedures in the transport layer, as well as the supply of a partially reliable communication service. A communication architecture is proposed herein, integrating SCTP with the session initiation protocol, SIP, besides additional protocols. This architecture is intended to handle voice applications over IP networks with mobility requirements. User localization procedures are specified in the application layer as well, using SIP, as an alternative mean to the mechanisms used by traditional protocols, that support mobility in the network layer. The SDL formal specification language is used to specify the operation of a control module, which coordinates the operation of the system component protocols. This formal specification is intended to prevent ambiguities and inconsistencies in the definition of this module, assisting in the correct implementation of the elements of this architecture

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Ethernet technology dominates the market of computer local networks. However, it was not been established as technology for industrial automation set, where the requirements demand determinism and real-time performance. Many solutions have been proposed to solve the problem of non-determinism, which are based mainly on TDMA (Time Division Multiple Access), Token Passing and Master-Slave. This work of research carries through measured of performance that allows to compare the behavior of the Ethernet nets when submitted with the transmissions of data on protocols UDP and RAW Ethernet, as well as, on three different types of Ethernet technologies. The objective is to identify to the alternative amongst the protocols and analyzed Ethernet technologies that offer to a more satisfactory support the nets of the industrial automation and distributed real-time application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional irrigation projects do not locally determine the water availability in the soil. Then, irregular irrigation cycles may occur: some with insufficient amount that leads to water deficit, other with excessive watering that causes lack of oxygen in plants. Due to the nonlinear nature of this problem and the multivariable context of irrigation processes, fuzzy logic is suggested to replace commercial ON-OFF irrigation system with predefined timing. Other limitation of commercial solutions is that irrigation processes either consider the different watering needs throughout plant growth cycles or the climate changes. In order to fulfill location based agricultural needs, it is indicated to monitor environmental data using wireless sensors connected to an intelligent control system. This is more evident in applications as precision agriculture. This work presents the theoretical and experimental development of a fuzzy system to implement a spatially differentiated control of an irrigation system, based on soil moisture measurement with wireless sensor nodes. The control system architecture is modular: a fuzzy supervisor determines the soil moisture set point of each sensor node area (according to the soil-plant set) and another fuzzy system, embedded in the sensor node, does the local control and actuates in the irrigation system. The fuzzy control system was simulated with SIMULINK® programming tool and was experimentally built embedded in mobile device SunSPOTTM operating in ZigBee. Controller models were designed and evaluated in different combinations of input variables and inference rules base

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A serious problem that affects an oil refinery s processing units is the deposition of solid particles or the fouling on the equipments. These residues are naturally present on the oil or are by-products of chemical reactions during its transport. A fouled heat exchanger loses its capacity to adequately heat the oil, needing to be shut down periodically for cleaning. Previous knowledge of the best period to shut down the exchanger may improve the energetic and production efficiency of the plant. In this work we develop a system to predict the fouling on a heat exchanger from the Potiguar Clara Camarão Refinery, based on data collected in a partnership with Petrobras. Recurrent Neural Networks are used to predict the heat exchanger s flow in future time. This variable is the main indicator of fouling, because its value decreases gradually as the deposits on the tubes reduce their diameter. The prediction could be used to tell when the flow will have decreased under an acceptable value, indicating when the exchanger shutdown for cleaning will be needed