10 resultados para redundância

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

High levels of local, regional, and global extinctions has progressively simplified communities in terms of both species and ecosystem functioning. Theoretical models demonstrated that the degree of functional redundancy determines the rates of functional group loss in response to species extinctions. Here, we improve the theoretical predictions by incorporating in the model interactions between species and between functional groups. In this study, we tested the effect of different scenarios of interspecific interactions and effects between functional groups on the resistance to loss of community functional groups. Virtual communities have been built with different distribution patterns of species in functional groups, both with high and low evenness. A matrix A was created to represent the net effect of interspecific interactions among all species, representing nesting patterns, modularity, sensitive species, and dominant species. Moreover, a second matrix B was created to represent the interactions between functional groups, also exhibiting different patterns. The extinction probability of each species was calculated based on community species richness and by the intensity of the interspecific interactions that act upon it and group to which it belongs. In the model, successive extinctions decrease the community species richness, the degree of functional redundancy and, consequently, the number of functional groups that remain in the system. For each scenario of functional redundancy, A, and B, we ran 1000 simulations to generate an average functional extinction curve. Different model assumptions were able to generate remarkable variation on functional extinction curves. More extreme variations occurred when the matrix A and B caused a higher heterogeneity in the species extinction probability. Scenarios with sensitive species, positive or negative, showed a greater variation than the scenarios with dominant species. Nested interactions showed greater variation than scenarios where the interactions were in modules. Communities with maximal functional evenness can only be destabilized by the interactions between species and functional groups. In contrast, communities with low functional evenness can have its resistance either increased or decreased by the interactions. The concentration of positive interactions in low redundancy groups or negative interactions in high redundancy groups was able to decrease the functional extinction rates. In contrast, the concentration of negative interactions in low redundancy groups or positive interactions in high redundancy groups was able to increase the functional extinction rates. This model shows results that are relevant for species priorization in ecosystem conservation and restoration

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The industries are getting more and more rigorous, when security is in question, no matter is to avoid financial damages due to accidents and low productivity, or when it s related to the environment protection. It was thinking about great world accidents around the world involving aircrafts and industrial process (nuclear, petrochemical and so on) that we decided to invest in systems that could detect fault and diagnosis (FDD) them. The FDD systems can avoid eventual fault helping man on the maintenance and exchange of defective equipments. Nowadays, the issues that involve detection, isolation, diagnose and the controlling of tolerance fault are gathering strength in the academic and industrial environment. It is based on this fact, in this work, we discuss the importance of techniques that can assist in the development of systems for Fault Detection and Diagnosis (FDD) and propose a hybrid method for FDD in dynamic systems. We present a brief history to contextualize the techniques used in working environments. The detection of fault in the proposed system is based on state observers in conjunction with other statistical techniques. The principal idea is to use the observer himself, in addition to serving as an analytical redundancy, in allowing the creation of a residue. This residue is used in FDD. A signature database assists in the identification of system faults, which based on the signatures derived from trend analysis of the residue signal and its difference, performs the classification of the faults based purely on a decision tree. This FDD system is tested and validated in two plants: a simulated plant with coupled tanks and didactic plant with industrial instrumentation. All collected results of those tests will be discussed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potentially toxic cyanobacterial blooms in water bodies are spread across the globe, resulting in the loss of water quality and adverse effects on human health. In arid and semiarid regions, the hydrologic regime characterized by an annual cycle of drought and rain, change the volume and the retention time of the reservoir. Such changes affect the limnological characteristics and causing changes in composition and biomass community of cyanobacteria. The reservoir Cruzeta (Zmax = 8.7 m) is a eutrophic water supply source located in the semiarid tropical (Northeast Brazil). Raised the hypothesis that the hydrological regime of semi-arid tropical is a determining factor in the availability of resources in eutrophic water sources, which influences the composition of dominant species of cyanobacteria. The aim of this study was to analyze the changes in biomass and species composition of cyanobacteria for two annual hydrological cycles and evaluate factors drivers. The study was divided into five distinct periods (dry 2010, rain 2011, dry 2011, rain 2012, dry 2012). The dominant group found in all periods was Cyanobacteria (99% of total biomass), which contributed to the low diversity. The filamentous species Cylindrospermopsis raciborskii was present at both points in almost every study. The colonial species Microcystis panniformis and Sphaerocavum brasiliensis dominated only in periods with lower volumes of water. The diatoms contribute more to the biomass during the period of severe drought. The point near the dam (P1) had phytoplankton biomass larger than the point near the tributary (P2). The dominant species of colonial cyanobacteria lasted until the overflow in P1, and P2 this dominance was until the first rains. The redundancy analysis indicated that physical factors such as light availability and water level were the main factors driving the seasonal succession of phytoplankton. The composition of phytoplankton in spring was alternated by species of filamentous cyanobacteria in conditions of poor stability of the water column, such as Cylindrospermopsis raciborskii, and colonial species under conditions of high stability of the water column, such as Microcystis panniformis and Sphaerocavum brasiliensis. The extremes of torrential rains and severe droughts, governed by the hydrological regime of the semi-arid region led to the availability of resources in the watershed, directing the spatial and temporal dynamics of phytoplankton in the reservoir Cruzeta

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The detection and diagnosis of faults, ie., find out how , where and why failures occur is an important area of study since man came to be replaced by machines. However, no technique studied to date can solve definitively the problem. Differences in dynamic systems, whether linear, nonlinear, variant or invariant in time, with physical or analytical redundancy, hamper research in order to obtain a unique solution . In this paper, a technique for fault detection and diagnosis (FDD) will be presented in dynamic systems using state observers in conjunction with other tools in order to create a hybrid FDD. A modified state observer is used to create a residue that allows also the detection and diagnosis of faults. A bank of faults signatures will be created using statistical tools and finally an approach using mean squared error ( MSE ) will assist in the study of the behavior of fault diagnosis even in the presence of noise . This methodology is then applied to an educational plant with coupled tanks and other with industrial instrumentation to validate the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coding process is a fundamental aspect of cerebral functioning. The sensory stimuli transformation in neurophysiological responses has been a research theme in several areas of Neuroscience. One of the most used ways to measure a neural code e ciency is by the use of Information Theory measures, such as mutual information. Using these tools, recent studies show that in the auditory cortex both local eld potentials (LFPs) and action potential spiking times code information about sound stimuli. However, there are no studies applying Information Theory tools to investigate the e ciency of codes that use postsynaptics potentials (PSPs), alone and associated with LFP analysis. These signals are related in the sense that LFPs are partly created by joint action of several PSPs. The present dissertation reports information measures between PSP and LFP responses obtained in the primary auditory cortex of anaesthetized rats and auditory stimuli of distinct frequencies. Our results show that PSP responses hold information about sound stimuli in comparable levels and even greater than LFP responses. We have also found that PSPs and LFPs code sound information independently, since the joint analysis of these signals did neither show synergy nor redundancy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The continuous evolution of integrated circuit technology has allowed integrating thousands of transistors on a single chip. This is due to the miniaturization process, which reduces the diameter of wires and transistors. One drawback of this process is that the circuit becomes more fragile and susceptible to break, making the circuit more susceptible to permanent faults during the manufacturing process as well as during their lifetime. Coarse Grained Reconfigurable Architectures (CGRAs) have been used as an alternative to traditional architectures in an attempt to tolerate such faults due to its intrinsic hardware redundancy and high performance. This work proposes a fault tolerance mechanism in a CGRA in order to increase the architecture fault tolerance even considering a high fault rate. The proposed mechanism was added to the scheduler, which is the mechanism responsible for mapping instructions onto the architecture. The instruction mapping occurs at runtime, translating binary code without the need for recompilation. Furthermore, to allow faster implementation, instruction mapping is performed using a greedy module scheduling algorithm, which consists of a software pipeline technique for loop acceleration. The results show that, even with the proposed mechanism, the time for mapping instructions is still in order of microseconds. This result allows that instruction mapping process remains at runtime. In addition, a study was also carried out mapping scheduler rate. The results demonstrate that even at fault rates over 50% in functional units and interconnection components, the scheduler was able to map instructions onto the architecture in most of the tested applications.