588 resultados para redundancy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present study, we propose a theoretical graph procedure to investigate multiple pathways in brain functional networks. By taking into account all the possible paths consisting of h links between the nodes pairs of the network, we measured the global network redundancy R (h) as the number of parallel paths and the global network permeability P (h) as the probability to get connected. We used this procedure to investigate the structural and dynamical changes in the cortical networks estimated from a dataset of high-resolution EEG signals in a group of spinal cord injured (SCI) patients during the attempt of foot movement. In the light of a statistical contrast with a healthy population, the permeability index P (h) of the SCI networks increased significantly (P < 0.01) in the Theta frequency band (3-6 Hz) for distances h ranging from 2 to 4. On the contrary, no significant differences were found between the two populations for the redundancy index R (h) . The most significant changes in the brain functional network of SCI patients occurred mainly in the lower spectral contents. These changes were related to an improved propagation of communication between the closest cortical areas rather than to a different level of redundancy. This evidence strengthens the hypothesis of the need for a higher functional interaction among the closest ROIs as a mechanism to compensate the lack of feedback from the peripheral nerves to the sensomotor areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microbial community composition was examined in two soil types, Anthrosols and adjacent soils, sampled from three locations in the Brazilian Amazon. The Anthrosols, also known as Amazonian dark earths, are highly fertile soils that are a legacy of pre-Columbian settlement. Both Anthrosols and adjacent soils are derived from the same parent material and subject to the same environmental conditions, including rainfall and temperature; however, the Anthrosols contain high levels of charcoal-like black carbon from which they derive their dark color. The Anthrosols typically have higher cation exchange capacity, higher pH, and higher phosphorus and calcium contents. We used culture media prepared from soil extracts to isolate bacteria unique to the two soil types and then sequenced their 16S rRNA genes to determine their phylogenetic placement. Higher numbers of culturable bacteria, by over two orders of magnitude at the deepest sampling depths, were counted in the Anthrosols. Sequences of bacteria isolated on soil extract media yielded five possible new bacterial families. Also, a higher number of families in the bacteria were represented by isolates from the deeper soil depths in the Anthrosols. Higher bacterial populations and a greater diversity of isolates were found in all of the Anthrosols, to a depth of up to 1 m, compared to adjacent soils located within 50-500 m of their associated Anthrosols. Compared to standard culture media, soil extract media revealed diverse soil microbial populations adapted to the unique biochemistry and physiological ecology of these Anthrosols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, it has been observed that software clones and plagiarism are becoming an increased threat for one?s creativity. Clones are the results of copying and using other?s work. According to the Merriam – Webster dictionary, “A clone is one that appears to be a copy of an original form”. It is synonym to duplicate. Clones lead to redundancy of codes, but not all redundant code is a clone.On basis of this background knowledge ,in order to safeguard one?s idea and to avoid intentional code duplication for pretending other?s work as if their owns, software clone detection should be emphasized more. The objective of this paper is to review the methods for clone detection and to apply those methods for finding the extent of plagiarism occurrence among the Swedish Universities in Master level computer science department and to analyze the results.The rest part of the paper, discuss about software plagiarism detection which employs data analysis technique and then statistical analysis of the results.Plagiarism is an act of stealing and passing off the idea?s and words of another person?s as one?s own. Using data analysis technique, samples(Master level computer Science thesis report) were taken from various Swedish universities and processed in Ephorus anti plagiarism software detection. Ephorus gives the percentage of plagiarism for each thesis document, from this results statistical analysis were carried out using Minitab Software.The results gives a very low percentage of Plagiarism extent among the Swedish universities, which concludes that Plagiarism is not a threat to Sweden?s standard of education in computer science.This paper is based on data analysis, intelligence techniques, EPHORUS software plagiarism detection tool and MINITAB statistical software analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a calculus of stratified resolution, in which special attention is paid to clauses that "define" relations. If such clauses are discovered in the initial set of clauses, they are treated using the rule of definition unfolding, i.e. the rule that replaces defined relations by their definitions. Stratified resolution comes with a powerful notion of redundancy: a clause to which definition unfolding has been applied can be removed from the search space. To prove the completeness of stratified resolution with redundancies, we use a novel combination of Bachmair and Ganzingerâ??s model construction technique and a hierarchical construction of orderings and least fixpoints.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents the study and development of fault-tolerant techniques for programmable architectures, the well-known Field Programmable Gate Arrays (FPGAs), customizable by SRAM. FPGAs are becoming more valuable for space applications because of the high density, high performance, reduced development cost and re-programmability. In particular, SRAM-based FPGAs are very valuable for remote missions because of the possibility of being reprogrammed by the user as many times as necessary in a very short period. SRAM-based FPGA and micro-controllers represent a wide range of components in space applications, and as a result will be the focus of this work, more specifically the Virtex® family from Xilinx and the architecture of the 8051 micro-controller from Intel. The Triple Modular Redundancy (TMR) with voters is a common high-level technique to protect ASICs against single event upset (SEU) and it can also be applied to FPGAs. The TMR technique was first tested in the Virtex® FPGA architecture by using a small design based on counters. Faults were injected in all sensitive parts of the FPGA and a detailed analysis of the effect of a fault in a TMR design synthesized in the Virtex® platform was performed. Results from fault injection and from a radiation ground test facility showed the efficiency of the TMR for the related case study circuit. Although TMR has showed a high reliability, this technique presents some limitations, such as area overhead, three times more input and output pins and, consequently, a significant increase in power dissipation. Aiming to reduce TMR costs and improve reliability, an innovative high-level technique for designing fault-tolerant systems in SRAM-based FPGAs was developed, without modification in the FPGA architecture. This technique combines time and hardware redundancy to reduce overhead and to ensure reliability. It is based on duplication with comparison and concurrent error detection. The new technique proposed in this work was specifically developed for FPGAs to cope with transient faults in the user combinational and sequential logic, while also reducing pin count, area and power dissipation. The methodology was validated by fault injection experiments in an emulation board. The thesis presents comparison results in fault coverage, area and performance between the discussed techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been recently shownthat localfield potentials (LFPs)fromthe auditory and visual cortices carry information about sensory stimuli, but whether this is a universal property of sensory cortices remains to be determined. Moreover, little is known about the temporal dynamics of sensory information contained in LFPs following stimulus onset. Here we investigated the time course of the amount of stimulus information in LFPs and spikes from the gustatory cortex of awake rats subjected to tastants and water delivery on the tongue. We found that the phase and amplitude of multiple LFP frequencies carry information about stimuli, which have specific time courses after stimulus delivery. The information carried by LFP phase and amplitude was independent within frequency bands, since the joint information exhibited neither synergy nor redundancy. Tastant information in LFPs was also independent and had a different time course from the information carried by spikes. These findings support the hypothesis that the brain uses different frequency channels to dynamically code for multiple features of a stimulus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High levels of local, regional, and global extinctions has progressively simplified communities in terms of both species and ecosystem functioning. Theoretical models demonstrated that the degree of functional redundancy determines the rates of functional group loss in response to species extinctions. Here, we improve the theoretical predictions by incorporating in the model interactions between species and between functional groups. In this study, we tested the effect of different scenarios of interspecific interactions and effects between functional groups on the resistance to loss of community functional groups. Virtual communities have been built with different distribution patterns of species in functional groups, both with high and low evenness. A matrix A was created to represent the net effect of interspecific interactions among all species, representing nesting patterns, modularity, sensitive species, and dominant species. Moreover, a second matrix B was created to represent the interactions between functional groups, also exhibiting different patterns. The extinction probability of each species was calculated based on community species richness and by the intensity of the interspecific interactions that act upon it and group to which it belongs. In the model, successive extinctions decrease the community species richness, the degree of functional redundancy and, consequently, the number of functional groups that remain in the system. For each scenario of functional redundancy, A, and B, we ran 1000 simulations to generate an average functional extinction curve. Different model assumptions were able to generate remarkable variation on functional extinction curves. More extreme variations occurred when the matrix A and B caused a higher heterogeneity in the species extinction probability. Scenarios with sensitive species, positive or negative, showed a greater variation than the scenarios with dominant species. Nested interactions showed greater variation than scenarios where the interactions were in modules. Communities with maximal functional evenness can only be destabilized by the interactions between species and functional groups. In contrast, communities with low functional evenness can have its resistance either increased or decreased by the interactions. The concentration of positive interactions in low redundancy groups or negative interactions in high redundancy groups was able to decrease the functional extinction rates. In contrast, the concentration of negative interactions in low redundancy groups or positive interactions in high redundancy groups was able to increase the functional extinction rates. This model shows results that are relevant for species priorization in ecosystem conservation and restoration

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A resposta da goiabeira à calagem e à adubação pode ser monitorada por análises de tecido vegetal. O perfil nutricional é definido em relação a padrões de teores de nutrientes. No entanto, os teores de nutrientes-padrão são constantemente criticados por não considerarem as interações que ocorrem entre nutrientes e por gerarem tendências numéricas, decorrentes da redundância dos dados, da dependência de escala e da distribuição não normal. As técnicas de análise composicional de dados podem controlar esses dados tendenciosos, equilibrando os grupos de nutrientes, tais como os envolvidos na calagem e na adubação. A utilização das relações log isométricas (ilr) ortonormais, sequencialmente dispostas, evita tendências numéricas inerentes aos dados de composição. Os objetivos do trabalho foram relacionar o balanço de nutrientes dos tecidos vegetais com a produção de goiabeiras em pomares de 'Paluma' diferentemente corrigidos e adubados, e ajustar os atuais padrões de nutrientes com a faixa de equilíbrio das goiabeiras mais produtivas. Um experimento de calagem de sete anos e três, experimentos de três anos com doses de N, P2O5 e K2O, foram conduzidos em pomares de goiabeiras 'Paluma' em um Latossolo Vermelho-Amarelo. Os teores de N, P, K, Ca e Mg na planta foram monitorados anualmente. Selecionaram-se os balanços [N, P, K | Ca, Mg], [N, P | K], [N | P] e [Ca | Mg] para separar os efeitos da calagem (Ca-Mg) e dos fertilizantes (N-K) nos balanços de macronutrientes. Os balanços foram mais influenciados pela calagem do que pela fertilização. A produtividade das goiabeiras e seu balanço nutricional permitiram a definição de faixas de equilíbrio de nutrientes e sua validação com as faixas de concentrações críticas atualmente utilizadas no Brasil e combinadas em coordenadas ilr.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The industries are getting more and more rigorous, when security is in question, no matter is to avoid financial damages due to accidents and low productivity, or when it s related to the environment protection. It was thinking about great world accidents around the world involving aircrafts and industrial process (nuclear, petrochemical and so on) that we decided to invest in systems that could detect fault and diagnosis (FDD) them. The FDD systems can avoid eventual fault helping man on the maintenance and exchange of defective equipments. Nowadays, the issues that involve detection, isolation, diagnose and the controlling of tolerance fault are gathering strength in the academic and industrial environment. It is based on this fact, in this work, we discuss the importance of techniques that can assist in the development of systems for Fault Detection and Diagnosis (FDD) and propose a hybrid method for FDD in dynamic systems. We present a brief history to contextualize the techniques used in working environments. The detection of fault in the proposed system is based on state observers in conjunction with other statistical techniques. The principal idea is to use the observer himself, in addition to serving as an analytical redundancy, in allowing the creation of a residue. This residue is used in FDD. A signature database assists in the identification of system faults, which based on the signatures derived from trend analysis of the residue signal and its difference, performs the classification of the faults based purely on a decision tree. This FDD system is tested and validated in two plants: a simulated plant with coupled tanks and didactic plant with industrial instrumentation. All collected results of those tests will be discussed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image compress consists in represent by small amount of data, without loss a visual quality. Data compression is important when large images are used, for example satellite image. Full color digital images typically use 24 bits to specify the color of each pixel of the Images with 8 bits for each of the primary components, red, green and blue (RGB). Compress an image with three or more bands (multispectral) is fundamental to reduce the transmission time, process time and record time. Because many applications need images, that compression image data is important: medical image, satellite image, sensor etc. In this work a new compression color images method is proposed. This method is based in measure of information of each band. This technique is called by Self-Adaptive Compression (S.A.C.) and each band of image is compressed with a different threshold, for preserve information with better result. SAC do a large compression in large redundancy bands, that is, lower information and soft compression to bands with bigger amount of information. Two image transforms are used in this technique: Discrete Cosine Transform (DCT) and Principal Component Analysis (PCA). Primary step is convert data to new bands without relationship, with PCA. Later Apply DCT in each band. Data Loss is doing when a threshold discarding any coefficients. This threshold is calculated with two elements: PCA result and a parameter user. Parameters user define a compression tax. The system produce three different thresholds, one to each band of image, that is proportional of amount information. For image reconstruction is realized DCT and PCA inverse. SAC was compared with JPEG (Joint Photographic Experts Group) standard and YIQ compression and better results are obtain, in MSE (Mean Square Root). Tests shown that SAC has better quality in hard compressions. With two advantages: (a) like is adaptive is sensible to image type, that is, presents good results to divers images kinds (synthetic, landscapes, people etc., and, (b) it need only one parameters user, that is, just letter human intervention is required

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potentially toxic cyanobacterial blooms in water bodies are spread across the globe, resulting in the loss of water quality and adverse effects on human health. In arid and semiarid regions, the hydrologic regime characterized by an annual cycle of drought and rain, change the volume and the retention time of the reservoir. Such changes affect the limnological characteristics and causing changes in composition and biomass community of cyanobacteria. The reservoir Cruzeta (Zmax = 8.7 m) is a eutrophic water supply source located in the semiarid tropical (Northeast Brazil). Raised the hypothesis that the hydrological regime of semi-arid tropical is a determining factor in the availability of resources in eutrophic water sources, which influences the composition of dominant species of cyanobacteria. The aim of this study was to analyze the changes in biomass and species composition of cyanobacteria for two annual hydrological cycles and evaluate factors drivers. The study was divided into five distinct periods (dry 2010, rain 2011, dry 2011, rain 2012, dry 2012). The dominant group found in all periods was Cyanobacteria (99% of total biomass), which contributed to the low diversity. The filamentous species Cylindrospermopsis raciborskii was present at both points in almost every study. The colonial species Microcystis panniformis and Sphaerocavum brasiliensis dominated only in periods with lower volumes of water. The diatoms contribute more to the biomass during the period of severe drought. The point near the dam (P1) had phytoplankton biomass larger than the point near the tributary (P2). The dominant species of colonial cyanobacteria lasted until the overflow in P1, and P2 this dominance was until the first rains. The redundancy analysis indicated that physical factors such as light availability and water level were the main factors driving the seasonal succession of phytoplankton. The composition of phytoplankton in spring was alternated by species of filamentous cyanobacteria in conditions of poor stability of the water column, such as Cylindrospermopsis raciborskii, and colonial species under conditions of high stability of the water column, such as Microcystis panniformis and Sphaerocavum brasiliensis. The extremes of torrential rains and severe droughts, governed by the hydrological regime of the semi-arid region led to the availability of resources in the watershed, directing the spatial and temporal dynamics of phytoplankton in the reservoir Cruzeta

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To contribute to our understanding of the genome complexity of sugarcane, we undertook a large-scale expressed sequence tag (EST),program. More than 260,000 cDNA clones were partially sequenced from 26 standard cDNA libraries generated from different sugarcane tissues. After the processing of the sequences, 237,954 high-quality ESTs were identified. These ESTs were assembled into 43,141 putative transcripts. of the assembled sequences, 35.6% presented no matches with existing sequences in public databases. A global analysis of the whole SUCEST data set indicated that 14,409 assembled sequences (33% of the total) contained at least one cDNA clone with a full-length insert. Annotation of the 43,141 assembled sequences associated almost 50% of the putative identified sugarcane genes with protein metabolism, cellular communication/signal transduction, bioenergetics, and stress responses. Inspection of the translated assembled sequences for conserved protein domains revealed 40,821 amino acid sequences with 1415 Pfam domains. Reassembling the consensus sequences of the 43,141 transcripts revealed a 22% redundancy in the first assembling. This indicated that possibly 33,620 unique genes had been identified and indicated that >90% of the sugarcane expressed genes were tagged.