32 resultados para Redes de Nova Geração
Resumo:
ART networks present some advantages: online learning; convergence in a few epochs of training; incremental learning, etc. Even though, some problems exist, such as: categories proliferation, sensitivity to the presentation order of training patterns, the choice of a good vigilance parameter, etc. Among the problems, the most important is the category proliferation that is probably the most critical. This problem makes the network create too many categories, consuming resources to store unnecessarily a large number of categories, impacting negatively or even making the processing time unfeasible, without contributing to the quality of the representation problem, i. e., in many cases, the excessive amount of categories generated by ART networks makes the quality of generation inferior to the one it could reach. Another factor that leads to the category proliferation of ART networks is the difficulty of approximating regions that have non-rectangular geometry, causing a generalization inferior to the one obtained by other methods of classification. From the observation of these problems, three methodologies were proposed, being two of them focused on using a most flexible geometry than the one used by traditional ART networks, which minimize the problem of categories proliferation. The third methodology minimizes the problem of the presentation order of training patterns. To validate these new approaches, many tests were performed, where these results demonstrate that these new methodologies can improve the quality of generalization for ART networks
Resumo:
Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments
Resumo:
The microstrip antennas are in constant evidence in current researches due to several advantages that it presents. Fractal geometry coupled with good performance and convenience of the planar structures are an excellent combination for design and analysis of structures with ever smaller features and multi-resonant and broadband. This geometry has been applied in such patch microstrip antennas to reduce its size and highlight its multi-band behavior. Compared with the conventional microstrip antennas, the quasifractal patch antennas have lower frequencies of resonance, enabling the manufacture of more compact antennas. The aim of this work is the design of quasi-fractal patch antennas through the use of Koch and Minkowski fractal curves applied to radiating and nonradiating antenna s edges of conventional rectangular patch fed by microstrip inset-fed line, initially designed for the frequency of 2.45 GHz. The inset-fed technique is investigated for the impedance matching of fractal antennas, which are fed through lines of microstrip. The efficiency of this technique is investigated experimentally and compared with simulations carried out by commercial software Ansoft Designer used for precise analysis of the electromagnetic behavior of antennas by the method of moments and the neural model proposed. In this dissertation a study of literature on theory of microstrip antennas is done, the same study is performed on the fractal geometry, giving more emphasis to its various forms, techniques for generation of fractals and its applicability. This work also presents a study on artificial neural networks, showing the types/architecture of networks used and their characteristics as well as the training algorithms that were used for their implementation. The equations of settings of the parameters for networks used in this study were derived from the gradient method. It will also be carried out research with emphasis on miniaturization of the proposed new structures, showing how an antenna designed with contours fractals is capable of a miniaturized antenna conventional rectangular patch. The study also consists of a modeling through artificial neural networks of the various parameters of the electromagnetic near-fractal antennas. The presented results demonstrate the excellent capacity of modeling techniques for neural microstrip antennas and all algorithms used in this work in achieving the proposed models were implemented in commercial software simulation of Matlab 7. In order to validate the results, several prototypes of antennas were built, measured on a vector network analyzer and simulated in software for comparison
Resumo:
The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks
Resumo:
This paper presents the performanee analysis of traffie retransmission algorithms pro¬posed to the HCCA medium aeeess meehanism of IEEE 802.11 e standard applied to industrial environmen1. Due to the nature of this kind of environment, whieh has eleetro¬magnetic interferenee, and the wireless medium of IEEE 802.11 standard, suseeptible to such interferenee, plus the lack of retransmission meehanisms, refers to an impraetieable situation to ensure quality of service for real-time traffic, to whieh the IEEE 802.11 e stan¬dard is proposed and this environment requires. Thus, to solve this problem, this paper proposes a new approach that involves the ereation and evaluation of retransmission al-gorithms in order to ensure a levei of robustness, reliability and quality of serviee to the wireless communication in such environments. Thus, according to this approaeh, if there is a transmission error, the traffie scheduler is able to manage retransmissions to reeo¬ver data 10s1. The evaluation of the proposed approaeh is performed through simulations, where the retransmission algorithms are applied to different seenarios, whieh are abstrae¬tions of an industrial environment, and the results are obtained by using an own-developed network simulator and compared with eaeh other to assess whieh of the algorithms has better performanee in a pre-defined applieation
Resumo:
This work develops a methodology for defining the maximum active power being injected into predefined nodes in the studied distribution networks, considering the possibility of multiple accesses of generating units. The definition of these maximum values is obtained from an optimization study, in which further losses should not exceed those of the base case, i.e., without the presence of distributed generation. The restrictions on the loading of the branches and voltages of the system are respected. To face the problem it is proposed an algorithm, which is based on the numerical method called particle swarm optimization, applied to the study of AC conventional load flow and optimal load flow for maximizing the penetration of distributed generation. Alternatively, the Newton-Raphson method was incorporated to resolution of the load flow. The computer program is performed with the SCILAB software. The proposed algorithm is tested with the data from the IEEE network with 14 nodes and from another network, this one from the Rio Grande do Norte State, at a high voltage (69 kV), with 25 nodes. The algorithm defines allowed values of nominal active power of distributed generation, in percentage terms relative to the demand of the network, from reference values
Resumo:
Bayesian networks are powerful tools as they represent probability distributions as graphs. They work with uncertainties of real systems. Since last decade there is a special interest in learning network structures from data. However learning the best network structure is a NP-Hard problem, so many heuristics algorithms to generate network structures from data were created. Many of these algorithms use score metrics to generate the network model. This thesis compare three of most used score metrics. The K-2 algorithm and two pattern benchmarks, ASIA and ALARM, were used to carry out the comparison. Results show that score metrics with hyperparameters that strength the tendency to select simpler network structures are better than score metrics with weaker tendency to select simpler network structures for both metrics (Heckerman-Geiger and modified MDL). Heckerman-Geiger Bayesian score metric works better than MDL with large datasets and MDL works better than Heckerman-Geiger with small datasets. The modified MDL gives similar results to Heckerman-Geiger for large datasets and close results to MDL for small datasets with stronger tendency to select simpler network structures
Resumo:
The use of non-human primates in scientific research has contributed significantly to the biomedical area and, in the case of Callithrix jacchus, has provided important evidence on physiological mechanisms that help explain its biology, making the species a valuable experimental model in different pathologies. However, raising non-human primates in captivity for long periods of time is accompanied by behavioral disorders and chronic diseases, as well as progressive weight loss in most of the animals. The Primatology Center of the Universidade Federal do Rio Grande do Norte (UFRN) has housed a colony of C. jacchus for nearly 30 years and during this period these animals have been weighed systematically to detect possible alterations in their clinical conditions. This procedure has generated a volume of data on the weight of animals at different age ranges. These data are of great importance in the study of this variable from different perspectives. Accordingly, this paper presents three studies using weight data collected over 15 years (1985-2000) as a way of verifying the health status and development of the animals. The first study produced the first article, which describes the histopathological findings of animals with probable diagnosis of permanent wasting marmoset syndrome (WMS). All the animals were carriers of trematode parasites (Platynosomum spp) and had obstruction in the hepatobiliary system; it is suggested that this agent is one of the etiological factors of the syndrome. In the second article, the analysis focused on comparing environmental profile and cortisol levels between the animals with normal weight curve evolution and those with WMS. We observed a marked decrease in locomotion, increased use of lower cage extracts and hypocortisolemia. The latter is likely associated to an adaptation of the mechanisms that make up the hypothalamus-hypophysis-adrenal axis, as observed in other mammals under conditions of chronic malnutrition. Finally, in the third study, the animals with weight alterations were excluded from the sample and, using computational tools (K-means and SOM) in a non-supervised way, we suggest found new ontogenetic development classes for C. jacchus. These were redimensioned from five to eight classes: infant I, infant II, infant III, juvenile I, juvenile II, sub-adult, young adult and elderly adult, in order to provide a more suitable classification for more detailed studies that require better control over the animal development
Resumo:
It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it
Resumo:
The aim of this work was to describe the methodological procedures that were mandatory to develop a 3D digital imaging of the external and internal geometry of the analogue outcrops from reservoirs and to build a Virtual Outcrop Model (VOM). The imaging process of the external geometry was acquired by using the Laser Scanner, the Geodesic GPS and the Total Station procedures. On the other hand, the imaging of the internal geometry was evaluated by GPR (Ground Penetrating Radar).The produced VOMs were adapted with much more detailed data with addition of the geological data and the gamma ray and permeability profiles. As a model for the use of the methodological procedures used on this work, the adapted VOM, two outcrops, located at the east part of the Parnaiba Basin, were selected. On the first one, rocks from the aeolian deposit of the Piaui Formation (Neo-carboniferous) and tidal flat deposits from the Pedra de Fogo Formation (Permian), which arises in a large outcrops located between Floriano and Teresina (Piauí), are present. The second area, located at the National Park of Sete Cidades, also at the Piauí, presents rocks from the Cabeças Formation deposited in fluvial-deltaic systems during the Late Devonian. From the data of the adapted VOMs it was possible to identify lines, surfaces and 3D geometry, and therefore, quantify the geometry of interest. Among the found parameterization values, a table containing the thickness and width, obtained in canal and lobes deposits at the outcrop Paredão and Biblioteca were the more relevant ones. In fact, this table can be used as an input for stochastic simulation of reservoirs. An example of the direct use of such table and their predicted radargrams was the identification of the bounding surface at the aeolian sites from the Piauí Formation. In spite of such radargrams supply only bi-dimensional data, the acquired lines followed of a mesh profile were used to add a third dimension to the imaging of the internal geometry. This phenomenon appears to be valid for all studied outcrops. As a conclusion, the tool here presented can became a new methodology in which the advantages of the digital imaging acquired from the Laser Scanner (precision, accuracy and speed of acquisition) were combined with the Total Station procedure (precision) using the classical digital photomosaic technique
Resumo:
La presente pesquisa contempla reflexiones referentes de la actividad turística en la organización socio-espacial en el litoral de las comarcas de Extremoz y Ceará-Mirim, región de la Grande Natal. Nuestro objetivo principal es el estudio de las transformaciones del espacio y de sus implicaciones socio-ambientales en curso en el proceso de producción del espacio turístico en el litoral de las referidas comarcas desde 1997 hasta 2007, momento de importancia publica-privada que, a partir del PRODETUR, tuvo la base para el incremento de las potencialidades turísticas. En este sentido, varias fueron las técnicas para una mejor compresión de las aspiraciones y de la percepción de los actores envueltos con la actividad turística (comerciantes, turistas, populación y poder publico local) para conocer cuales son sus consideraciones en cuanto a los cambios que proceden de la implantación de la actividad turística en el lugar, cuanto a la mejora de la calidad de vida, de la generación de empleo y de renta, comercialización, conservación, preservación del ambiente, cumplimiento de la legislación, afirmación cultural así como las acciones puestas en ejecución en las comarcas. Para tal necesidad, se buscó analizar los datos estadísticos a partir del uso de los cuestionarios con preguntas estructuradas y semi-abiertas como instrumento de colecta de información que les era correlacionada con la opinión de los actores locales de modo que podamos formar y entender los elementos básicos que son parte de los espacios turísticos en foco. Fueran utilizadas fotografías aéreas de las comarcas de Extremoz y Ceará-Mirim, provistas por el IDEMA, con la intención de percibir acerca de los cambios del espacio y las implicaciones socio-ambientales de la área en estudio. Concluimos en función de los resultados que el modelo de Turismo concebido por lo Brasil, estimulado y financiado por lo Gobierno Federal, está insertado en el contexto de la economía global y, por lo tanto, el Estado del Rio Grande del Norte, en específico los espacios litoraneos de las comarcas de Extremoz y Ceará-Mirim, que poseen características similares a este modelo, con sus particularidades, que si traduzca por la exclusión social, formas de apropiación privada de los espacios público y áreas de protección ambiental como las playas, las dunas y las lagunas, el desacato o no el cumplimiento de la legislación ambiental, aumento de las desigualdad de renta en una región que posee una problemática social grave y sin inversión, implantación de la infraestructura, ausencia de política pública local, donde los intereses económicos son prioridad delante de las aclamaciones populares. Se sugiere un repensar cuanto al modelo actual de desarrollo adoptado, que el planeamiento sea pautado en base a la participación integrada de los varios agentes implicados con la actividad turística, incluyendo en la medida del posible, las aspiraciones de la población local como precusoras de sus reales necesidades, donde esta acción interactiva contestará ciertamente en un esfuerzo significativo en la construcción de un nuevo paradigma, modelo del desarrollo sustentable, siendo posible superar gradualmente el incremento de la pobreza, de la exclusión y de los impactos ambientales, donde la calidad de vida sea factor fundamental
Resumo:
The work presented here aims to make an analysis of the socio-spatial dynamics of associative supermarket chains and their importance in redefining the roles of small urban North Rio Grande cities. The theoretical approach gives priority to business as a city constituent whose understanding allows us to seize the new socio-spatial dynamics of small towns in the face of globalization and which caused changes in the scope of its commercial forms. In this sense, we understand that trade, as an essentially urban activity has a very specific characteristic, with respect to its ability to transform the content and meaning of places. Another important factor in the construction work was the context of changes in the capitalist production system with the advent of flexible production and the determinations of the economic globalization process that brought new ways of organizing trade. The empirical analysis of the research includes two associative supermarket chains, the “Rede 10” and the “Rede Seridó”, bringing together basic elements for understanding the genesis and evolution of this new organizational model of trade in small towns of the state, as well as allowed -In understand the main changes in this segment of commercial activity. The methodology we used literature in books and periodicals, collected mainly secondary data collection with the SEBRAE and the ABRAS and was still a field research where interviews were conducted forwarded along to the associative network managers to supermarkets, owners of associated facilities and with consumers of the surveyed networks .Finally, we conclude that the formation and expansion of associative supermarket chains in the context of small cities potiguares is essentially in a survival alternative traditional small traders, that sharing the associative principles albeit somewhat rigidly guided by the training cooperation networks can not only stay in the market , but to impose as a new agent in the capital of the reproduction process. Thus, the associative supermarket chains in the search for new spaces, particularly within small towns end up promoting new momentum in these cities providing different flows and interconnections with different places, giving new content and urban roles. By taking not only the condition of the place of living, but also the place to reproduce the capital, small towns offer their population better able to make purchases, thus avoiding the mandatory population shifts to other urban centers in order to meet their consumption needs.
Resumo:
A typical electrical power system is characterized by centr alization of power gene- ration. However, with the restructuring of the electric sys tem, this topology is changing with the insertion of generators in parallel with the distri bution system (distributed gene- ration) that provides several benefits to be located near to e nergy consumers. Therefore, the integration of distributed generators, especially fro m renewable sources in the Brazi- lian system has been common every year. However, this new sys tem topology may result in new challenges in the field of the power system control, ope ration, and protection. One of the main problems related to the distributed generati on is the islanding formation, witch can result in safety risk to the people and to the power g rid. Among the several islanding protection techniques, passive techniques have low implementation cost and simplicity, requiring only voltage and current measuremen ts to detect system problems. This paper proposes a protection system based on the wavelet transform with overcur- rent and under/overvoltage functions as well as infomation of fault-induced transients in order to provide a fast detection and identification of fault s in the system. The propo- sed protection scheme was evaluated through simulation and experimental studies, with performance similar to the overcurrent and under/overvolt age conventional methods, but with the additional detection of the exact moment of the fault.
Resumo:
Wireless Communication is a trend in the industrial environment nowadays and on this trend, we can highlight the WirelessHART technology. In this situation, it is natural the search for new improvements in the technology and such improvements can be related directly to the routing and scheduling algorithms. In the present thesis, we present a literature review about the main specific solutions for Routing and scheduling for WirelessHART. The thesis also proposes a new scheduling algorithm called Flow Scheduling that intends to improve superframe utilization and flexibility aspects. For validation purposes, we develop a simulation module for the Network Simulator 3 (NS-3) that models aspects like positioning, signal attenuation and energy consumption and provides an link individual error configuration. The module also allows the creation of the scheduling superframe using the Flow and Han Algorithms. In order to validate the new algorithms, we execute a series of comparative tests and evaluate the algorithms performance for link allocation, delay and superframe occupation. In order to validate the physical layer of the simulation module, we statically configure the routing and scheduling aspects and perform reliability and energy consumption tests using various literature topologies and error probabilities.
Resumo:
Wireless Communication is a trend in the industrial environment nowadays and on this trend, we can highlight the WirelessHART technology. In this situation, it is natural the search for new improvements in the technology and such improvements can be related directly to the routing and scheduling algorithms. In the present thesis, we present a literature review about the main specific solutions for Routing and scheduling for WirelessHART. The thesis also proposes a new scheduling algorithm called Flow Scheduling that intends to improve superframe utilization and flexibility aspects. For validation purposes, we develop a simulation module for the Network Simulator 3 (NS-3) that models aspects like positioning, signal attenuation and energy consumption and provides an link individual error configuration. The module also allows the creation of the scheduling superframe using the Flow and Han Algorithms. In order to validate the new algorithms, we execute a series of comparative tests and evaluate the algorithms performance for link allocation, delay and superframe occupation. In order to validate the physical layer of the simulation module, we statically configure the routing and scheduling aspects and perform reliability and energy consumption tests using various literature topologies and error probabilities.