957 resultados para Wide Area Monitoring
Resumo:
Drawing upon Brazilian experience, this research explores some of the key issues to be addressed in using e-government technical cooperation designed to enhance service provision of Patent Offices in developing countries. While the development of software applications is often seen merely as a technical engineering exercise, localization and adaptation are context bounded matters that are characterized by many entanglements of human and non-humans. In this work, technical, legal and policy implications of technical cooperation are also discussed in a complex and dynamic implementation environment characterized by the influence of powerful hidden agendas associated with the arena of intellectual property (IP), which are shaped by recent technological, economic and social developments in our current knowledge-based economy. This research employs two different theoretical lenses to examine the same case, which consists of transfer of a Patent Management System (PMS) from the European Patent Office (EPO) to the Brazilian Patent Office that is locally named ‘Instituto Nacional da Propriedade Industrial’ (INPI). Fundamentally, we have opted for a multi-paper thesis comprising an introduction, three scientific articles and a concluding chapter that discusses and compares the insights obtained from each article. The first article is dedicated to present an extensive literature review on e-government and technology transfer. This review allowed the proposition on an integrative meta-model of e-government technology transfer, which is named E-government Transfer Model (ETM). Subsequently, in the second article, we present Actor-Network Theory (ANT) as a framework for understanding the processes of transferring e-government technologies from Patent Offices in developed countries to Patent Offices in developing countries. Overall, ANT is seen as having a potentially wide area of application and being a promising theoretical vehicle in IS research to carry out a social analysis of messy and heterogeneous processes that drive technical change. Drawing particularly on the works of Bruno Latour, Michel Callon and John Law, this work applies this theory to a longitudinal study of the management information systems supporting the Brazilian Patent Office restructuration plan that involved the implementation of a European Patent Management System in Brazil. Based upon the ANT elements, we follow the actors to identify and understand patterns of group formation associated with the technical cooperation between the Brazilian Patent Office (INPI) and the European Patent Office (EPO). Therefore, this research explores the intricate relationships and interactions between human and non-human actors in their attempts to construct various network alliances, thereby demonstrating that technologies embodies compromise. Finally, the third article applies ETM model as a heuristic frame to examine the same case previously studied from an ANT perspective. We have found evidence that ETM has strong heuristic qualities that can guide practitioners who are engaged in the transfer of e-government systems from developed to developing countries. The successful implementation of e-government projects in developing countries is important to stimulate economic growth and, as a result, we need to understand the processes through which such projects are being implemented and succeed. Here, we attempt to improve understanding on the development and stabilization of a complex social-technical system in the arena of intellectual property. Our preliminary findings suggest that e-government technology transfer is an inherently political process and that successful outcomes require continuous incremental actions and improvisations to address the ongoing issues as they emerge.
Resumo:
Tangara da Serra is located on southwestern Mato Grosso and is found to be on the route of pollutants dispersion originated in the Legal Amazon s deforestation area. This region has also a wide area of sugarcane culture, setting this site quite exposed to atmospheric pollutants. The objective of this work was to evaluate the genotoxicity of three different concentrations of organic particulate matter which was collected from August through December / 2008 in Tangara da Serra, using micronucleus test in Tradescantia pallida (Trad-MCN). The levels of particulate matter less than 10μm (MP10) and black carbon (BC) collected on the Teflon and polycarbonate filters were determined as well. Also, the alkanes and polycyclic aromatic hydrocarbons (PAHs) were identified and quantified on the samples from the burning period by gas chromatography detector with flame ionization detection (GC-FID). The results from the analyzing of alkanes indicate an antropic influence. Among the PAHs, the retene was the one found on the higher quantity and it is an indicator of biomass burning. The compounds indene(1,2,3-cd)pyrene and benzo(k)fluoranthene were identified on the samples and are considered to be potentially mutagenic and carcinogenic. By using Trad-MCN, it was observed a significant increase on the micronucleus frequency during the burning period, and this fact can be related to the mutagenic PAHs which were found on such extracts. When the period of less burnings is analyzed and compared to the negative control group, it was noted that there was no significant difference on the micronuclei rate. On the other hand, when the higher burning period is analyzed, statistically significant differences were evident. This study showed that the Trad-MCN was sensible and efficient on evaluating the genotoxicity potencial of organic matter from biomass burning, and also, emphasizes the importance of performing a chemical composition analysis in order to achieve a complete diagnosis on environmental risk control
Resumo:
This work presents a theoretical and numerical analysis of structures using frequency selective surfaces applied on patch antennas. The FDTD method is used to determine the time domain reflected fields. Applications of frequency selective surfaces and patch antennas cover a wide area of telecommunications, especially mobile communications, filters and WB antennas. scattering parameters are obteained from Fourier Transformer of transmited and reflected fields in time domain. The PML are used as absorbing boundary condition, allowing the determination of the fields with a small interference of reflections from discretized limit space. Rectangular patches are considered on dielectric layer and fed by microstrip line. Frequency selective surfaces with periodic and quasi-periodic structures are analyzed on both sides of antenna. A literature review of the use of frequency selective surfaces in patch antennas are also performed. Numerical results are also compared with measured results for return loss of analyzed structures. It is also presented suggestions of continuity to this work
Resumo:
Objective: Control of microleakage represents a challenge for posterior composite restorations. The technique for composite placement may reduce microleakage. The null hypothesis of this in vitro study was that centripetal incremental insertion of composite resin would result in less microleakage than that obtained with the oblique incremental technique or bulk technique. Method and Materials: Standardized Class 2 preparations were made in 60 caries-free extracted third molars and randomly assigned to 3 groups ( n = 20): ( 1) oblique incremental insertion technique ( control), ( 2) centripetal incremental insertion technique, and ( 3) bulk insertion. The teeth were restored with a total-etch adhesive and micro-hybrid composite resin. The specimens were isolated with nail varnish except for a 2-mm-wide area around the restoration and then thermocycled ( 1,000 thermal cycles, 5 degrees C/ 55 degrees C; 30-second dwell time). The specimens were immersed in an aqueous solution of 50% silver nitrate for 24 hours, followed by 8 hours of immersion in a photo-developing solution and subsequently evaluated for leakage. The microleakage scores ( 0 to 4) obtained from the occlusal and cervical walls were analyzed with median nonparametric tests ( P <.05). Results: The null hypothesis was rejected. All techniques attained statistically similar dentin microleakage scores ( P =.15). The centripetal insertion technique displayed significantly less microleakage than the oblique technique at the enamel margins ( P =.04). Conclusion: None of the techniques eliminated marginal microleakage in Class 2 preparations. However, in occlusal areas, the centripetal technique performed significantly better than the other techniques.
Resumo:
Satellite remote sensing of ocean colour is the only method currently available for synoptically measuring wide-area properties of ocean ecosystems, such as phytoplankton chlorophyll biomass. Recently, a variety of bio-optical and ecological methods have been established that use satellite data to identify and differentiate between either phytoplankton functional types (PFTs) or phytoplankton size classes (PSCs). In this study, several of these techniques were evaluated against in situ observations to determine their ability to detect dominant phytoplankton size classes (micro-, nano- and picoplankton). The techniques are applied to a 10-year ocean-colour data series from the SeaWiFS satellite sensor and compared with in situ data (6504 samples) from a variety of locations in the global ocean. Results show that spectral-response, ecological and abundance-based approaches can all perform with similar accuracy. Detection of microplankton and picoplankton were generally better than detection of nanoplankton. Abundance-based approaches were shown to provide better spatial retrieval of PSCs. Individual model performance varied according to PSC, input satellite data sources and in situ validation data types. Uncertainty in the comparison procedure and data sources was considered. Improved availability of in situ observations would aid ongoing research in this field. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
A study of the characteristics and distribution of the soil humus fractions in representative ecosystems of central Brazil was carried out with special emphasis on the comparison between the soils under virgin vegetation-Cerrado-and those subjected to cultivation. In spite of the contrasted vegetation and cultural practices in the sites studied, the soil humus showed analogous characteristics: there was a negligible amount of plant residues, the humic and fulvic acids amounted to approximately 70% of the total organic carbon, and about 40% of these humic substances were in extremely stable association with the soil mineral fraction, the HCl-HF treatment being required for their extraction. The stability of such organo-mineral complexes increased slightly in the cultured sites. The study of the humic acid fraction showed increased oxidation and aromaticity in most of the cultivated sites: the lowest values for the IR alkyl vibrations and H/C atomic ratios and the highest ones for the optical density at 465 nm were observed in sites transformed into orchards, whereas the above changes were small in those used as pasture. The 14C NMR spectra confirmed that the proportion of polyalkyl structures decreased in the humic acids of soils subjected to cultivation, as opposed to that of carboxyl groups. In spite of the high stability inferred for the organic matter throughout the wide area examined, the samples from the original Cerrado as well as from those transformed into pastures showed, in laboratory conditions, higher mineralization rates than those from the sites subjected to cultivation. This is partly attributed to the decreased proportions of extractable humic substances in the latter. © 1992.
Resumo:
Pós-graduação em Ciências Biológicas (Biologia Celular e Molecular) - IBRC
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A transparent (wide-area) wavelength-routed optical network may be constructed by using wavelength cross-connect switches connected together by fiber to form an arbitrary mesh structure. The network is accessed through electronic stations that are attached to some of these cross-connects. These wavelength cross-connect switches have the property that they may configure themselves into unspecified states. Each input port of a switch is always connected to some output port of the switch whether or not such a connection is required for the purpose of information transfer. Due to the presence of these unspecified states, there exists the possibility of setting up unintended alloptical cycles in the network (viz., a loop with no terminating electronics in it). If such a cycle contains amplifiers [e.g., Erbium- Doped Fiber Amplifiers (EDFA’s)], there exists the possibility that the net loop gain is greater than the net loop loss. The amplified spontaneous emission (ASE) noise from amplifiers can build up in such a feedback loop to saturate the amplifiers and result in oscillations of the ASE noise in the loop. Such all-optical cycles as defined above (and hereafter referred to as “white” cycles) must be eliminated from an optical network in order for the network to perform any useful operation. Furthermore, for the realistic case in which the wavelength cross-connects result in signal crosstalk, there is a possibility of having closed cycles with oscillating crosstalk signals. We examine algorithms that set up new transparent optical connections upon request while avoiding the creation of such cycles in the network. These algorithms attempt to find a route for a connection and then (in a post-processing fashion) configure switches such that white cycles that might get created would automatically get eliminated. In addition, our call-set-up algorithms can avoid the possibility of crosstalk cycles.
Resumo:
Recently, there has been growing interest in developing optical fiber networks to support the increasing bandwidth demands of multimedia applications, such as video conferencing and World Wide Web browsing. One technique for accessing the huge bandwidth available in an optical fiber is wavelength-division multiplexing (WDM). Under WDM, the optical fiber bandwidth is divided into a number of nonoverlapping wavelength bands, each of which may be accessed at peak electronic rates by an end user. By utilizing WDM in optical networks, we can achieve link capacities on the order of 50 THz. The success of WDM networks depends heavily on the available optical device technology. This paper is intended as a tutorial on some of the optical device issues in WDM networks. It discusses the basic principles of optical transmission in fiber and reviews the current state of the art in optical device technology. It introduces some of the basic components in WDM networks, discusses various implementations of these components, and provides insights into their capabilities and limitations. Then, this paper demonstrates how various optical components can be incorporated into WDM optical networks for both local and wide-area applications. Last, the paper provides a brief review of experimental WDM networks that have been implemented.
Resumo:
Computer and telecommunication networks are changing the world dramatically and will continue to do so in the foreseeable future. The Internet, primarily based on packet switches, provides very flexible data services such as e-mail and access to the World Wide Web. The Internet is a variable-delay, variable- bandwidth network that provides no guarantee on quality of service (QoS) in its initial phase. New services are being added to the pure data delivery framework of yesterday. Such high demands on capacity could lead to a “bandwidth crunch” at the core wide-area network, resulting in degradation of service quality. Fortunately, technological innovations have emerged which can provide relief to the end user to overcome the Internet’s well-known delay and bandwidth limitations. At the physical layer, a major overhaul of existing networks has been envisaged from electronic media (e.g., twisted pair and cable) to optical fibers - in wide-area, metropolitan-area, and even local-area settings. In order to exploit the immense bandwidth potential of optical fiber, interesting multiplexing techniques have been developed over the years.
Resumo:
The bandwidth requirements of the Internet are increasing every day and there are newer and more bandwidth-thirsty applications emerging on the horizon. Wavelength division multiplexing (WDM) is the next step towards leveraging the capabilities of the optical fiber, especially for wide-area backbone networks. The ability to switch a signal at intermediate nodes in a WDM network based on their wavelengths is known as wavelength-routing. One of the greatest advantages of using wavelength-routing WDM is the ability to create a virtual topology different from the physical topology of the underlying network. This virtual topology can be reconfigured when necessary, to improve performance. We discuss the previous work done on virtual topology design and also discuss and propose different reconfiguration algorithms applicable under different scenarios.
Resumo:
To open this Third Vertebrate Pest Conference is a real privilege. It is a pleasure to welcome all of you in attendance, and I know there are others who would like to be meeting with us, but, for one reason or another cannot be. However, we can serve them by taking back the results of discussion and by making available the printed transactions of what is said here. It has been the interest and demand for the proceedings of the two previous conferen- ces which, along with personal contacts many of you have with the sponsoring committee, have gauged the need for continuing these meetings. The National Pest Control Association officers who printed the 1962 proceedings still are supplying copies of that conference. Two reprintings of the 1964 conference have been necessary and repeat orders from several universities indicate that those proceedings have become textbooks for special classes. When Dr. Howard mentioned in opening the first Conference in 1962 that publication of those papers would make a valuable handbook of animal control, he was prophetic, indeed. We are pleased that this has happened, but not surprised, since to many of us in this specialized field, the conferences have provided a unique opportunity to meet colleagues with similar interests, to exchange information on control techniques and to be informed by research workers of problem solving investigations as well as to hear of promising basic research. The development of research is a two-way street and we think these conferences also identify areas of inadequate knowledge, thereby stimulating needed research. We have represented here a number of types of specialists—animal ecologists, public health and transmissible disease experts, control methods specialists, public agency administration and enforcement staffs, agricultural extension people, manufacturing and sale industry representatives, commercial pest control operators, and others—and in addition to improving communications among these professional groups an equally important purpose of these conferences is to improve understanding between them and the general public. Within the term general public are many individuals and also organizations dedicated to appreciation and protection of certain animal forms or animal life in general. Proper concepts of vertebrate pest control do not conflict with such views. It is worth repeating for the record the definition of "vertebrate pest" which has been stated at our previous conferences. "A vertebrate pest is any native or introduced, wild or feral, non-human spe- cies of vertebrate animal that is currently troublesome locally or over a wide area to one or more persons either by being a general nuisance, a health hazard or by destroying food or natural resources. In other words, vertebrate pest status is not an inherent quality or fixed classification but is a circumstantial relationship to man's interests." I believe progress has been made in reducing the misunderstanding and emotion with which vertebrate pest control was formerly treated whenever a necessity for control was stated. If this is true, I likewise believe it is deserved, because control methods and programs have progressed. Control no longer refers only to population reductions by lethal means. We have learned something of alternate control approaches and the necessity for studying the total environment; where reduction of pest animal numbers is the required solution to a problem situation we have a wider choice of more selective, safe and efficient materials. Although increased attention has been given to control methods, research when we take a close look at the severity of animal damage to so many facets of our economy, particularly to agricultural production and public health, we realize it still is pitifully small and slow. The tremendous acceleration of the world's food and health requirements seems to demand expediting vertebrate pest control to effectively neutralize the enormous impact of animal damage to vital resources. The efforts we are making here at problem delineation, idea communication and exchange of methodology could well serve as both nucleus and rough model for a broader application elsewhere. I know we all hope this Third Conference will advance these general objectives, and I think there is no doubt of its value in increasing our own scope of information.
Resumo:
Il lavoro è stato suddiviso in tre macro-aree. Una prima riguardante un'analisi teorica di come funzionano le intrusioni, di quali software vengono utilizzati per compierle, e di come proteggersi (usando i dispositivi che in termine generico si possono riconoscere come i firewall). Una seconda macro-area che analizza un'intrusione avvenuta dall'esterno verso dei server sensibili di una rete LAN. Questa analisi viene condotta sui file catturati dalle due interfacce di rete configurate in modalità promiscua su una sonda presente nella LAN. Le interfacce sono due per potersi interfacciare a due segmenti di LAN aventi due maschere di sotto-rete differenti. L'attacco viene analizzato mediante vari software. Si può infatti definire una terza parte del lavoro, la parte dove vengono analizzati i file catturati dalle due interfacce con i software che prima si occupano di analizzare i dati di contenuto completo, come Wireshark, poi dei software che si occupano di analizzare i dati di sessione che sono stati trattati con Argus, e infine i dati di tipo statistico che sono stati trattati con Ntop. Il penultimo capitolo, quello prima delle conclusioni, invece tratta l'installazione di Nagios, e la sua configurazione per il monitoraggio attraverso plugin dello spazio di disco rimanente su una macchina agent remota, e sui servizi MySql e DNS. Ovviamente Nagios può essere configurato per monitorare ogni tipo di servizio offerto sulla rete.
Resumo:
OBJECTIVE: A previous study of radiofrequency neurotomy of the articular branches of the obturator nerve for hip joint pain produced modest results. Based on an anatomical and radiological study, we sought to define a potentially more effective radiofrequency method. DESIGN: Ten cadavers were studied, four of them bilaterally. The obturator nerve and its articular branches were marked by wires. Their radiological relationship to the bone structures on fluoroscopy was imaged and analyzed. A magnetic resonance imaging (MRI) study was undertaken on 20 patients to determine the structures that would be encountered by the radiofrequency electrode during different possible percutaneous approaches. RESULTS: The articular branches of the obturator nerve vary in location over a wide area. The previously described method of denervating the hip joint did not take this variation into account. Moreover, it approached the nerves perpendicularly. Because optimal coagulation requires electrodes to lie parallel to the nerves, a perpendicular approach probably produced only a minimal lesion. In addition, MRI demonstrated that a perpendicular approach is likely to puncture femoral vessels. Vessel puncture can be avoided if an oblique pass is used. Such an approach minimizes the angle between the target nerves and the electrode, and increases the likelihood of the nerve being captured by the lesion made. Multiple lesions need to be made in order to accommodate the variability in location of the articular nerves. CONCLUSIONS: The method that we described has the potential to produce complete and reliable nerve coagulation. Moreover, it minimizes the risk of penetrating the great vessels. The efficacy of this approach should be tested in clinical trials.