52 resultados para Metadados. Monitoramento. Aferição. QoS. QoC. Requisições síncronas. requisições assíncronas
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests
Resumo:
The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks
Resumo:
This work presents a suggestion of a security system of land automation having as objective main the creation of a system capable from easy method, supervise the installations of a building with the goal to preserver security personal and patrimonial necessities, aim at portability low cost and use easiness. Was designed an alarms central and access controller that has digital and analogical inputs for sensors and outputs for buzzer, telephonic dialing and electronic lock. The system is supervised by software that makes solicitations of information to the alarms central through the one computer's serial port (RS-232). The supervisory software was developed in platform LabVIEW with displays the received data on a graphical interface informing the sensors' current states distributed in the building and system events as alarns occurrences. This system also can be viewed through the Internet for people registered by the land security's system administrator
Resumo:
The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML
Resumo:
Cloud Computing is a paradigm that enables the access, in a simple and pervasive way, through the network, to shared and configurable computing resources. Such resources can be offered on demand to users in a pay-per-use model. With the advance of this paradigm, a single service offered by a cloud platform might not be enough to meet all the requirements of clients. Ergo, it is needed to compose services provided by different cloud platforms. However, current cloud platforms are not implemented using common standards, each one has its own APIs and development tools, which is a barrier for composing different services. In this context, the Cloud Integrator, a service-oriented middleware platform, provides an environment to facilitate the development and execution of multi-cloud applications. The applications are compositions of services, from different cloud platforms and, represented by abstract workflows. However, Cloud Integrator has some limitations, such as: (i) applications are locally executed; (ii) users cannot specify the application in terms of its inputs and outputs, and; (iii) experienced users cannot directly determine the concrete Web services that will perform the workflow. In order to deal with such limitations, this work proposes Cloud Stratus, a middleware platform that extends Cloud Integrator and offers different ways to specify an application: as an abstract workflow or a complete/partial execution flow. The platform enables the application deployment in cloud virtual machines, so that several users can access it through the Internet. It also supports the access and management of virtual machines in different cloud platforms and provides services monitoring mechanisms and assessment of QoS parameters. Cloud Stratus was validated through a case study that consists of an application that uses different services provided by different cloud platforms. Cloud Stratus was also evaluated through computing experiments that analyze the performance of its processes.
Resumo:
The study consists in the structure elaboration and the ePoste project communicationprotocol, which is a system of monitoring by wireless communication with the aim tosensor and act with one or many public lighting points, and also the treatment of data todetect the activity in the sensors located on the posts. Previously the communication withsensors were made in locus and when data collect was necessary or the performance inthe devices, the operator had to move until the net place. Accordingly, the purpose has theconcern to increase the system dynamic, achieving the possible integrations with the systemsalready available to the lighting management. The same technology of communicationbetween the sensors are considerate, using the ZigBee protocol running on the meshnet, the communication with this net is made be internet through a gateway by GPRS,device with two basics functionalities, which bypass for data and the firmware update inthe sensor. This functionality test of data management is being tested; integrate way withlocal net of sensors and the internet data server operates. The protocol developed, besidesincorporating the ZigBee, that it is formation of framework in lower level, where definesbeginning, size and errors check, as well as the communication of sensor with the concentrator,which it is encapsulated in ZigBee; also the protocol of upper level, where thatit is the names, it was developed a platform of service to meet GET and POST requestsbeneath HTTP protocol, this service is implemented in the data server, which availablethe communication with the clients systems, in case, lighting management systems.
Resumo:
The technique of surface coating using magnetron sputtering is one of the most widely used in the surface engineering, for its versatility in obtaining different films as well as in the micro / nanometric thickness control. Among the various process parameters, those related to the active species of the plasma are of the most fundamental importance in the mechanism and kinetics of deposition. In order to identify the active species of the plasma, parameters such as gas flow, pressure and density of electric power were varied during titanium coating on glass substrate. By flowing argon gas of 10, 20, 30, 40 and 50 sccm (cubic centimeters per minute) for each gas flow a sequential scan of the electric current of 0.10, 0.20, 0.30, 0.40 , 0.50 A. The maximum value of 0.50 A was chosen based both on literature data and on limitations of the equipment. The monitoring of plasma species present during the deposition was carried out in situ by the technique of optical emission spectroscopy (OES) through the spectrometer Ocean Optics USB2000 Series. For this purpose, an apparatus was developed to adapt the OES inside the plasma reactor to stay positioned closest to the target. The radiations emitted by the species were detected by an optical fiber placed behind the glass substrate and their intensities as a function of wavelength were, displayed on a monitor screen. The acquisition time for each condition of the plain parameters was related to the minima of spectral lines intensities due to the film formed on the substrate. The intensities of different emission lines of argon and titanium were then analyzed as a function of time, to determine the active species and estimate the thickness of the deposited films. After the deposition, the coated glasses thin films were characterized by optical transmittance through an infrared laser. It was found that the thickness and deposition rate determined by in situ analysis were consistent with the results obtained by laser transmittance
Resumo:
The general objective of this thesis has been seasonal monitoring (quarterly time scale) of coastal and estuarine areas of a section of the Northern Coast of Rio Grande do Norte, Brazil, environmentally sensitive and with intense sediment erosion in the oil activities to underpin the implementation of projects for containment of erosion and mitigate the impacts of coastal dynamics. In order to achieve the general objective, the work was done systematically in three stages which consisted the specific objectives. The first stage was the implementation of geodetic reference infrastructure for carrying out the geodetic survey of the study area. This process included the implementation of RGLS (Northern Coast of the RN GPS Network), consisting of stations with geodetic coordinates and orthometric heights of precision; positioning of Benchmarks and evaluation of the gravimetric geoid available, for use in GPS altimetry of precision; and development of software for GPS altimetry of precision. The second stage was the development and improvement of methodologies for collection, processing, representation, integration and analysis of CoastLine (CL) and Digital Elevation Models (DEM) obtained by geodetic positioning techniques. As part of this stage have been made since, the choice of equipment and positioning methods to be used, depending on the required precision and structure implanted, and the definition of the LC indicator and of the geodesic references best suited, to coastal monitoring of precision. The third step was the seasonal geodesic monitoring of the study area. It was defined the execution times of the geodetic surveys by analyzing the pattern of sediment dynamics of the study area; the performing of surveys in order to calculate and locate areas and volumes of erosion and accretion (sandy and volumetric sedimentary balance) occurred on CL and on the beaches and islands surfaces throughout the year, and study of correlations between the measured variations (in area and volume) between each survey and the action of the coastal dynamic agents. The results allowed an integrated study of spatial and temporal interrelationships of the causes and consequences of intensive coastal processes operating in the area, especially to the measurement of variability of erosion, transport, balance and supply sedimentary over the annual cycle of construction and destruction of beaches. In the analysis of the results, it was possible to identify the causes and consequences of severe coastal erosion occurred on beaches exposed, to analyze the recovery of beaches and the accretion occurring in tidal inlets and estuaries. From the optics of seasonal variations in the CL, human interventions to erosion contention have been proposed with the aim of restoring the previous situation of the beaches in the process of erosion.
Resumo:
The objective of this Doctoral Thesis was monitoring, in trimestral scale, the coastal morphology of the Northeastern coast sections of Rio Grande do Norte State, in Brazil, which is an area of Potiguar Basin influenced by the oil industry activities. The studied sections compose coastal areas with intense sedimentary erosion and high environmental sensitivity to the oil spill. In order to achieve the general objective of this study, the work has been systematized in four steps. The first one refers to the evaluation of the geomorphological data acquisition methodologies used on Digital Elevation Model (DEM) of sandy beaches. The data has been obtained from Soledade beach, located on the Northeastern coast of Rio Grande Norte. The second step has been centered on the increasing of the reference geodetic infrastructure to accomplish the geodetic survey of the studied area by implanting a station in Corta Cachorro Barrier Island and by conducting monitoring geodetic surveys to understand the beach system based on the Coastline (CL) and on DEM multitemporal analysis. The third phase has been related to the usage of the methodology developed by Santos; Amaro (2011) and Santos et al. (2012) for the surveying, processing, representation, integration and analysis of Coastlines from sandy coast, which have been obtained through geodetic techniques of positioning, morphological change analysis and sediment transport. The fourth stage represents the innovation of surveys in coastal environment by using the Terrestrial Laser Scanning (TLS), based on Light Detection and Ranging (LiDAR), to evaluate a highly eroded section on Soledade beach where the oil industry structures are located. The evaluation has been achieved through high-precision DEM and accuracy during the modeling of the coast morphology changes. The result analysis of the integrated study about the spatial and temporal interrelations of the intense coastal processes in areas of building cycles and destruction of beaches has allowed identifying the causes and consequences of the intense coastal erosion in exposed beach sections and in barrier islands
Resumo:
The human activities responsible for the ambient degradation in the modern world are diverse. The industrial activities are preponderant in the question of the impact consequences for brazilian ecosystems. Amongst the human activities, the petroliferous industry in operation in Potiguar Petroliferous Basin (PPB) displays the constant risk of ambient impacts in the integrant cities, not only for the human populations and the environment, but also it reaches the native microorganisms of Caatinga ground and in the mangrove sediment. Not hindering, the elaboration of strategies of bioremediation for impacted areas pass through the knowledge of microbiota and its relations with the environment. Moreover, in the microorganism groups associated to oil, are emphasized the sulfate-reducing prokaryotes (SRP) that, in its anaerobic metabolism, these organisms participate of the sulfate reduction, discharging H2S, causing ambient risks and causing the corrosion of surfaces, as pipelines and tanks, resulting in damages for the industry. Some ancestries of PRS integrate the Archaea domain, group of microorganisms whose sequenced genomes present predominance of extremophilic adaptations, including surrounding with oil presence. This work has two correlated objectives: i) the detection and monitoring of the gene dsrB, gift in sulfate-reducing prokaryotes, through DGGE analysis in samples of mDNA of a mangrove sediment and semiarid soil, both in the BPP; ii) to relate genomic characteristics to the ecological aspects of Archaea through in silico studies, standing out the importance to the oil and gas industry. The results of the first work suggest that the petrodegraders communities of SRP persist after the contamination with oil in mangrove sediment and in semiarid soil. Comparing the populations of both sites, it reveals that there are variations in the size and composition during one year of experiments. In the second work, functional and structural factors are the probable cause to the pressure in maintenance of the conservation of the sequences in the multiple copies of the 16S rDNA gene. Is verified also the discrepancy established between total content GC and content GC of the same gene. Such results relating ribosomal genes and the ambient factors are important for metagenomic evaluations using PCR-DGGE. The knowledge of microbiota associated to the oil can contribute for a better destination of resources by the petroliferous industry and the development of bioremediation strategies. Likewise, search to lead to the best agreement of the performance of native microbiota in biogeochemical cycles in Potiguar Petroliferous Basin ecosystem
Resumo:
Coral bleaching has been increasingly the focus of research around the world since the early 1980s, when it was verified to be increasing in frequency, intensity and amount of areas affected. The phenomenon has been recorded since 1993, associated with elevation of the sea surface temperature due to El Niños and water thermal anomalies, according to most reports around the world. On the coast of Rio Grande do Norte, Brazil, a mass coral bleaching event was recorded in the Environmental Protection Area of Coral Reefs (APARC) during March and April 2010, when the water temperature reached 34°C for several days. About 80% of the corals in Maracajaú reef-complex exhibited partial or total bleaching. The aims of this study were to verify representativeness of coral coverage and how the bleaching dynamic has developed among different species. Coral coverage was estimated according to Reef Check Brazil protocol associated with quadrant method, and bleaching was evaluated from biweekly visual surveys in 80 colonies of Favia gravida, Porites astreoides, Siderastrea stellata and Millepora alcicornis. At the same time temperature, pH, salinity and horizontal transparency, as well as mortality and disease occurrence were monitored. Analysis of variance and Multiple Regression from the perspective of time lag concept were used to evaluate the bleaching dynamics among species and the relationship between variation of means of bleaching and variations of abiotic parameters, respectively. Species showed significant differences among themselves as to variation of means of bleaching over time, but the dynamic of variation exhibited similar patterns
Resumo:
The present work is grounded basically on the use of the Basic Tools for the Statistic Process Control SPC, with the intent to detect non-conformities on a given productive process. It consists on a case study accomplished at a Hemocenter in Natal (Rio Grande do Norte). In this study it is shown that, the Statistic Process Control Technique, which was used as a tool, is useful to identify on-conformities on the volume of hemocomponents. The gathering of the used data was performed by means of document analysis, direct observations and database queries. The results achieved from the study show that the analyzed products, even though when they have presented, in some cases, points out of control, they satisfied the ANVISA standards. Finally, suggestions for further improvement of the final product and guidance for future employment of CEP, also extended to other lines of production, are presented
Resumo:
This Master Thesis presents a case study on the use of Statistical Process Control (SPC) at the Núcleo de Pesquisas em Alimentos e Medicamentos (NUPLAM). The SPC basic tools have been applied in the process of the tuberculostáticos drugs encapsulation, primarily concerning the objective to choose, between two speeds, which one is the best one to perform the tuberculostatics encapsulation. Later on, with the company effectively operating, the SPC was applied intending to know the variability of the process and, through the tracking of the process itself, to arrive at an estimated limit for the control of future lots of tuberculostatics of equal dosage. As special causes were detected acting in the process, a cause-and-effect diagram was built in order to try to discover, in each factor that composes the productive process, the possible causes of variation of the capsules average weight. The hypotheses raised will be able to serve as a base for deepened the study to eliminate or reduce these interferences in the process. Also a study on the capacity of the process to attend the specifications was carried out, and this study has shown the process´s inaptitude to take care of them. However, on the side of NUPLAM exists a real yearning to implant the SPC and consequently to improve the existing quality already present on its medicines
Resumo:
The area of the hospital automation has been the subject a lot of research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others); communication (tracking patients, staff and materials), development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials); and aid to medical diagnosis (according to each speciality). This thesis presents an architecture for a patient monitoring and alert systems. This architecture is based on intelligent systems techniques and is applied in hospital automation, specifically in the Intensive Care Unit (ICU) for the patient monitoring in hospital environment. The main goal of this architecture is to transform the multiparameter monitor data into useful information, through the knowledge of specialists and normal parameters of vital signs based on fuzzy logic that allows to extract information about the clinical condition of ICU patients and give a pre-diagnosis. Finally, alerts are dispatched to medical professionals in case any abnormality is found during monitoring. After the validation of the architecture, the fuzzy logic inferences were applied to the trainning and validation of an Artificial Neural Network for classification of the cases that were validated a priori with the fuzzy system
Resumo:
The greater part of monitoring onshore Oil and Gas environment currently are based on wireless solutions. However, these solutions have a technological configuration that are out-of-date, mainly because analog radios and inefficient communication topologies are used. On the other hand, solutions based in digital radios can provide more efficient solutions related to energy consumption, security and fault tolerance. Thus, this paper evaluated if the Wireless Sensor Network, communication technology based on digital radios, are adequate to monitoring Oil and Gas onshore wells. Percent of packets transmitted with successful, energy consumption, communication delay and routing techniques applied to a mesh topology will be used as metrics to validate the proposal in the different routing techniques through network simulation tool NS-2