11 resultados para clustering and QoS-aware routing

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to acknowledge the domain level and influence of the neuromarketing construct. This is done considering professionals at advertising agencies in Brazil. The presence of concepts related to this new approach is very little divulged, and there are little analysis performed on this area. Thus, the research is of qualitative and exploratory nature and used as primary fonts books, articles related to marketing, neuroscience, and psychology as well as secondary fonts. A profound interview was realized aiming the main advertising agencies in Brazil. The public was composed by managers responsible for planning. A content analysis was performed afterwards. The advances related to the brain science have permitted the development of technological innovation. These go primarily towards knowledge and unconscious experiences of consumers, which are responsible for the impulse of decision making and consumer behavior. These issues are related to Neuromarketing, that in turn, uses techniques such as FMRI, PET and FDOT. These scan the consumer s brain and produces imagines on the neuron s structures and functioning. This is seen while activities such as mental tasks for the visualization of brands, images or products, watching videos and commercials are performed. It is observed that the agencies are constantly in search of new technologies and are aware of the limitations of the current research instruments. On the other hand, they are not totally familiar with concepts related to neuromarketing. In relation to the neuroimage techniques it is pointed out by the research that there is full unawareness, but some agencies seem to visualize positive impacts with the use of these techniques for the evaluation of films and in ways that permit to know the consumer better. It is also seen that neuroimage is perceived as a technique amongst others, but its application is not real, there are some barriers in the market and in the agencies itself. These barriers as well as some questioning allied to the scarce knowledge of neuromarketing, make it not possible to be put into practice in the advertising market. It is also observed that even though there is greater use of neuromarketing; there would not be any meaningful changes in functioning and structuring of these agencies. The use of the neuro-image machines should be done in research institutes and centers of big companies. Results show that the level of domain of the neuromarketing construct in the Brazilian advertising agencies is only a theoretical one. Little is known of this subject and the neurological studies and absolutely nothing of neuroimage techniques

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis proposes the specification and performance analysis of a real-time communication mechanism for IEEE 802.11/11e standard. This approach is called Group Sequential Communication (GSC). The GSC has a better performance for dealing with small data packets when compared to the HCCA mechanism by adopting a decentralized medium access control using a publish/subscribe communication scheme. The main objective of the thesis is the HCCA overhead reduction of the Polling, ACK and QoS Null frames exchanged between the Hybrid Coordinator and the polled stations. The GSC eliminates the polling scheme used by HCCA scheduling algorithm by using a Virtual Token Passing procedure among members of the real-time group to whom a high-priority and sequential access to communication medium is granted. In order to improve the reliability of the mechanism proposed into a noisy channel, it is presented an error recovery scheme called second chance algorithm. This scheme is based on block acknowledgment strategy where there is a possibility of retransmitting when missing real-time messages. Thus, the GSC mechanism maintains the real-time traffic across many IEEE 802.11/11e devices, optimized bandwidth usage and minimal delay variation for data packets in the wireless network. For validation purpose of the communication scheme, the GSC and HCCA mechanisms have been implemented in network simulation software developed in C/C++ and their performance results were compared. The experiments show the efficiency of the GSC mechanism, especially in industrial communication scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The greater part of monitoring onshore Oil and Gas environment currently are based on wireless solutions. However, these solutions have a technological configuration that are out-of-date, mainly because analog radios and inefficient communication topologies are used. On the other hand, solutions based in digital radios can provide more efficient solutions related to energy consumption, security and fault tolerance. Thus, this paper evaluated if the Wireless Sensor Network, communication technology based on digital radios, are adequate to monitoring Oil and Gas onshore wells. Percent of packets transmitted with successful, energy consumption, communication delay and routing techniques applied to a mesh topology will be used as metrics to validate the proposal in the different routing techniques through network simulation tool NS-2

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image segmentation is one of the image processing problems that deserves special attention from the scientific community. This work studies unsupervised methods to clustering and pattern recognition applicable to medical image segmentation. Natural Computing based methods have shown very attractive in such tasks and are studied here as a way to verify it's applicability in medical image segmentation. This work treats to implement the following methods: GKA (Genetic K-means Algorithm), GFCMA (Genetic FCM Algorithm), PSOKA (PSO and K-means based Clustering Algorithm) and PSOFCM (PSO and FCM based Clustering Algorithm). Besides, as a way to evaluate the results given by the algorithms, clustering validity indexes are used as quantitative measure. Visual and qualitative evaluations are realized also, mainly using data given by the BrainWeb brain simulator as ground truth

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ubiquitous computing systems operate in environments where the available resources significantly change during the system operation, thus requiring adaptive and context aware mechanisms to sense changes in the environment and adapt to new execution contexts. Motivated by this requirement, a framework for developing and executing adaptive context aware applications is proposed. The PACCA framework employs aspect-oriented techniques to modularize the adaptive behavior and to keep apart the application logic from this behavior. PACCA uses abstract aspect concept to provide flexibility by addition of new adaptive concerns that extend the abstract aspect. Furthermore, PACCA has a default aspect model that considers habitual adaptive concerns in ubiquitous applications. It exploits the synergy between aspect-orientation and dynamic composition to achieve context-aware adaptation, guided by predefined policies and aim to allow software modules on demand load making possible better use of mobile devices and yours limited resources. A Development Process for the ubiquitous applications conception is also proposed and presents a set of activities that guide adaptive context-aware developer. Finally, a quantitative study evaluates the approach based on aspects and dynamic composition for the construction of ubiquitous applications based in metrics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Symbolic Data Analysis (SDA) main aims to provide tools for reducing large databases to extract knowledge and provide techniques to describe the unit of such data in complex units, as such, interval or histogram. The objective of this work is to extend classical clustering methods for symbolic interval data based on interval-based distance. The main advantage of using an interval-based distance for interval-based data lies on the fact that it preserves the underlying imprecision on intervals which is usually lost when real-valued distances are applied. This work includes an approach allow existing indices to be adapted to interval context. The proposed methods with interval-based distances are compared with distances punctual existing literature through experiments with simulated data and real data interval

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Peng was the first to work with the Technical DFA (Detrended Fluctuation Analysis), a tool capable of detecting auto-long-range correlation in time series with non-stationary. In this study, the technique of DFA is used to obtain the Hurst exponent (H) profile of the electric neutron porosity of the 52 oil wells in Namorado Field, located in the Campos Basin -Brazil. The purpose is to know if the Hurst exponent can be used to characterize spatial distribution of wells. Thus, we verify that the wells that have close values of H are spatially close together. In this work we used the method of hierarchical clustering and non-hierarchical clustering method (the k-mean method). Then compare the two methods to see which of the two provides the best result. From this, was the parameter � (index neighborhood) which checks whether a data set generated by the k- average method, or at random, so in fact spatial patterns. High values of � indicate that the data are aggregated, while low values of � indicate that the data are scattered (no spatial correlation). Using the Monte Carlo method showed that combined data show a random distribution of � below the empirical value. So the empirical evidence of H obtained from 52 wells are grouped geographically. By passing the data of standard curves with the results obtained by the k-mean, confirming that it is effective to correlate well in spatial distribution

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Wireless Sensor Network (WSN) consists of distributed devices in an area in order to monitor physical variables such as temperature, pressure, vibration, motion and environmental conditions in places where wired networks would be difficult or impractical to implement, for example, industrial applications of difficult access, monitoring and control of oil wells on-shore or off-shore, monitoring of large areas of agricultural and animal farming, among others. To be viable, a WSN should have important requirements such as low cost, low latency, and especially low power consumption. However, to ensure these requirements, these networks suffer from limited resources, and eventually being used in hostile environments, leading to high failure rates, such as segmented routing, mes sage loss, reducing efficiency, and compromising the entire network, inclusive. This work aims to present the FTE-LEACH, a fault tolerant and energy efficient routing protocol that maintains efficiency in communication and dissemination of data.This protocol was developed based on the IEEE 802.15.4 standard and suitable for industrial networks with limited energy resources

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to acknowledge the domain level and influence of the neuromarketing construct. This is done considering professionals at advertising agencies in Brazil. The presence of concepts related to this new approach is very little divulged, and there are little analysis performed on this area. Thus, the research is of qualitative and exploratory nature and used as primary fonts books, articles related to marketing, neuroscience, and psychology as well as secondary fonts. A profound interview was realized aiming the main advertising agencies in Brazil. The public was composed by managers responsible for planning. A content analysis was performed afterwards. The advances related to the brain science have permitted the development of technological innovation. These go primarily towards knowledge and unconscious experiences of consumers, which are responsible for the impulse of decision making and consumer behavior. These issues are related to Neuromarketing, that in turn, uses techniques such as FMRI, PET and FDOT. These scan the consumer s brain and produces imagines on the neuron s structures and functioning. This is seen while activities such as mental tasks for the visualization of brands, images or products, watching videos and commercials are performed. It is observed that the agencies are constantly in search of new technologies and are aware of the limitations of the current research instruments. On the other hand, they are not totally familiar with concepts related to neuromarketing. In relation to the neuroimage techniques it is pointed out by the research that there is full unawareness, but some agencies seem to visualize positive impacts with the use of these techniques for the evaluation of films and in ways that permit to know the consumer better. It is also seen that neuroimage is perceived as a technique amongst others, but its application is not real, there are some barriers in the market and in the agencies itself. These barriers as well as some questioning allied to the scarce knowledge of neuromarketing, make it not possible to be put into practice in the advertising market. It is also observed that even though there is greater use of neuromarketing; there would not be any meaningful changes in functioning and structuring of these agencies. The use of the neuro-image machines should be done in research institutes and centers of big companies. Results show that the level of domain of the neuromarketing construct in the Brazilian advertising agencies is only a theoretical one. Little is known of this subject and the neurological studies and absolutely nothing of neuroimage techniques

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Context-aware applications are typically dynamic and use services provided by several sources, with different quality levels. Context information qualities are expressed in terms of Quality of Context (QoC) metadata, such as precision, correctness, refreshment, and resolution. On the other hand, service qualities are expressed via Quality of Services (QoS) metadata such as response time, availability and error rate. In order to assure that an application is using services and context information that meet its requirements, it is essential to continuously monitor the metadata. For this purpose, it is needed a QoS and QoC monitoring mechanism that meet the following requirements: (i) to support measurement and monitoring of QoS and QoC metadata; (ii) to support synchronous and asynchronous operation, thus enabling the application to periodically gather the monitored metadata and also to be asynchronously notified whenever a given metadata becomes available; (iii) to use ontologies to represent information in order to avoid ambiguous interpretation. This work presents QoMonitor, a module for QoS and QoC metadata monitoring that meets the abovementioned requirement. The architecture and implementation of QoMonitor are discussed. To support asynchronous communication QoMonitor uses two protocols: JMS and Light-PubSubHubbub. In order to illustrate QoMonitor in the development of ubiquitous application it was integrated to OpenCOPI (Open COntext Platform Integration), a Middleware platform that integrates several context provision middleware. To validate QoMonitor we used two applications as proofof- concept: an oil and gas monitoring application and a healthcare application. This work also presents a validation of QoMonitor in terms of performance both in synchronous and asynchronous requests