857 resultados para clustering and QoS-aware routing
Resumo:
This study aims to acknowledge the domain level and influence of the neuromarketing construct. This is done considering professionals at advertising agencies in Brazil. The presence of concepts related to this new approach is very little divulged, and there are little analysis performed on this area. Thus, the research is of qualitative and exploratory nature and used as primary fonts books, articles related to marketing, neuroscience, and psychology as well as secondary fonts. A profound interview was realized aiming the main advertising agencies in Brazil. The public was composed by managers responsible for planning. A content analysis was performed afterwards. The advances related to the brain science have permitted the development of technological innovation. These go primarily towards knowledge and unconscious experiences of consumers, which are responsible for the impulse of decision making and consumer behavior. These issues are related to Neuromarketing, that in turn, uses techniques such as FMRI, PET and FDOT. These scan the consumer s brain and produces imagines on the neuron s structures and functioning. This is seen while activities such as mental tasks for the visualization of brands, images or products, watching videos and commercials are performed. It is observed that the agencies are constantly in search of new technologies and are aware of the limitations of the current research instruments. On the other hand, they are not totally familiar with concepts related to neuromarketing. In relation to the neuroimage techniques it is pointed out by the research that there is full unawareness, but some agencies seem to visualize positive impacts with the use of these techniques for the evaluation of films and in ways that permit to know the consumer better. It is also seen that neuroimage is perceived as a technique amongst others, but its application is not real, there are some barriers in the market and in the agencies itself. These barriers as well as some questioning allied to the scarce knowledge of neuromarketing, make it not possible to be put into practice in the advertising market. It is also observed that even though there is greater use of neuromarketing; there would not be any meaningful changes in functioning and structuring of these agencies. The use of the neuro-image machines should be done in research institutes and centers of big companies. Results show that the level of domain of the neuromarketing construct in the Brazilian advertising agencies is only a theoretical one. Little is known of this subject and the neurological studies and absolutely nothing of neuroimage techniques
Resumo:
Avaliaram-se as relações entre características físicas e morfológicas do sêmen de bovinos das subespécies Bos taurus taurus e Bos taurus indicus com a idade dos touros e a época de colheita do sêmen. Utilizaram-se observações feitas durante o período de 1993 a 1999, em 42 touros com 12 a 174 meses de idade, divididos em cinco classes: Bos taurus taurus - 12 a 36 meses; 37 a 60 meses; 61 a 84 meses; 85 a 108 meses; e 109 a 138 meses; e Bos taurus indicus - 12 a 42 meses; 43 a 72 meses; 73 a 102 meses; 103 a 132 meses; e 133 a 174 meses. As características analisadas foram: volume, turbilhonamento espermático, concentração espermática, motilidade espermática, vigor espermático, anormalidades espermáticas e integridade do acrossoma. No processamento das análises, utilizaram-se as médias das características em cada classe de idade e o mês de colheita de sêmen durante o período de 1993 a 1999. As técnicas estatísticas multivariadas de componentes principais e de agrupamento hierárquico mostraram resultados que podem contribuir na escolha de sêmen de melhor qualidade. As classes 103 a 132 meses e 133 a 174 meses para a subespécie Bos taurus indicus e a classe de idade 109 a 138 meses para a subespécie Bos taurus taurus foram as mais contrastantes. Para ambas as subespécies, nos meses mais úmidos, o sêmen apresentou menor qualidade, principalmente na subespécie Bos taurus taurus. As características do sêmen mais contrastantes no estudo por classes e no período de coleta de sêmen foram: vigor espermático, motilidade espermática, concentração espermática, integridade do acrossoma e anormalidades terciárias no sêmen.
Resumo:
Os solos submetidos aos sistemas de produção sem preparo estão sujeitos à compactação, provocada pelo tráfego de máquinas, tornando necessário o acompanhamento das alterações do ambiente físico, que, quando desfavorável, restringe o crescimento radicular, podendo reduzir a produtividade das culturas. O objetivo do trabalho foi avaliar o efeito de diferentes intensidades de compactação na qualidade física de um Latossolo Vermelho textura média, localizado em Jaboticabal (SP), sob cultivo de milho, usando métodos de estatística multivariada. O delineamento experimental foi inteiramente casualizado, com seis intensidades de compactação e quatro repetições. Foram coletadas amostras indeformadas do solo nas camadas de 0,02-0,05, 0,08-0,11 e 0,15-0,18 m para determinação da densidade do solo (Ds), na camada de 0-0,20 m. As características da cultura avaliadas foram: densidade radicular, diâmetro radicular, matéria seca das raízes, altura das plantas, altura de inserção da primeira espiga, diâmetro do colmo e matéria seca das plantas. As análises de agrupamentos e componentes principais permitiram identificar três grupos de alta, média e baixa produtividade de plantas de milho, segundo variáveis do solo, do sistema radicular e da parte aérea das plantas. A classificação dos acessos em grupos foi feita por três métodos: método de agrupamentos hierárquico, método não-hierárquico k-means e análise de componentes principais. Os componentes principais evidenciaram que elevadas produtividades de milho estão correlacionadas com o bom crescimento da parte aérea das plantas, em condições de menor densidade do solo, proporcionando elevada produção de matéria seca das raízes, contudo, de pequeno diâmetro. A qualidade física do Latossolo Vermelho para o cultivo do milho foi assegurada até à densidade do solo de 1,38 Mg m-3.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
This thesis proposes the specification and performance analysis of a real-time communication mechanism for IEEE 802.11/11e standard. This approach is called Group Sequential Communication (GSC). The GSC has a better performance for dealing with small data packets when compared to the HCCA mechanism by adopting a decentralized medium access control using a publish/subscribe communication scheme. The main objective of the thesis is the HCCA overhead reduction of the Polling, ACK and QoS Null frames exchanged between the Hybrid Coordinator and the polled stations. The GSC eliminates the polling scheme used by HCCA scheduling algorithm by using a Virtual Token Passing procedure among members of the real-time group to whom a high-priority and sequential access to communication medium is granted. In order to improve the reliability of the mechanism proposed into a noisy channel, it is presented an error recovery scheme called second chance algorithm. This scheme is based on block acknowledgment strategy where there is a possibility of retransmitting when missing real-time messages. Thus, the GSC mechanism maintains the real-time traffic across many IEEE 802.11/11e devices, optimized bandwidth usage and minimal delay variation for data packets in the wireless network. For validation purpose of the communication scheme, the GSC and HCCA mechanisms have been implemented in network simulation software developed in C/C++ and their performance results were compared. The experiments show the efficiency of the GSC mechanism, especially in industrial communication scenarios.
Resumo:
The greater part of monitoring onshore Oil and Gas environment currently are based on wireless solutions. However, these solutions have a technological configuration that are out-of-date, mainly because analog radios and inefficient communication topologies are used. On the other hand, solutions based in digital radios can provide more efficient solutions related to energy consumption, security and fault tolerance. Thus, this paper evaluated if the Wireless Sensor Network, communication technology based on digital radios, are adequate to monitoring Oil and Gas onshore wells. Percent of packets transmitted with successful, energy consumption, communication delay and routing techniques applied to a mesh topology will be used as metrics to validate the proposal in the different routing techniques through network simulation tool NS-2
Resumo:
Image segmentation is one of the image processing problems that deserves special attention from the scientific community. This work studies unsupervised methods to clustering and pattern recognition applicable to medical image segmentation. Natural Computing based methods have shown very attractive in such tasks and are studied here as a way to verify it's applicability in medical image segmentation. This work treats to implement the following methods: GKA (Genetic K-means Algorithm), GFCMA (Genetic FCM Algorithm), PSOKA (PSO and K-means based Clustering Algorithm) and PSOFCM (PSO and FCM based Clustering Algorithm). Besides, as a way to evaluate the results given by the algorithms, clustering validity indexes are used as quantitative measure. Visual and qualitative evaluations are realized also, mainly using data given by the BrainWeb brain simulator as ground truth
Resumo:
Ubiquitous computing systems operate in environments where the available resources significantly change during the system operation, thus requiring adaptive and context aware mechanisms to sense changes in the environment and adapt to new execution contexts. Motivated by this requirement, a framework for developing and executing adaptive context aware applications is proposed. The PACCA framework employs aspect-oriented techniques to modularize the adaptive behavior and to keep apart the application logic from this behavior. PACCA uses abstract aspect concept to provide flexibility by addition of new adaptive concerns that extend the abstract aspect. Furthermore, PACCA has a default aspect model that considers habitual adaptive concerns in ubiquitous applications. It exploits the synergy between aspect-orientation and dynamic composition to achieve context-aware adaptation, guided by predefined policies and aim to allow software modules on demand load making possible better use of mobile devices and yours limited resources. A Development Process for the ubiquitous applications conception is also proposed and presents a set of activities that guide adaptive context-aware developer. Finally, a quantitative study evaluates the approach based on aspects and dynamic composition for the construction of ubiquitous applications based in metrics
Resumo:
Symbolic Data Analysis (SDA) main aims to provide tools for reducing large databases to extract knowledge and provide techniques to describe the unit of such data in complex units, as such, interval or histogram. The objective of this work is to extend classical clustering methods for symbolic interval data based on interval-based distance. The main advantage of using an interval-based distance for interval-based data lies on the fact that it preserves the underlying imprecision on intervals which is usually lost when real-valued distances are applied. This work includes an approach allow existing indices to be adapted to interval context. The proposed methods with interval-based distances are compared with distances punctual existing literature through experiments with simulated data and real data interval
Resumo:
Peng was the first to work with the Technical DFA (Detrended Fluctuation Analysis), a tool capable of detecting auto-long-range correlation in time series with non-stationary. In this study, the technique of DFA is used to obtain the Hurst exponent (H) profile of the electric neutron porosity of the 52 oil wells in Namorado Field, located in the Campos Basin -Brazil. The purpose is to know if the Hurst exponent can be used to characterize spatial distribution of wells. Thus, we verify that the wells that have close values of H are spatially close together. In this work we used the method of hierarchical clustering and non-hierarchical clustering method (the k-mean method). Then compare the two methods to see which of the two provides the best result. From this, was the parameter � (index neighborhood) which checks whether a data set generated by the k- average method, or at random, so in fact spatial patterns. High values of � indicate that the data are aggregated, while low values of � indicate that the data are scattered (no spatial correlation). Using the Monte Carlo method showed that combined data show a random distribution of � below the empirical value. So the empirical evidence of H obtained from 52 wells are grouped geographically. By passing the data of standard curves with the results obtained by the k-mean, confirming that it is effective to correlate well in spatial distribution
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A methodology for pipeline leakage detection using a combination of clustering and classification tools for fault detection is presented here. A fuzzy system is used to classify the running mode and identify the operational and process transients. The relationship between these transients and the mass balance deviation are discussed. This strategy allows for better identification of the leakage because the thresholds are adjusted by the fuzzy system as a function of the running mode and the classified transient level. The fuzzy system is initially off-line trained with a modified data set including simulated leakages. The methodology is applied to a small-scale LPG pipeline monitoring case where portability, robustness and reliability are amongst the most important criteria for the detection system. The results are very encouraging with relatively low levels of false alarms, obtaining increased leakage detection with low computational costs. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A simulation study was made of the effects of mixing two evolutionary forces (natural selection and random genetic drift), combined in a single data matrix of gene frequencies, on the resulting genetic distances among populations. Twenty-one, kinds of simulated gene frequencies surfaces, for 15 populations linearly distributed over geographic space, were used to construct 21 data matrices, combining different proportions of two types of surfaces (gradients and random surfaces). These matrices were analysed by Unweighted Pair-Group Method - Arithmetic Averages (UPGMA), clustering and Principal Coordinate Analysis. The results obtained show that ordination is more accurate than UPGMA in revealing the spatial patterns in the genetic distances, in comparison with results obtained using the Mantel test comparing directly genetic and geographic distances.
Resumo:
The genetic divergence in 20 Eucalyptus spp. clones was evaluated by multivariate techniques based on 167 RAPD markers, of which 155 were polymorphic and 12 monomorphic. The measures of genetic distances were obtained by the arithmetic complement of the coefficients of Jaccard and of Sorenso-Nei and Li and evaluated by the hierarchical methods of Single Linkage clustering and Unweighted Pair Group Method with Arithmetic Mean (UPGMA). Independent of the dissimilarity coefficient, the greatest divergence was found between clones 7 and 17 and the smallest between the clones 11 and 14. Clone clustering was little influenced by the applied procedure so that, adopting the same percentage of divergence, the UPGMA identified two groups less for the coefficient of Sorenso-Nei and Li. The clones evidenced considerable genetic divergence, which is partly associated to the origin of the study material. The clusters formed by the UPGMA clustering algorithm associated to the arithmetic complement of Jaccard were most consistent.
Resumo:
The post-processing of association rules is a difficult task, since a huge number of rules that are generated are of no interest to the user. To overcome this problem many approaches have been developed, such as objective measures and clustering. However, objective measures don't reduce nor organize the collection of rules, therefore making the understanding of the domain difficult. On the other hand, clustering doesn't reduce the exploration space nor direct the user to find interesting knowledge, therefore making the search for relevant knowledge not so easy. In this context this paper presents the PAR-COM methodology that, by combining clustering and objective measures, reduces the association rule exploration space directing the user to what is potentially interesting. An experimental study demonstrates the potential of PAR-COM to minimize the user's effort during the post-processing process. © 2012 Springer-Verlag.