857 resultados para clustering and QoS-aware routing
Resumo:
The study of motor unit action potential (MUAP) activity from electrornyographic signals is an important stage on neurological investigations that aim to understand the state of the neuromuscular system. In this context, the identification and clustering of MUAPs that exhibit common characteristics, and the assessment of which data features are most relevant for the definition of such cluster structure are central issues. In this paper, we propose the application of an unsupervised Feature Relevance Determination (FRD) method to the analysis of experimental MUAPs obtained from healthy human subjects. In contrast to approaches that require the knowledge of a priori information from the data, this FRD method is embedded on a constrained mixture model, known as Generative Topographic Mapping, which simultaneously performs clustering and visualization of MUAPs. The experimental results of the analysis of a data set consisting of MUAPs measured from the surface of the First Dorsal Interosseous, a hand muscle, indicate that the MUAP features corresponding to the hyperpolarization period in the physisiological process of generation of muscle fibre action potentials are consistently estimated as the most relevant and, therefore, as those that should be paid preferential attention for the interpretation of the MUAP groupings.
Resumo:
This paper deals with the selection of centres for radial basis function (RBF) networks. A novel mean-tracking clustering algorithm is described as a way in which centers can be chosen based on a batch of collected data. A direct comparison is made between the mean-tracking algorithm and k-means clustering and it is shown how mean-tracking clustering is significantly better in terms of achieving an RBF network which performs accurate function modelling.
Resumo:
The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensional data. Several experiments are used to compare the proposed approach with the original algorithm and some of its modification and speed-up techniques.
Resumo:
Possible future changes of clustering and return periods (RPs) of European storm series with high potential losses are quantified. Historical storm series are identified using 40 winters of reanalysis. Time series of top events (1, 2 or 5 year return levels (RLs)) are used to assess RPs of storm series both empirically and theoretically. Additionally, 800 winters of general circulation model simulations for present (1960–2000) and future (2060–2100) climate conditions are investigated. Clustering is identified for most countries, and estimated RPs are similar for reanalysis and present day simulations. Future changes of RPs are estimated for fixed RLs and fixed loss index thresholds. For the former, shorter RPs are found for Western Europe, but changes are small and spatially heterogeneous. For the latter, which combines the effects of clustering and event ranking shifts, shorter RPs are found everywhere except for Mediterranean countries. These changes are generally not statistically significant between recent and future climate. However, the RPs for the fixed loss index approach are mostly beyond the range of pre-industrial natural climate variability. This is not true for fixed RLs. The quantification of losses associated with storm series permits a more adequate windstorm risk assessment in a changing climate.
Resumo:
There is a family of well-known external clustering validity indexes to measure the degree of compatibility or similarity between two hard partitions of a given data set, including partitions with different numbers of categories. A unified, fully equivalent set-theoretic formulation for an important class of such indexes was derived and extended to the fuzzy domain in a previous work by the author [Campello, R.J.G.B., 2007. A fuzzy extension of the Rand index and other related indexes for clustering and classification assessment. Pattern Recognition Lett., 28, 833-841]. However, the proposed fuzzy set-theoretic formulation is not valid as a general approach for comparing two fuzzy partitions of data. Instead, it is an approach for comparing a fuzzy partition against a hard referential partition of the data into mutually disjoint categories. In this paper, generalized external indexes for comparing two data partitions with overlapping categories are introduced. These indexes can be used as general measures for comparing two partitions of the same data set into overlapping categories. An important issue that is seldom touched in the literature is also addressed in the paper, namely, how to compare two partitions of different subsamples of data. A number of pedagogical examples and three simulation experiments are presented and analyzed in details. A review of recent related work compiled from the literature is also provided. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
During the last decade, the Internet usage has been growing at an enormous rate which has beenaccompanied by the developments of network applications (e.g., video conference, audio/videostreaming, E-learning, E-Commerce and real-time applications) and allows several types ofinformation including data, voice, picture and media streaming. While end-users are demandingvery high quality of service (QoS) from their service providers, network undergoes a complex trafficwhich leads the transmission bottlenecks. Considerable effort has been made to study thecharacteristics and the behavior of the Internet. Simulation modeling of computer networkcongestion is a profitable and effective technique which fulfills the requirements to evaluate theperformance and QoS of networks. To simulate a single congested link, simulation is run with asingle load generator while for a larger simulation with complex traffic, where the nodes are spreadacross different geographical locations generating distributed artificial loads is indispensable. Onesolution is to elaborate a load generation system based on master/slave architecture.
Resumo:
Personalized communication is when the marketing message is adapted to each individual by using information from a databaseand utilizing it in the various, different media channels available today. That gives the marketer the possibility to create a campaign that cuts through today’s clutter of marketing messages and gets the recipients attention. PODi is a non-profit organization that was started with the aim of contributing knowledge in the field of digital printingtechnologies. They have created a database of case studies showing companies that have successfully implemented personalizedcommunication in their marketing campaigns. The purpose of the project was therefore to analyze PODi case studies with the main objective of finding out if/how successfully the PODi-cases have been and what made them so successful. To collect the data found in the PODi cases the authors did a content analysis with a sample size of 140 PODi cases from the year 2008 to 2010. The study was carried out by analyzing the cases' measurable ways of success: response rate, conversion rate, visited PURL (personalized URL:s) and ROI (Return On Investment). In order to find out if there were any relationships to be found between the measurable result and what type of industry, campaign objective and media vehicle that was used in the campaign, the authors put up different research uestions to explore that. After clustering and merging the collected data the results were found to be quite spread but shows that the averages of response rates, visited PURL and conversion rates were consistently very high. In the study the authors also collected and summarized what the companies themselves claim to be the reasons for success with their marketing campaigns. The resultshows that the creation of a personalized campaign is complex and dependent on many different variables. It is for instance ofgreat importance to have a well thought-out plan with the campaign and to have good data and insights about the customer in order to perform creative personalization. It is also important to make it easy for the recipient to reply, to use several media vehicles for multiple touch points and to have an attractive and clever design.
Resumo:
GPS tracking of mobile objects provides spatial and temporal data for a broad range of applications including traffic management and control, transportation routing and planning. Previous transport research has focused on GPS tracking data as an appealing alternative to travel diaries. Moreover, the GPS based data are gradually becoming a cornerstone for real-time traffic management. Tracking data of vehicles from GPS devices are however susceptible to measurement errors – a neglected issue in transport research. By conducting a randomized experiment, we assess the reliability of GPS based traffic data on geographical position, velocity, and altitude for three types of vehicles; bike, car, and bus. We find the geographical positioning reliable, but with an error greater than postulated by the manufacturer and a non-negligible risk for aberrant positioning. Velocity is slightly underestimated, whereas altitude measurements are unreliable.
Resumo:
Uma grande evolução aconteceu no mundo com o surgimento da Internet. Ela causou um espantoso aumento das comunicações, que passaram a ser em tempo real e com uma abrangência mundial. Foi a democratização da informação. A Internet serve como uma grande ferramenta para as empresas, principalmente com a globalização dos mercados, fenômeno que cresce cada dia mais e atinge a todos. A globalização fez com que as organizações se tornassem globais, e a Internet funciona como um elo de ligação entre empresas, fornecedores e consumidores. Este trabalho consistiu na realização de uma pesquisa survey exploratória com o objetivo de verificar e descrever o uso potencial da Internet como ferramenta de auxílio à realização de negócios de caráter global, nas micro, pequenas e médias empresas de Porto Alegre. A amostra das empresas pesquisadas foi extraída do Trade Point Porto Alegre, por ser essa uma entidade que tem por objetivo auxiliar as empresas a realizarem operações globais. Com relação ao mercado global, o trabalho identificou que quase todas as empresas acreditam que ele tenha oportunidades de negócios. Os principais meios para entrar nesse mercado são a participação em feiras e rodadas de negócios, contato pessoal e o Trade Point. A Internet já está disseminada no meio empresarial, todas as empresas já a conhecem, e boa parte das empresas realizam operações que podem ser auxiliadas pela rede, como comunicação, promoção de produtos e acompanhamento pós-venda. Identificou-se que as microempresas são as que menos acreditam no mercado internacional, mas apontaram que a Internet pode ajudá-las em suas atividades. Já as pequenas empresas são as que atuam no mercado internacional e acreditam que a Internet possa ajudá-las em algumas atividades. Por fim, as médias empresas, também atuam no mercado internacional, principalmente com as exportações, e são as que já estão utilizando a Internet. O Trade Point se mostrou um serviço bem requisitado pelas empresas, principalmente as que atuam com o comércio internacional. As principais vantagens citadas foram a centralização de informações e a geração de novos negócios.
Resumo:
In this text we tried to make a critic analysis of knowledge management, by means of case studies about some branches of Banco do Brasil S.A., trying to identify the presence or absence of elements that characterize the environmental working place focused on a continuous learning. The research was taken at several branches of Banco do Brasil S.A. in Curitiba ¿ State of Paraná ¿ Brasil. We asked the managers to answer the questions as they represent the leadership in every branch. The critic points that interfere positive or negatively the daily activities on knowledge management were examined, such as: time, administration of the branches, market changings, information systems, knowledge generation and transmission, besides the internal and external standard rules. The conclusion is that knowledge management in the researched branches is still in its beginning. Although there is an advanced technical infra structure and the managers have demonstrated some knowledge on this subject and are aware of its .importance for the company. The knowledge management is visualized more as an auxiliary technology for the updated models than a new way of managing the organization. The most important revolution expected - to modify the way of thinking and acting of the employees - is still in its starting point.
Resumo:
O presente trabalho analisa as relações de consumo envolvidas em compras em camelôs na cidade do Rio de Janeiro. Com o objetivo de compreender essas relações e tentar identificar comportamentos que se assemelham e que possam ser reconhecidos como representantes de uma ou mais facetas apresentadas por Gabriel e Lang (2006), foram entrevistados nove indivíduos. Os dados foram coletados a partir de entrevistas em profundidade e analisados através da técnica da análise de conteúdo (BARDIN, 2011). Os nove consumidores de camelôs foram entrevistados no período de janeiro a março de 2013 e possuem idades que variam de 22 a 36 anos. Os resultados demonstram que os consumidores de camelô são informados e sabem os riscos e vantagens desse mercado. Além disso, foi possível perceber a forte presença de diferentes emoções e sentimentos envolvidos na hora da compra. Dessa forma, foi possível concluir que o consumidor de camelôs pode ter comportamentos, inclusive de maneira simultânea, que representem uma ou mais facetas das nove propostas por Gabriel e Lang (2006).
Resumo:
Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.
A organização social e o acesso à cultura: o caso das bibliotecas parque do Estado do Rio de Janeiro
Resumo:
Esta dissertação procura analisar qual a contribuição das Organizações Sociais para o acesso a direitos culturais, a partir da experiência das Bibliotecas Parque do Estado do Rio de Janeiro, em especial a de Manguinhos e a da Rocinha. Ciente de que as formas de cooperação para a efetivação de direitos culturais são múltiplas e que precisam ser pensadas a partir da inter-relação de vários atores e aspectos, todas invariavelmente necessitam desaguar em molduras de gestão viabilizadoras do acesso à cultura. A pesquisa adota o método do estudo de caso, valendo-se de pesquisas bibliográfica, documental e de campo. Apresenta o cenário de construção dos direitos culturais, em larga expansão no Brasil, e destaca que, para materializá-los, torna-se necessário estudar, avaliar e adotar modelos organizacionais alternativos aos tradicionais que caracterizam a administração pública direta e indireta. Aborda o campo da gestão e dos direitos culturais no contexto das três principais reformas do aparelho do Estado Republicano, ocorridas nas décadas de 30, 60 e 90, com ênfase na última, que incorpora a teoria da Nova Gestão Pública, base desta dissertação. Focaliza a Organização Social como modelo opcional à gestão de instituições ou programas culturais, a partir da realidade existente, das motivações, das vantagens e das perspectivas e aduz uma narrativa acerca do processo de concepção da legislação do estado do Rio de Janeiro. Verifica como surgiram esses equipamentos culturais e como se deu a formação da rede de Bibliotecas Parque. Descreve o processo de implantação das Organizações Sociais de Cultura no estado e apresenta o gestor das bibliotecas e sua relação com a secretaria de Cultura. Conclui que há necessidade de aperfeiçoamento de mecanismos de gestão, a fim de que o modelo possa, de fato, oferecer contribuição para o acesso a direitos culturais.
Resumo:
In the last decade mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. It is expected that this tendency will continue to increase with the convergence of fixed Internet wired networks with mobile ones and with the evolution to the full IP architecture paradigm. Therefore mobile wireless communications will be of paramount importance on the development of the information society of the near future. In particular a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation. 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigm). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications to be available in the near future. The approach followed in the design and implementation of the mobile wireless networks of current generation (2G and 3G) has been the stratification of the architecture into a communication protocol model composed by a set of layers, in which each one encompasses some set of functionalities. In such protocol layered model, communications is only allowed between adjacent layers and through specific interface service points. This modular concept eases the implementation of new functionalities as the behaviour of each layer in the protocol stack is not affected by the others. However, the fact that lower layers in the protocol stack model do not utilize information available from upper layers, and vice versa, downgrades the performance achieved. This is particularly relevant if multiple antenna systems, in a MIMO (Multiple Input Multiple Output) configuration, are implemented. MIMO schemes introduce another degree of freedom for radio resource allocation: the space domain. Contrary to the time and frequency domains, radio resources mapped into the spatial domain cannot be assumed as completely orthogonal, due to the amount of interference resulting from users transmitting in the same frequency sub-channel and/or time slots but in different spatial beams. Therefore, the availability of information regarding the state of radio resources, from lower to upper layers, is of fundamental importance in the prosecution of the levels of QoS expected from those multimedia applications. In order to match applications requirements and the constraints of the mobile radio channel, in the last few years researches have proposed a new paradigm for the layered architecture for communications: the cross-layer design framework. In a general way, the cross-layer design paradigm refers to a protocol design in which the dependence between protocol layers is actively exploited, by breaking out the stringent rules which restrict the communication only between adjacent layers in the original reference model, and allowing direct interaction among different layers of the stack. An efficient management of the set of available radio resources demand for the implementation of efficient and low complexity packet schedulers which prioritize user’s transmissions according to inputs provided from lower as well as upper layers in the protocol stack, fully compliant with the cross-layer design paradigm. Specifically, efficiently designed packet schedulers for 4G networks should result in the maximization of the capacity available, through the consideration of the limitations imposed by the mobile radio channel and comply with the set of QoS requirements from the application layer. IEEE 802.16e standard, also named as Mobile WiMAX, seems to comply with the specifications of 4G mobile networks. The scalable architecture, low cost implementation and high data throughput, enable efficient data multiplexing and low data latency, which are attributes essential to enable broadband data services. Also, the connection oriented approach of Its medium access layer is fully compliant with the quality of service demands from such applications. Therefore, Mobile WiMAX seems to be a promising 4G mobile wireless networks candidate. In this thesis it is proposed the investigation, design and implementation of packet scheduling algorithms for the efficient management of the set of available radio resources, in time, frequency and spatial domains of the Mobile WiMAX networks. The proposed algorithms combine input metrics from physical layer and QoS requirements from upper layers, according to the crosslayer design paradigm. Proposed schedulers are evaluated by means of system level simulations, conducted in a system level simulation platform implementing the physical and medium access control layers of the IEEE802.16e standard.
Resumo:
Ubiquitous computing raises new usability challenges that cut across design and development. We are particularly interested in environments enhanced with sensors, public displays and personal devices. How can prototypes be used to explore the users' mobility and interaction, both explicitly and implicitly, to access services within these environments? Because of the potential cost of development and design failure, these systems must be explored using early assessment techniques and versions of the systems that could disrupt if deployed in the target environment. These techniques are required to evaluate alternative solutions before making the decision to deploy the system on location. This is crucial for a successful development, that anticipates potential user problems, and reduces the cost of redesign. This thesis reports on the development of a framework for the rapid prototyping and analysis of ubiquitous computing environments that facilitates the evaluation of design alternatives. It describes APEX, a framework that brings together an existing 3D Application Server with a modelling tool. APEX-based prototypes enable users to navigate a virtual world simulation of the envisaged ubiquitous environment. By this means users can experience many of the features of the proposed design. Prototypes and their simulations are generated in the framework to help the developer understand how the user might experience the system. These are supported through three different layers: a simulation layer (using a 3D Application Server); a modelling layer (using a modelling tool) and a physical layer (using external devices and real users). APEX allows the developer to move between these layers to evaluate different features. It supports exploration of user experience through observation of how users might behave with the system as well as enabling exhaustive analysis based on models. The models support checking of properties based on patterns. These patterns are based on ones that have been used successfully in interactive system analysis in other contexts. They help the analyst to generate and verify relevant properties. Where these properties fail then scenarios suggested by the failure provide an important aid to redesign.