902 resultados para Math Applications in Computer Science
Resumo:
Pocket Data Mining (PDM) describes the full process of analysing data streams in mobile ad hoc distributed environments. Advances in mobile devices like smart phones and tablet computers have made it possible for a wide range of applications to run in such an environment. In this paper, we propose the adoption of data stream classification techniques for PDM. Evident by a thorough experimental study, it has been proved that running heterogeneous/different, or homogeneous/similar data stream classification techniques over vertically partitioned data (data partitioned according to the feature space) results in comparable performance to batch and centralised learning techniques.
Resumo:
The Twitter network has been labelled the most commonly used microblogging application around today. With about 500 million estimated registered users as of June, 2012, Twitter has become a credible medium of sentiment/opinion expression. It is also a notable medium for information dissemination; including breaking news on diverse issues since it was launched in 2007. Many organisations, individuals and even government bodies follow activities on the network in order to obtain knowledge on how their audience reacts to tweets that affect them. We can use postings on Twitter (known as tweets) to analyse patterns associated with events by detecting the dynamics of the tweets. A common way of labelling a tweet is by including a number of hashtags that describe its contents. Association Rule Mining can find the likelihood of co-occurrence of hashtags. In this paper, we propose the use of temporal Association Rule Mining to detect rule dynamics, and consequently dynamics of tweets. We coined our methodology Transaction-based Rule Change Mining (TRCM). A number of patterns are identifiable in these rule dynamics including, new rules, emerging rules, unexpected rules and ?dead' rules. Also the linkage between the different types of rule dynamics is investigated experimentally in this paper.
Resumo:
The objective of this book is to present the quantitative techniques that are commonly employed in empirical finance research together with real world, state of the art research examples. Each chapter is written by international experts in their fields. The unique approach is to describe a question or issue in finance and then to demonstrate the methodologies that may be used to solve it. All of the techniques described are used to address real problems rather than being presented for their own sake and the areas of application have been carefully selected so that a broad range of methodological approaches can be covered. This book is aimed primarily at doctoral researchers and academics who are engaged in conducting original empirical research in finance. In addition, the book will be useful to researchers in the financial markets and also advanced Masters-level students who are writing dissertations.
Resumo:
SOA (Service Oriented Architecture), workflow, the Semantic Web, and Grid computing are key enabling information technologies in the development of increasingly sophisticated e-Science infrastructures and application platforms. While the emergence of Cloud computing as a new computing paradigm has provided new directions and opportunities for e-Science infrastructure development, it also presents some challenges. Scientific research is increasingly finding that it is difficult to handle “big data” using traditional data processing techniques. Such challenges demonstrate the need for a comprehensive analysis on using the above mentioned informatics techniques to develop appropriate e-Science infrastructure and platforms in the context of Cloud computing. This survey paper describes recent research advances in applying informatics techniques to facilitate scientific research particularly from the Cloud computing perspective. Our particular contributions include identifying associated research challenges and opportunities, presenting lessons learned, and describing our future vision for applying Cloud computing to e-Science. We believe our research findings can help indicate the future trend of e-Science, and can inform funding and research directions in how to more appropriately employ computing technologies in scientific research. We point out the open research issues hoping to spark new development and innovation in the e-Science field.
Resumo:
This paper describes an approach to teaching and learning that combines elements of ludic engagement, gamification and digital creativity in order to make the learning of a serious subject a fun, interactive and inclusive experience for students regardless of their gender, age, culture, experience or any disabilities that they may have. This approach has been successfully used to teach software engineering to first year students but could in principle be transferred to any subject or discipline.
Resumo:
tMelt-polycondensation of succinic acid anhydride with oxazoline-based diol monomers gave hyper-branched polymers with carboxylicacids terminal groups.1H NMR and quantitative13C NMRspectroscopy coupled with DEPT-13513C NMR experiment showed high degrees of branching (over 60%).Esterification of the acid end groups by addition of citronellol at 160◦C produced novel white spirit solubleresins which were characterized by Fourier transform-infrared (FTIR) spectroscopy, gel permeation chro-matography (GPC), differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA). Blendsof the new hyperbranched materials with commercial alkyd resins resulted in a dramatic, concentrationdependent drop in viscosity. Solvent-borne coatings were formulated containing the hyperbranchedpolymers. Dynamic mechanical analysis studies revealed that the air drying rates of the new coatingsystems were enhanced compared with identical formulations containing only commercial alkyd resins.
Resumo:
Techniques to retrieve reliable images from complicated objects are described, overcoming problems introduced by uneven surfaces, giving enhanced depth resolution and improving image contrast. The techniques are illustrated with application to THz imaging of concealed wall paintings.
Resumo:
This paper presents a novel mobile sink area allocation scheme for consumer based mobile robotic devices with a proven application to robotic vacuum cleaners. In the home or office environment, rooms are physically separated by walls and an automated robotic cleaner cannot make a decision about which room to move to and perform the cleaning task. Likewise, state of the art cleaning robots do not move to other rooms without direct human interference. In a smart home monitoring system, sensor nodes may be deployed to monitor each separate room. In this work, a quad tree based data gathering scheme is proposed whereby the mobile sink physically moves through every room and logically links all separated sub-networks together. The proposed scheme sequentially collects data from the monitoring environment and transmits the information back to a base station. According to the sensor nodes information, the base station can command a cleaning robot to move to a specific location in the home environment. The quad tree based data gathering scheme minimizes the data gathering tour length and time through the efficient allocation of data gathering areas. A calculated shortest path data gathering tour can efficiently be allocated to the robotic cleaner to complete the cleaning task within a minimum time period. Simulation results show that the proposed scheme can effectively allocate and control the cleaning area to the robot vacuum cleaner without any direct interference from the consumer. The performance of the proposed scheme is then validated with a set of practical sequential data gathering tours in a typical office/home environment.
Resumo:
Nonlinear data assimilation is high on the agenda in all fields of the geosciences as with ever increasing model resolution and inclusion of more physical (biological etc.) processes, and more complex observation operators the data-assimilation problem becomes more and more nonlinear. The suitability of particle filters to solve the nonlinear data assimilation problem in high-dimensional geophysical problems will be discussed. Several existing and new schemes will be presented and it is shown that at least one of them, the Equivalent-Weights Particle Filter, does indeed beat the curse of dimensionality and provides a way forward to solve the problem of nonlinear data assimilation in high-dimensional systems.
Resumo:
With the fast development of wireless communications, ZigBee and semiconductor devices, home automation networks have recently become very popular. Since typical consumer products deployed in home automation networks are often powered by tiny and limited batteries, one of the most challenging research issues is concerning energy reduction and the balancing of energy consumption across the network in order to prolong the home network lifetime for consumer devices. The introduction of clustering and sink mobility techniques into home automation networks have been shown to be an efficient way to improve the network performance and have received significant research attention. Taking inspiration from nature, this paper proposes an Ant Colony Optimization (ACO) based clustering algorithm specifically with mobile sink support for home automation networks. In this work, the network is divided into several clusters and cluster heads are selected within each cluster. Then, a mobile sink communicates with each cluster head to collect data directly through short range communications. The ACO algorithm has been utilized in this work in order to find the optimal mobility trajectory for the mobile sink. Extensive simulation results from this research show that the proposed algorithm significantly improves home network performance when using mobile sinks in terms of energy consumption and network lifetime as compared to other routing algorithms currently deployed for home automation networks.
Resumo:
Liquidity is a fundamentally important facet of investments, but there is no single measure that quantifies it perfectly. Instead, a range of measures are necessary to capture different dimensions of liquidity such as the breadth and depth of markets, the costs of transacting, the speed with which transactions can occur and the resilience of prices to trading activity. This article considers how different dimensions have been measured in financial markets and for various forms of real estate investment. The purpose of this exercise is to establish the range of liquidity measures that could be used for real estate investments before considering which measures and questions have been investigated so far. Most measures reviewed here are applicable to public real estate, but not all can be applied to private real estate assets or funds. Use of a broader range of liquidity measures could help real estate researchers tackle issues such as quantification of illiquidity premiums for the real estate asset class or different types of real estate, and how liquidity differences might be incorporated into portfolio allocation models.
Resumo:
This article explores the way users of an online gay chat room negotiate the exchange of photographs and the conduct of video conferencing sessions and how this negotiation changes the way participants manage their interactions and claim and impute social identities. Different modes of communication provide users with different resources for the control of information, affecting not just what users are able to reveal, but also what they are able to conceal. Thus, the shift from a purely textual mode for interacting to one involving visual images fundamentally changes the kinds of identities and relationships available to users. At the same time, the strategies users employ to negotiate these shifts of mode can alter the resources available in different modes. The kinds of social actions made possible through different modes, it is argued, are not just a matter of the modes themselves but also of how modes are introduced into the ongoing flow of interaction.
Resumo:
The aim of this paper is to give an overview of the issues and actions on the Brazilian cultural heritage and then to discuss contributions as well as relationships that may be established from the principles of Information Science. The first item is concerned with the relationship between heritage and the concept of document, the second relates the documentary processes and the information scientist and finally, an approach of cultural heritage mediation and appropriation is presented.
Resumo:
Teaching and learning with history and philosophy of science (HPS) has been, and continues to be, supported by science educators. While science education standards documents in many countries also stress the importance of teaching and learning with HPS, the approach still suffers from ineffective implementation in school science teaching. In order to better understand this problem, an analysis of the obstacles of implementing HPS into classrooms was undertaken. The obstacles taken into account were structured in four groups: 1. culture of teaching physics, 2. teachers` skills, epistemological and didactical attitudes and beliefs, 3. institutional framework of science teaching, and 4. textbooks as fundamental didactical support. Implications for more effective implementation of HPS are presented, taking the social nature of educational systems into account.