530 resultados para extraction methods
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.
Resumo:
The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.
Resumo:
Spatio-Temporal interest points are the most popular feature representation in the field of action recognition. A variety of methods have been proposed to detect and describe local patches in video with several techniques reporting state of the art performance for action recognition. However, the reported results are obtained under different experimental settings with different datasets, making it difficult to compare the various approaches. As a result of this, we seek to comprehensively evaluate state of the art spatio- temporal features under a common evaluation framework with popular benchmark datasets (KTH, Weizmann) and more challenging datasets such as Hollywood2. The purpose of this work is to provide guidance for researchers, when selecting features for different applications with different environmental conditions. In this work we evaluate four popular descriptors (HOG, HOF, HOG/HOF, HOG3D) using a popular bag of visual features representation, and Support Vector Machines (SVM)for classification. Moreover, we provide an in-depth analysis of local feature descriptors and optimize the codebook sizes for different datasets with different descriptors. In this paper, we demonstrate that motion based features offer better performance than those that rely solely on spatial information, while features that combine both types of data are more consistent across a variety of conditions, but typically require a larger codebook for optimal performance.
Resumo:
This chapter explores the objectives, principle and methods of climate law. The United Nations Framework Convention on Climate Change (UNFCCC) lays the foundations of the international regime by setting out its ultimate objectives in Article 2, the key principles in Article 3, and the methods of the regime in Article 4. The ultimate objective of the regime – to avoid dangerous anthropogenic interference – is examined and assessments of the Intergovernmental Panel on Climate Change (IPCC) are considered when seeking to understand the definition of this concept. The international environmental principles of: state sovereignty and responsibility, preventative action, cooperation, sustainable development, precaution, polluter pays and common but differentiated responsibility are then examined and their incorporation within the international climate regime instruments evaluated. This is followed by an examination of the methods used by the mitigation and adaptation regimes in seeking to achieve the objective of the UNFCCC. Methods of the mitigation regime include: domestic implementation of policies, setting of standards and targets and allocation of rights, use of flexibility mechanisms, and reporting. While it is noted that methods of the adaptation regime are still evolving, the latter includes measures such as impact assessments, national adaptation plans and the provision of funding.
Resumo:
Cancer poses an undeniable burden to the health and wellbeing of the Australian community. In a recent report commissioned by the Australian Institute for Health and Welfare(AIHW, 2010), one in every two Australians on average will be diagnosed with cancer by the age of 85, making cancer the second leading cause of death in 2007, preceded only by cardiovascular disease. Despite modest decreases in standardised combined cancer mortality over the past few decades, in part due to increased funding and access to screening programs, cancer remains a significant economic burden. In 2010, all cancers accounted for an estimated 19% of the country's total burden of disease, equating to approximately $3:8 billion in direct health system costs (Cancer Council Australia, 2011). Furthermore, there remains established socio-economic and other demographic inequalities in cancer incidence and survival, for example, by indigenous status and rurality. Therefore, in the interests of the nation's health and economic management, there is an immediate need to devise data-driven strategies to not only understand the socio-economic drivers of cancer but also facilitate the implementation of cost-effective resource allocation for cancer management...
Resumo:
This case-study explores alternative and experimental methods of research data acquisition, through an emerging research methodology, ‘Guerrilla Research Tactics’ [GRT]. The premise is that the researcher develops covert tactics for attracting and engaging with research participants. These methods range between simple analogue interventions to physical bespoke artefacts which contain an embedded digital link to a live, interactive data collecting resource, such as an online poll, survey or similar. These artefacts are purposefully placed in environments where the researcher anticipates an encounter and response from the potential research participant. The choice of design and placement of artefacts is specific and intentional. DESCRIPTION: Additional information may include: the outcomes; key factors or principles that contribute to its effectiveness; anticipated impact/evidence of impact. This case-study assesses the application of ‘Guerrilla Research Tactics’ [GRT] Methodology as an alternative, engaging and interactive method of data acquisition for higher degree research. Extending Gauntlett’s definition of ‘new creative methods… an alternative to language driven qualitative research methods' (2007), this case-study contributes to the existing body of literature addressing creative and interactive approaches to HDR data collection. The case-study was undertaken with Masters of Architecture and Urban Design research students at QUT, in 2012. Typically students within these creative disciplines view research as a taxing and boring process, distracting them from their studio design focus. An obstacle that many students face, is acquiring data from their intended participant groups. In response to these challenges the authors worked with students to develop creative, fun, and engaging research methods for both the students and their research participants. GRT are influenced by and developed from a combination of participatory action research (Kindon, 2008) and unobtrusive research methods (Kellehear, 1993), to enhance social research. GRT takes un-obtrusive research in a new direction, beyond the typical social research methods. The Masters research students developed alternative methods for acquiring data, which relied on a combination of analogue design interventions and online platforms commonly distributed through social networks. They identified critical issues that required action by the community, and the processes they developed focused on engaging with communities, to propose solutions. Key characteristics shared between both GRT and Guerrilla Activism, are notions of political issues, the unexpected, the unconventional, and being interactive, unique and thought provoking. The trend of Guerrilla Activism has been adapted to: marketing, communication, gardening, craftivism, theatre, poetry, and art. Focusing on the action element and examining elements of current trends within Guerrilla marketing, we believe that GRT can be applied to a range of research areas within various academic disciplines.
Resumo:
Raman spectroscopy, X-ray diffraction (XRD), and scanning electron microscopy (SEM) have been used to compare samples of YBa2Cu3O7 (YBCO) synthesised by the solid-state method and a novel co-precipitation technique. XRD results indicate that YBCO prepared by these two methods are phase pure, however the Raman and SEM results show marked differences between these samples.
Resumo:
This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications, and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes. We present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods that preserve the underlying structure of the problem. We also present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable-step-size implementations based on various types of control.
Resumo:
The pioneering work of Runge and Kutta a hundred years ago has ultimately led to suites of sophisticated numerical methods suitable for solving complex systems of deterministic ordinary differential equations. However, in many modelling situations, the appropriate representation is a stochastic differential equation and here numerical methods are much less sophisticated. In this paper a very general class of stochastic Runge-Kutta methods is presented and much more efficient classes of explicit methods than previous extant methods are constructed. In particular, a method of strong order 2 with a deterministic component based on the classical Runge-Kutta method is constructed and some numerical results are presented to demonstrate the efficacy of this approach.