547 resultados para least common subgraph algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental problem faced by stereo vision algorithms is that of determining correspondences between two images which comprise a stereo pair. This paper presents work towards the development of a new matching algorithm, based on the rank transform. This algorithm makes use of both area-based and edge-based information, and is therefore referred to as a hybrid algorithm. In addition, this algorithm uses a number of matching constraints, including the novel rank constraint. Results obtained using a number of test pairs show that the matching algorithm is capable of removing most invalid matches. The accuracy of matching in the vicinity of edges is also improved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction design is about finding better ways for people to interact with each other through communication technologies. Interaction design involves understanding how people learn, work and play so that we can engineer better, more valuable technologies that are more appropriate to the contexts of their lives. As an academic discipline, interaction design is about the people-research that underpins these technologies. As a comparative tool for business it is about creating innovations that have market pull rather than a technology push. Many examples can be found which demonstrate the value of interaction design within both industry and academia, however finding the common ground between this spectrum of activity is often difficult. Differences in language, approach and outcomes often lead to researchers from either side of the spectrum complaining of an uncommon ground, which often results in a lack of collaboration within such projects. However, as demonstrated through this case study, rather than focussing on finding a common ground to assist in better collaboration between industry and academia, celebrating the uniqueness of each approach whilst bridging them with a common language can lead to new knowledge and commercial innovation. This case study will focus on the research and development phase of a Diversionary Therapy Platform, a collaboration between the Australasian CRC for Interaction Design and The Royal Children's Hospital (Brisbane, Australia). This collaborative effort has led to the formation of a new commercial venture, Diversionary Therapy Pty Ltd, which aims to bring to the market the research outcomes from the project. The case study will outline the collaborative research and development process undertaken between the many stakeholders and reflect on the challenges identified within this process. A key finding from this collaboration was allowing for the co-existence of the common and uncommon ground throughout the project. This concept will be discussed further throughout this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the past decade the discipline of nursing has been reviewing its practice, especially in relation to specialty areas. There has been an appreciation by nursing leaders that specialisation brings with it concerns related to a disuniting effect on the discipline and a fragmentation of nursing's traditional generalist practice. Accompanying these concerns is a debate over what is a specialty and how to define a specialist. This qualitative study drew upon a constructivist methodology, to explore how nurses, working in specialty areas, define and give meaning to their practice. Three groups of nurses (n=20) from the specialty of critical care were interviewed using a focus group technique. The data were analysed to build constructions of specialty practice. A distinct and qualitative difference was recognised in the practice behaviours of nurses working in the specialty area. The qualitatively different practice behaviours have been identified as ‘nursing-in-a-specialty’ and ‘specialist nurse’. Two constructions emerged to differentiate the skill behaviours, these were ‘practice’ and ‘knowledge’. The specialist nurse practices were based on two distinct types of practice, that of ‘discretion’ and ‘incorporation’. ‘Knowledge’ was constructed as a synthesis of propositional and practice knowledge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter is devoted to the issue of non-fiduciary common law obligations of good faith, as they may arise in the performance and enforcement of joint ventures. In recent times a rush of commercial contractual claims involving good faith has signified the need for a separate chapter examining this issue. Although most of these decisions have arisen in commercial contexts other than joint ventures, the decisions, nevertheless, warrant careful consideration to the extent that they cast light on the likely contours of the common law good faith obligation as it may apply in the joint venture context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to implement a Game-Theory based offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. The goal of this work is then to develop a Multi-Objective (MO) optimisation tool able to provide a set of optimal solutions for the inspection task, given the environment data, the mission requirements and the definition of the objectives to minimise. Results indicate the robustness and capability of the method to find the trade-off between the Pareto-optimal solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel evolutionary computation approach to three-dimensional path planning for unmanned aerial vehicles (UAVs) with tactical and kinematic constraints. A genetic algorithm (GA) is modified and extended for path planning. Two GAs are seeded at the initial and final positions with a common objective to minimise their distance apart under given UAV constraints. This is accomplished by the synchronous optimisation of subsequent control vectors. The proposed evolutionary computation approach is called synchronous genetic algorithm (SGA). The sequence of control vectors generated by the SGA constitutes to a near-optimal path plan. The resulting path plan exhibits no discontinuity when transitioning from curve to straight trajectories. Experiments and results show that the paths generated by the SGA are within 2% of the optimal solution. Such a path planner when implemented on a hardware accelerator, such as field programmable gate array chips, can be used in the UAV as on-board replanner, as well as in ground station systems for assisting in high precision planning and modelling of mission scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new approach for network upgrading to improve the penetration level of Small Scale Generators in residential feeders. In this paper, it is proposed that a common DC link can be added to LV network to alleviate the negative impact of increased export power on AC lines, allowing customers to inject their surplus power with no restrictions to the common DC link. In addition, it is shown that the proposed approach can be a pathway from current AC network to future DC network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laboratory-based studies of human dietary behaviour benefit from highly controlled conditions; however, this approach can lack ecological validity. Identifying a reliable method to capture and quantify natural dietary behaviours represents an important challenge for researchers. In this study, we scrutinised cafeteria-style meals in the ‘Restaurant of the Future.’ Self-selected meals were weighed and photographed, both before and after consumption. Using standard portions of the same foods, these images were independently coded to produce accurate and reliable estimates of (i) initial self-served portions, and (ii) food remaining at the end of the meal. Plate cleaning was extremely common; in 86% of meals at least 90% of self-selected calories were consumed. Males ate a greater proportion of their self-selected meals than did females. Finally, when participants visited the restaurant more than once, the correspondence between selected portions was better predicted by the weight of the meal than by its energy content. These findings illustrate the potential benefits of meal photography in this context. However, they also highlight significant limitations, in particular, the need to exclude large amounts of data when one food obscures another.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate thin-film energy dispersive spectroscopic (EDS) analyses of clays with low-atomic-number (low Z) elements (e.g. Na, Al, Si), presents a challenge to the microscopist not only because of the spatial resolution required, but also because of their susceptibility to electron beam-induced radiation damange and very low X-ray count rates. Most common clays, such as kaolinite, smectite and illite occur as submicrometer crystallites with varying degrees of structural disorder in at least two directions and may have dimensions as small as one or two unit cells along the basal direction. Thus, even clays with relatively large a-b dimenstions (e.g., 100 x 100 nm) may be <5nm in the c-axis direction. For typical conditions in an analytical electron microscope (AEM), this sample thickness gives rise to very poor count rates (<200cps) and necessitates long counting times (>300s) in order to obtain satisfactory statistical precision. Unfortunately, beam damage rates for the common clays are very rapid (<10s in imaging mode) between 100kV and 200kV. With a focussed probe for elemental analyses, the damage rate is exacerbated by a high current density and may result in loss of low-Z elements during data collection and consequent loss of analytical accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a framework for both gradient descent image and object alignment in the Fourier domain. Our method centers upon the classical Lucas & Kanade (LK) algorithm where we represent the source and template/model in the complex 2D Fourier domain rather than in the spatial 2D domain. We refer to our approach as the Fourier LK (FLK) algorithm. The FLK formulation is advantageous when one pre-processes the source image and template/model with a bank of filters (e.g. oriented edges, Gabor, etc.) as: (i) it can handle substantial illumination variations, (ii) the inefficient pre-processing filter bank step can be subsumed within the FLK algorithm as a sparse diagonal weighting matrix, (iii) unlike traditional LK the computational cost is invariant to the number of filters and as a result far more efficient, and (iv) this approach can be extended to the inverse compositional form of the LK algorithm where nearly all steps (including Fourier transform and filter bank pre-processing) can be pre-computed leading to an extremely efficient and robust approach to gradient descent image matching. Further, these computational savings translate to non-rigid object alignment tasks that are considered extensions of the LK algorithm such as those found in Active Appearance Models (AAMs).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study examined Queensland Transcultural Mental Health Centre (QTMHC) client characteristics in order to provide a better understanding for development of future health service delivery models. Archived data that was collected for 1499 clients over two years period (2007-2009) was analysed using descriptive statistics and Chi squares. The results indicated that clients were referred from a range of sources and were generally adults. There were more women than men, who sought services. At least half of the clients had language barriers and relied on bilingual workers. Most frequently expressed mental health issues were mood disorder symptoms, followed by symptoms of schizophrenia and psychosis and anxiety. Acculturation strains and stressors were described as the most common psychosocial issues. Mental health and psychosocial issues differed for age, gender and world regions from which the CALD clients originated. The findings provided an understanding of clients who seek services at QTMHC. Various ways in which transcultural services and data bases can be further improved are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Threats against computer networks evolve very fast and require more and more complex measures. We argue that teams respectively groups with a common purpose for intrusion detection and prevention improve the measures against rapid propagating attacks similar to the concept of teams solving complex tasks known from field of work sociology. Collaboration in this sense is not easy task especially for heterarchical environments. We propose CIMD (collaborative intrusion and malware detection) as a security overlay framework to enable cooperative intrusion detection approaches. Objectives and associated interests are used to create detection groups for exchange of security-related data. In this work, we contribute a tree-oriented data model for device representation in the scope of security. We introduce an algorithm for the formation of detection groups, show realization strategies for the system and conduct vulnerability analysis. We evaluate the benefit of CIMD by simulation and probabilistic analysis.