247 resultados para GRASP filtering


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We contribute an empirically derived noise model for the Kinect sensor. We systematically measure both lateral and axial noise distributions, as a function of both distance and angle of the Kinect to an observed surface. The derived noise model can be used to filter Kinect depth maps for a variety of applications. Our second contribution applies our derived noise model to the KinectFusion system to extend filtering, volumetric fusion, and pose estimation within the pipeline. Qualitative results show our method allows reconstruction of finer details and the ability to reconstruct smaller objects and thinner surfaces. Quantitative results also show our method improves pose estimation accuracy. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Semantic Web offers many possibilities for future Web technologies. Therefore, it is a need to search for ways that can bring the huge amount of unstructured documents from current Web to Semantic Web automatically. One big challenge in searching for such ways is how to understand patterns by both humans and machine. To address this issue, we present an innovative model which interprets patterns to high level concepts. These concepts can explain the patterns' meanings in a human understandable way while improving the information filtering performance. The model is evaluated by comparing it against one state-of-the-art benchmark model using standard Reuters dataset. The results show that the proposed model is successful. The significance of this model is three fold. It gives a way to interpret text mining output, provides a technique to find concepts relevant to the whole set of patterns which is an essential feature to understand the topic, and to some extent overcomes information mismatch and overload problems of existing models. This model will be very useful for knowledge based applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

User profiling is the process of constructing user models which represent personal characteristics and preferences of customers. User profiles play a central role in many recommender systems. Recommender systems recommend items to users based on user profiles, in which the items can be any objects which the users are interested in, such as documents, web pages, books, movies, etc. In recent years, multidimensional data are getting more and more attention for creating better recommender systems from both academia and industry. Additional metadata provides algorithms with more details for better understanding the interactions between users and items. However, most of the existing user/item profiling techniques for multidimensional data analyze data through splitting the multidimensional relations, which causes information loss of the multidimensionality. In this paper, we propose a user profiling approach using a tensor reduction algorithm, which we will show is based on a Tucker2 model. The proposed profiling approach incorporates latent interactions between all dimensions into user profiles, which significantly benefits the quality of neighborhood formation. We further propose to integrate the profiling approach into neighborhoodbased collaborative filtering recommender algorithms. Experimental results show significant improvements in terms of recommendation accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For traditional information filtering (IF) models, it is often assumed that the documents in one collection are only related to one topic. However, in reality users’ interests can be diverse and the documents in the collection often involve multiple topics. Topic modelling was proposed to generate statistical models to represent multiple topics in a collection of documents, but in a topic model, topics are represented by distributions over words which are limited to distinctively represent the semantics of topics. Patterns are always thought to be more discriminative than single terms and are able to reveal the inner relations between words. This paper proposes a novel information filtering model, Significant matched Pattern-based Topic Model (SPBTM). The SPBTM represents user information needs in terms of multiple topics and each topic is represented by patterns. More importantly, the patterns are organized into groups based on their statistical and taxonomic features, from which the more representative patterns, called Significant Matched Patterns, can be identified and used to estimate the document relevance. Experiments on benchmark data sets demonstrate that the SPBTM significantly outperforms the state-of-the-art models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recommender systems based on multidimensional data, additional metadata provides algorithms with more information for better understanding the interaction between users and items. However, most of the profiling approaches in neighbourhood-based recommendation approaches for multidimensional data merely split or project the dimensional data and lack the consideration of latent interaction between the dimensions of the data. In this paper, we propose a novel user/item profiling approach for Collaborative Filtering (CF) item recommendation on multidimensional data. We further present incremental profiling method for updating the profiles. For item recommendation, we seek to delve into different types of relations in data to understand the interaction between users and items more fully, and propose three multidimensional CF recommendation approaches for top-N item recommendations based on the proposed user/item profiles. The proposed multidimensional CF approaches are capable of incorporating not only localized relations of user-user and/or item-item neighbourhoods but also latent interaction between all dimensions of the data. Experimental results show significant improvements in terms of recommendation accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is consensus among practitioners and academics that culture is a critical factor that is able to determine success or failure of BPM initiatives. Yet, culture is a topic that seems difficult to grasp and manage. This may be the reason for the overall lack of guidance on how to address this topic in practice. We have conducted in-depth research for more than three years to examine why and how culture is relevant to BPM. In this chapter, we introduce a framework that explains the role of culture in BPM. We also present the relevant cultural values that compose a BPM culture, and we introduce a tool to examine the supportiveness of organizational cultures for BPM. Our research results provide the basis for further empirical analyses on the topic and support practitioners in the management of culture as an important factor in BPM initiatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years disaster risk reduction efforts have focused on disturbances ranging from climate variability, seismic hazards, geo-political instability and public and animal health crises. These factors combined with uncertainty derived from inter-dependencies within and across systems of critical infrastructure create significant problems of governance for the private and public sector alike. The potential for rapid spread of impacts, geographically and virtually, can render a comprehensive understanding of disaster response and recovery needs and risk mitigation issues beyond the grasp of competent authority. Because of such cascading effects communities and governments at local and state-levels are unlikely to face single incidents but rather series of systemic impacts: often appearing concurrently. A further point to note is that both natural and technological hazards can act directly on socio-technical systems as well as being propagated by them: as network events. Such events have been categorised as ‘outside of the box,’ ‘too fast,’ and ‘too strange’ (Lagadec, 2004). Emergent complexities in linked systems can make disaster effects difficult to anticipate and recovery efforts difficult to plan for. Beyond the uncertainties of real world disasters, that might be called familiar or even regular, can we safely assume that the generic capability we use now will suit future disaster contexts? This paper presents initial scoping of research funded by the Bushfire and Natural Hazards Cooperative Research Centre seeking to define future capability needs of disaster management organisations. It explores challenges to anticipating the needs of representative agencies and groups active in before, during and after phases of emergency and disaster situations using capability deficit assessments and scenario assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resource assignment and scheduling is a difficult task when job processing times are stochastic, and resources are to be used for both known and unknown demand. To operate effectively within such an environment, several novel strategies are investigated. The first focuses upon the creation of a robust schedule, and utilises the concept of strategically placed idle time (i.e. buffering). The second approach introduces the idea of maintaining a number of free resources at each time, and culminates in another form of strategically placed buffering. The attraction of these approaches is that they are easy to grasp conceptually, and mimic what practitioners already do in practice. Our extensive numerical testing has shown that these techniques ensure more prompt job processing, and reduced job cancellations and waiting time. They are effective in the considered setting and could easily be adapted for many real life problems, for instance those in health care. This article has more importantly demonstrated that integrating the two approaches is a better strategy and will provide an effective stochastic scheduling approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we present an autonomous mobile ma- nipulator that is used to collect sample containers in an unknown environment. The manipulator is part of a team of heterogeneous mobile robots that are to search and identify sample containers in an unknown environment. A map of the environment along with possible positions of sample containers are shared between the robots in the team by using a cloud-based communication interface. To grasp a container with its manipulator arm the robot has to place itself in a position suitable for the manipulation task. This optimal base placement pose is selected by querying a precomputed inverse reachability database.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We argue that safeguards are necessary to ensure human rights are adequately protected. All systems of blocking access to online content necessarily raise difficult and problematic issues of infringement of freedom of speech and access to information. Given the importance of access to information across the breadth of modern life, great care must be taken to ensure that any measures designed to protect copyright by blocking access to online locations are proportionate. Any measures to block access to online content must be carefully tailored to avoid serious and disproportionate impact on human rights. This means first that the measures must be effective and adapted to achieve a legitimate purpose. The experience of foreign jurisdictions suggests that this legislation is unlikely to be effective. Unless and until there is clear evidence that the proposed scheme is likely to increase effective returns to Australian creators, this legislation should not be introduced. Second, the principle of proportionality requires ensuring that the proposed legislation does not unnecessarily burden legitimate speech or access to information. As currently worded, the draft legislation may result in online locations being blocked even though they would, if operated in Australia, not contravene Australian law. This is unacceptable, and if introduced, the law should be drafted so that it is clearly limited only to foreign locations where there is clear and compelling evidence that the location would authorise copyright infringement if it were in Australia. Third, proportionality requires that measures are reasonable and strike an appropriate balance between competing interests. This draft legislation provides few safeguards for the public interest or the interests of private actors who would access legitimate information. New safeguards should be introduced to ensure that the public interest is well represented at both the stage of the primary application and at any applications to rescind or vary injunctions. We recommend that: The legislation not be introduced unless and until there is compelling evidence that it will have a real and significant positive impact on the effective incomes of Australian creators. The ‘facilitates an infringement’ test in s 115A(1)(b) should be replaced with ‘authorises infringement’. The ‘primary purpose’ test in s 115A(1)(c) should be replaced with: “the online location has no substantial non-infringing uses”. An explicit role for public interest groups as amici curiae should be introduced. Costs of successful applications should be borne by applicants. Injunctions should be valid only for renewable two year terms. Section 115A(5) should be clarified, and cl (b) and (c) be removed. The effectiveness of the scheme should be evaluated in two years.