214 resultados para 336.222


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the demand for better and deeper analysis in sports, organizations (both professional teams and broadcasters) are looking to use spatiotemporal data in the form of player tracking information to obtain an advantage over their competitors. However, due to the large volume of data, its unstructured nature, and lack of associated team activity labels (e.g. strategic/tactical), effective and efficient strategies to deal with such data have yet to be deployed. A bottleneck restricting such solutions is the lack of a suitable representation (i.e. ordering of players) which is immune to the potentially infinite number of possible permutations of player orderings, in addition to the high dimensionality of temporal signal (e.g. a game of soccer last for 90 mins). Leveraging a recent method which utilizes a "role-representation", as well as a feature reduction strategy that uses a spatiotemporal bilinear basis model to form a compact spatiotemporal representation. Using this representation, we find the most likely formation patterns of a team associated with match events across nearly 14 hours of continuous player and ball tracking data in soccer. Additionally, we show that we can accurately segment a match into distinct game phases and detect highlights. (i.e. shots, corners, free-kicks, etc) completely automatically using a decision-tree formulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past decade, vision-based tracking systems have been successfully deployed in professional sports such as tennis and cricket for enhanced broadcast visualizations as well as aiding umpiring decisions. Despite the high-level of accuracy of the tracking systems and the sheer volume of spatiotemporal data they generate, the use of this high quality data for quantitative player performance and prediction has been lacking. In this paper, we present a method which predicts the location of a future shot based on the spatiotemporal parameters of the incoming shots (i.e. shot speed, location, angle and feet location) from such a vision system. Having the ability to accurately predict future short-term events has enormous implications in the area of automatic sports broadcasting in addition to coaching and commentary domains. Using Hawk-Eye data from the 2012 Australian Open Men's draw, we utilize a Dynamic Bayesian Network to model player behaviors and use an online model adaptation method to match the player's behavior to enhance shot predictability. To show the utility of our approach, we analyze the shot predictability of the top 3 players seeds in the tournament (Djokovic, Federer and Nadal) as they played the most amounts of games.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient and effective feature detection and representation is an important consideration when processing videos, and a large number of applications such as motion analysis, 3D scene understanding, tracking etc. depend on this. Amongst several feature description methods, local features are becoming increasingly popular for representing videos because of their simplicity and efficiency. While they achieve state-of-the-art performance with low computational complexity, their performance is still too limited for real world applications. Furthermore, rapid increases in the uptake of mobile devices has increased the demand for algorithms that can run with reduced memory and computational requirements. In this paper we propose a semi binary based feature detectordescriptor based on the BRISK detector, which can detect and represent videos with significantly reduced computational requirements, while achieving comparable performance to the state of the art spatio-temporal feature descriptors. First, the BRISK feature detector is applied on a frame by frame basis to detect interest points, then the detected key points are compared against consecutive frames for significant motion. Key points with significant motion are encoded with the BRISK descriptor in the spatial domain and Motion Boundary Histogram in the temporal domain. This descriptor is not only lightweight but also has lower memory requirements because of the binary nature of the BRISK descriptor, allowing the possibility of applications using hand held devices.We evaluate the combination of detectordescriptor performance in the context of action classification with a standard, popular bag-of-features with SVM framework. Experiments are carried out on two popular datasets with varying complexity and we demonstrate comparable performance with other descriptors with reduced computational complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At the highest level of competitive sport, nearly all performances of athletes (both training and competitive) are chronicled using video. Video is then often viewed by expert coaches/analysts who then manually label important performance indicators to gauge performance. Stroke-rate and pacing are important performance measures in swimming, and these are previously digitised manually by a human. This is problematic as annotating large volumes of video can be costly, and time-consuming. Further, since it is difficult to accurately estimate the position of the swimmer at each frame, measures such as stroke rate are generally aggregated over an entire swimming lap. Vision-based techniques which can automatically, objectively and reliably track the swimmer and their location can potentially solve these issues and allow for large-scale analysis of a swimmer across many videos. However, the aquatic environment is challenging due to fluctuations in scene from splashes, reflections and because swimmers are frequently submerged at different points in a race. In this paper, we temporally segment races into distinct and sequential states, and propose a multimodal approach which employs individual detectors tuned to each race state. Our approach allows the swimmer to be located and tracked smoothly in each frame despite a diverse range of constraints. We test our approach on a video dataset compiled at the 2012 Australian Short Course Swimming Championships.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. To compare self-assessed driving habits and skills of licensed drivers with central visual loss who use bioptic telescopes to those of age-matched normally sighted drivers, and to examine the association between bioptic drivers' impressions of the quality of their driving and ratings by a “backseat” evaluator. Methods. Participants were licensed bioptic drivers (n = 23) and age-matched normally sighted drivers (n = 23). A questionnaire was administered addressing driving difficulty, space, quality, exposure, and, for bioptic drivers, whether the telescope was helpful in on-road situations. Visual acuity and contrast sensitivity were assessed. Information on ocular diagnosis, telescope characteristics, and bioptic driving experience was collected from the medical record or in interview. On-road driving performance in regular traffic conditions was rated independently by two evaluators. Results. Like normally sighted drivers, bioptic drivers reported no or little difficulty in many driving situations (e.g., left turns, rush hour), but reported more difficulty under poor visibility conditions and in unfamiliar areas (P < 0.05). Driving exposure was reduced in bioptic drivers (driving 250 miles per week on average vs. 410 miles per week for normally sighted drivers, P = 0.02), but driving space was similar to that of normally sighted drivers (P = 0.29). All but one bioptic driver used the telescope in at least one driving task, and 56% used the telescope in three or more tasks. Bioptic drivers' judgments about the quality of their driving were very similar to backseat evaluators' ratings. Conclusions. Bioptic drivers show insight into the overall quality of their driving and areas in which they experience driving difficulty. They report using the bioptic telescope while driving, contrary to previous claims that it is primarily used to pass the vision screening test at licensure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new community and communication type of social networks - online dating - are gaining momentum. With many people joining in the dating network, users become overwhelmed by choices for an ideal partner. A solution to this problem is providing users with partners recommendation based on their interests and activities. Traditional recommendation methods ignore the users’ needs and provide recommendations equally to all users. In this paper, we propose a recommendation approach that employs different recommendation strategies to different groups of members. A segmentation method using the Gaussian Mixture Model (GMM) is proposed to customize users’ needs. Then a targeted recommendation strategy is applied to each identified segment. Empirical results show that the proposed approach outperforms several existing recommendation methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most recommender systems attempt to use collaborative filtering, content-based filtering or hybrid approach to recommend items to new users. Collaborative filtering recommends items to new users based on their similar neighbours, and content-based filtering approach tries to recommend items that are similar to new users' profiles. The fundamental issues include how to profile new users, and how to deal with the over-specialization in content-based recommender systems. Indeed, the terms used to describe items can be formed as a concept hierarchy. Therefore, we aim to describe user profiles or information needs by using concepts vectors. This paper presents a new method to acquire user information needs, which allows new users to describe their preferences on a concept hierarchy rather than rating items. It also develops a new ranking function to recommend items to new users based on their information needs. The proposed approach is evaluated on Amazon book datasets. The experimental results demonstrate that the proposed approach can largely improve the effectiveness of recommender systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different reputation models are used in the web in order to generate reputation values for products using uses' review data. Most of the current reputation models use review ratings and neglect users' textual reviews, because it is more difficult to process. However, we argue that the overall reputation score for an item does not reflect the actual reputation for all of its features. And that's why the use of users' textual reviews is necessary. In our work we introduce a new reputation model that defines a new aggregation method for users' extracted opinions about products' features from users' text. Our model uses features ontology in order to define general features and sub-features of a product. It also reflects the frequencies of positive and negative opinions. We provide a case study to show how our results compare with other reputation models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fire incident in buildings is common, so the fire safety design of the framed structure is imperative, especially for the unprotected or partly protected bare steel frames. However, software for structural fire analysis is not widely available. As a result, the performance-based structural fire design is urged on the basis of using user-friendly and conventional nonlinear computer analysis programs so that engineers do not need to acquire new structural analysis software for structural fire analysis and design. The tool is desired to have the capacity of simulating the different fire scenarios and associated detrimental effects efficiently, which includes second-order P-D and P-d effects and material yielding. Also the nonlinear behaviour of large-scale structure becomes complicated when under fire, and thus its simulation relies on an efficient and effective numerical analysis to cope with intricate nonlinear effects due to fire. To this end, the present fire study utilizes a second order elastic/plastic analysis software NIDA to predict structural behaviour of bare steel framed structures at elevated temperatures. This fire study considers thermal expansion and material degradation due to heating. Degradation of material strength with increasing temperature is included by a set of temperature-stress-strain curves according to BS5950 Part 8 mainly, which implicitly allows for creep deformation. This finite element stiffness formulation of beam-column elements is derived from the fifth-order PEP element which facilitates the computer modeling by one member per element. The Newton-Raphson method is used in the nonlinear solution procedure in order to trace the nonlinear equilibrium path at specified elevated temperatures. Several numerical and experimental verifications of framed structures are presented and compared against solutions in literature. The proposed method permits engineers to adopt the performance-based structural fire analysis and design using typical second-order nonlinear structural analysis software.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To minimise the number of load sheddings in a microgrid (MG) during autonomous operation, islanded neighbour MGs can be interconnected if they are on a self-healing network and an extra generation capacity is available in the distributed energy resources (DER) of one of the MGs. In this way, the total load in the system of interconnected MGs can be shared by all the DERs within those MGs. However, for this purpose, carefully designed self-healing and supply restoration control algorithm, protection systems and communication infrastructure are required at the network and MG levels. In this study, first, a hierarchical control structure is discussed for interconnecting the neighbour autonomous MGs where the introduced primary control level is the main focus of this study. Through the developed primary control level, this study demonstrates how the parallel DERs in the system of multiple interconnected autonomous MGs can properly share the load of the system. This controller is designed such that the converter-interfaced DERs operate in a voltage-controlled mode following a decentralised power sharing algorithm based on droop control. DER converters are controlled based on a per-phase technique instead of a conventional direct-quadratic transformation technique. In addition, linear quadratic regulator-based state feedback controllers, which are more stable than conventional proportional integrator controllers, are utilised to prevent instability and weak dynamic performances of the DERs when autonomous MGs are interconnected. The efficacy of the primary control level of the DERs in the system of multiple interconnected autonomous MGs is validated through the PSCAD/EMTDC simulations considering detailed dynamic models of DERs and converters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction Informal caring networks contribute significantly to end-of-life (EOL) care in the community. However, to ensure that these networks are sustainable, and unpaid carers are not exploited, primary carers need permission and practical assistance to gather networks together and negotiate the help they need. Our aim in this study was to develop an understanding of how formal and informal carers work together when care is being provided in a dying person's home. We were particularly interested in formal providers’ perceptions and knowledge of informal networks of care and in identifying barriers to the networks working together. Methods Qualitative methods, informed by an interpretive approach, were used. In February-July 2012, 10 focus groups were conducted in urban, regional, and rural Australia comprising 88 participants. Findings Our findings show that formal providers are aware, and supportive, of the vital role informal networks play in the care of the dying at home. A number of barriers to formal and informal networks working together more effectively were identified. In particular, we found that the Australian policy of health-promoting palliative is not substantially translating to practice. Conclusion Combinations of formal and informal caring networks are essential to support people and their primary carers. Formal service providers do little to establish, support, or maintain the informal networks although there is much goodwill and scope for them to do so. Further re-orientation towards a health-promoting palliative care and community capacity building approach is suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increased levels of polybrominated diphenyl ethers (PBDEs) can occur particularly in dust and soil surrounding facilities that recycle products containing PBDEs. This may be the source of increased exposure for nearby workers and residents. To investigate, we measured PBDE levels in soil, office dust and blood of workers at the closest workplace (i.e. within 100m) to a large automotive shredding and metal recycling facility in Brisbane, Australia. The workplace investigated in this study was independent of the automotive shredding facility and was one of approximately 50 businesses of varying types within a relatively large commercial/industrial area surrounding the recycling facility. Concentrations of PBDEs in soils were at least an order of magnitude greater than background levels in the area. Congener profiles were dominated by larger molecular weight congeners; in particular BDE-209. This reflected the profile in outdoor air samples previously collected at this site. Biomonitoring data from blood serum indicated no differential exposure for workers near the recycling facility compared to a reference group of office workers, also in Brisbane. Unlike air, indoor dust and soil sample profiles, serum samples from both worker groups were dominated by congeners BDE-47, BDE-153, BDE-99, BDE-100 and BDE-183 and was similar to the profile previously reported in the general Australian population. Estimated exposures for workers near the industrial point source suggested indoor workers had significantly higher exposure than outdoor workers due to their exposure to indoor dust rather than soil. However, no relationship was observed between blood PBDE levels and different roles and activity patterns of workers on-site. These comparisons of PBDE levels in serum provide additional insight into the inter-individual variability within Australia. Results also indicate congener patterns in the workplace environment did not match blood profiles of workers. This was attributed to the relatively high background exposures for the general Australian population via dietary intake and the home environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.