955 resultados para Large datasets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Data from two randomized phase III trials were analyzed to evaluate prognostic factors and treatment selection in the first-line management of advanced non-small cell lung cancer patients with performance status (PS) 2. Patients and Methods: Patients randomized to combination chemotherapy (carboplatin and paclitaxel) in one trial and single-agent therapy (gemcitabine or vinorelbine) in the second were included in these analyses. Both studies had identical eligibility criteria and were conducted simultaneously. Comparison of efficacy and safety was performed between the two cohorts. A regression analysis identified prognostic factors and subgroups of patients that may benefit from combination or single-agent therapy. Results: Two hundred one patients were treated with combination and 190 with single-agent therapy. Objective responses were 37 and 15%, respectively. Median time to progression was 4.6 months in the combination arm and 3.5 months in the single-agent arm (p < 0.001). Median survival imes were 8.0 and 6.6 months, and 1-year survival rates were 31 and 26%, respectively. Albumin <3.5 g, extrathoracic metastases, lactate dehydrogenase ≥200 IU, and 2 comorbid conditions predicted outcome. Patients with 0-2 risk factors had similar outcomes independent of treatment, whereas patients with 3-4 factors had a nonsignificant improvement in median survival with combination chemotherapy. Conclusion: Our results show that PS2 non-small cell lung cancer patients are a heterogeneous group who have significantly different outcomes. Patients treated with first-line combination chemotherapy had a higher response and longer time to progression, whereas overall survival did not appear significantly different. A prognostic model may be helpful in selecting PS 2 patients for either treatment strategy. © 2009 by the International Association for the Study of Lung Cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most of the advanced economies, students are losing interest in careers especially in en¬gineering and related industries. Hence, western economies are confronting a critical skilled labour shortage in areas of technology, science and engineering. Decisions about career pathways are made as early as the primary years of schooling and hence cooperation be¬tween industry and schools to attract students to the professions is crucial. The aim of this paper is to document how the organisational and institutional elements of one industry-school partnerships initiative — The Gateway Schools Program — contribute to productive knowledge sharing and networking. In particular this paper focuses on an initiative of an Australian State government in response to a perceived crisis around the skills shortage in an economy transitioning from a localised to a global knowledge production economy. The Gateway Schools initiative signals the first sustained attempt in Australia to incorporate schools into production networks through strategic partnerships linking them to partner organisations at the industry level. We provide case examples of how four schools opera¬tionalise the partnerships with the minerals and energy industries and how these partner¬ships as knowledge assets impact the delivery of curriculum and capacity building among teachers. Our ultimate goal is to define those characteristics of successful partnerships that do contribute to enhanced interest and engagement by students in those careers that are currently experiencing critical shortages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cooperation and caring are best taught within a group as it promotes connectedness, collaborative effort, and relationship building.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a novel approach to mobile robot navigation using visual information towards the goal of long-term autonomy. A novel concept of a continuous appearance-based trajectory is proposed in order to solve the limitations of previous robot navigation systems, and two new algorithms for mobile robots, CAT-SLAM and CAT-Graph, are presented and evaluated. These algorithms yield performance exceeding state-of-the-art methods on public benchmark datasets and large-scale real-world environments, and will help enable widespread use of mobile robots in everyday applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several I- and A-type granite, syenite plutons and spatially associated, giant Fe–Ti–V deposit-bearing mafic ultramafic layered intrusions occur in the Pan–Xi(Panzhihua–Xichang) area within the inner zone of the Emeishan large igneous province (ELIP). These complexes are interpreted to be related to the Emeishan mantle plume. We present LA-ICP-MS and SIMS zircon U–Pb ages and Hf–Nd isotopic compositions for the gabbros, syenites and granites from these complexes. The dating shows that the age of the felsic intrusive magmatism (256.2 ± 3.0–259.8 ± 1.6 Ma) is indistinguishable from that of the mafic intrusive magmatism (255.4 ± 3.1–259.5 ± 2.7 Ma) and represents the final phase of a continuous magmatic episode that lasted no more than 10 Myr. The upper gabbros in the mafic–ultramafic intrusions are generally more isotopically enriched (lower eNd and eHf) than the middle and lower gabbros, suggesting that the upper gabbros have experienced a higher level of crustal contamination than the lower gabbros. The significantly positive eHf(t) values of the A-type granites and syenites (+4.9 to +10.8) are higher than those of the upper gabbros of the associated mafic intrusion, which shows that they cannot be derived by fractional crystallization of these bodies. They are however identical to those of the mafic enclaves (+7.0 to +11.4) and middle and lower gabbros, implying that they are cogenetic. We suggest that they were generated by fractionation of large-volume, plume-related basaltic magmas that ponded deep in the crust. The deep-seated magma chamber erupted in two stages: the first near a density minimum in the basaltic fractionation trend and the second during the final stage of fractionation when the magma was a low density Fe-poor, Si-rich felsic magma. The basaltic magmas emplaced in the shallowlevel magma chambers differentiated to form mafic–ultramafic layered intrusions accompanied by a small amount of crustal assimilation through roof melting. Evolved A-type granites (synenites and syenodiorites) were produced dominantly by crystallization in the deep crustal magma chamber. In contrast, the I-type granites have negative eNd(t) [-6.3 to -7.5] and eHf(t) [-1.3 to -6.7] values, with the Nd model ages (T Nd DM2) of 1.63-1.67 Ga and Hf model ages (T Hf DM2) of 1.56-1.58 Ga, suggesting that they were mainly derived from partial melting of Mesoproterozoic crust. In combination with previous studies, this study also shows that plume activity not only gave rise to reworking of ancient crust, but also significant growth of juvenile crust in the center of the ELIP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methodology was applied to a survey of time-course incidence of four viruses (alfalfa mosaic virus, clover yellow vein virus, subterranean clover mottle virus and subterranean clover red leaf virus) in improved pastures in southern regions of Australia. -from Authors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the demand for better and deeper analysis in sports, organizations (both professional teams and broadcasters) are looking to use spatiotemporal data in the form of player tracking information to obtain an advantage over their competitors. However, due to the large volume of data, its unstructured nature, and lack of associated team activity labels (e.g. strategic/tactical), effective and efficient strategies to deal with such data have yet to be deployed. A bottleneck restricting such solutions is the lack of a suitable representation (i.e. ordering of players) which is immune to the potentially infinite number of possible permutations of player orderings, in addition to the high dimensionality of temporal signal (e.g. a game of soccer last for 90 mins). Leveraging a recent method which utilizes a "role-representation", as well as a feature reduction strategy that uses a spatiotemporal bilinear basis model to form a compact spatiotemporal representation. Using this representation, we find the most likely formation patterns of a team associated with match events across nearly 14 hours of continuous player and ball tracking data in soccer. Additionally, we show that we can accurately segment a match into distinct game phases and detect highlights. (i.e. shots, corners, free-kicks, etc) completely automatically using a decision-tree formulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient and effective feature detection and representation is an important consideration when processing videos, and a large number of applications such as motion analysis, 3D scene understanding, tracking etc. depend on this. Amongst several feature description methods, local features are becoming increasingly popular for representing videos because of their simplicity and efficiency. While they achieve state-of-the-art performance with low computational complexity, their performance is still too limited for real world applications. Furthermore, rapid increases in the uptake of mobile devices has increased the demand for algorithms that can run with reduced memory and computational requirements. In this paper we propose a semi binary based feature detectordescriptor based on the BRISK detector, which can detect and represent videos with significantly reduced computational requirements, while achieving comparable performance to the state of the art spatio-temporal feature descriptors. First, the BRISK feature detector is applied on a frame by frame basis to detect interest points, then the detected key points are compared against consecutive frames for significant motion. Key points with significant motion are encoded with the BRISK descriptor in the spatial domain and Motion Boundary Histogram in the temporal domain. This descriptor is not only lightweight but also has lower memory requirements because of the binary nature of the BRISK descriptor, allowing the possibility of applications using hand held devices.We evaluate the combination of detectordescriptor performance in the context of action classification with a standard, popular bag-of-features with SVM framework. Experiments are carried out on two popular datasets with varying complexity and we demonstrate comparable performance with other descriptors with reduced computational complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we provide an overview of the Social Event Detection (SED) task that is part of the MediaEval Bench mark for Multimedia Evaluation 2013. This task requires participants to discover social events and organize the re- lated media items in event-specific clusters within a collection of Web multimedia. Social events are events that are planned by people, attended by people and for which the social multimedia are also captured by people. We describe the challenges, datasets, and the evaluation methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As all-atom molecular dynamics method is limited by its enormous computational cost, various coarse-grained strategies have been developed to extend the length scale of soft matters in the modeling of mechanical behaviors. However, the classical thermostat algorithm in highly coarse-grained molecular dynamics method would underestimate the thermodynamic behaviors of soft matters (e.g. microfilaments in cells), which can weaken the ability of materials to overcome local energy traps in granular modeling. Based on all-atom molecular dynamics modeling of microfilament fragments (G-actin clusters), a new stochastic thermostat algorithm is developed to retain the representation of thermodynamic properties of microfilaments at extra coarse-grained level. The accuracy of this stochastic thermostat algorithm is validated by all-atom MD simulation. This new stochastic thermostat algorithm provides an efficient way to investigate the thermomechanical properties of large-scale soft matters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an accurate and robust geometric and material nonlinear formulation to predict structural behaviour of unprotected steel members at elevated temperatures. A fire analysis including large displacement effects for frame structures is presented. This finite element formulation of beam-column elements is based on the plastic hinge approach to model the elasto-plastic strain-hardening material behaviour. The Newton-Raphson method allowing for the thermal-time dependent effect was employed for the solution of the non-linear governing equations for large deflection in thermal history. A combined incremental and total formulation for determining member resistance is employed in this nonlinear solution procedure for the efficient modeling of nonlinear effects. Degradation of material strength with increasing temperature is simulated by a set of temperature-stress-strain curves according to both ECCS and BS5950 Part 8, which implicitly allows for creep deformation. The effects of uniform or non-uniform temperature distribution over the section of the structural steel member are also considered. Several numerical and experimental verifications are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Nontuberculous mycobacteria (NTM) are normal inhabitants of a variety of environmental reservoirs including natural and municipal water. The aim of this study was to document the variety of species of NTM in potable water in Brisbane, QLD, with a specific interest in the main pathogens responsible for disease in this region and to explore factors associated with the isolation of NTM. One-litre water samples were collected from 189 routine collection sites in summer and 195 sites in winter. Samples were split, with half decontaminated with CPC 0.005%, then concentrated by filtration and cultured on 7H11 plates in MGIT tubes (winter only). Results Mycobacteria were grown from 40.21% sites in Summer (76/189) and 82.05% sites in winter (160/195). The winter samples yielded the greatest number and variety of mycobacteria as there was a high degree of subculture overgrowth and contamination in summer. Of those samples that did yield mycobacteria in summer, the variety of species differed from those isolated in winter. The inclusion of liquid media increased the yield for some species of NTM. Species that have been documented to cause disease in humans residing in Brisbane that were also found in water include M. gordonae, M. kansasii, M. abscessus, M. chelonae, M. fortuitum complex, M. intracellulare, M. avium complex, M. flavescens, M. interjectum, M. lentiflavum, M. mucogenicum, M. simiae, M. szulgai, M. terrae. M. kansasii was frequently isolated, but M. avium and M. intracellulare (the main pathogens responsible for disease is QLD) were isolated infrequently. Distance of sampling site from treatment plant in summer was associated with isolation of NTM. Pathogenic NTM (defined as those known to cause disease in QLD) were more likely to be identified from sites with narrower diameter pipes, predominantly distribution sample points, and from sites with asbestos cement or modified PVC pipes. Conclusions NTM responsible for human disease can be found in large urban water distribution systems in Australia. Based on our findings, additional point chlorination, maintenance of more constant pressure gradients in the system, and the utilisation of particular pipe materials should be considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document describes large, accurately calibrated and time-synchronised datasets, gathered in controlled environmental conditions, using an unmanned ground vehicle equipped with a wide variety of sensors. These sensors include: multiple laser scanners, a millimetre wave radar scanner, a colour camera and an infra-red camera. Full details of the sensors are given, as well as the calibration parameters needed to locate them with respect to each other and to the platform. This report also specifies the format and content of the data, and the conditions in which the data have been gathered. The data collection was made in two different situations of the vehicle: static and dynamic. The static tests consisted of sensing a fixed ’reference’ terrain, containing simple known objects, from a motionless vehicle. For the dynamic tests, data were acquired from a moving vehicle in various environments, mainly rural, including an open area, a semi-urban zone and a natural area with different types of vegetation. For both categories, data have been gathered in controlled environmental conditions, which included the presence of dust, smoke and rain. Most of the environments involved were static, except for a few specific datasets which involve the presence of a walking pedestrian. Finally, this document presents illustrations of the effects of adverse environmental conditions on sensor data, as a first step towards reliability and integrity in autonomous perceptual systems.