389 resultados para Large datasets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this issue of Cancer Discovery, Guagnano and colleagues use a large and diverse annotated collection of cancer cell lines, the Cancer Cell Line Encyclopedia, to correlate whole-genome expression and genomic alteration datasets with cell line sensitivity data to the novel pan-fibroblast growth factor receptor (FGFR) inhibitor NVP-BGJ398. Their findings underscore not only the preclinical use of such cell line panels in identifying predictive biomarkers, but also the emergence of the FGFRs as valid therapeutic targets, across an increasingly broad range of malignancies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first year of a property degree program is a time to establish threshold concept knowledge to acculturise students into their discipline or professional group. Due to the foundational nature of first year in many property degrees, students are enrolled in large, multi-disciplinary classes. There are several challenges in the delivery of large first year multi-disciplinary units to engage the student in a community of leaning to aid in student retention. Through action based research this study shows how social networking, particularly Facebook, can be used to create a sense of community across large, multi-disciplinary units to illicit ‘real time’ feedback from students and encourage peer to peer learning. This study assesses the benefits of using social media and considers the potential limitations of this medium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2009 and 2010, withdrawal rates from a Pharmacology unit of accelerated QUT nursing students in the first year of their degree, were higher than for continuing students. The cohort of 216 accelerated students in 2011 had university or non-university qualification or equivalent experience and included domestic and international students. A previously tested intervention was introduced in 2011 to improve retention rates and support all Pharmacology students in their first year of nursing. The intervention involved a community website, on-line tutors and an “O week” workshop comprising information about library resources, effective learning strategies and learning tips from a previous student as well as review anatomy, physiology and microbiology lectures. Withdrawal rates for accelerated students in the Pharmacology unit improved and all students found the workshop and review lectures to be informative and valuable. The intervention was therefore successfully transferred to a large, diverse cohort of accelerated nursing students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research described in this paper forms part of an in-depth investigation of safety culture in one of Australia’s largest construction companies. The research builds on a previous qualitative study with organisational safety leaders and further investigates how safety culture is perceived and experienced by organisational members, as well as how this relates to their safety behaviour and related outcomes at work. Participants were 2273 employees of the case study organisation, with 689 from the Construction function and 1584 from the Resources function. The results of several analyses revealed some interesting organisational variance on key measures. Specifically, the Construction function scored significantly higher on all key measures: safety climate, safety motivation, safety compliance, and safety participation. The results are discussed in terms of relevance in an applied research context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical study is carried out using large eddy simulation to study the heat and toxic gases released from fires in real road tunnels. Due to disasters about tunnel fires in previous decade, it attracts increasing attention of researchers to create safe and reliable ventilation designs. In this research, a real tunnel with 10 MW fire (which approximately equals to the heat output speed of a burning bus) at the middle of tunnel is simulated using FDS (Fire Dynamic Simulator) for different ventilation velocities. Carbone monoxide concentration and temperature vertical profiles are shown for various locations to explore the flow field. It is found that, with the increase of the longitudinal ventilation velocity, the vertical profile gradients of CO concentration and smoke temperature were shown to be both reduced. However, a relatively large longitudinal ventilation velocity leads to a high similarity between the vertical profile of CO volume concentration and that of temperature rise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable communications is one of the major concerns in wireless sensor networks (WSNs). Multipath routing is an effective way to improve communication reliability in WSNs. However, most of existing multipath routing protocols for sensor networks are reactive and require dynamic route discovery. If there are many sensor nodes from a source to a destination, the route discovery process will create a long end-to-end transmission delay, which causes difficulties in some time-critical applications. To overcome this difficulty, the efficient route update and maintenance processes are proposed in this paper. It aims to limit the amount of routing overhead with two-tier routing architecture and introduce the combination of piggyback and trigger update to replace the periodic update process, which is the main source of unnecessary routing overhead. Simulations are carried out to demonstrate the effectiveness of the proposed processes in improvement of total amount of routing overhead over existing popular routing protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A major factor in the stratospheric collection process is the relative density of particles at the collection altitude. With current aircraft-borne collector plate geometries, one potential extraterrestrial particle of about 10 micron diameter is collected approximately every hour. However, a new design for the collector plate, termed the Large Area Collector (LAC), allows a factor of 10 improvement in collection efficiency over current conventional geometry. The implementation of LAC design on future stratospheric collection flights will provide many opportunities for additional data on both terrestrial and extraterrestrial phenomena. With the improvement in collection efficiency, LAC's may provide a suitable number of potential extraterrestrial particles in one short flight of between 4 and 8 hours duration. Alternatively, total collection periods of approximately 40 hours enhance the probability that rare particles can be retrieved from the stratosphere. This latter approach is of great value for the cosmochemist who may wish to perform sophisticated analyses on interplanetary dust greater than a picogram. The former approach, involving short duration flights, may also provide invaluable data on the source of many extraterrestrial particles. The time dependence of particle entry to the collection altitude is an important parameter which may be correlated with specific global events (e.g., meteoroid streams) provided the collection time is known to an accuracy of 2 hours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The selection of optimal camera configurations (camera locations, orientations etc.) for multi-camera networks remains an unsolved problem. Previous approaches largely focus on proposing various objective functions to achieve different tasks. Most of them, however, do not generalize well to large scale networks. To tackle this, we introduce a statistical formulation of the optimal selection of camera configurations as well as propose a Trans-Dimensional Simulated Annealing (TDSA) algorithm to effectively solve the problem. We compare our approach with a state-of-the-art method based on Binary Integer Programming (BIP) and show that our approach offers similar performance on small scale problems. However, we also demonstrate the capability of our approach in dealing with large scale problems and show that our approach produces better results than 2 alternative heuristics designed to deal with the scalability issue of BIP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer worms represent a serious threat for modern communication infrastructures. These epidemics can cause great damage such as financial losses or interruption of critical services which support lives of citizens. These worms can spread with a speed which prevents instant human intervention. Therefore automatic detection and mitigation techniques need to be developed. However, if these techniques are not designed and intensively tested in realistic environments, they may cause even more harm as they heavily interfere with high volume communication flows. We present a simulation model which allows studies of worm spread and counter measures in large scale multi-AS topologies with millions of IP addresses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large Igneous Provinces are exceptional intraplate igneous events throughout Earth’s history. Their significance and potential global impact is related to the total volume of magma intruded and released during these geologically brief events (peak eruptions are often within 1-5 Myrs duration) where millions to tens of millions of cubic kilometers of magma are produced. In some cases, at least 1% of the Earth’s surface has been directly covered in volcanic rock, being equivalent to the size of small continents with comparable crustal thicknesses. Large Igneous Provinces are thus important, albeit episodic episodes of new crust addition. However, most magmatism is basaltic so that contributions to crustal growth will not always be picked up in zircon geochronology studies that better trace major episodes of extension-related silicic magmatism and the silicic Large Igneous Provinces. Much headway has been made on our understanding of these anomalous igneous events over the last 25 years, driving many new ideas and models. This includes their: 1) global spatial and temporal distribution, with a long-term average of one event approximately every 20 Myrs, but a clear clustering of events at times of supercontinent break-up – Large Igneous Provinces are thus an integral part of the Wilson cycle and are becoming an increasingly important tool in reconnecting dispersed continental fragments; 2) compositional diversity that in part reflects their crustal setting of ocean basins, and continental interiors and margins where in the latter setting, LIP magmatism can be silicicdominant; 3) mineral and energy resources with major PGE and precious metal resources being hosted in these provinces, as well as magmatism impacting on the hydrocarbon potential of volcanic basins and rifted margins through enhancing source rock maturation, providing fluid migration pathways, and trap formation; 4) biospheric, hydrospheric and atmospheric impacts, with Large Igneous Provinces now widely regarded as a key trigger mechanism for mass extinctions, although the exact kill mechanism(s) are still being resolved; 5) role in mantle geodynamics and thermal evolution of the Earth, by potentially recording the transport of material from the lower mantle or core-mantle boundary to the Earth's surface and being a fundamental component in whole mantle convection models; and 6) recognition on the inner planets where the lack of plate tectonics and erosional processes and planetary antiquity means that the very earliest record of LIP events during planetary evolution may be better preserved than on Earth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LIP emplacement is linked to the timing and evolution of supercontinental break-up. LIP-related break-up produces volcanic rifted margins, new and large (up to 108 km2) ocean basins, and new, smaller continents that undergo dispersal and potentially reassembly (e.g., India). However, not all continental LIPs lead to continental rupture. We analysed the <330 Ma continental LIP record(following final assembly of Pangea) to find relationships between LIP event attributes (e.g., igneous volume, extent, distance from pre-existing continental margin) and ocean basin attributes (e.g., length of new ocean basin/rifted margin) and how these varied during the progressive break up of Pangea. No correlation exists between LIP magnitude and size of the subsequent ocean basin or rifted margin. Our review suggests a three-phased break-up history of Pangea: 1) “Preconditioning” phase (∼330–200 Ma): LIP events (n=7) occurred largely around the supercontinental margin clustering today in Asia, with a low (<20%) rifting success rate. The Panjal Traps at ∼280 Ma may represent the first continental rupturing event of Pangea, resulting in continental ribboning along the Tethyan margin; 2) “Main Break-up” phase (∼200–100 Ma): numerous large LIP events(n=10) in the supercontinent interior, resulting in highly successful fragmentation (90%) and large, new ocean basins(e.g., Central/South Atlantic, Indian, >3000 km long); 3) “Waning” phase (∼100–0 Ma): Declining LIP magnitudes (n=6), greater proximity to continental margins (e.g., Madagascar, North Atlantic, Afro-Arabia, Sierra Madre) producing smaller ocean basins (<2600 km long). How Pangea broke up may thus have implications for earlier supercontinent reconstructions and LIP record.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Predicting protein subnuclear localization is a challenging problem. Some previous works based on non-sequence information including Gene Ontology annotations and kernel fusion have respective limitations. The aim of this work is twofold: one is to propose a novel individual feature extraction method; another is to develop an ensemble method to improve prediction performance using comprehensive information represented in the form of high dimensional feature vector obtained by 11 feature extraction methods. Methodology/Principal Findings A novel two-stage multiclass support vector machine is proposed to predict protein subnuclear localizations. It only considers those feature extraction methods based on amino acid classifications and physicochemical properties. In order to speed up our system, an automatic search method for the kernel parameter is used. The prediction performance of our method is evaluated on four datasets: Lei dataset, multi-localization dataset, SNL9 dataset and a new independent dataset. The overall accuracy of prediction for 6 localizations on Lei dataset is 75.2% and that for 9 localizations on SNL9 dataset is 72.1% in the leave-one-out cross validation, 71.7% for the multi-localization dataset and 69.8% for the new independent dataset, respectively. Comparisons with those existing methods show that our method performs better for both single-localization and multi-localization proteins and achieves more balanced sensitivities and specificities on large-size and small-size subcellular localizations. The overall accuracy improvements are 4.0% and 4.7% for single-localization proteins and 6.5% for multi-localization proteins. The reliability and stability of our classification model are further confirmed by permutation analysis. Conclusions It can be concluded that our method is effective and valuable for predicting protein subnuclear localizations. A web server has been designed to implement the proposed method. It is freely available at http://bioinformatics.awowshop.com/snlpr​ed_page.php.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To use a large wavefront database of a clinical population to investigate relationships between refractions and higher order aberrations and between aberrations of right and left eyes. Methods: Third and fourth-order aberration coefficients and higher-order root-mean-squared aberrations (HO RMS), scaled to a pupil size of 4.5 mm diameter, were analysed in a population of about 24,000 patients from Carl Zeiss Vision's European wavefront database. Correlations were determined between the aberrations and the variables of refraction, near addition and cylinder. Results: Most aberration coefficients were significantly dependent upon these variables, but the proportions of aberrations that could be explained by these factors were less than 2% except for spherical aberration (12%), horizontal coma (9%) and HO RMS (7%). Near addition was the major contributor for horizontal coma (8.5% out of 9.5%) and spherical equivalent was the major contributor for spherical aberration (7.7% out of 11.6%). Interocular correlations were highly significant for all aberration coefficients, varying between 0.16 and 0.81. Anisometropia was a variable of significance for three aberrations (vertical coma, secondary astigmatism and tetrafoil), but little importance can be placed on this because of the small proportions of aberrations that can be explained by refraction (all less than 1.0 %). Conclusions: Most third- and fourth-order aberration coefficients were significantly dependent upon spherical equivalent, near addition and cylinder, but only horizontal coma (9%) and spherical aberration (12%) showed dependencies of greater than 2%. Interocular correlations were highly significant for all aberration coefficients, but anisometropia had little influence on aberration coefficients.