855 resultados para Large-scale databases
Resumo:
A novel approach to large-scale production of high-quality graphene flakes in magnetically-enhanced arc discharges between carbon electrodes is reported. A non-uniform magnetic field is used to control the growth and deposition zones, where the Y-Ni catalyst experiences a transition to the ferromagnetic state, which in turn leads to the graphene deposition in a collection area. The quality of the produced material is characterized by the SEM, TEM, AFM, and Raman techniques. The proposed growth mechanism is supported by the nucleation and growth model.
Resumo:
Next Generation Sequencing (NGS) has revolutionised molecular biology, resulting in an explosion of data sets and an increasing role in clinical practice. Such applications necessarily require rapid identification of the organism as a prelude to annotation and further analysis. NGS data consist of a substantial number of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. Highly accurate results have been obtained for restricted sets using SVM classifiers, but such methods are difficult to parallelise and success depends on careful attention to feature selection. This work examines the problem at very large scale, using a mix of synthetic and real data with a view to determining the overall structure of the problem and the effectiveness of parallel ensembles of simpler classifiers (principally random forests) in addressing the challenges of large scale genomics.
Resumo:
In this paper we describe CubIT, a multi-user presentation and collaboration system installed at the Queensland University of Technology’s (QUT) Cube facility. The ‘Cube’ is an interactive visualisation facility made up of five very large-scale interactive multi-panel wall displays, each consisting of up to twelve 55-inch multi-touch screens (48 screens in total) and massive projected display screens situated above the display panels. The paper outlines the unique design challenges, features, implementation and evaluation of CubIT. The system was built to make the Cube facility accessible to QUT’s academic and student population. CubIT enables users to easily upload and share their own media content, and allows multiple users to simultaneously interact with the Cube’s wall displays. The features of CubIT were implemented via three user interfaces, a multi-touch interface working on the wall displays, a mobile phone and tablet application and a web-based content management system. Each of these interfaces plays a different role and offers different interaction mechanisms. Together they support a wide range of collaborative features including multi-user shared workspaces, drag and drop upload and sharing between users, session management and dynamic state control between different parts of the system. The results of our evaluation study showed that CubIT was successfully used for a variety of tasks, and highlighted challenges with regards to user expectations regarding functionality as well as issues arising from public use.
Resumo:
Although the collection of player and ball tracking data is fast becoming the norm in professional sports, large-scale mining of such spatiotemporal data has yet to surface. In this paper, given an entire season's worth of player and ball tracking data from a professional soccer league (approx 400,000,000 data points), we present a method which can conduct both individual player and team analysis. Due to the dynamic, continuous and multi-player nature of team sports like soccer, a major issue is aligning player positions over time. We present a "role-based" representation that dynamically updates each player's relative role at each frame and demonstrate how this captures the short-term context to enable both individual player and team analysis. We discover role directly from data by utilizing a minimum entropy data partitioning method and show how this can be used to accurately detect and visualize formations, as well as analyze individual player behavior.
Resumo:
Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making
Resumo:
Project work can involve multiple people from varying disciplines coming together to solve problems as a group. Large scale interactive displays are presenting new opportunities to support such interactions with interactive and semantically enabled cooperative work tools such as intelligent mind maps. In this paper, we present a novel digital, touch-enabled mind-mapping tool as a first step towards achieving such a vision. This first prototype allows an evaluation of the benefits of a digital environment for a task that would otherwise be performed on paper or flat interactive surfaces. Observations and surveys of 12 participants in 3 groups allowed the formulation of several recommendations for further research into: new methods for capturing text input on touch screens; inclusion of complex structures; multi-user environments and how users make the shift from single- user applications; and how best to navigate large screen real estate in a touch-enabled, co-present multi-user setting.
Resumo:
Large-scale integration of non-inertial generators such as wind farms will create frequency stability issues due to reduced system inertia. Inertia based frequency stability study is important to predict the performance of power system with increased level of renewables. This paper focuses on the impact large-scale wind penetration on frequency stability of the Australian Power Network. MATLAB simulink is used to develop a frequency based dynamic model utilizing the network data from a simplified 14-generator Australian power system. The loss of generation is modeled as the active power disturbance and minimum inertia required to maintain the frequency stability is determined for five-area power system.
Resumo:
A hippocampal-CA3 memory model was constructed with PGENESIS, a recently developed version of GENESIS that allows for distributed processing of a neural network simulation. A number of neural models of the human memory system have identified the CA3 region of the hippocampus as storing the declarative memory trace. However, computational models designed to assess the viability of the putative mechanisms of storage and retrieval have generally been too abstract to allow comparison with empirical data. Recent experimental evidence has shown that selective knock-out of NMDA receptors in the CA1 of mice leads to reduced stability of firing specificity in place cells. Here a similar reduction of stability of input specificity is demonstrated in a biologically plausible neural network model of the CA3 region, under conditions of Hebbian synaptic plasticity versus an absence of plasticity. The CA3 region is also commonly associated with seizure activity. Further simulations of the same model tested the response to continuously repeating versus randomized nonrepeating input patterns. Each paradigm delivered input of equal intensity and duration. Non-repeating input patterns elicited a greater pyramidal cell spike count. This suggests that repetitive versus non-repeating neocortical inpus has a quantitatively different effect on the hippocampus. This may be relevant to the production of independent epileptogenic zones and the process of encoding new memories.
Resumo:
Numerous efforts have been dedicated to the synthesis of large-volume methacrylate monoliths for large-scale biomolecules purification but most were obstructed by the enormous release of exotherms during preparation, thereby introducing structural heterogeneity in the monolith pore system. A significant radial temperature gradient develops along the monolith thickness, reaching a terminal temperature that supersedes the maximum temperature required for structurally homogenous monoliths preparation. The enormous heat build-up is perceived to encompass the heat associated with initiator decomposition and the heat released from free radical-monomer and monomer-monomer interactions. The heat resulting from the initiator decomposition was expelled along with some gaseous fumes before commencing polymerization in a gradual addition fashion. Characteristics of 80 mL monolith prepared using this technique was compared with that of a similar monolith synthesized in a bulk polymerization mode. An extra similarity in the radial temperature profiles was observed for the monolith synthesized via the heat expulsion technique. A maximum radial temperature gradient of only 4.3°C was recorded at the center and 2.1°C at the monolith peripheral for the combined heat expulsion and gradual addition technique. The comparable radial temperature distributions obtained birthed identical pore size distributions at different radial points along the monolith thickness.
Resumo:
This paper overviews the development of a vision-based AUV along with a set of complementary operational strategies to allow reliable autonomous data collection in relatively shallow water and coral reef environments. The development of the AUV, called Starbug, encountered many challenges in terms of vehicle design, navigation and control. Some of these challenges are discussed with focus on operational strategies for estimating and reducing the total navigation error when using lower-resolution sensing modalities. Results are presented from recent field trials which illustrate the ability of the vehicle and associated operational strategies to enable rapid collection of visual data sets suitable for marine research applications.
Resumo:
PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.
Resumo:
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.
Resumo:
Existing field data for Rangal coals (Late Permian) of the Bowen Basin, Queensland, Australia, are inconsistent with the depositional model generally accepted in the current geological literature to explain coal deposition. Given the apparent unsuitability of the current depositional model to the Bowen Basin coal data, a new depositional model, here named the Cyclic Salinity Model, is proposed and tested in this study.