855 resultados para Railroad large scale apparatus
Resumo:
A hippocampal-CA3 memory model was constructed with PGENESIS, a recently developed version of GENESIS that allows for distributed processing of a neural network simulation. A number of neural models of the human memory system have identified the CA3 region of the hippocampus as storing the declarative memory trace. However, computational models designed to assess the viability of the putative mechanisms of storage and retrieval have generally been too abstract to allow comparison with empirical data. Recent experimental evidence has shown that selective knock-out of NMDA receptors in the CA1 of mice leads to reduced stability of firing specificity in place cells. Here a similar reduction of stability of input specificity is demonstrated in a biologically plausible neural network model of the CA3 region, under conditions of Hebbian synaptic plasticity versus an absence of plasticity. The CA3 region is also commonly associated with seizure activity. Further simulations of the same model tested the response to continuously repeating versus randomized nonrepeating input patterns. Each paradigm delivered input of equal intensity and duration. Non-repeating input patterns elicited a greater pyramidal cell spike count. This suggests that repetitive versus non-repeating neocortical inpus has a quantitatively different effect on the hippocampus. This may be relevant to the production of independent epileptogenic zones and the process of encoding new memories.
Resumo:
Numerous efforts have been dedicated to the synthesis of large-volume methacrylate monoliths for large-scale biomolecules purification but most were obstructed by the enormous release of exotherms during preparation, thereby introducing structural heterogeneity in the monolith pore system. A significant radial temperature gradient develops along the monolith thickness, reaching a terminal temperature that supersedes the maximum temperature required for structurally homogenous monoliths preparation. The enormous heat build-up is perceived to encompass the heat associated with initiator decomposition and the heat released from free radical-monomer and monomer-monomer interactions. The heat resulting from the initiator decomposition was expelled along with some gaseous fumes before commencing polymerization in a gradual addition fashion. Characteristics of 80 mL monolith prepared using this technique was compared with that of a similar monolith synthesized in a bulk polymerization mode. An extra similarity in the radial temperature profiles was observed for the monolith synthesized via the heat expulsion technique. A maximum radial temperature gradient of only 4.3°C was recorded at the center and 2.1°C at the monolith peripheral for the combined heat expulsion and gradual addition technique. The comparable radial temperature distributions obtained birthed identical pore size distributions at different radial points along the monolith thickness.
Resumo:
This paper overviews the development of a vision-based AUV along with a set of complementary operational strategies to allow reliable autonomous data collection in relatively shallow water and coral reef environments. The development of the AUV, called Starbug, encountered many challenges in terms of vehicle design, navigation and control. Some of these challenges are discussed with focus on operational strategies for estimating and reducing the total navigation error when using lower-resolution sensing modalities. Results are presented from recent field trials which illustrate the ability of the vehicle and associated operational strategies to enable rapid collection of visual data sets suitable for marine research applications.
Resumo:
PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.
Resumo:
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.
Resumo:
Existing field data for Rangal coals (Late Permian) of the Bowen Basin, Queensland, Australia, are inconsistent with the depositional model generally accepted in the current geological literature to explain coal deposition. Given the apparent unsuitability of the current depositional model to the Bowen Basin coal data, a new depositional model, here named the Cyclic Salinity Model, is proposed and tested in this study.
Resumo:
On the 18th of July 2013, three hundred local members of Gladstone, Queensland erupted into song and dance performing the fraught history of their community harbourside through tug boat ballets, taiko drumming, German bell ringing and BMX bike riding. Over 17,500 people attended the four performances of Boomtown, a Queensland Music Festival event. This was the largest regional, outdoor community-engaged musical performance staged in Australia. The narrative moved beyond the dominant, pejorative view of Gladstone as an industrial town to include the community members’ sense of purpose and aspirations. It was a celebratory, contentious and ambitious project that sought to disrupt the traditional conventions of performance-making through working in artistically democratic ways. This article explores the potential for Australian Community Engaged Arts (CEA) projects such as Boomtown to democratically engage community members and co-create culturally meaningful work within a community. Research into CEA projects rarely consider how the often delicate conversations between practitioners and the community work. The complex processes of finding and co-writing the narrative, casting, and rehearsing Boomtown are discussed with reference to artistic director/dramaturge Sean Mee’s innovative approaches. Boomtown began with and concluded with community conversations. Skilful negotiation ensured congruence between the townspeople’s stories and the “community story” presented on stage, abrogating potential problems of narrative ownership. To supplement the research, twenty-one personal interviews were undertaken with Gladstone community members invested in the production before, during and after the project: performers, audience members and local professionals. The stories shared and emphasised in the theatricalised story were based on propitious, meaningful, local stories from lived experiences rather than preconceived, trivial or tokenistic matters, and were underpinned by a consensus formed on what was in the best interests of the majority of community members. Boomtown exposed hidden issues in the community and gave voice to thoughts, feelings and concerns which triggered not just engagement, but honest conversation within the community.
Resumo:
Public buildings and large infrastructure are typically monitored by tens or hundreds of cameras, all capturing different physical spaces and observing different types of interactions and behaviours. However to date, in large part due to limited data availability, crowd monitoring and operational surveillance research has focused on single camera scenarios which are not representative of real-world applications. In this paper we present a new, publicly available database for large scale crowd surveillance. Footage from 12 cameras for a full work day covering the main floor of a busy university campus building, including an internal and external foyer, elevator foyers, and the main external approach are provided; alongside annotation for crowd counting (single or multi-camera) and pedestrian flow analysis for 10 and 6 sites respectively. We describe how this large dataset can be used to perform distributed monitoring of building utilisation, and demonstrate the potential of this dataset to understand and learn the relationship between different areas of a building.
Resumo:
Outlines some of the potential risks or actual harms that result from large-scale land leases or acquisitions and the relevant human rights and environmental law principles.
Resumo:
With the advent of functional neuroimaging techniques, in particular functional magnetic resonance imaging (fMRI), we have gained greater insight into the neural correlates of visuospatial function. However, it may not always be easy to identify the cerebral regions most specifically associated with performance on a given task. One approach is to examine the quantitative relationships between regional activation and behavioral performance measures. In the present study, we investigated the functional neuroanatomy of two different visuospatial processing tasks, judgement of line orientation and mental rotation. Twenty-four normal participants were scanned with fMRI using blocked periodic designs for experimental task presentation. Accuracy and reaction time (RT) to each trial of both activation and baseline conditions in each experiment was recorded. Both experiments activated dorsal and ventral visual cortical areas as well as dorsolateral prefrontal cortex. More regionally specific associations with task performance were identified by estimating the association between (sinusoidal) power of functional response and mean RT to the activation condition; a permutation test based on spatial statistics was used for inference. There was significant behavioral-physiological association in right ventral extrastriate cortex for the line orientation task and in bilateral (predominantly right) superior parietal lobule for the mental rotation task. Comparable associations were not found between power of response and RT to the baseline conditions of the tasks. These data suggest that one region in a neurocognitive network may be most strongly associated with behavioral performance and this may be regarded as the computationally least efficient or rate-limiting node of the network.
Resumo:
The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.
Resumo:
Moreton Island and several other large siliceous sand dune islands and mainland barrier deposits in SE Queensland represent the distal, onshore component of an extensive Quaternary continental shelf sediment system. This sediment has been transported up to 1000 km along the coast and shelf of SE Australia over multiple glacioeustatic sea-level cycles. Stratigraphic relationships and a preliminary Optically Stimulated Luminance (OSL) chronology for Moreton Island indicate a middle Pleistocene age for the large majority of the deposit. Dune units exposed in the centre of the island and on the east coast have OSL ages that indicate deposition occurred between approximately 540 ka and 350 ka BP, and at around 96±10 ka BP. Much of the southern half of the island has a veneer of much younger sediment, with OSL ages of 0.90±0.11 ka, 1.28±0.16 ka, 5.75±0.53 ka and <0.45 ka BP. The younger deposits were partially derived from the reworking of the upper leached zone of the much older dunes. A large parabolic dune at the northern end of the island, OSL age of 9.90±1.0 ka BP, and palaeosol exposures that extend below present sea level suggest the Pleistocene dunes were sourced from shorelines positioned several to tens of metres lower than, and up to few kilometres seaward of the present shoreline. Given the lower gradient of the inner shelf a few km seaward of the island, it seems likely that periods of intermediate sea level (e.g. ~20 m below present) produced strongly positive onshore sediment budgets and the mobilisation of dunes inland to form much of what now comprises Moreton Island. The new OSL ages and comprehensive OSL chronology for the Cooloola deposit, 100 km north of Moreton Island, indicate that the bulk of the coastal dune deposits in SE Queensland were emplaced between approximately 540 ka BP and prior to the Last Interglacial. This chronostratigraphic information improves our fundamental understanding of long-term sediment transport and accumulation on large-scale continental shelf sediment systems.
Resumo:
A key feature of the current era of Australian schooling is the dominance of publically available student, school and teacher performance data. Our paper examines the intersection of data on teachers’ postgraduate qualifications and students’ end of schooling outcomes in 26 Catholic Systemic Secondary Schools and 18 Catholic Independent Secondary Schools throughout the State of Queensland. We introduce and justify taking up a new socially-just measurement model of students’ end of schooling outcomes, called the ‘Tracking and Academic Management Index’, otherwise known as ‘TAMI’. Additional analysis is focused on the outcomes of top-end students vis-à-vis all students who are encouraged to remain in institutionalised education of one form or another for the two final years of senior secondary schooling. These findings of the correlations between Catholic teachers’ postgraduate qualifications and students’ end of schooling outcomes are also compared with teachers’ postgraduate qualifications and students’ end of schooling outcomes across 174 Queensland Government Secondary Schools and 58 Queensland Independent Secondary Schools from the same data collection period. The findings raise important questions about the transference of teachers’ postgraduate qualifications for progressing students’ end of schooling outcomes as well as the performance of Queensland Catholic Systemic Secondary Schools and Queensland Catholic Independent Secondary Schools during a particular era of education.
Resumo:
This article investigates whether participation on Twitter during Toronto’s 2014 WorldPride festival facilitated challenges to heteronormativity through increased visibility, connections, and messages about LGBTQ people. Analysis of 68,231 tweets found that surges in activity using WorldPride hashtags, connections among users, and the circulation of affective content with common symbols made celebrations visible. However, the platform’s features catered to politicians, celebrities, and advertisers in ways that accentuated self-promotional, local, and often banal content, overshadowing individual users and the festival’s global mandate. By identifying Twitter’s limits in fostering the visibility of users and messages that circulate nonnormative discourses, this study makes way for future research identifying alternative platform dynamics that can enhance the visibility of diversity.
Resumo:
In the mining optimisation literature, most researchers focused on two strategic-level and tactical-level open-pit mine optimisation problems, which are respectively termed ultimate pit limit (UPIT) or constrained pit limit (CPIT). However, many researchers indicate that the substantial numbers of variables and constraints in real-world instances (e.g., with 50-1000 thousand blocks) make the CPIT’s mixed integer programming (MIP) model intractable for use. Thus, it becomes a considerable challenge to solve the large scale CPIT instances without relying on exact MIP optimiser as well as the complicated MIP relaxation/decomposition methods. To take this challenge, two new graph-based algorithms based on network flow graph and conjunctive graph theory are developed by taking advantage of problem properties. The performance of our proposed algorithms is validated by testing recent large scale benchmark UPIT and CPIT instances’ datasets of MineLib in 2013. In comparison to best known results from MineLib, it is shown that the proposed algorithms outperform other CPIT solution approaches existing in the literature. The proposed graph-based algorithms leads to a more competent mine scheduling optimisation expert system because the third-party MIP optimiser is no longer indispensable and random neighbourhood search is not necessary.