827 resultados para Multiple-scale processing
Resumo:
The paradigm that mangroves are critical for sustaining production in coastal fisheries is widely accepted, but empirical evidence has been tenuous. This study showed that links between mangrove extent and coastal fisheries production could be detected for some species at a broad regional scale (1000s of kilometres) on the east coast of Queensland, Australia. The relationships between catch-per-unit-effort for different commercially caught species in four fisheries (trawl, line, net and pot fisheries) and mangrove characteristics, estimated from Landsat images were examined using multiple regression analyses. The species were categorised into three groups based on information on their life history characteristics, namely mangrove-related species (banana prawns Penaeus merguiensis, mud crabs Scylla serrata and barramundi Lates calcarifer), estuarine species (tiger prawns Penaeus esculentus and Penaeus semisulcatus, blue swimmer crabs Portunus pelagicus and blue threadfin Eleutheronema tetradactylum) and offshore species (coral trout Plectropomus spp.). For the mangrove-related species, mangrove characteristics such as area and perimeter accounted for most of the variation in the model; for the non-mangrove estuarine species, latitude was the dominant parameter but some mangrove characteristics (e.g. mangrove perimeter) also made significant contributions to the models. In contrast, for the offshore species, latitude was the dominant variable, with no contribution from mangrove characteristics. This study also identified that finer scale spatial data for the fisheries, to enable catch information to be attributed to a particular catchment, would help to improve our understanding of relationships between mangroves and fisheries production.
Resumo:
The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.
Resumo:
The wind loading on most structural elements is made up of both an external and internal pressure. Internal pressures are also important for the design of naturally ventilated buildings. The internal pressure is the interaction between the external pressure propagating through the building envelope and any internal plant causing building pressurization. Although the external pressure field can be well defined through a series of wind tunnel tests, modeling complexities makes accurate prediction of the internal pressure difficult. For commercial testing for the determination of design cladding pressures, an internal pressure coefficient is generally assumed from wind loading standards. Several theories regarding the propagation of internal pressures through single and multiple dominant openings have been proposed for small and large flexible buildings (Harris (1990), Holmes, (1979), Liu & Saathoff (1981 ), Vickery (1986, 1994), Vickery & Bloxham (1992), Vickery & Georgiou (1991))...
Resumo:
Recently, a variety high-aspect-ratio nanostructures have been grown and profiled for various applications ranging from field emission transistors to gene/drug delivery devices. However, fabricating and processing arrays of these structures and determining how changing certain physical parameters affects the final outcome is quite challenging. We have developed several modules that can be used to simulate the processes of various physical vapour deposition systems from precursor interaction in the gas phase to gas-surface interactions and surface processes. In this paper, multi-scale hybrid numerical simulations are used to study how low-temperature non-equilibrium plasmas can be employed in the processing of high-aspect-ratio structures such that the resulting nanostructures have properties suitable for their eventual device application. We show that whilst using plasma techniques is beneficial in many nanofabrication processes, it is especially useful in making dense arrays of high-aspect-ratio nanostructures.
Resumo:
The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion. In this paper, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate. The approach is first validated using synthetic data. We then demonstrate its benefit using a mobile robot, equipped with a laser scanner and a radar, which operates in an outdoor environment in the presence of large clouds of airborne dust and smoke.
Resumo:
This paper presents a novel method to rank map hypotheses by the quality of localization they afford. The highest ranked hypothesis at any moment becomes the active representation that is used to guide the robot to its goal location. A single static representation is insufficient for navigation in dynamic environments where paths can be blocked periodically, a common scenario which poses significant challenges for typical planners. In our approach we simultaneously rank multiple map hypotheses by the influence that localization in each of them has on locally accurate odometry. This is done online for the current locally accurate window by formulating a factor graph of odometry relaxed by localization constraints. Comparison of the resulting perturbed odometry of each hypothesis with the original odometry yields a score that can be used to rank map hypotheses by their utility. We deploy the proposed approach on a real robot navigating a structurally noisy office environment. The configuration of the environment is physically altered outside the robots sensory horizon during navigation tasks to demonstrate the proposed approach of hypothesis selection.
Resumo:
Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.
Resumo:
This chapter describes decentralized data fusion algorithms for a team of multiple autonomous platforms. Decentralized data fusion (DDF) provides a useful basis with which to build upon for cooperative information gathering tasks for robotic teams operating in outdoor environments. Through the DDF algorithms, each platform can maintain a consistent global solution from which decisions may then be made. Comparisons will be made between the implementation of DDF using two probabilistic representations. The first, Gaussian estimates and the second Gaussian mixtures are compared using a common data set. The overall system design is detailed, providing insight into the overall complexity of implementing a robust DDF system for use in information gathering tasks in outdoor UAV applications.
Resumo:
The mining industry faces three long term strategic risks in relation to its water and energy use: 1) securing enough water and energy to meet increased production; 2) reducing water use, energy consumption and emissions due to social, environmental and economic pressures; and 3) understanding the links between water and energy, so that an improvement in one area does not create an adverse effect in another. This project helps the industry analyse these risks by creating a hierarchical systems model (HSM) that represents the water and energy interactions on a sub-site, site and regional scales; which is coupled with a flexible risk framework. The HSM consists of: components that represent sources of water and energy; activities that use water and energy and off-site destinations of water and produced emissions. It can also represent more complex components on a site, with inbuilt examples including tailings dams and water treatment plants. The HSM also allows multiple sites and other infrastructure to be connected together to explore regional water and energy interactions. By representing water and energy as a single interconnected system the HSM can explore tradeoffs and synergies. For example, on a synthetic case study, which represents a typical site, simulations suggested that while a synergy in terms of water use and energy use could be made when chemical additives were used to enhance dust suppression, there were trade-offs when either thickened tailings or dry processing were used. On a regional scale, the HSM was used to simulate various scenarios, including: mines only withdrawing water when needed; achieving economics-of-scale through use of a single centralised treatment plant rather than smaller decentralised treatment plants; and capturing of fugitive emissions for energy generation. The HSM also includes an integrated risk framework for interpreting model output, so that onsite and off-site impacts of various water and energy management strategies can be compared in a managerial context. The case studies in this report explored company, social and environmental risks for scenarios of regional water scarcity, unregulated saline discharge, and the use of plantation forestry to offset carbon emissions. The HSM was able to represent the non-linear causal relationship at the regional scale, such as the forestry scheme offsetting a small percentage of carbon emissions but causing severe regional water shortages. The HSM software developed in this project will be released as an open source tool to allow industry personnel to easily and inexpensively quantify and explore the links between water use, energy use, and carbon emissions. The tool can be easily adapted to represent specific sites or regions. Case studies conducted in this project highlighted the potential complexity of these links between water, energy, and carbon emissions, as well as the significance of the cumulative effects of these links over time. A deeper understanding of these links is vital for the mining industry in order to progress to more sustainable operations, and the HSM provides an accessible, robust framework for investigating these links.
Resumo:
In this paper we describe CubIT, a multi-user presentation and collaboration system installed at the Queensland University of Technology’s (QUT) Cube facility. The ‘Cube’ is an interactive visualisation facility made up of five very large-scale interactive multi-panel wall displays, each consisting of up to twelve 55-inch multi-touch screens (48 screens in total) and massive projected display screens situated above the display panels. The paper outlines the unique design challenges, features, implementation and evaluation of CubIT. The system was built to make the Cube facility accessible to QUT’s academic and student population. CubIT enables users to easily upload and share their own media content, and allows multiple users to simultaneously interact with the Cube’s wall displays. The features of CubIT were implemented via three user interfaces, a multi-touch interface working on the wall displays, a mobile phone and tablet application and a web-based content management system. Each of these interfaces plays a different role and offers different interaction mechanisms. Together they support a wide range of collaborative features including multi-user shared workspaces, drag and drop upload and sharing between users, session management and dynamic state control between different parts of the system. The results of our evaluation study showed that CubIT was successfully used for a variety of tasks, and highlighted challenges with regards to user expectations regarding functionality as well as issues arising from public use.
Resumo:
Although the collection of player and ball tracking data is fast becoming the norm in professional sports, large-scale mining of such spatiotemporal data has yet to surface. In this paper, given an entire season's worth of player and ball tracking data from a professional soccer league (approx 400,000,000 data points), we present a method which can conduct both individual player and team analysis. Due to the dynamic, continuous and multi-player nature of team sports like soccer, a major issue is aligning player positions over time. We present a "role-based" representation that dynamically updates each player's relative role at each frame and demonstrate how this captures the short-term context to enable both individual player and team analysis. We discover role directly from data by utilizing a minimum entropy data partitioning method and show how this can be used to accurately detect and visualize formations, as well as analyze individual player behavior.
Resumo:
1.Marine ecosystems provide critically important goods and services to society, and hence their accelerated degradation underpins an urgent need to take rapid, ambitious and informed decisions regarding their conservation and management. 2.The capacity, however, to generate the detailed field data required to inform conservation planning at appropriate scales is limited by time and resource consuming methods for collecting and analysing field data at the large scales required. 3.The ‘Catlin Seaview Survey’, described here, introduces a novel framework for large-scale monitoring of coral reefs using high-definition underwater imagery collected using customized underwater vehicles in combination with computer vision and machine learning. This enables quantitative and geo-referenced outputs of coral reef features such as habitat types, benthic composition, and structural complexity (rugosity) to be generated across multiple kilometre-scale transects with a spatial resolution ranging from 2 to 6 m2. 4.The novel application of technology described here has enormous potential to contribute to our understanding of coral reefs and associated impacts by underpinning management decisions with kilometre-scale measurements of reef health. 5.Imagery datasets from an initial survey of 500 km of seascape are freely available through an online tool called the Catlin Global Reef Record. Outputs from the image analysis using the technologies described here will be updated on the online repository as work progresses on each dataset. 6.Case studies illustrate the utility of outputs as well as their potential to link to information from remote sensing. The potential implications of the innovative technologies on marine resource management and conservation are also discussed, along with the accuracy and efficiency of the methodologies deployed.
Resumo:
Purpose: This study investigated the impact of simulated hyperopic anisometropia and sustained near work on performance of academic-related measures in children. Methods: Participants included 16 children (mean age: 11.1 ± 0.8 years) with minimal refractive error. Academic-related outcome measures included a reading test (Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (Developmental Eye Movement test). Performance was assessed with and without 0.75 D of imposed monocular hyperopic defocus (administered in a randomised order), before and after 20 minutes of sustained near work. Unilateral hyperopic defocus was systematically assigned to either the dominant or non-dominant sighting eye to evaluate the impact of ocular dominance on any performance decrements. Results: Simulated hyperopic anisometropia and sustained near work both independently reduced performance on all of the outcome measures (p<0.001). A significant interaction was also observed between simulated anisometropia and near work (p<0.05), with the greatest decrement in performance observed during simulated anisometropia in combination with sustained near work. Laterality of the refractive error simulation (ocular dominance) did not significantly influence the outcome measures (p>0.05). A reduction of up to 12% in performance was observed across the range of academic-related measures following sustained near work undertaken during the anisometropic simulation. Conclusion: Simulated hyperopic anisometropia significantly impaired academic–related performance, particularly in combination with sustained near work. The impact of uncorrected habitual anisometropia on academic-related performance in children requires further investigation.
Resumo:
Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making
Resumo:
Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 hours to only 7 minutes. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.