90 resultados para Large-scale Distribution

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of controlling a Markov decision process (MDP) with a large state space, so as to minimize average cost. Since it is intractable to compete with the optimal policy for large scale problems, we pursue the more modest goal of competing with a low-dimensional family of policies. We use the dual linear programming formulation of the MDP average cost problem, in which the variable is a stationary distribution over state-action pairs, and we consider a neighborhood of a low-dimensional subset of the set of stationary distributions (defined in terms of state-action features) as the comparison class. We propose a technique based on stochastic convex optimization and give bounds that show that the performance of our algorithm approaches the best achievable by any policy in the comparison class. Most importantly, this result depends on the size of the comparison class, but not on the size of the state space. Preliminary experiments show the effectiveness of the proposed algorithm in a queuing application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most departmental computing infrastructure reflects the state of networking technology and available funds at the time of construction, which converge in a preconceived notion of homogeneity of network architecture and usage patterns. The DMAN (Digital Media Access Network) project, a large-scale server and network foundation for the Hong Kong Polytechnic University's School of Design was created as a platform that would support a highly complex academic environment while giving maximum freedom to students, faculty and researchers through simplicity and ease of use. As a centralized multi-user computation backbone, DMAN faces an extremely hetrogeneous user and application profile, exceeding implementation and maintenance challenges of typical enterprise, and even most academic server set-ups. This paper sumarizes the specification, implementation and application of the system while describing its significance for design education in a computational context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper draws on a study of government initiat ives aimed at facilitating economic development, specifically the Multifunction Polis Feasibility Study involving the governments and business enterprises of Australia and Japan (1987-1991). Large scale projects that involve collaboration between gove rnment and business (termed: large scale collaborative venture LSCV)are identified as one aspect of competing in the new economy . The study pursued the research propos ition that a LSCV can be effectively facilitated by following a theory based process similar to those in corporate practice. An approach to managing such ventures is outlined, based on strategic marketing theory that may enhance their success and thereby help countries part icipate more successfully in global competition through such ventures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper relates to government initiatives which aim at advancing their country’s economic development and investor attractiveness. It identifies large scale projects that involve collaboration between government and business (termed: large scale collaborative venture – LSCV) as one aspect of competing in the new economy. The study pursued the research proposition that a LSCV can be effectively facilitated by following a theory based process similar to what is used in corporate practice. An approach to managing such ventures is outlined, based on strategic marketing theory applied to a major project, the Multifunction Polis. It is proposed that such an approach may enhance the success of a collaborative venture and thereby help countries participate more successfully in global competition through such ventures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce K-tree in an information retrieval context. It is an efficient approximation of the k-means clustering algorithm. Unlike k-means it forms a hierarchy of clusters. It has been extended to address issues with sparse representations. We compare performance and quality to CLUTO using document collections. The K-tree has a low time complexity that is suitable for large document collections. This tree structure allows for efficient disk based implementations where space requirements exceed that of main memory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change and human activity are subjecting the environment to unprecedented rates of change. Monitoring these changes is an immense task that demands new levels of automated monitoring and analysis. We propose the use of acoustics as a proxy for the time consuming auditing of fauna, especially for determining the presence/absence of species. Acoustic monitoring is deceptively simple; seemingly all that is required is a sound recorder. However there are many major challenges if acoustics are to be used for large scale monitoring of ecosystems. Key issues are scalability and automation. This paper discusses our approach to this important research problem. Our work is being undertaken in collaboration with ecologists interested both in identifying particular species and in general ecosystem health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The automatic extraction of road features from remote sensed images has been a topic of great interest within the photogrammetric and remote sensing communities for over 3 decades. Although various techniques have been reported in the literature, it is still challenging to efficiently extract the road details with the increasing of image resolution as well as the requirement for accurate and up-to-date road data. In this paper, we will focus on the automatic detection of road lane markings, which are crucial for many applications, including lane level navigation and lane departure warning. The approach consists of four steps: i) data preprocessing, ii) image segmentation and road surface detection, iii) road lane marking extraction based on the generated road surface, and iv) testing and system evaluation. The proposed approach utilized the unsupervised ISODATA image segmentation algorithm, which segments the image into vegetation regions, and road surface based only on the Cb component of YCbCr color space. A shadow detection method based on YCbCr color space is also employed to detect and recover the shadows from the road surface casted by the vehicles and trees. Finally, the lane marking features are detected from the road surface using the histogram clustering. The experiments of applying the proposed method to the aerial imagery dataset of Gympie, Queensland demonstrate the efficiency of the approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes technologies we have developed to perform autonomous large-scale off-world excavation. A scale dragline excavator of size similar to that required for lunar excavation was made capable of autonomous control. Systems have been put in place to allow remote operation of the machine from anywhere in the world. Algorithms have been developed for complete autonomous digging and dumping of material taking into account machine and terrain constraints and regolith variability. Experimental results are presented showing the ability to autonomously excavate and move large amounts of regolith and accurately place it at a specified location.