920 resultados para Ecosystem management -- Queensland -- Johnstone (Shire) -- Data processing.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data have been collected on fisheries catch and effort trends since the latter half of the 1800s. With current trends in declining stocks and stricter management regimes, data need to be collected and analyzed over shorter periods and at finer spatial resolution than in the past. New methods of electronic reporting may reduce the lag time in data collection and provide more accurate spatial resolution. In this study I evaluated the differences between fish dealer and vessel reporting systems for federal fisheries in the US New England and Mid-Atlantic areas. Using data on landing date, report date, gear used, port landed, number of hauls, number of fish sampled and species quotas from available catch and effort records I compared dealer and vessel electronically collected data against paper collected dealer and vessel data to determine if electronically collected data are timelier and more accurate. To determine if vessel or dealer electronic reporting is more useful for management, I determined differences in timeliness and accuracy between vessel and dealer electronic reports. I also compared the cost and efficiency of these new methods with less technology intensive reporting methods using available cost data and surveys of seafood dealers for cost information. Using this information I identified potentially unnecessary duplication of effort and identified applications in ecosystem-based fisheries management. This information can be used to guide the decisions of fisheries managers in the United States and other countries that are attempting to identify appropriate fisheries reporting methods for the management regimes under consideration. (PDF contains 370 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of the present research was to investigate how Local Governments in Queensland were progressing with the adoption of delineated DM policies and supporting guidelines. The study consulted Local Government representatives and hence, the results reflect their views on these issues. Is adoption occurring? To what degree? Are policies and guidelines being effectively implemented so that the objective of a safer, more resilient community is being achieved? If not, what are the current barriers to achieving this, and can recommendations be made to overcome these barriers? These questions defined the basis on which the present study was designed and the survey tools developed. While it was recognised that LGAQ and Emergency Management Queensland (EMQ) may have differing views on some reported issues, it was beyond the scope of the present study to canvass those views. The study resolved to document and analyse these questions under the broad themes of: • Building community capacity (notably via community awareness). • Council operationalisation of DM. • Regional partnerships (in mitigation/adaptation). Data was collected via a survey tool comprising two components: • An online questionnaire survey distributed via the LGAQ Disaster Management Alliance (hereafter referred to as the “Alliance”) to DM sections of all Queensland Local Government Councils; and • a series of focus groups with selected Queensland Councils

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a novel program parallelization technique incorporating with dynamic and static scheduling. It utilizes a problem specific pattern developed from the prior knowledge of the targeted problem abstraction. Suitable for solving complex parallelization problems such as data intensive all-to-all comparison constrained by memory, the technique delivers more robust and faster task scheduling compared to the state-of-the art techniques. Good performance is achieved from the technique in data intensive bioinformatics applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy makers, natural resource managers, regulators, and the public often call on scientists to estimate the potential ecological changes caused by both natural and human-induced stresses, and to determine how those changes will impact people and the environment. To develop accurate forecasts of ecological changes we need to: 1) increase understanding of ecosystem composition, structure, and functioning, 2) expand ecosystem monitoring and apply advanced scientific information to make these complex data widely available, and 3) develop and improve forecast and interpretative tools that use a scientific basis to assess the results of management and science policy actions. (PDF contains 120 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: The characterization of urinary calculi using noninvasive methods has the potential to affect clinical management. CT remains the gold standard for diagnosis of urinary calculi, but has not reliably differentiated varying stone compositions. Dual-energy CT (DECT) has emerged as a technology to improve CT characterization of anatomic structures. This study aims to assess the ability of DECT to accurately discriminate between different types of urinary calculi in an in vitro model using novel postimage acquisition data processing techniques. METHODS: Fifty urinary calculi were assessed, of which 44 had >or=60% composition of one component. DECT was performed utilizing 64-slice multidetector CT. The attenuation profiles of the lower-energy (DECT-Low) and higher-energy (DECT-High) datasets were used to investigate whether differences could be seen between different stone compositions. RESULTS: Postimage acquisition processing allowed for identification of the main different chemical compositions of urinary calculi: brushite, calcium oxalate-calcium phosphate, struvite, cystine, and uric acid. Statistical analysis demonstrated that this processing identified all stone compositions without obvious graphical overlap. CONCLUSION: Dual-energy multidetector CT with postprocessing techniques allows for accurate discrimination among the main different subtypes of urinary calculi in an in vitro model. The ability to better detect stone composition may have implications in determining the optimum clinical treatment modality for urinary calculi from noninvasive, preprocedure radiological assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Prepared by: Staff Development Unit, Administrative Management Section, Management Coordination Branch, Divison of Accounting Operations."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cover title.