7 resultados para wide-area surveillance

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Arguably, the world has become one large pervasive computing environment. Our planet is growing a digital skin of a wide array of sensors, hand-held computers, mobile phones, laptops, web services and publicly accessible web-cams. Often, these devices and services are deployed in groups, forming small communities of interacting devices. Service discovery protocols allow processes executing on each device to discover services offered by other devices within the community. These communities can be linked together to form a wide-area pervasive environment, allowing processes in one p u p tu interact with services in another. However, the costs of communication and the protocols by which this communication is mediated in the wide-area differ from those of intra-group, or local-area, communication. Communication is an expensive operation for small, battery powered devices, but it is less expensive for servem and workstations, which have a constant power supply and 81'e connected to high bandwidth networks. This paper introduces Superstring, a peer to-peer service discovery protocol optimised fur use in the wide-area. Its goals are to minimise computation and memory overhead in the face of large numbers of resources. It achieves this memory and computation scalability by distributing the storage cost of service descriptions and the computation cost of queries over multiple resolvers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Assessment of the extent of coral bleaching has become an important part of studies that aim to understand the condition of coral reefs. In this study a reference card that uses differences in coral colour was developed as an inexpensive, rapid and non-invasive method for the assessment of bleaching. The card uses a 6 point brightness/saturation scale within four colour hues to record changes in bleaching state. Changes on the scale of 2 units or more reflect a change in symbiont density and chlorophyll a content, and therefore the bleaching state of the coral. When used by non-specialist observers in the field (here on an intertidal reef flat), there was an inter-observer error of I colour score. This technique improves on existing subjective assessment of bleaching state by visual observation and offers the potential for rapid, wide-area assessment of changing coral condition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Retrieving large amounts of information over wide area networks, including the Internet, is problematic due to issues arising from latency of response, lack of direct memory access to data serving resources, and fault tolerance. This paper describes a design pattern for solving the issues of handling results from queries that return large amounts of data. Typically these queries would be made by a client process across a wide area network (or Internet), with one or more middle-tiers, to a relational database residing on a remote server. The solution involves implementing a combination of data retrieval strategies, including the use of iterators for traversing data sets and providing an appropriate level of abstraction to the client, double-buffering of data subsets, multi-threaded data retrieval, and query slicing. This design has recently been implemented and incorporated into the framework of a commercial software product developed at Oracle Corporation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we apply a new method for the determination of surface area of carbonaceous materials, using the local surface excess isotherms obtained from the Grand Canonical Monte Carlo simulation and a concept of area distribution in terms of energy well-depth of solid–fluid interaction. The range of this well-depth considered in our GCMC simulation is from 10 to 100 K, which is wide enough to cover all carbon surfaces that we dealt with (for comparison, the well-depth for perfect graphite surface is about 58 K). Having the set of local surface excess isotherms and the differential area distribution, the overall adsorption isotherm can be obtained in an integral form. Thus, given the experimental data of nitrogen or argon adsorption on a carbon material, the differential area distribution can be obtained from the inversion process, using the regularization method. The total surface area is then obtained as the area of this distribution. We test this approach with a number of data in the literature, and compare our GCMC-surface area with that obtained from the classical BET method. In general, we find that the difference between these two surface areas is about 10%, indicating the need to reliably determine the surface area with a very consistent method. We, therefore, suggest the approach of this paper as an alternative to the BET method because of the long-recognized unrealistic assumptions used in the BET theory. Beside the surface area obtained by this method, it also provides information about the differential area distribution versus the well-depth. This information could be used as a microscopic finger-print of the carbon surface. It is expected that samples prepared from different precursors and different activation conditions will have distinct finger-prints. We illustrate this with Cabot BP120, 280 and 460 samples, and the differential area distributions obtained from the adsorption of argon at 77 K and nitrogen also at 77 K have exactly the same patterns, suggesting the characteristics of this carbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pesticides and herbicides including organochlorine compounds have had extensive current and past application by Queensland's intensive coastal agriculture industry as web as for a wide range of domestic, public health and agricultural purposes in urban areas, The persistent nature of these types of compounds together with possible continued illegal use of banned organochlorine compounds raises the potential for continued long-term chronic exposure to plants and animals of the Great Barrier Reef. Sediment and seagrass samples were collected from 16 intertidal and 25 subtidal sampling sites between Torres Strait and Townsville, near Mackay and Gladstone, and in Hervey and Moreton Bays in 1997 and 1998 and analysed for pesticide and herbicide residues. Low levels of atrazine (0.1-0.3 mug kg(-1)), diuron (0.2-10.1 mug kg(-1)), lindane (0.08-0.19 mug kg(-1)), dieldrin (0.05-0.37 mug kg(-1)), DDT (0.05-0.26 mug kg(-1)), and DDE (0.05-0.26 mug kg(-1)) were detected in sediments and/or seagrasses. Contaminants were mainly detected in samples collected along the high rainfall, tropical coast between Townsville and Port Douglas and in Moreton Bay. Of the contaminants detected, the herbicide diuron is of most concern as the concentrations detected have some potential to impact local seagrass communities, (C) 2000 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many queries sent to search engines refer to specific locations in the world. Location-based queries try to find local services and facilities around the user’s environment or in a particular area. This paper reviews the specifications of geospatial queries and discusses the similarities and differences between location-based queries and other queries. We introduce nine patterns for location-based queries containing either a service name alone or a service name accompanied by a location name. Our survey indicates that at least 22% of the Web queries have a geospatial dimension and most of these can be considered as location-based queries. We propose that location-based queries should be treated different from general queries to produce more relevant results.