5 resultados para decentralised data fusion framework

em DigitalCommons@University of Nebraska - Lincoln


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer and telecommunication networks are changing the world dramatically and will continue to do so in the foreseeable future. The Internet, primarily based on packet switches, provides very flexible data services such as e-mail and access to the World Wide Web. The Internet is a variable-delay, variable- bandwidth network that provides no guarantee on quality of service (QoS) in its initial phase. New services are being added to the pure data delivery framework of yesterday. Such high demands on capacity could lead to a “bandwidth crunch” at the core wide-area network, resulting in degradation of service quality. Fortunately, technological innovations have emerged which can provide relief to the end user to overcome the Internet’s well-known delay and bandwidth limitations. At the physical layer, a major overhaul of existing networks has been envisaged from electronic media (e.g., twisted pair and cable) to optical fibers - in wide-area, metropolitan-area, and even local-area settings. In order to exploit the immense bandwidth potential of optical fiber, interesting multiplexing techniques have been developed over the years.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a general framework for the analysis of animal telemetry data through the use of weighted distributions. It is shown that several interpretations of resource selection functions arise when constructed from the ratio of a use and availability distribution. Through the proposed general framework, several popular resource selection models are shown to be special cases of the general model by making assumptions about animal movement and behavior. The weighted distribution framework is shown to be easily extended to readily account for telemetry data that are highly auto-correlated; as is typical with use of new technology such as global positioning systems animal relocations. An analysis of simulated data using several models constructed within the proposed framework is also presented to illustrate the possible gains from the flexible modeling framework. The proposed model is applied to a brown bear data set from southeast Alaska.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prior studies of phylogenetic relationships among phocoenids based on morphology and molecular sequence data conflict and yield unresolved relationships among species. This study evaluates a comprehensive set of cranial, postcranial, and soft anatomical characters to infer interrelationships among extant species and several well-known fossil phocoenids, using two different methods to analyze polymorphic data: polymorphic coding and frequency step matrix. Our phylogenetic results confirmed phocoenid monophyly. The division of Phocoenidae into two subfamilies previously proposed was rejected, as well as the alliance of the two extinct genera Salumiphocaena and Piscolithax with Phocoena dioptrica and Phocoenoides dalli. Extinct phocoenids are basal to all extant species. We also examined the origin and distribution of porpoises within the context of this phylogenetic framework. Phocoenid phylogeny together with available geologic evidence suggests that the early history of phocoenids was centered in the North Pacific during the middle Miocene, with subsequent dispersal into the southern hemisphere in the middle Pliocene. A cooling period in the Pleistocene allowed dispersal of the southern ancestor of Phocoena sinusinto the North Pacific (Gulf of California).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapidly accumulating Holocene sediments in estuaries commonly are difficult to sample and date. In Chesapeake Bay, we obtained sediment cores as much as 20min length and used numerous radiocarbon ages measured by accelerator mass spectrometry methods to provide the first detailed chronologies of Holocene sediment accumulation in the bay. Carbon in these sediments is a complex mixture of materials from a variety of sources. Analyses of different components of the sediments show that total organic carbon ages are largely unreliable, because much of the carbon (including coal) has been transported to the bay from upstream sources and is older than sediments in which it was deposited. Mollusk shells (clams, oysters) and foraminifera appear to give reliable results, although reworking and burrowing are potential problems. Analyses of museum specimens collected alive before atmospheric nuclear testing suggest that the standard reservoir correction for marine samples is appropriate for middle to lower Chesapeake Bay. The biogenic carbonate radiocarbon ages are compatible with 210Pb and 137Cs data and pollen stratigraphy from the same sites. Post-settlement changes in sediment transport and accumulation is an important environmental issue in many estuaries, including the Chesapeake. Our data show that large variations in sediment mass accumulation rates occur among sites. At shallow water sites, local factors seem to control changes in accumulation rates with time. Our two relatively deep-water sites in the axial channel of the bay have different long-term average accumulation rates, but the history of sediment accumulation at these sites appears to reflect overall conditions in the bay. Mass accumulation rates at the two deep-water sites rapidly increased by about fourfold coincident with widespread land clearance for agriculture in the Chesapeake watershed.