11 resultados para path sampling

em DigitalCommons@University of Nebraska - Lincoln


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the problem of survivable lightpath provisioning in wavelength-division-multiplexing (WDM) mesh networks, taking into consideration optical-layer protection and some realistic optical signal quality constraints. The investigated networks use sparsely placed optical–electrical–optical (O/E/O) modules for regeneration and wavelength conversion. Given a fixed network topology with a number of sparsely placed O/E/O modules and a set of connection requests, a pair of link-disjoint lightpaths is established for each connection. Due to physical impairments and wavelength continuity, both the working and protection lightpaths need to be regenerated at some intermediate nodes to overcome signal quality degradation and wavelength contention. In the present paper, resource-efficient provisioning solutions are achieved with the objective of maximizing resource sharing. The authors propose a resource-sharing scheme that supports three kinds of resource-sharing scenarios, including a conventional wavelength-link sharing scenario, which shares wavelength links between protection lightpaths, and two new scenarios, which share O/E/O modules between protection lightpaths and between working and protection lightpaths. An integer linear programming (ILP)-based solution approach is used to find optimal solutions. The authors also propose a local optimization heuristic approach and a tabu search heuristic approach to solve this problem for real-world, large mesh networks. Numerical results show that our solution approaches work well under a variety of network settings and achieves a high level of resource-sharing rates (over 60% for O/E/O modules and over 30% for wavelength links), which translate into great savings in network costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Survivable traffic grooming (STG) is a promising approach to provide reliable and resource-efficient multigranularity connection services in wavelength-division-multiplexing (WDM) optical networks. In this paper, we study the STG problem in WDM mesh optical networks employing path protection at the connection level. Both dedicated-protection and shared-protection schemes are considered. Given network resources, the objective of the STG problem is to maximize network throughput. To enable survivability under various kinds of single failures, such as fiber cut and duct cut, we consider the general shared-risklink- group (SRLG) diverse routing constraints. We first resort to the integer-linear-programming (ILP) approach to obtain optimal solutions. To address its high computational complexity, we then propose three efficient heuristics, namely separated survivable grooming algorithm (SSGA), integrated survivable grooming algorithm (ISGA), and tabu-search survivable grooming algorithm (TSGA). While SSGA and ISGA correspond to an overlay network model and a peer network model, respectively, TSGA further improves the grooming results from SSGA and ISGA by incorporating the effective tabu-search (TS) method. Numerical results show that the heuristics achieve comparable solutions to the ILP approach, which uses significantly longer running times than the heuristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the problem of dedicated path-protection in wavelength-division multiplexed (WDM) mesh networks with waveband switching functionality under shared risk link group (SRLG) constraints. Two dedicated path protection schemes are proposed, namely the PBABL scheme and the MPABWL scheme. The PBABL scheme protects each working waveband-path through a backup waveband-path. The MPABWL scheme protects each working waveband-path by either a backup waveband-path or multiple backup lightpaths. Heuristic algorithms adopting random optimization technique are proposed for both the schemes. The performance of the two protection schemes is studied and compared. Simulation results show that both the heuristics can obtain optimum solutions and the MPABWL scheme leads to less switching and transmission costs than the PBABL scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wavelength-routed networks (WRN) are very promising candidates for next-generation Internet and telecommunication backbones. In such a network, optical-layer protection is of paramount importance due to the risk of losing large amounts of data under a failure. To protect the network against this risk, service providers usually provide a pair of risk-independent working and protection paths for each optical connection. However, the investment made for the optical-layer protection increases network cost. To reduce the capital expenditure, service providers need to efficiently utilize their network resources. Among all the existing approaches, shared-path protection has proven to be practical and cost-efficient [1]. In shared-path protection, several protection paths can share a wavelength on a fiber link if their working paths are risk-independent. In real-world networks, provisioning is usually implemented without the knowledge of future network resource utilization status. As the network changes with the addition and deletion of connections, the network utilization will become sub-optimal. Reconfiguration, which is referred to as the method of re-provisioning the existing connections, is an attractive solution to fill in the gap between the current network utilization and its optimal value [2]. In this paper, we propose a new shared-protection-path reconfiguration approach. Unlike some of previous reconfiguration approaches that alter the working paths, our approach only changes protection paths, and hence does not interfere with the ongoing services on the working paths, and is therefore risk-free. Previous studies have verified the benefits arising from the reconfiguration of existing connections [2] [3] [4]. Most of them are aimed at minimizing the total used wavelength-links or ports. However, this objective does not directly relate to cost saving because minimizing the total network resource consumption does not necessarily maximize the capability of accommodating future connections. As a result, service providers may still need to pay for early network upgrades. Alternatively, our proposed shared-protection-path reconfiguration approach is based on a load-balancing objective, which minimizes the network load distribution vector (LDV, see Section 2). This new objective is designed to postpone network upgrades, thus bringing extra cost savings to service providers. In other words, by using the new objective, service providers can establish as many connections as possible before network upgrades, resulting in increased revenue. We develop a heuristic load-balancing (LB) reconfiguration approach based on this new objective and compare its performance with an approach previously introduced in [2] and [4], whose objective is minimizing the total network resource consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Survivable traffic grooming (STG) is a promising approach to provide reliable and resource-efficient multigranularity connection services in wavelength division multiplexing (WDM) optical networks. In this paper, we study the STG problem in WDM mesh optical networks employing path protection at the connection level. Both dedicated protection and shared protection schemes are considered. Given the network resources, the objective of the STG problem is to maximize network throughput. To enable survivability under various kinds of single failures such as fiber cut and duct cut, we consider the general shared risk link group (SRLG) diverse routing constraints. We first resort to the integer linear programming (ILP) approach to obtain optimal solutions. To address its high computational complexity, we then propose three efficient heuristics, namely separated survivable grooming algorithm (SSGA), integrated survivable grooming algorithm (ISGA) and tabu search survivable grooming algorithm (TSGA). While SSGA and ISGA correspond to an overlay network model and a peer network model respectively, TSGA further improves the grooming results from SSGA and ISGA by incorporating the effective tabu search method. Numerical results show that the heuristics achieve comparable solutions to the ILP approach, which uses significantly longer running times than the heuristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a resource-sharing scheme that supports three kinds of sharing scenarios in a WDM mesh network with path-based protection and sparse OEO regeneration. Several approaches are used to maximize the sharing of wavelength-links and OEO regenerators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Killer whale (Orcinus orca Linnaeus, 1758) abundance in the North Pacific is known only for a few populations for which extensive longitudinal data are available, with little quantitative data from more remote regions. Line-transect ship surveys were conducted in July and August of 2001–2003 in coastal waters of the western Gulf of Alaska and the Aleutian Islands. Conventional and Multiple Covariate Distance Sampling methods were used to estimate the abundance of different killer whale ecotypes, which were distinguished based upon morphological and genetic data. Abundance was calculated separately for two data sets that differed in the method by which killer whale group size data were obtained. Initial group size (IGS) data corresponded to estimates of group size at the time of first sighting, and post-encounter group size (PEGS) corresponded to estimates made after closely approaching sighted groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classical sampling methods can be used to estimate the mean of a finite or infinite population. Block kriging also estimates the mean, but of an infinite population in a continuous spatial domain. In this paper, I consider a finite population version of block kriging (FPBK) for plot-based sampling. The data are assumed to come from a spatial stochastic process. Minimizing mean-squared-prediction errors yields best linear unbiased predictions that are a finite population version of block kriging. FPBK has versions comparable to simple random sampling and stratified sampling, and includes the general linear model. This method has been tested for several years for moose surveys in Alaska, and an example is given where results are compared to stratified random sampling. In general, assuming a spatial model gives three main advantages over classical sampling: (1) FPBK is usually more precise than simple or stratified random sampling, (2) FPBK allows small area estimation, and (3) FPBK allows nonrandom sampling designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"How large a sample is needed to survey the bird damage to corn in a county in Ohio or New Jersey or South Dakota?" Like those in the Bureau of Sport Fisheries and Wildlife and the U.S.D.A. who have been faced with a question of this sort we found only meager information on which to base an answer, whether the problem related to a county in Ohio or to one in New Jersey, or elsewhere. Many sampling methods and rates of sampling did yield reliable estimates but the judgment was often intuitive or based on the reasonableness of the resulting data. Later, when planning the next study or survey, little additional information was available on whether 40 samples of 5 ears each or 5 samples of 200 ears should be examined, i.e., examination of a large number of small samples or a small number of large samples. What information is needed to make a reliable decision? Those of us involved with the Agricultural Experiment Station regional project concerned with the problems of bird damage to crops, known as NE-49, thought we might supply an ans¬wer if we had a corn field in which all the damage was measured. If all the damage were known, we could then sample this field in various ways and see how the estimates from these samplings compared to the actual damage and pin-point the best and most accurate sampling procedure. Eventually the investigators in four states became involved in this work1 and instead of one field we were able to broaden the geographical base by examining all the corn ears in 2 half-acre sections of fields in each state, 8 sections in all. When the corn had matured well past the dough stage, damage on each corn ear was assessed, without removing the ear from the stalk, by visually estimating the percent of the kernel surface which had been destroyed and rating it in one of 5 damage categories. Measurements (by row-centimeters) of the rows of kernels pecked by birds also were made on selected ears representing all categories and all parts of each field section. These measurements provided conversion factors that, when fed into a computer, were applied to the more than 72,000 visually assessed ears. The machine now had in its memory and could supply on demand a map showing each ear, its location and the intensity of the damage.