946 resultados para Reserve site selection


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compressive Sampling Matching Pursuit (CoSaMP) is one of the popular greedy methods in the emerging field of Compressed Sensing (CS). In addition to the appealing empirical performance, CoSaMP has also splendid theoretical guarantees for convergence. In this paper, we propose a modification in CoSaMP to adaptively choose the dimension of search space in each iteration, using a threshold based approach. Using Monte Carlo simulations, we show that this modification improves the reconstruction capability of the CoSaMP algorithm in clean as well as noisy measurement cases. From empirical observations, we also propose an optimum value for the threshold to use in applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the underlay mode of cognitive radio, secondary users are allowed to transmit when the primary is transmitting, but under tight interference constraints that protect the primary. However, these constraints limit the secondary system performance. Antenna selection (AS)-based multiple antenna techniques, which exploit spatial diversity with less hardware, help improve secondary system performance. We develop a novel and optimal transmit AS rule that minimizes the symbol error probability (SEP) of an average interference-constrained multiple-input-single-output secondary system that operates in the underlay mode. We show that the optimal rule is a non-linear function of the power gain of the channel from the secondary transmit antenna to the primary receiver and from the secondary transmit antenna to the secondary receive antenna. We also propose a simpler, tractable variant of the optimal rule that performs as well as the optimal rule. We then analyze its SEP with L transmit antennas, and extensively benchmark it with several heuristic selection rules proposed in the literature. We also enhance these rules in order to provide a fair comparison, and derive new expressions for their SEPs. The results bring out new inter-relationships between the various rules, and show that the optimal rule can significantly reduce the SEP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Novel transmit antenna selection techniques are conceived for Spatial Modulation (SM) systems and their symbol error rate (SER) performance is investigated. Specifically, low-complexity Euclidean Distance optimized Antenna Selection (EDAS) and Capacity Optimized Antenna Selection (COAS) are studied. It is observed that the COAS scheme gives a better SER performance than the EDAS scheme. We show that the proposed antenna selection based SM systems are capable of attaining a significant gain in signal-to-noise ratio (SNR) compared to conventional SM systems, and also outperform the conventional MIMO systems employing antenna selection at both low and medium SNRs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bilateral filters perform edge-preserving smoothing and are widely used for image denoising. The denoising performance is sensitive to the choice of the bilateral filter parameters. We propose an optimal parameter selection for bilateral filtering of images corrupted with Poisson noise. We employ the Poisson's Unbiased Risk Estimate (PURE), which is an unbiased estimate of the Mean Squared Error (MSE). It does not require a priori knowledge of the ground truth and is useful in practical scenarios where there is no access to the original image. Experimental results show that quality of denoising obtained with PURE-optimal bilateral filters is almost indistinguishable with that of the Oracle-MSE-optimal bilateral filters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classification of a large document collection involves dealing with a huge feature space where each distinct word is a feature. In such an environment, classification is a costly task both in terms of running time and computing resources. Further it will not guarantee optimal results because it is likely to overfit by considering every feature for classification. In such a context, feature selection is inevitable. This work analyses the feature selection methods, explores the relations among them and attempts to find a minimal subset of features which are discriminative for document classification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seismic site classifications are used to represent site effects for estimating hazard parameters (response spectral ordinates) at the soil surface. Seismic site classifications have generally been carried out using average shear wave velocity and/or standard penetration test n-values of top 30-m soil layers, according to the recommendations of the National Earthquake Hazards Reduction Program (NEHRP) or the International Building Code (IBC). The site classification system in the NEHRP and the IBC is based on the studies carried out in the United States where soil layers extend up to several hundred meters before reaching any distinct soil-bedrock interface and may not be directly applicable to other regions, especially in regions having shallow geological deposits. This paper investigates the influence of rock depth on site classes based on the recommendations of the NEHRP and the IBC. For this study, soil sites having a wide range of average shear wave velocities (or standard penetration test n-values) have been collected from different parts of Australia, China, and India. Shear wave velocities of rock layers underneath soil layers have also been collected at depths from a few meters to 180 m. It is shown that a site classification system based on the top 30-m soil layers often represents stiffer site classes for soil sites having shallow rock depths (rock depths less than 25 m from the soil surface). A new site classification system based on average soil thickness up to engineering bedrock has been proposed herein, which is considered more representative for soil sites in shallow bedrock regions. It has been observed that response spectral ordinates, amplification factors, and site periods estimated using one-dimensional shear wave analysis considering the depth of engineering bedrock are different from those obtained considering top 30-m soil layers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accidental spills and improper disposal of industrial effluent/sludge containing heavy metals onto the open land or into subsurface result in soil and water contamination. Detailed investigations are carried out to identify the source of contamination of heavy metals in an industrial suburb near Bangalore in India. Detailed investigation of ground water and subsurface soil analysis for various heavy metals has been carried out. Ground water samples were collected in the entire area through the cluster of borewells. Subsurface soil samples were collected from near borewells which were found to contain heavy metals. Water samples and soils samples (after acid digestion) were analysed as per APHO-standard method of analysis. While the results of Zn, Ni and Cd showed that they are within allowable limits in the soil, the ground water and soils in the site have concentration of Cr+6 far exceeding the allowable limits (up to 832 mg/kg). Considering the topography of the area, ground water movement and results of chromium concentration in the borewells and subsurface it was possible to identify the origin, zone of contamination and the migration path of Cr+6. The results indicated that the predominant mechanism of migration of Cr+6 is by diffusion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a methodology for identifying best features from a large feature space. In high dimensional feature space nearest neighbor search is meaningless. In this feature space we see quality and performance issue with nearest neighbor search. Many data mining algorithms use nearest neighbor search. So instead of doing nearest neighbor search using all the features we need to select relevant features. We propose feature selection using Non-negative Matrix Factorization(NMF) and its application to nearest neighbor search. Recent clustering algorithm based on Locally Consistent Concept Factorization(LCCF) shows better quality of document clustering by using local geometrical and discriminating structure of the data. By using our feature selection method we have shown further improvement of performance in the clustering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Outlier detection in high dimensional categorical data has been a problem of much interest due to the extensive use of qualitative features for describing the data across various application areas. Though there exist various established methods for dealing with the dimensionality aspect through feature selection on numerical data, the categorical domain is actively being explored. As outlier detection is generally considered as an unsupervised learning problem due to lack of knowledge about the nature of various types of outliers, the related feature selection task also needs to be handled in a similar manner. This motivates the need to develop an unsupervised feature selection algorithm for efficient detection of outliers in categorical data. Addressing this aspect, we propose a novel feature selection algorithm based on the mutual information measure and the entropy computation. The redundancy among the features is characterized using the mutual information measure for identifying a suitable feature subset with less redundancy. The performance of the proposed algorithm in comparison with the information gain based feature selection shows its effectiveness for outlier detection. The efficacy of the proposed algorithm is demonstrated on various high-dimensional benchmark data sets employing two existing outlier detection methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impact of global warming on daily rainfall is examined using atmospheric variables from five General Circulation Models (GCMs) and a stochastic downscaling model. Daily rainfall at eleven raingauges over Malaprabha catchment of India and National Center for Environmental Prediction (NCEP) reanalysis data at grid points over the catchment for a continuous time period 1971-2000 (current climate) are used to calibrate the downscaling model. The downscaled rainfall simulations obtained using GCM atmospheric variables corresponding to the IPCC-SRES (Intergovernmental Panel for Climate Change - Special Report on Emission Scenarios) A2 emission scenario for the same period are used to validate the results. Following this, future downscaled rainfall projections are constructed and examined for two 20 year time slices viz. 2055 (i.e. 2046-2065) and 2090 (i.e. 2081-2100). The model results show reasonable skill in simulating the rainfall over the study region for the current climate. The downscaled rainfall projections indicate no significant changes in the rainfall regime in this catchment in the future. More specifically, 2% decrease by 2055 and 5% decrease by 2090 in monsoon (HAS) rainfall compared to the current climate (1971-2000) under global warming conditions are noticed. Also, pre-monsoon (JFMAM) and post-monsoon (OND) rainfall is projected to increase respectively, by 2% in 2055 and 6% in 2090 and, 2% in 2055 and 12% in 2090, over the region. On annual basis slight decreases of 1% and 2% are noted for 2055 and 2090, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accuracy of pairing of the anticodon of the initiator tRNA (tRNA(fMet)) and the initiation codon of an mRNA, in the ribosomal P-site, is crucial for determining the translational reading frame. However, a direct role of any ribosomal element(s) in scrutinizing this pairing is unknown. The P-site elements, m(2)G966 (methylated by RsmD), m(5)C967 (methylated by RsmB) and the C-terminal tail of the protein S9 lie in the vicinity of tRNA(fMet). We investigated the role of these elements in initiation from various codons, namely, AUG, GUG, UUG, CUG, AUA, AUU, AUC and ACG with tRNA(CAU)(fmet) (tRNA(fMet) with CAU anticodon); CAC and CAU with tRNA(GUG)(fme); UAG with tRNA(GAU)(fMet) using in vivo and computational methods. Although RsmB deficiency did not impact initiation from most codons, RsmD deficiency increased initiation from AUA, CAC and CAU (2- to 3.6-fold). Deletion of the S9 C-terminal tail resulted in poorer initiation from UUG, GUG and CUG, but in increased initiation from CAC, CAU and UAC codons (up to 4-fold). Also, the S9 tail suppressed initiation with tRNA(CAU)(fMet)lacking the 3GC base pairs in the anticodon stem. These observations suggest distinctive roles of 966/967 methylations and the S9 tail in initiation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose energy harvesting technologies and cooperative relaying techniques to power the devices and improve reliability. We propose schemes to (a) maximize the packet reception ratio (PRR) by cooperation and (b) minimize the average packet delay (APD) by cooperation amongst nodes. Our key result and insight from the testbed implementation is about total data transmitted by each relay. A greedy policy that relays more data under a good harvesting condition turns out to be a sub optimal policy. This is because, energy replenishment is a slow process. The optimal scheme offers a low APD and also improves PRR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the paper is to estimate Safe Shutdown Earthquake (SSE) and Operating/Design Basis Earthquake (OBE/DBE) for the Nuclear Power Plant (NPP) site located at Kalpakkam, Tamil Nadu, India. The NPP is located at 12.558 degrees N, 80.175 degrees E and a 500 km circular area around NPP site is considered as `seismic study area' based on past regional earthquake damage distribution. The geology, seismicity and seismotectonics of the study area are studied and the seismotectonic map is prepared showing the seismic sources and the past earthquakes. Earthquake data gathered from many literatures are homogenized and declustered to form a complete earthquake catalogue for the seismic study area. The conventional maximum magnitude of each source is estimated considering the maximum observed magnitude (M-max(obs)) and/or the addition of 0.3 to 0.5 to M-max(obs). In this study maximum earthquake magnitude has been estimated by establishing a region's rupture character based on source length and associated M-max(obs). A final source-specific M-max is selected from the three M-max values by following the logical criteria. To estimate hazard at the NPP site, ten Ground-Motion Prediction Equations (GMPEs) valid for the study area are considered. These GMPEs are ranked based on Log-Likelihood (LLH) values. Top five GMPEs are considered to estimate the peak ground acceleration (PGA) for the site. Maximum PGA is obtained from three faults and named as vulnerable sources to decide the magnitudes of OBE and SSE. The average and normalized site specific response spectrum is prepared considering three vulnerable sources and further used to establish site-specific design spectrum at NPP site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The timer-based selection scheme is a popular, simple, and distributed scheme that is used to select the best node from a set of available nodes. In it, each node sets a timer as a function of a local preference number called a metric, and transmits a packet when its timer expires. The scheme ensures that the timer of the best node, which has the highest metric, expires first. However, it fails to select the best node if another node transmits a packet within Delta s of the transmission by the best node. We derive the optimal timer mapping that maximizes the average success probability for the practical scenario in which the number of nodes in the system is unknown but only its probability distribution is known. We show that it has a special discrete structure, and present a recursive characterization to determine it. We benchmark its performance with ad hoc approaches proposed in the literature, and show that it delivers significant gains. New insights about the optimality of some ad hoc approaches are also developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sport hunting is often proposed as a tool to support the conservation of large carnivores. However, it is challenging to provide tangible economic benefits from this activity as an incentive for local people to conserve carnivores. We assessed economic gains from sport hunting and poaching of leopards (Panthera pardus), costs of leopard depredation of livestock, and attitudes of people toward leopards in Niassa National Reserve, Mozambique. We sent questionnaires to hunting concessionaires (n = 8) to investigate the economic value of and the relative importance of leopards relative to other key trophy-hunted species. We asked villagers (n = 158) the number of and prices for leopards poached in the reserve and the number of goats depredated by leopard. Leopards were the mainstay of the hunting industry; a single animal was worth approximately U.S.$24,000. Most safari revenues are retained at national and international levels, but poached leopard are illegally traded locally for small amounts ($83). Leopards depredated 11 goats over 2 years in 2 of 4 surveyed villages resulting in losses of $440 to 6 households. People in these households had negative attitudes toward leopards. Although leopard sport hunting generates larger gross revenues than poaching, illegal hunting provides higher economic benefits for households involved in the activity. Sport-hunting revenues did not compensate for the economic losses of livestock at the household level. On the basis of our results, we propose that poaching be reduced by increasing the costs of apprehension and that the economic benefits from leopard sport hunting be used to improve community livelihoods and provide incentives not to poach.