112 resultados para Probability Of Target Attainment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study a State Dependent Attempt Rate (SDAR) approximation to model M queues (one queue per node) served by the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) protocol as standardized in the IEEE 802.11 Distributed Coordination Function (DCF). The approximation is that, when n of the M queues are non-empty, the (transmission) attempt probability of each of the n non-empty nodes is given by the long-term (transmission) attempt probability of n saturated nodes. With the arrival of packets into the M queues according to independent Poisson processes, the SDAR approximation reduces a single cell with non-saturated nodes to a Markovian coupled queueing system. We provide a sufficient condition under which the joint queue length Markov chain is positive recurrent. For the symmetric case of equal arrival rates and finite and equal buffers, we develop an iterative method which leads to accurate predictions for important performance measures such as collision probability, throughput and mean packet delay. We replace the MAC layer with the SDAR model of contention by modifying the NS-2 source code pertaining to the MAC layer, keeping all other layers unchanged. By this model-based simulation technique at the MAC layer, we achieve speed-ups (w.r.t. MAC layer operations) up to 5.4. Through extensive model-based simulations and numerical results, we show that the SDAR model is an accurate model for the DCF MAC protocol in single cells. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A lightning strike in the neighborhood can induce significant currents in tall down conductors. Though the magnitude of induced current in this case is much smaller than that encountered during a direct strike, the probability of occurrence and the frequency content is higher. In view of this, appropriate knowledge on the characteristics of such induced currents is relevant for the scrutiny of the recorded currents and in the evaluation of interference to the electrical and electronic system in the vicinity. Previously, a study was carried out on characteristics of induced currents assuming ideal conditions, that there were no influencing objects in the vicinity of the down conductor and channel. However, some influencing conducting bodies will always be present, such as trees, electricity and communication towers, buildings, and other elevated objects that can affect the induced currents in a down conductor. The present work is carried out to understand the influence of nearby conducting objects on the characteristics of induced currents due to a strike to ground in the vicinity of a tall down conductor. For the study, an electromagnetic model is employed to model the down conductor, channel, and neighboring conducting objects, and Numerical Electromagnetic Code-2 is used for numerical field computations. Neighboring objects of different heights, of different shapes, and at different locations are considered. It is found that the neighboring objects have significant influence on the magnitude and nature of induced currents in a down conductor when the height of the nearby conducting object is comparable to that of the down conductor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gujarat is one of the fastest-growing states of India with high industrial activities coming up in major cities of the state. It is indispensable to analyse seismic hazard as the region is considered to be most seismically active in stable continental region of India. The Bhuj earthquake of 2001 has caused extensive damage in terms of causality and economic loss. In the present study, the seismic hazard of Gujarat evaluated using a probabilistic approach with the use of logic tree framework that minimizes the uncertainties in hazard assessment. The peak horizontal acceleration (PHA) and spectral acceleration (Sa) values were evaluated for 10 and 2 % probability of exceedance in 50 years. Two important geotechnical effects of earthquakes, site amplification and liquefaction, are also evaluated, considering site characterization based on site classes. The liquefaction return period for the entire state of Gujarat is evaluated using a performance-based approach. The maps of PHA and PGA values prepared in this study are very useful for seismic hazard mitigation of the region in future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the error exponents in Bayesian decentralized spectrum sensing, i.e., the detection of occupancy of the primary spectrum by a cognitive radio, with probability of error as the performance metric. At the individual sensors, the error exponents of a Central Limit Theorem (CLT) based detection scheme are analyzed. At the fusion center, a K-out-of-N rule is employed to arrive at the overall decision. It is shown that, in the presence of fading, for a fixed number of sensors, the error exponents with respect to the number of observations at both the individual sensors as well as at the fusion center are zero. This motivates the development of the error exponent with a certain probability as a novel metric that can be used to compare different detection schemes in the presence of fading. The metric is useful, for example, in answering the question of whether to sense for a pilot tone in a narrow band (and suffer Rayleigh fading) or to sense the entire wide-band signal (and suffer log-normal shadowing), in terms of the error exponent performance. The error exponents with a certain probability at both the individual sensors and at the fusion center are derived, with both Rayleigh as well as log-normal shadow fading. Numerical results are used to illustrate and provide a visual feel for the theoretical expressions obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We studied the development of surface instabilities leading to the generation of multielectron bubbles (MEBs) in superfluid helium upon the application of a pulsed electric field. We found the statistical distribution of the charge of individual instabilities to be strongly dependent on the duration of the electric field pulse. The rate and probability of generation of these instabilities in relation to the temporal characteristics of the applied field was also investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Future space-based gravity wave (GW) experiments such as the Big Bang Observatory (BBO), with their excellent projected, one sigma angular resolution, will measure the luminosity distance to a large number of GW sources to high precision, and the redshift of the single galaxies in the narrow solid angles towards the sources will provide the redshifts of the gravity wave sources. One sigma BBO beams contain the actual source in only 68% of the cases; the beams that do not contain the source may contain a spurious single galaxy, leading to misidentification. To increase the probability of the source falling within the beam, larger beams have to be considered, decreasing the chances of finding single galaxies in the beams. Saini et al. T.D. Saini, S.K. Sethi, and V. Sahni, Phys. Rev. D 81, 103009 (2010)] argued, largely analytically, that identifying even a small number of GW source galaxies furnishes a rough distance-redshift relation, which could be used to further resolve sources that have multiple objects in the angular beam. In this work we further develop this idea by introducing a self-calibrating iterative scheme which works in conjunction with Monte Carlo simulations to determine the luminosity distance to GW sources with progressively greater accuracy. This iterative scheme allows one to determine the equation of state of dark energy to within an accuracy of a few percent for a gravity wave experiment possessing a beam width an order of magnitude larger than BBO (and therefore having a far poorer angular resolution). This is achieved with no prior information about the nature of dark energy from other data sets such as type Ia supernovae, baryon acoustic oscillations, cosmic microwave background, etc. DOI:10.1103/PhysRevD.87.083001

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we analyze the coexistence of a primary and a secondary (cognitive) network when both networks use the IEEE 802.11 based distributed coordination function for medium access control. Specifically, we consider the problem of channel capture by a secondary network that uses spectrum sensing to determine the availability of the channel, and its impact on the primary throughput. We integrate the notion of transmission slots in Bianchi's Markov model with the physical time slots, to derive the transmission probability of the secondary network as a function of its scan duration. This is used to obtain analytical expressions for the throughput achievable by the primary and secondary networks. Our analysis considers both saturated and unsaturated networks. By performing a numerical search, the secondary network parameters are selected to maximize its throughput for a given level of protection of the primary network throughput. The theoretical expressions are validated using extensive simulations carried out in the Network Simulator 2. Our results provide critical insights into the performance and robustness of different schemes for medium access by the secondary network. In particular, we find that the channel captures by the secondary network does not significantly impact the primary throughput, and that simply increasing the secondary contention window size is only marginally inferior to silent-period based methods in terms of its throughput performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We theoretically explore the annihilation of vortex dipoles, generated when an obstacle moves through an oblate Bose-Einstein condensate, and examine the energetics of the annihilation event. We show that the grey soliton, which results from the vortex dipole annihilation, is lower in energy than the vortex dipole. We also investigate the annihilation events numerically and observe that annihilation occurs only when the vortex dipole overtakes the obstacle and comes closer than the coherence length. Furthermore, we find that noise reduces the probability of annihilation events. This may explain the lack of annihilation events in experimental realizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The delineation of seismic source zones plays an important role in the evaluation of seismic hazard. In most of the studies the seismic source delineation is done based on geological features. In the present study, an attempt has been made to delineate seismic source zones in the study area (south India) based on the seismicity parameters. Seismicity parameters and the maximum probable earthquake for these source zones were evaluated and were used in the hazard evaluation. The probabilistic evaluation of seismic hazard for south India was carried out using a logic tree approach. Two different types of seismic sources, linear and areal, were considered in the present study to model the seismic sources in the region more precisely. In order to properly account for the attenuation characteristics of the region, three different attenuation relations were used with different weightage factors. Seismic hazard evaluation was done for the probability of exceedance (PE) of 10% and 2% in 50 years. The spatial variation of rock level peak horizontal acceleration (PHA) and spectral acceleration (Sa) values corresponding to return periods of 475 and 2500 years for the entire study area are presented in this work. The peak ground acceleration (PGA) values at ground surface level were estimated based on different NEHRP site classes by considering local site effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a simple, reliable method based on probability of transitions and distribution of adjacent pixel pairs for steganalysis on digital images in spatial domain subjected to Least Significant Bit replacement steganography. Our method is sensitive to the statistics of underlying cover image and is a variant of Sample Pair Method. We use the new method to estimate length of hidden message reliably. The novelty of our method is that it detects from the statistics of the underlying image, which is invariant with embedding, whether the results it calculate are reliable or not. To our knowledge, no steganalytic method so far predicts from the properties of the stego image, whether its results are accurate or not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several concepts have been developed in the recent years for nanomaterial based integrated MEMS platform in order to accelerate the process of biological sample preparation followed by selective screening and identification of target molecules. In this context, there exist several challenges which need to be addressed in the process of electrical lysis of biological cells. These are due to (i) low resource settings while achieving maximal lysis (ii) high throughput of target molecules to be detected (iii) automated extraction and purification of relevant molecules such as DNA and protein from extremely small volume of sample (iv) requirement of fast, accurate and yet scalable methods (v) multifunctionality toward process monitoring and (vi) downward compatibility with already existing diagnostic protocols. This paper reports on the optimization of electrical lysis process based on various different nanocomposite coated electrodes placed in a microfluidic channel. The nanocomposites are synthesized using different nanomaterials like Zinc nanorod dispersion in polymer. The efficiency of electrical lysis with various different electrode coatings has been experimentally verified in terms of DNA concentration, amplification and protein yield. The influence of the coating thickness on the injection current densities has been analyzed. We further correlate experimentally the current density vs. voltage relationship with the extent of bacterial cell lysis. A coupled multiphysics based simulation model is used to predict the cell trajectories and lysis efficiencies under various electrode boundary conditions as estimated from experimental results. Detailed in-situ fluorescence imaging and spectroscopy studies are performed to validate various hypotheses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient photon detection in gaseous photomultipliers require maximum photoelectron yield from the photocathode surface and also detection of them. In this work we have investigated the parameters that affect the photoelectron yield from the photocathode surface and methods to improve them thus ensuring high detection efficiency of the gaseous photomultiplier. The parameters studied are the electric field at the photocathode surface, surface properties of photocathode and pressure of gas mixture inside the gaseous photomultiplier. It was observed that optimized electric field at the photocathode ensures high detection efficiency. Lower pressure of filled gas increases the photoelectron yield from the photocathode surface but reduces the focusing probability of electrons inside the electron multiplier. Also evacuation for longer duration before gas filling increases the photoelectron yield.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uncertainty in material properties and traffic characterization in the design of flexible pavements has led to significant efforts in recent years to incorporate reliability methods and probabilistic design procedures for the design, rehabilitation, and maintenance of pavements. In the mechanistic-empirical (ME) design of pavements, despite the fact that there are multiple failure modes, the design criteria applied in the majority of analytical pavement design methods guard only against fatigue cracking and subgrade rutting, which are usually considered as independent failure events. This study carries out the reliability analysis for a flexible pavement section for these failure criteria based on the first-order reliability method (FORM) and the second-order reliability method (SORM) techniques and the crude Monte Carlo simulation. Through a sensitivity analysis, the most critical parameter affecting the design reliability for both fatigue and rutting failure criteria was identified as the surface layer thickness. However, reliability analysis in pavement design is most useful if it can be efficiently and accurately applied to components of pavement design and the combination of these components in an overall system analysis. The study shows that for the pavement section considered, there is a high degree of dependence between the two failure modes, and demonstrates that the probability of simultaneous occurrence of failures can be almost as high as the probability of component failures. Thus, the need to consider the system reliability in the pavement analysis is highlighted, and the study indicates that the improvement of pavement performance should be tackled in the light of reducing this undesirable event of simultaneous failure and not merely the consideration of the more critical failure mode. Furthermore, this probability of simultaneous occurrence of failures is seen to increase considerably with small increments in the mean traffic loads, which also results in wider system reliability bounds. The study also advocates the use of narrow bounds to the probability of failure, which provides a better estimate of the probability of failure, as validated from the results obtained from Monte Carlo simulation (MCS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development of simple functionalization methods to attach biomolecules such as proteins and DNA on inexpensive substrates is important for widespread use of low cost, disposable biosensors. Here, we describe a method based on polyelectrolyte multilayers to attach single stranded DNA molecules to conventional glass slides as well as a completely non-standard substrate, namely flexible plastic transparency sheets. We then use the functionalized transparency sheets to specifically detect single stranded Hepatitis B DNA sequences from samples. We also demonstrate a blocking method for reducing non-specific binding of target DNA sequences using negatively charged polyelectrolyte molecules. The polyelectrolyte based functionalization method, which relies on surface charge as opposed to covalent surface linkages, could be an attractive platform to develop assays on inexpensive substrates for low cost biosensing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of sexually dimorphic, elaborate male traits that are seemingly maladaptive may be driven by sexual selection (male-male competition and or female mate choice). Tusk possession in the Asian elephant is sexually dimorphic and exaggerated but its role in male-male competition has not yet been determined. We examined the role of the tusks in establishing dominance along with two other known male-male signals, namely, body size and musth (a temporary physiologically heightened sexual state) in an Asian elephant population in northeastern India with equal proportions of tusked and tuskless males. We observed 116 agonistic interactions with clear dominance outcomes between adult (>15 years) males during 458 field days in the dry season months of 2008-2011. A generalized linear mixed-effects model was used to predict the probability of winning as a function of body size, tusk possession and musth status relative to the opponent. A hierarchy of the three male-male signals emerged from this analysis, with musth overriding body size and body size overriding tusk possession. In this elephant population tusk possession thus plays a relatively minor role in male-male competition. An important implication of musth and body size being stronger determinants of dominance than tusk possession is that it could facilitate rapid evolution of tuskless males in the population under artificial selection against tusked individuals, which are poached for ivory. (C) 2013 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.