34 resultados para Mitigate


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the classical problem of sequential detection of change in a distribution (from hypothesis 0 to hypothesis 1), where the fusion centre receives vectors of periodic measurements, with the measurements being i.i.d. over time and across the vector components, under each of the two hypotheses. In our problem, the sensor devices ("motes") that generate the measurements constitute an ad hoc wireless network. The motes contend using a random access protocol (such as CSMA/CA) to transmit their measurement packets to the fusion centre. The fusion centre waits for vectors of measurements to accumulate before taking decisions. We formulate the optimal detection problem, taking into account the network delay experienced by the vectors of measurements, and find that, under periodic sampling, the detection delay decouples into network delay and decision delay. We obtain a lower bound on the network delay, and propose a censoring scheme, where lagging sensors drop their delayed observations in order to mitigate network delay. We show that this scheme can achieve the lower bound. This approach is explored via simulation. We also use numerical evaluation and simulation to study issues such as: the optimal sampling rate for a given number of sensors, and the optimal number of sensors for a given measurement rate

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, using 3-D device simulation, we perform an extensive gate to source/drain underlap optimization for the recently proposed hybrid transistor, HFinFET, to show that the underlap lengths can be suitably tuned to improve the ON-OFF ratio as well as the subthreshold characteristics in an ultrashort channel n-type device without significantON performance degradation. We also show that the underlap knob can be tuned to mitigate the device quality degradation in presence of interface traps. The obtained results are shown to be promising when compared against ITRS 2009 performance projections, as well as published state of the art planar and nonplanar Silicon MOSFET data of comparable gate lengths using standard benchmarking techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software transactional memory (STM) has been proposed as a promising programming paradigm for shared memory multi-threaded programs as an alternative to conventional lock based synchronization primitives. Typical STM implementations employ a conflict detection scheme, which works with uniform access granularity, tracking shared data accesses either at word/cache line or at object level. It is well known that a single fixed access tracking granularity cannot meet the conflicting goals of reducing false conflicts without impacting concurrency adversely. A fine grained granularity while improving concurrency can have an adverse impact on performance due to lock aliasing, lock validation overheads, and additional cache pressure. On the other hand, a coarse grained granularity can impact performance due to reduced concurrency. Thus, in general, a fixed or uniform granularity access tracking (UGAT) scheme is application-unaware and rarely matches the access patterns of individual application or parts of an application, leading to sub-optimal performance for different parts of the application(s). In order to mitigate the disadvantages associated with UGAT scheme, we propose a Variable Granularity Access Tracking (VGAT) scheme in this paper. We propose a compiler based approach wherein the compiler uses inter-procedural whole program static analysis to select the access tracking granularity for different shared data structures of the application based on the application's data access pattern. We describe our prototype VGAT scheme, using TL2 as our STM implementation. Our experimental results reveal that VGAT-STM scheme can improve the application performance of STAMP benchmarks from 1.87% to up to 21.2%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Land cover (LC) and land use (LU) dynamics induced by human and natural processes play a major role in global as well as regional patterns of landscapes influencing biodiversity, hydrology, ecology and climate. Changes in LC features resulting in forest fragmentations have posed direct threats to biodiversity, endangering the sustainability of ecological goods and services. Habitat fragmentation is of added concern as the residual spatial patterns mitigate or exacerbate edge effects. LU dynamics are obtained by classifying temporal remotely sensed satellite imagery of different spatial and spectral resolutions. This paper reviews five different image classification algorithms using spatio-temporal data of a temperate watershed in Himachal Pradesh, India. Gaussian Maximum Likelihood classifier was found to be apt for analysing spatial pattern at regional scale based on accuracy assessment through error matrix and ROC (receiver operating characteristic) curves. The LU information thus derived was then used to assess spatial changes from temporal data using principal component analysis and correspondence analysis based image differencing. The forest area dynamics was further studied by analysing the different types of fragmentation through forest fragmentation models. The computed forest fragmentation and landscape metrics show a decline of interior intact forests with a substantial increase in patch forest during 1972-2007.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to improve the tracking and erosion performance of outdoor polymeric silicone rubber (SR) insulators used in HV power transmission lines, micron sized inorganic fillers are usually added to the base SR matrix. In addition, insulators used in high voltage dc transmission lines are designed to have increased creepage distance to mitigate the tracking and erosion problems. ASTM D2303 standard gives a procedure for finding the tracking and erosion resistance of outdoor polymeric insulator weathershed material samples under laboratory conditions for ac voltages. In this paper, inclined plane (IP) tracking and erosion tests similar to ASTM D2303 were conducted under both positive and negative dc voltages for silicone rubber samples filled with micron and nano sized particles to understand the phenomena occurring during such tests. Micron sized Alumina Trihydrate (ATH) and nano sized alumina fillers were added to silicone rubber matrix to improve the resistance to tracking and erosion. The leakage current during the tests and the eroded mass at the end of the tests were monitored. Scanning Electron Microscopy (SEM) and Energy dispersive Xray (EDX) studies were conducted to understand the filler dispersion and the changes in surface morphology in both nanocomposite and microcomposite samples. The results suggest that nanocomposites performed better than microcomposites even for a small filler loading (4%) for both positive and negative dc stresses. It was also seen that the tracking and erosion performance of silicone rubber is better under negative dc as compared to positive dc voltage. EDX studies showed migration of different ions onto the surface of the sample during the IP test under positive dc which has led to an inferior performance as compared to the performance under negative dc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The financial crisis set off by the default of Lehman Brothers in 2008 leading to disastrous consequences for the global economy has focused attention on regulation and pricing issues related to credit derivatives. Credit risk refers to the potential losses that can arise due to the changes in the credit quality of financial instruments. These changes could be due to changes in the ratings, market price (spread) or default on contractual obligations. Credit derivatives are financial instruments designed to mitigate the adverse impact that may arise due to credit risks. However, they also allow the investors to take up purely speculative positions. In this article we provide a succinct introduction to the notions of credit risk, the credit derivatives market and describe some of the important credit derivative products. There are two approaches to pricing credit derivatives, namely the structural and the reduced form or intensity-based models. A crucial aspect of the modelling that we touch upon briefly in this article is the problem of calibration of these models. We hope to convey through this article the challenges that are inherent in credit risk modelling, the elegant mathematics and concepts that underlie some of the models and the importance of understanding the limitations of the models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most Java programmers would agree that Java is a language that promotes a philosophy of “create and go forth”. By design, temporary objects are meant to be created on the heap, possibly used and then abandoned to be collected by the garbage collector. Excessive generation of temporary objects is termed “object churn” and is a form of software bloat that often leads to performance and memory problems. To mitigate this problem, many compiler optimizations aim at identifying objects that may be allocated on the stack. However, most such optimizations miss large opportunities for memory reuse when dealing with objects inside loops or when dealing with container objects. In this paper, we describe a novel algorithm that detects bloat caused by the creation of temporary container and String objects within a loop. Our analysis determines which objects created within a loop can be reused. Then we describe a source-to-source transformation that efficiently reuses such objects. Empirical evaluation indicates that our solution can reduce upto 40% of temporary object allocations in large programs, resulting in a performance improvement that can be as high as a 20% reduction in the run time, specifically when a program has a high churn rate or when the program is memory intensive and needs to run the GC often.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes an algorithm for joint data detection and tracking of the dominant singular mode of a time varying channel at the transmitter and receiver of a time division duplex multiple input multiple output beamforming system. The method proposed is a modified expectation maximization algorithm which utilizes an initial estimate to track the dominant modes of the channel at the transmitter and the receiver blindly; and simultaneously detects the un known data. Furthermore, the estimates are constrained to be within a confidence interval of the previous estimate in order to improve the tracking performance and mitigate the effect of error propagation. Monte-Carlo simulation results of the symbol error rate and the mean square inner product between the estimated and the true singular vector are plotted to show the performance benefits offered by the proposed method compared to existing techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The last decade has witnessed two unusually large tsunamigenic earthquakes. The devastation from the 2004 Sumatra Andaman and the 2011 Tohoku-Oki earthquakes (both of moment magnitude >= 9.0) and their ensuing tsunamis comes as a harsh reminder on the need to assess and mitigate coastal hazards due to earthquakes and tsunamis worldwide. Along any given subduction zone, megathrust tsunamigenic earthquakes occur over intervals considerably longer than their documented histories and thus, 2004-type events may appear totally `out of the blue'. In order to understand and assess the risk from tsunamis, we need to know their long-term frequency and magnitude, going beyond documented history, to recent geological records. The ability to do this depends on our knowledge of the processes that govern subduction zones, their responses to interseismic and coseismic deformation, and on our expertise to identify and relate tsunami deposits to earthquake sources. In this article, we review the current state of understanding on the recurrence of great thrust earthquakes along global subduction zones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For necessary goods like water, under supply constraints, fairness considerations lead to negative externalities. The objective of this paper is to design an infinite horizon contract or relational contract (a type of long-term contract) that ensures self-enforcing (instead of court-enforced) behaviour by the agents to mitigate the externality due to fairness issues. In this contract, the consumer is induced to consume at firm-supply level using the threat of higher fair price for future time periods. The pricing mechanism, computed in this paper, internalizes the externality and is shown to be economically efficient and provides revenue sufficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To achieve food security and meet the demands of the ever-growing human populations, farming systems have assumed unsustainable practices to produce more from a finite land area. This has been cause for concern mainly due to the often-irreversible damage done to the otherwise productive agricultural landscapes. Agro-ecology is proclaimed to be deteriorating due to eroding integrity of connected ecological mosaics and vulnerability to climate change. This has contributed to declining species diversity, loss of buffer vegetation, fragmentation of habitats, and loss of natural pollinators or predators, which eventually leads to decline in ecosystem services. Currently, a hierarchy of conservation initiatives is being considered to restore ecological integrity of agricultural landscapes. However, the challenge of identifying a suitable conservation strategy is a daunting task in view of socio-ecological factors that may constrain the choice of available strategies. One way to mitigate this situation and integrate biodiversity with agricultural landscapes is to implement offset mechanisms, which are compensatory and balancing approaches to restore the ecological health and function of an ecosystem. This needs to be tailored to the history of location specific agricultural practices, and the social, ecological and environmental conditions. The offset mechanisms can complement other initiatives through which farmers are insured against landscape-level risks such as droughts, fire and floods. For countries in the developing world with significant biodiversity and extensive agriculture, we should promote a comprehensive model of sustainable agricultural landscapes and ecosystem services, replicable at landscape to regional scales. Arguably, the model can be a potential option to sustain the integrity of biodiversity mosaic in agricultural landscapes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software transactional memory(STM) is a promising programming paradigm for shared memory multithreaded programs. While STM offers the promise of being less error-prone and more programmer friendly compared to traditional lock-based synchronization, it also needs to be competitive in performance in order for it to be adopted in mainstream software. A major source of performance overheads in STM is transactional aborts. Conflict resolution and aborting a transaction typically happens at the transaction level which has the advantage that it is automatic and application agnostic. However it has a substantial disadvantage in that STM declares the entire transaction as conflicting and hence aborts it and re-executes it fully, instead of partially re-executing only those part(s) of the transaction, which have been affected due to the conflict. This "Re-execute Everything" approach has a significant adverse impact on STM performance. In order to mitigate the abort overheads, we propose a compiler aided Selective Reconciliation STM (SR-STM) scheme, wherein certain transactional conflicts can be reconciled by performing partial re-execution of the transaction. Ours is a selective hybrid approach which uses compiler analysis to identify those data accesses which are legal and profitable candidates for reconciliation and applies partial re-execution only to these candidates selectively while other conflicting data accesses are handled by the default STM approach of abort and full re-execution. We describe the compiler analysis and code transformations required for supporting selective reconciliation. We find that SR-STM is effective in reducing the transactional abort overheads by improving the performance for a set of five STAMP benchmarks by 12.58% on an average and up to 22.34%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing nitrate concentrations in ground water is deleterious to human health as ingestion of such water can cause methemoglobinemia in infants and even cancer in adults (desirable limit for nitrate as NO3 - 45 mg/L, IS code 10500-1991). Excess nitrate concentrations in ground water is contributed by reason being disposal of sewage and excessive use of fertilizers. Though numerous technologies such as reverse osmosis, ion exchange, electro-dialysis, permeable reactive barriers using zerovalent iron etc exists, nitrate removal continues to be one of challenging issue as nitrate ion is highly mobile within the soil strata. The tapping the denitrification potential of soil denitrifiers which are inherently available in the soil matrix is the most sustainable approach to mitigate accumulation of nitrate in ground water. The insitu denitrification of sand and bentonite enhanced sand (bentonite content = 5%) in presence of easily assimilable organic carbon such as ethanol was studied. Batch studies showed that nitrate reduction by sand follows first order kinetics with a rate constant 5.3x10(-2) hr(-1) and rate constant 4.3 x 10(-2) hr(-1) was obtained for bentonite-enhanced sand (BS) at 25 degrees C. Filter columns (height = 5 cm and diameter = 8.2 cm) were constructed using sand and bentonite-enhanced sand as filter media. The filtration rate through both the filter columns was maintained at average value of 2.60 cm/h. The nitrate removal rates through both the filter media was assessed for solution containing 22.6 mg NO3-N/L concentrations while keeping C/N mass ratio as 3. For sand filter column, the nitrate removal efficiency reached the average value of 97.6% after passing 50 pore volumes of the nitrate solution. For bentonite-enhanced sand filter column, the average nitrate removal efficiency was 83.5%. The time required for effective operation for sand filter bed was 100 hours, while bentonite-enhanced sand filter bed did not require any maturation period as that of sand filter bed for effective performance because the presence of micropores in bentonite increases the hydraulic retention time of the solution inside the filter bed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been a continuous surge toward developing new biopolymers that exhibit better in vivo biocompatibility properties in terms of demonstrating a reduced foreign body response (FBR). One approach to mitigate the undesired FBR is to develop an implant capable of releasing anti-inflammatory molecules in a sustained manner over a long time period. Implants causing inflammation are also more susceptible to infection. In this article, the in vivo biocompatibility of a novel, biodegradable salicylic acid releasing polyester (SAP) has been investigated by subcutaneous implantation in a mouse model. The tissue response to SAP was compared with that of a widely used biodegradable polymer, poly(lactic acid-co-glycolic acid) (PLGA), as a control over three time points: 2, 4, and 16 weeks postimplantation. A long-term in vitro study illustrates a continuous, linear (zero order) release of salicylic acid with a cumulative mass percent release rate of 7.34 x 10(-4) h(-1) over similar to 1.5-17 months. On the basis of physicochemical analysis, surface erosion for SAP and bulk erosion for PLGA have been confirmed as their dominant degradation modes in vivo. On the basis of the histomorphometrical analysis of inflammatory cell densities and collagen distribution as well as quantification of proinflammatory cytokine levels (TNF-alpha and IL-1 beta), a reduced foreign body response toward SAP with respect to that generated by PLGA has been unambiguously established. The favorable in vivo tissue response to SAP, as manifest from the uniform and well-vascularized encapsulation around the implant, is consistent with the decrease in inflammatory cell density and increase in angiogenesis with time. The above observations, together with the demonstration of long-term and sustained release of salicylic acid, establish the potential use of SAP for applications in improved matrices for tissue engineering and chronic wound healing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-dimensional magnetic recording (2-D TDMR) is an emerging technology that aims to achieve areal densities as high as 10 Tb/in(2) using sophisticated 2-D signal-processing algorithms. High areal densities are achieved by reducing the size of a bit to the order of the size of magnetic grains, resulting in severe 2-D intersymbol interference (ISI). Jitter noise due to irregular grain positions on the magnetic medium is more pronounced at these areal densities. Therefore, a viable read-channel architecture for TDMR requires 2-D signal-detection algorithms that can mitigate 2-D ISI and combat noise comprising jitter and electronic components. Partial response maximum likelihood (PRML) detection scheme allows controlled ISI as seen by the detector. With the controlled and reduced span of 2-D ISI, the PRML scheme overcomes practical difficulties such as Nyquist rate signaling required for full response 2-D equalization. As in the case of 1-D magnetic recording, jitter noise can be handled using a data-dependent noise-prediction (DDNP) filter bank within a 2-D signal-detection engine. The contributions of this paper are threefold: 1) we empirically study the jitter noise characteristics in TDMR as a function of grain density using a Voronoi-based granular media model; 2) we develop a 2-D DDNP algorithm to handle the media noise seen in TDMR; and 3) we also develop techniques to design 2-D separable and nonseparable targets for generalized partial response equalization for TDMR. This can be used along with a 2-D signal-detection algorithm. The DDNP algorithm is observed to give a 2.5 dB gain in SNR over uncoded data compared with the noise predictive maximum likelihood detection for the same choice of channel model parameters to achieve a channel bit density of 1.3 Tb/in(2) with media grain center-to-center distance of 10 nm. The DDNP algorithm is observed to give similar to 10% gain in areal density near 5 grains/bit. The proposed signal-processing framework can broadly scale to various TDMR realizations and areal density points.