94 resultados para Reliability allocation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, an analytical study considering the effect of uncertainties in the seismic analysis of geosynthetic-reinforced soil (GRS) walls is presented. Using limit equilibrium method and assuming sliding wedge failure mechanism, analysis is conducted to evaluate the external stability of GRS walls when subjected to earthquake loads. Target reliability based approach is used to estimate the probability of failure in three modes of failure, viz., sliding, bearing, and eccentricity failure. The properties of reinforced backfill, retained backfill, foundation soil, and geosynthetic reinforcement are treated as random variables. In addition, the uncertainties associated with horizontal seismic acceleration and surcharge load acting on the wall are considered. The optimum length of reinforcement needed to maintain the stability against three modes of failure by targeting various component and system reliability indices is obtained. Studies have also been made to study the influence of various parameters on the seismic stability in three failure modes. The results are compared with those given by first-order second moment method and Monte Carlo simulation methods. In the illustrative example, external stability of the two walls, Gould and Valencia walls, subjected to Northridge earthquake is reexamined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present study, results of reliability analyses of four selected rehabilitated earth dam sections, i.e., Chang, Tapar, Rudramata, and Kaswati, under pseudostatic loading conditions, are presented. Using the response surface methodology, in combination with first order reliability method and numerical analysis, the reliability index (beta) values are obtained and results are interpreted in conjunction with conventional factor of safety values. The influence of considering variability in the input soil shear strength parameters, horizontal seismic coefficient (alpha(h)), and location of reservoir full level on the stability assessment of the earth dam sections is discussed in the probabilistic framework. A comparison of results with those obtained from other method of reliability analysis, viz., Monte Carlo simulations combined with limit equilibrium approach, provided a basis for discussing the stability of earth dams in probabilistic terms, and the results of the analysis suggest that the considered earth dam sections are reliable and are expected to perform satisfactorily.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, an analytical study considering the effect of uncertainties in the seismic analysis of geosynthetic-reinforced soil (GRS) walls is presented. Using limit equilibrium method and assuming sliding wedge failure mechanism, analysis is conducted to evaluate the external stability of GRS walls when subjected to earthquake loads. Target reliability based approach is used to estimate the probability of failure in three modes of failure, viz., sliding, bearing, and eccentricity failure. The properties of reinforced backfill, retained backfill, foundation soil, and geosynthetic reinforcement are treated as random variables. In addition, the uncertainties associated with horizontal seismic acceleration and surcharge load acting on the wall are considered. The optimum length of reinforcement needed to maintain the stability against three modes of failure by targeting various component and system reliability indices is obtained. Studies have also been made to study the influence of various parameters on the seismic stability in three failure modes. The results are compared with those given by first-order second moment method and Monte Carlo simulation methods. In the illustrative example, external stability of the two walls, Gould and Valencia walls, subjected to Northridge earthquake is reexamined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the foremost design considerations in microelectronics miniaturization is the use of embedded passives which provide practical solution. In a typical circuit, over 80 percent of the electronic components are passives such as resistors, inductors, and capacitors that could take up to almost 50 percent of the entire printed circuit board area. By integrating passive components within the substrate instead of being on the surface, embedded passives reduce the system real estate, eliminate the need for discrete and assembly, enhance electrical performance and reliability, and potentially reduce the overall cost. Moreover, it is lead free. Even with these advantages, embedded passive technology is at a relatively immature stage and more characterization and optimization are needed for practical applications leading to its commercialization.This paper presents an entire process from design and fabrication to electrical characterization and reliability test of embedded passives on multilayered microvia organic substrate. Two test vehicles focusing on resistors and capacitors have been designed and fabricated. Embedded capacitors in this study are made with polymer/ceramic nanocomposite (BaTiO3) material to take advantage of low processing temperature of polymers and relatively high dielectric constant of ceramics and the values of these capacitors range from 50 pF to 1.5 nF with capacitance per area of approximately 1.5 nF/cm(2). Limited high frequency measurement of these capacitors was performed. Furthermore, reliability assessments of thermal shock and temperature humidity tests based on JEDEC standards were carried out. Resistors used in this work have been of three types: 1) carbon ink based polymer thick film (PTF), 2) resistor foils with known sheet resistivities which are laminated to printed wiring board (PWB) during a sequential build-up (SBU) process and 3) thin-film resistor plating by electroless method. Realization of embedded resistors on conventional board-level high-loss epoxy (similar to 0.015 at 1 GHz) and proposed low-loss BCB dielectric (similar to 0.0008 at > 40 GHz) has been explored in this study. Ni-P and Ni-W-P alloys were plated using conventional electroless plating, and NiCr and NiCrAlSi foils were used for the foil transfer process. For the first time, Benzocyclobutene (BCB) has been proposed as a board level dielectric for advanced System-on-Package (SOP) module primarily due to its attractive low-loss (for RF application) and thin film (for high density wiring) properties.Although embedded passives are more reliable by eliminating solder joint interconnects, they also introduce other concerns such as cracks, delamination and component instability. More layers may be needed to accommodate the embedded passives, and various materials within the substrate may cause significant thermo -mechanical stress due to coefficient of thermal expansion (CTE) mismatch. In this work, numerical models of embedded capacitors have been developed to qualitatively examine the effects of process conditions and electrical performance due to thermo-mechanical deformations.Also, a prototype working product with the board level design including features of embedded resistors and capacitors are underway. Preliminary results of these are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of time variant reliability analysis of existing structures subjected to stationary random dynamic excitations is considered. The study assumes that samples of dynamic response of the structure, under the action of external excitations, have been measured at a set of sparse points on the structure. The utilization of these measurements m in updating reliability models, postulated prior to making any measurements, is considered. This is achieved by using dynamic state estimation methods which combine results from Markov process theory and Bayes' theorem. The uncertainties present in measurements as well as in the postulated model for the structural behaviour are accounted for. The samples of external excitations are taken to emanate from known stochastic models and allowance is made for ability (or lack of it) to measure the applied excitations. The future reliability of the structure is modeled using expected structural response conditioned on all the measurements made. This expected response is shown to have a time varying mean and a random component that can be treated as being weakly stationary. For linear systems, an approximate analytical solution for the problem of reliability model updating is obtained by combining theories of discrete Kalman filter and level crossing statistics. For the case of nonlinear systems, the problem is tackled by combining particle filtering strategies with data based extreme value analysis. In all these studies, the governing stochastic differential equations are discretized using the strong forms of Ito-Taylor's discretization schemes. The possibility of using conditional simulation strategies, when applied external actions are measured, is also considered. The proposed procedures are exemplifiedmby considering the reliability analysis of a few low-dimensional dynamical systems based on synthetically generated measurement data. The performance of the procedures developed is also assessed based on a limited amount of pertinent Monte Carlo simulations. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliability analysis for computing systems in aerospace applications must account for actual computations the system performs in the use environment. This paper introduces a theoretical nonhomogeneous Markov model for such applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is important to identify the ``correct'' number of topics in mechanisms like Latent Dirichlet Allocation(LDA) as they determine the quality of features that are presented as features for classifiers like SVM. In this work we propose a measure to identify the correct number of topics and offer empirical evidence in its favor in terms of classification accuracy and the number of topics that are naturally present in the corpus. We show the merit of the measure by applying it on real-world as well as synthetic data sets(both text and images). In proposing this measure, we view LDA as a matrix factorization mechanism, wherein a given corpus C is split into two matrix factors M-1 and M-2 as given by C-d*w = M1(d*t) x Q(t*w).Where d is the number of documents present in the corpus anti w is the size of the vocabulary. The quality of the split depends on ``t'', the right number of topics chosen. The measure is computed in terms of symmetric KL-Divergence of salient distributions that are derived from these matrix factors. We observe that the divergence values are higher for non-optimal number of topics - this is shown by a `dip' at the right value for `t'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methodologies are presented for minimization of risk in a river water quality management problem. A risk minimization model is developed to minimize the risk of low water quality along a river in the face of conflict among various stake holders. The model consists of three parts: a water quality simulation model, a risk evaluation model with uncertainty analysis and an optimization model. Sensitivity analysis, First Order Reliability Analysis (FORA) and Monte-Carlo simulations are performed to evaluate the fuzzy risk of low water quality. Fuzzy multiobjective programming is used to formulate the multiobjective model. Probabilistic Global Search Laussane (PGSL), a global search algorithm developed recently, is used for solving the resulting non-linear optimization problem. The algorithm is based on the assumption that better sets of points are more likely to be found in the neighborhood of good sets of points, therefore intensifying the search in the regions that contain good solutions. Another model is developed for risk minimization, which deals with only the moments of the generated probability density functions of the water quality indicators. Suitable skewness values of water quality indicators, which lead to low fuzzy risk are identified. Results of the models are compared with the results of a deterministic fuzzy waste load allocation model (FWLAM), when methodologies are applied to the case study of Tunga-Bhadra river system in southern India, with a steady state BOD-DO model. The fractional removal levels resulting from the risk minimization model are slightly higher, but result in a significant reduction in risk of low water quality. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a novel technique to model and ana¿ lyze the performability of parallel and distributed architectures using GSPN-reward models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

FDDI (Fibre Distributed Data Interface) is a 100 Mbit/s token ring network with two counter rotating optical rings. In this paper various possible faults (like lost token, link failures, etc.) are considered, and fault detection and the ring recovery process in case of a failure and the reliability mechanisms provided are studied. We suggest a new method to improve the fault detection and ring recovery process. The performance improvement in terms of station queue length and the average delay is compared with the performance of the existing fault detection and ring recovery process through simulation. We also suggest a modification for the physical configuration of the FDDI networks within the guidelines set by the standard to make the network more reliable. It is shown that, unlike the existing FDDI network, full connectivity is maintained among the stations even when multiple single link failures occur. A distributed algorithm is proposed for link reconfiguration of the modified FDDI network when many successive as well as simultaneous link failures occur. The performance of the modified FDDI network under link failures is studied through simulation and compared with that of the existing FDDI network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a multiple resource interaction model in a game-theoretical framework to solve resource allocation problems in theater level military campaigns. An air raid campaign using SEAD aircraft and bombers against an enemy target defended by air defense units is considered as the basic platform. Conditions for the existence of saddle point in pure strategies is proved and explicit feedback strategies are obtained for a simplified model with linear attrition function limited by resource availability. An illustrative example demonstrates the key features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reduction of carbon emissions is of paramount importance in the context of global warming. Countries and global companies are now engaged in understanding systematic ways of achieving well defined emission targets. In fact, carbon credits have become significant and strategic instruments of finance for countries and global companies. In this paper, we formulate and suggest a solution to the carbon allocation problem, which involves determining a cost minimizing allocation of carbon credits among different emitting agents. We address this challenge in the context of a global company which is faced with the challenge of determining an allocation of carbon credit caps among its divisions in a cost effective way. The problem is formulated as a reverse auction problem where the company plays the role of a buyer or carbon planning authority and the different divisions within the company are the emitting agents that specify cost curves for carbon credit reductions. Two natural variants of the problem: (a) with unlimited budget and (b) with limited budget are considered. Suitable assumptions are made on the cost curves and in each of the two cases we show that the resulting problem formulation is a knapsack problem that can be solved optimally using a greedy heuristic. The solution of the allocation problem provides critical decision support to global companies engaged seriously in green programs.