90 resultados para release planning
Resumo:
Seismic microzonation has generally been recognized as the most accepted tool in seismic hazard assessment and risk evaluation. In general, risk reduction can be done by reducing the hazard, the vulnerability or the value at risk. Since the earthquake hazard can not be reduced, one has to concentrate on vulnerability and value at risk. The vulnerability of an urban area / municipalities depends on the vulnerability of infrastructure and redundancies within the infrastructure. The earthquake risk is the damage to buildings along with number of people that are killed / hurt and the economic losses during the event due to an earthquake with a return period corresponding to this time period. The principal approaches one can follow to reduce these losses are to avoid, if possible, high hazard areas for the siting of buildings and infrastructure, and further ensure that the buildings and infrastructure are designed and constructed to resist expected earthquake loads. This can be done if one can assess the hazard at local scales. Seismic microzonation maps provide the basis for scientifically based decision-making to reduce earthquake risk for Govt./public agencies, private owners and the general public. Further, seismic microzonation carried out on an appropriate scale provides a valuable tool for disaster mitigation planning and emergency response planning for urban centers / municipalities. It provides the basis for the identification of the areas of the city / municipality which are most likely to experience serious damage in the event of an earthquake.
Resumo:
Characteristics of the process of entrainment in plane mixing layers, and the changes with compressibility and heat release, were studied using temporal DNS with simultaneous fluid packet tracking. Convective Mach numbers of the simulations are 0.15, 0.7 and 1.1. The Reynolds number is quite high (between 11 000 and 37 000 based on layer width and velocity difference), and is above the mixing transition. The study agrees with recent findings in round jets: first, engulfed fluid volume and its growth rate are both very small compared with the volume of the turbulent region and its growth rate, respectively. Secondly, most often, the process occurs close to the turbulent-nonturbulent boundaries. A new finding is that both compressibility and heat release retard the entrainment process so that it takes an O(1) time for vorticity or scalar levels to grow even after growth has been initiated. This delay is manifested as the fall in mixing layer growth rates as compressibility and heat release levels increase.
Resumo:
The problem of scheduling divisible loads in distributed computing systems, in presence of processor release time is considered. The objective is to find the optimal sequence of load distribution and the optimal load fractions assigned to each processor in the system such that the processing time of the entire processing load is a minimum. This is a difficult combinatorial optimization problem and hence genetic algorithms approach is presented for its solution.
Resumo:
This study considers the scheduling problem observed in the burn-in operation of semiconductor final testing, where jobs are associated with release times, due dates, processing times, sizes, and non-agreeable release times and due dates. The burn-in oven is modeled as a batch-processing machine which can process a batch of several jobs as long as the total sizes of the jobs do not exceed the machine capacity and the processing time of a batch is equal to the longest time among all the jobs in the batch. Due to the importance of on-time delivery in semiconductor manufacturing, the objective measure of this problem is to minimize total weighted tardiness. We have formulated the scheduling problem into an integer linear programming model and empirically show its computational intractability. Due to the computational intractability, we propose a few simple greedy heuristic algorithms and meta-heuristic algorithm, simulated annealing (SA). A series of computational experiments are conducted to evaluate the performance of the proposed heuristic algorithms in comparison with exact solution on various small-size problem instances and in comparison with estimated optimal solution on various real-life large size problem instances. The computational results show that the SA algorithm, with initial solution obtained using our own proposed greedy heuristic algorithm, consistently finds a robust solution in a reasonable amount of computation time.
Resumo:
A study is presented which is aimed at developing techniques suitable for effective planning and efficient operation of fleets of aircraft typical of the air force of a developing country. An important aspect of fleet management, the problem of resource allocation for achieving prescribed operational effectiveness of the fleet, is considered. For analysis purposes, it is assumed that the planes operate in a single flying-base repair-depot environment. The perennial problem of resource allocation for fleet and facility buildup that faces planners is modeled and solved as an optimal control problem. These models contain two "policy" variables representing investments in aircraft and repair facilities. The feasibility of decentralized control is explored by assuming the two policy variables are under the control of two independent decisionmakers guided by different and not often well coordinated objectives.
Resumo:
This paper describes the use of simulation in the planning and operation of a small fleet of aircraft typical of the air force of a developing country. We consider a single flying base, where the opera tionally ready aircraft are stationed, and a repair depot, where the planes are overhauled. The measure of effectiveness used is "system availability, the percentage of airplanes that are usable. The system is modeled in GPSS as a cyclic queue process. The simulation model is used to perform sensitivity analyses and to validate the principal assumptions of the analytical model on which the simulation model is based.
Resumo:
A study is presented which is aimed at developing techniques suitable for effective planning and efficient operation of fleets of aircraft typical of the air force of a developing country. An important aspect of fleet management, the problem of resource allocation for achieving prescribed operational effectiveness of the fleet, is considered. For analysis purposes, it is assumed that the planes operate in a single flying-base repair-depot environment. The perennial problem of resource allocation for fleet and facility buildup that faces planners is modeled and solved as an optimal control problem. These models contain two "policy" variables representing investments in aircraft and repair facilities. The feasibility of decentralized control is explored by assuming the two policy variables are under the control of two independent decisionmakers guided by different and not often well coordinated objectives.
Resumo:
The inverse relationship that exists between thyroxine and the vitamin A level of plasma has been examined in chicken. Thyroxine treatment leads to a decrease in the level of vitamin A carrier proteins, retinol-binding protein and prealbumin-2 in plasma and liver. There is an accumulation of vitamin A in the liver, with a greater proportion of vitamin A alcohol being present compared to that of control birds. In thyroxine treatment there is enhanced plasma turnover of retinol-binding protein and prealbumin-2, while their rates of synthesis are marginally increased. Amino acid supplementation partially counteracts effects of thyroxine treatment. Amino acid supplementation of thyroxine-treated birds does not alter the plasma turnover rates of retinol-binding protein and prealbumin-2 but increases substentially their rates of synthesis. The release of vitamin A into circulation is interfered with in hyperthyroidism due to inadequate availability of retinol-binding protein being caused by enhanced plasma turnover rate not compensated for by synthesis.
Resumo:
The objective of the present paper is to select the best compromise irrigation planning strategy for the case study of Jayakwadi irrigation project, Maharashtra, India. Four-phase methodology is employed. In phase 1, separate linear programming (LP) models are formulated for the three objectives, namely. net economic benefits, agricultural production and labour employment. In phase 2, nondominated (compromise) irrigation planning strategies are generated using the constraint method of multiobjective optimisation. In phase 3, Kohonen neural networks (KNN) based classification algorithm is employed to sort nondominated irrigation planning strategies into smaller groups. In phase 4, multicriterion analysis (MCA) technique, namely, Compromise Programming is applied to rank strategies obtained from phase 3. It is concluded that the above integrated methodology is effective for modeling multiobjective irrigation planning problems and the present approach can be extended to situations where number of irrigation planning strategies are even large in number. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we propose a novel and efficient algorithm for modelling sub-65 nm clock interconnect-networks in the presence of process variation. We develop a method for delay analysis of interconnects considering the impact of Gaussian metal process variations. The resistance and capacitance of a distributed RC line are expressed as correlated Gaussian random variables which are then used to compute the standard deviation of delay Probability Distribution Function (PDF) at all nodes in the interconnect network. Main objective is to find delay PDF at a cheaper cost. Convergence of this approach is in probability distribution but not in mean of delay. We validate our approach against SPICE based Monte Carlo simulations while the current method entails significantly lower computational cost.
Resumo:
The hazards associated with major accident hazard (MAN) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The high cost and extraordinary demands made on sophisticated air defence systems, pose hard challenges to the managers and engineers who plan the operation and maintenance of such systems. This paper presents a study aimed at developing simulation and systems analysis techniques for the effective planning and efficient operation of small fleets of aircraft, typical of the air force of a developing country. We consider an important aspect of fleet management: the problem of resource allocation for achieving prescribed operational effectiveness of the fleet. At this stage, we consider a single flying-base, where the operationally ready aircraft are stationed, and a repair-depot, where the planes are overhauled. An important measure of operational effectiveness is ‘ availability ’, which may be defined as the expected fraction of the fleet fit for use at a given instant. The tour of aircraft in a flying-base, repair-depot system through a cycle of ‘ operationally ready ’ and ‘ scheduled overhaul ’ phases is represented first by a deterministic flow process and then by a cyclic queuing process. Initially the steady-state availability at the flying-base is computed under the assumptions of Poisson arrivals, exponential service times and an equivalent singleserver repair-depot. This analysis also brings out the effect of fleet size on availability. It defines a ‘ small ’ fleet essentially in terms of the important ‘ traffic ’ parameter of service rate/maximum arrival rate.A simulation model of the system has been developed using GPSS to study sensitivity to distributional assumptions, to validate the principal assumptions of the analytical model such as the single-server assumption and to obtain confidence intervals for the statistical parameters of interest.