958 resultados para Cost Optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

TO THE EDITOR: Kinner and colleagues described the high proportion of deaths among recently released prisoners in Australia...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider non-linear transceiver designs for multiuser multi-input multi-output (MIMO) down-link in the presence of imperfections in the channel state information at the transmitter (CSIT). The base station (BS) is equipped with multiple transmit antennas and each user terminal is equipped with multiple receive antennas. The BS employs Tomlinson-Harashima precoding (THP) for inter-user interference pre-cancellation at the transmitter. We investigate robust THP transceiver designs based on the minimization of BS transmit power with mean square error (MSE) constraints, and balancing of MSE among users with a constraint on the total BS transmit power. We show that these design problems can be solved by iterative algorithms, wherein each iteration involves a pair of convex optimization problems. The robustness of the proposed algorithms to imperfections in CSIT is illustrated through simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a recent spate of high profile infrastructure cost overruns in Australia and internationally. This is just the tip of a longer-term and more deeply-seated problem with initial budget estimating practice, well recognised in both academic research and industry reviews: the problem of uncertainty. A case study of the Sydney Opera House is used to identify and illustrate the key causal factors and system dynamics of cost overruns. It is conventionally the role of risk management to deal with such uncertainty, but the type and extent of the uncertainty involved in complex projects is shown to render established risk management techniques ineffective. This paper considers a radical advance on current budget estimating practice which involves a particular approach to statistical modelling complemented by explicit training in estimating practice. The statistical modelling approach combines the probability management techniques of Savage, which operate on actual distributions of values rather than flawed representations of distributions, and the data pooling technique of Skitmore, where the size of the reference set is optimised. Estimating training employs particular calibration development methods pioneered by Hubbard, which reduce the bias of experts caused by over-confidence and improve the consistency of subjective decision-making. A new framework for initial budget estimating practice is developed based on the combined statistical and training methods, with each technique being explained and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rupture of a cerebral artery aneurysm causes a devastating subarachnoid hemorrhage (SAH), with a mortality of almost 50% during the first month. Each year, 8-11/100 000 people suffer from aneurysmal SAH in Western countries, but the number is twice as high in Finland and Japan. The disease is most common among those of working age, the mean age at rupture being 50-55 years. Unruptured cerebral aneurysms are found in 2-6% of the population, but knowledge about the true risk of rupture is limited. The vast majority of aneurysms should be considered rupture-prone, and treatment for these patients is warranted. Both unruptured and ruptured aneurysms can be treated by either microsurgical clipping or endovascular embolization. In a standard microsurgical procedure, the neck of the aneurysm is closed by a metal clip, sealing off the aneurysm from the circulation. Endovascular embolization is performed by packing the aneurysm from the inside of the vessel lumen with detachable platinum coils. Coiling is associated with slightly lower morbidity and mortality than microsurgery, but the long-term results of microsurgically treated aneurysms are better. Endovascular treatment methods are constantly being developed further in order to achieve better long-term results. New coils and novel embolic agents need to be tested in a variety of animal models before they can be used in humans. In this study, we developed an experimental rat aneurysm model and showed its suitability for testing endovascular devices. We optimized noninvasive MRI sequences at 4.7 Tesla for follow-up of coiled experimental aneurysms and for volumetric measurement of aneurysm neck remnants. We used this model to compare platinum coils with polyglycolic-polylactic acid (PGLA) -coated coils, and showed the benefits of the latter in this model. The experimental aneurysm model and the imaging methods also gave insight into the mechanisms involved in aneurysm formation, and the model can be used in the development of novel imaging techniques. This model is affordable, easily reproducible, reliable, and suitable for MRI follow-up. It is also suitable for endovascular treatment, and it evades spontaneous occlusion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article analyzes the effect of devising a new failure envelope by the combination of the most commonly used failure criteria for the composite laminates, on the design of composite structures. The failure criteria considered for the study are maximum stress and Tsai-Wu criteria. In addition to these popular phenomenological-based failure criteria, a micromechanics-based failure criterion called failure mechanism-based failure criterion is also considered. The failure envelopes obtained by these failure criteria are superimposed over one another and a new failure envelope is constructed based on the lowest absolute values of the strengths predicted by these failure criteria. Thus, the new failure envelope so obtained is named as most conservative failure envelope. A minimum weight design of composite laminates is performed using genetic algorithms. In addition to this, the effect of stacking sequence on the minimum weight of the laminate is also studied. Results are compared for the different failure envelopes and the conservative design is evaluated, with respect to the designs obtained by using only one failure criteria. The design approach is recommended for structures where composites are the key load-carrying members such as helicopter rotor blades.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Swarm Intelligence techniques such as particle swarm optimization (PSO) are shown to be incompetent for an accurate estimation of global solutions in several engineering applications. This problem is more severe in case of inverse optimization problems where fitness calculations are computationally expensive. In this work, a novel strategy is introduced to alleviate this problem. The proposed inverse model based on modified particle swarm optimization algorithm is applied for a contaminant transport inverse model. The inverse models based on standard-PSO and proposed-PSO are validated to estimate the accuracy of the models. The proposed model is shown to be out performing the standard one in terms of accuracy in parameter estimation. The preliminary results obtained using the proposed model is presented in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esophageal and gastroesophageal junction (GEJ) adenocarcinoma is rapidly increasing disease with a pathophysiology connected to oxidative stress. Exact pre-treatment clinical staging is essential for optimal care of this lethal malignancy. The cost-effectiviness of treatment is increasingly important. We measured oxidative metabolism in the distal and proximal esophagus by myeloperoxidase activity (MPA), glutathione content (GSH), and superoxide dismutase (SOD) in 20 patients operated on with Nissen fundoplication and 9 controls during a 4-year follow-up. Further, we assessed the oxidative damage of DNA by 8-hydroxydeoxyguanosine (8-OHdG) in esophageal samples of subjects (13 Barrett s metaplasia, 6 Barrett s esophagus with high-grade dysplasia, 18 adenocarcinoma of the distal esophagus/GEJ, and 14 normal controls). We estimated the accuracy (42 patients) and preoperative prognostic value (55 patients) of PET compared with computed tomography (CT) and endoscopic ultrasound (EUS) in patients with adenocarcinoma of the esophagus/GEJ. Finally, we clarified the specialty-related costs and the utility of either radical (30 patients) or palliative (23 patients) treatment of esophageal/GEJ carcinoma by the 15 D health-related quality-of-life (HRQoL) questionnaire and the survival rate. The cost-utility of radical treatment of esophageal/GEJ carcinoma was investigated using a decision tree analysis model comparing radical, palliative, and hypothetical new treatment. We found elevated oxidative stress ( measured by MPA) and decreased antioxidant defense (measured by GSH) after antireflux surgery. This indicates that antireflux surgery is not a perfect solution for oxidative stress of the esophageal mucosa. Elevated oxidative stress in turn may partly explain why adenocarcinoma of the distal esophagus is found even after successful fundoplication. In GERD patients, proximal esophageal mucosal anti-oxidative defense seems to be defective before and even years after successful antireflux surgery. In addition, antireflux surgery apparently does not change the level of oxidative stress in the proximal esophagus, suggesting that defective mucosal anti-oxidative capacity plays a role in development of oxidative damage to the esophageal mucosa in GERD. In the malignant transformation of Barrett s esophagus an important component appears to be oxidative stress. DNA damage may be mediated by 8-OHdG, which we found to be increased in Barrett s epithelium and in high-grade dysplasia as well as in adenocarcinoma of the esophagus/GEJ compared with controls. The entire esophagus of Barrett s patients suffers from increased oxidative stress ( measured by 8-OhdG). PET is a useful tool in the staging and prognostication of adenocarcinoma of the esophagus/GEJ detecting organ metastases better than CT, although its accuracy in staging of paratumoral and distant lymph nodes is limited. Radical surgery for esophageal/GEJ carcinoma provides the greatest benefit in terms of survival, and its cost-utility appears to be the best of currently available treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the beginning of 2008, I visited a watershed, located in Karkinatam village in the state of Karnataka, South India, where crops are intensively irrigated using groundwater. The water table had been depleted from a depth of 5 to 50 m in a large part of the area. Presently, 42% of a total of 158 water wells in the watershed are dry. Speaking with the farmers, I have been amazed to learn that they were drilling down to 500 m to tap water. This case is, of course, not isolated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental characterization of high dimensional dynamic systems sometimes uses the proper orthogonal decomposition (POD). If there are many measurement locations and relatively fewer sensors, then steady-state behavior can still be studied by sequentially taking several sets of simultaneous measurements. The number required of such sets of measurements can be minimized if we solve a combinatorial optimization problem. We aim to bring this problem to the attention of engineering audiences, summarize some known mathematical results about this problem, and present a heuristic (suboptimal) calculation that gives reasonable, if not stellar, results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider a decentralized supply chain formation problem for linear multi-echelon supply chains when the managers of the individual echelons are autonomous, rational, and intelligent. At each echelon, there is a choice of service providers and the specific problem we solve is that of determining a cost-optimal mix of service providers so as to achieve a desired level of end-to-end delivery performance. The problem can be broken up into two sub-problems following a mechanism design approach: (1) Design of an incentive compatible mechanism to elicit the true cost functions from the echelon managers; (2) Formulation and solution of an appropriate optimization problem using the true cost information. In this paper we propose a novel Bayesian incentive compatible mechanism for eliciting the true cost functions. This improves upon existing solutions in the literature which are all based on the classical Vickrey-Clarke-Groves mechanisms, requiring significant incentives to be paid to the echelon managers for achieving dominant strategy incentive compatibility. The proposed solution, which we call SCF-BIC (Supply Chain Formation with Bayesian Incentive Compatibility), significantly reduces the cost of supply chain formation. We illustrate the efficacy of the proposed methodology using the example of a three echelon manufacturing supply chain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A business cluster is a co-located group of micro, small, medium scale enterprises. Such firms can benefit significantly from their co-location through shared infrastructure and shared services. Cost sharing becomes an important issue in such sharing arrangements especially when the firms exhibit strategic behavior. There are many cost sharing methods and mechanisms proposed in the literature based on game theoretic foundations. These mechanisms satisfy a variety of efficiency and fairness properties such as allocative efficiency, budget balance, individual rationality, consumer sovereignty, strategyproofness, and group strategyproofness. In this paper, we motivate the problem of cost sharing in a business cluster with strategic firms and illustrate different cost sharing mechanisms through the example of a cluster of firms sharing a logistics service. Next we look into the problem of a business cluster sharing ICT (information and communication technologies) infrastructure and explore the use of cost sharing mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal allocation of water resources for various stakeholders often involves considerable complexity with several conflicting goals, which often leads to multi-objective optimization. In aid of effective decision-making to the water managers, apart from developing effective multi-objective mathematical models, there is a greater necessity of providing efficient Pareto optimal solutions to the real world problems. This study proposes a swarm-intelligence-based multi-objective technique, namely the elitist-mutated multi-objective particle swarm optimization technique (EM-MOPSO), for arriving at efficient Pareto optimal solutions to the multi-objective water resource management problems. The EM-MOPSO technique is applied to a case study of the multi-objective reservoir operation problem. The model performance is evaluated by comparing with results of a non-dominated sorting genetic algorithm (NSGA-II) model, and it is found that the EM-MOPSO method results in better performance. The developed method can be used as an effective aid for multi-objective decision-making in integrated water resource management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluctuation of field emission in carbon nanotubes (CNTs) is riot desirable in many applications and the design of biomedical x-ray devices is one of them. In these applications, it is of great importance to have precise control of electron beams over multiple spatio-temporal scales. In this paper, a new design is proposed in order to optimize the field emission performance of CNT arrays. A diode configuration is used for analysis, where arrays of CNTs act as cathode. The results indicate that the linear height distribution of CNTs, as proposed in this study, shows more stable performance than the conventionally used unifrom distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, an instruction decoder is designed as a monolithic structure that inhibit the leakage energy optimization. In this paper, we consider a split instruction decoder that enable the leakage energy optimization. We also propose a compiler scheduling algorithm that exploits instruction slack to increase the simultaneous active and idle duration in instruction decoder. The proposed compiler-assisted scheme obtains a further 14.5% reduction of energy consumption of instruction decoder over a hardware-only scheme for a VLIW architecture. The benefits are 17.3% and 18.7% in the context of a 2-clustered and a 4-clustered VLIW architecture respectively.