981 resultados para cost estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subspace intersection method (SIM) provides unbiased bearing estimates of multiple acoustic sources in a range-independent shallow ocean using a one-dimensional search without prior knowledge of source ranges and depths. The original formulation of this method is based on deployment of a horizontal linear array of hydrophones which measure acoustic pressure. In this paper, we extend SIM to an array of acoustic vector sensors which measure pressure as well as all components of particle velocity. Use of vector sensors reduces the minimum number of sensors required by a factor of 4, and also eliminates the constraint that the intersensor spacing should not exceed half wavelength. The additional information provided by the vector sensors leads to performance enhancement in the form of lower estimation error and higher resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

TO THE EDITOR: Kinner and colleagues described the high proportion of deaths among recently released prisoners in Australia...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a recent spate of high profile infrastructure cost overruns in Australia and internationally. This is just the tip of a longer-term and more deeply-seated problem with initial budget estimating practice, well recognised in both academic research and industry reviews: the problem of uncertainty. A case study of the Sydney Opera House is used to identify and illustrate the key causal factors and system dynamics of cost overruns. It is conventionally the role of risk management to deal with such uncertainty, but the type and extent of the uncertainty involved in complex projects is shown to render established risk management techniques ineffective. This paper considers a radical advance on current budget estimating practice which involves a particular approach to statistical modelling complemented by explicit training in estimating practice. The statistical modelling approach combines the probability management techniques of Savage, which operate on actual distributions of values rather than flawed representations of distributions, and the data pooling technique of Skitmore, where the size of the reference set is optimised. Estimating training employs particular calibration development methods pioneered by Hubbard, which reduce the bias of experts caused by over-confidence and improve the consistency of subjective decision-making. A new framework for initial budget estimating practice is developed based on the combined statistical and training methods, with each technique being explained and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ruptured abdominal aortic aneurysm (RAAA) is a life-threatening event, and without operative treatment the patient will die. The overall mortality can be as high as 80-90%; thus repair of RAAA should be attempted whenever feasible. The quality of life (QoL) has become an increasingly important outcome measure in vascular surgery. Aim of the study was to evaluate outcomes of RAAA and to find out predictors of mortality. In Helsinki and Uusimaa district 626 patients were identified to have RAAA in 1996-2004. Altogether 352 of them were admitted to Helsinki University Central Hospital (HUCH). Based on Finnvasc Registry, 836 RAAA patients underwent repair of RAAA in 1991-1999. The 30-day operative mortality, hospital and population-based mortality were assessed, and the effect of regional centralisation and improving in-hospital quality on the outcome of RAAA. QoL was evaluated by a RAND-36 questionnaire of survivors of RAAA. Quality-adjusted life years (QALYs), which measure length and QoL, were calculated using the EQ-5D index and estimation of life expectancy. The predictors of outcome after RAAA were assessed at admission and 48 hours after repair of RAAA. The 30-day operative mortality rate was 38% in HUCH and 44% nationwide, whereas the hospital mortality was 45% in HUCH. Population-based mortality was 69% in 1996-2004 and 56% in 2003-2004. After organisational changes were undertaken, the mortality decreased significantly at all levels. Among the survivors, the QoL was almost equal when compared with norms of age- and sex-matched controls; only physical functioning was slightly impaired. Successful repair of RAAA gave a mean of 4.1 (0-30.9) QALYs for all RAAA patients, although non-survivors were included. The preoperative Glasgow Aneurysm Score was an independent predictor of 30-day operative mortality after RAAA, and it also predicted the outcome at 48- hours for initial survivors of repair of RAAA. A high Glasgow Aneurysm Score and high age were associated with low numbers of QALYs to be achieved. Organ dysfunction measured by the Sequential Organ Failure Assessment (SOFA) score at 48 hours after repair of RAAA was the strongest predictor of death. In conclusion surgery of RAAA is a life-saving and cost-effective procedure. The centralisation of vascular emergencies improved the outcome of RAAA patients. The survivors had a good QoL after RAAA. Predictive models can be used on individual level only to provide supplementary information for clinical decision-making due to their moderate discriminatory value. These results support an active operation policy, as there is no reliable measure to predict the outcome after RAAA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a low-complexity algorithm for detection in high-rate, non-orthogonal space-time block coded (STBC) large-multiple-input multiple-output (MIMO) systems that achieve high spectral efficiencies of the order of tens of bps/Hz. We also present a training-based iterative detection/channel estimation scheme for such large STBC MIMO systems. Our simulation results show that excellent bit error rate and nearness-to-capacity performance are achieved by the proposed multistage likelihood ascent search (M-LAS) detector in conjunction with the proposed iterative detection/channel estimation scheme at low complexities. The fact that we could show such good results for large STBCs like 16 X 16 and 32 X 32 STBCs from Cyclic Division Algebras (CDA) operating at spectral efficiencies in excess of 20 bps/Hz (even after accounting for the overheads meant for pilot based training for channel estimation and turbo coding) establishes the effectiveness of the proposed detector and channel estimator. We decode perfect codes of large dimensions using the proposed detector. With the feasibility of such a low-complexity detection/channel estimation scheme, large-MIMO systems with tens of antennas operating at several tens of bps/Hz spectral efficiencies can become practical, enabling interesting high data rate wireless applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Swarm Intelligence techniques such as particle swarm optimization (PSO) are shown to be incompetent for an accurate estimation of global solutions in several engineering applications. This problem is more severe in case of inverse optimization problems where fitness calculations are computationally expensive. In this work, a novel strategy is introduced to alleviate this problem. The proposed inverse model based on modified particle swarm optimization algorithm is applied for a contaminant transport inverse model. The inverse models based on standard-PSO and proposed-PSO are validated to estimate the accuracy of the models. The proposed model is shown to be out performing the standard one in terms of accuracy in parameter estimation. The preliminary results obtained using the proposed model is presented in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A considerable amount of work has been dedicated on the development of analytical solutions for flow of chemical contaminants through soils. Most of the analytical solutions for complex transport problems are closed-form series solutions. The convergence of these solutions depends on the eigen values obtained from a corresponding transcendental equation. Thus, the difficulty in obtaining exact solutions from analytical models encourages the use of numerical solutions for the parameter estimation even though, the later models are computationally expensive. In this paper a combination of two swarm intelligence based algorithms are used for accurate estimation of design transport parameters from the closed-form analytical solutions. Estimation of eigen values from a transcendental equation is treated as a multimodal discontinuous function optimization problem. The eigen values are estimated using an algorithm derived based on glowworm swarm strategy. Parameter estimation of the inverse problem is handled using standard PSO algorithm. Integration of these two algorithms enables an accurate estimation of design parameters using closed-form analytical solutions. The present solver is applied to a real world inverse problem in environmental engineering. The inverse model based on swarm intelligence techniques is validated and the accuracy in parameter estimation is shown. The proposed solver quickly estimates the design parameters with a great precision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esophageal and gastroesophageal junction (GEJ) adenocarcinoma is rapidly increasing disease with a pathophysiology connected to oxidative stress. Exact pre-treatment clinical staging is essential for optimal care of this lethal malignancy. The cost-effectiviness of treatment is increasingly important. We measured oxidative metabolism in the distal and proximal esophagus by myeloperoxidase activity (MPA), glutathione content (GSH), and superoxide dismutase (SOD) in 20 patients operated on with Nissen fundoplication and 9 controls during a 4-year follow-up. Further, we assessed the oxidative damage of DNA by 8-hydroxydeoxyguanosine (8-OHdG) in esophageal samples of subjects (13 Barrett s metaplasia, 6 Barrett s esophagus with high-grade dysplasia, 18 adenocarcinoma of the distal esophagus/GEJ, and 14 normal controls). We estimated the accuracy (42 patients) and preoperative prognostic value (55 patients) of PET compared with computed tomography (CT) and endoscopic ultrasound (EUS) in patients with adenocarcinoma of the esophagus/GEJ. Finally, we clarified the specialty-related costs and the utility of either radical (30 patients) or palliative (23 patients) treatment of esophageal/GEJ carcinoma by the 15 D health-related quality-of-life (HRQoL) questionnaire and the survival rate. The cost-utility of radical treatment of esophageal/GEJ carcinoma was investigated using a decision tree analysis model comparing radical, palliative, and hypothetical new treatment. We found elevated oxidative stress ( measured by MPA) and decreased antioxidant defense (measured by GSH) after antireflux surgery. This indicates that antireflux surgery is not a perfect solution for oxidative stress of the esophageal mucosa. Elevated oxidative stress in turn may partly explain why adenocarcinoma of the distal esophagus is found even after successful fundoplication. In GERD patients, proximal esophageal mucosal anti-oxidative defense seems to be defective before and even years after successful antireflux surgery. In addition, antireflux surgery apparently does not change the level of oxidative stress in the proximal esophagus, suggesting that defective mucosal anti-oxidative capacity plays a role in development of oxidative damage to the esophageal mucosa in GERD. In the malignant transformation of Barrett s esophagus an important component appears to be oxidative stress. DNA damage may be mediated by 8-OHdG, which we found to be increased in Barrett s epithelium and in high-grade dysplasia as well as in adenocarcinoma of the esophagus/GEJ compared with controls. The entire esophagus of Barrett s patients suffers from increased oxidative stress ( measured by 8-OhdG). PET is a useful tool in the staging and prognostication of adenocarcinoma of the esophagus/GEJ detecting organ metastases better than CT, although its accuracy in staging of paratumoral and distant lymph nodes is limited. Radical surgery for esophageal/GEJ carcinoma provides the greatest benefit in terms of survival, and its cost-utility appears to be the best of currently available treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the beginning of 2008, I visited a watershed, located in Karkinatam village in the state of Karnataka, South India, where crops are intensively irrigated using groundwater. The water table had been depleted from a depth of 5 to 50 m in a large part of the area. Presently, 42% of a total of 158 water wells in the watershed are dry. Speaking with the farmers, I have been amazed to learn that they were drilling down to 500 m to tap water. This case is, of course, not isolated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual Machine (VM) management is an obvious need in today's data centers for various management activities and is accomplished in two phases— finding an optimal VM placement plan and implementing that placement through live VM migrations. These phases result in two research problems— VM placement problem (VMPP) and VM migration scheduling problem (VMMSP). This research proposes and develops several evolutionary algorithms and heuristic algorithms to address the VMPP and VMMSP. Experimental results show the effectiveness and scalability of the proposed algorithms. Finally, a VM management framework has been proposed and developed to automate the VM management activity in cost-efficient way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A business cluster is a co-located group of micro, small, medium scale enterprises. Such firms can benefit significantly from their co-location through shared infrastructure and shared services. Cost sharing becomes an important issue in such sharing arrangements especially when the firms exhibit strategic behavior. There are many cost sharing methods and mechanisms proposed in the literature based on game theoretic foundations. These mechanisms satisfy a variety of efficiency and fairness properties such as allocative efficiency, budget balance, individual rationality, consumer sovereignty, strategyproofness, and group strategyproofness. In this paper, we motivate the problem of cost sharing in a business cluster with strategic firms and illustrate different cost sharing mechanisms through the example of a cluster of firms sharing a logistics service. Next we look into the problem of a business cluster sharing ICT (information and communication technologies) infrastructure and explore the use of cost sharing mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to assess the heritability of cerebral cortex, based on measurements of grey matter (GM) thickness derived from structural MR images (sMRI). With data acquired from a large twin cohort (328 subjects), an automated method was used to estimate the cortical thickness, and EM-ICP surface registration algorithm was used to establish the correspondence of cortex across the population. An ACE model was then employed to compute the heritability of cortical thickness. Heritable cortical thickness measures various cortical regions, especially in frontal and parietal lobes, such as bilateral postcentral gyri, superior occipital gyri, superior parietal gyri, precuneus, the orbital part of the right frontal gyrus, right medial superior frontal gyrus, right middle occipital gyrus, right paracentral lobule, left precentral gyrus, and left dorsolateral superior frontal gyrus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ductility based design of reinforced concrete structures implicitly assumes certain damage under the action of a design basis earthquake. The damage undergone by a structure needs to be quantified, so as to assess the post-seismic reparability and functionality of the structure. The paper presents an analytical method of quantification and location of seismic damage, through system identification methods. It may be noted that soft ground storied buildings are the major casualties in any earthquake and hence the example structure is a soft or weak first storied one, whose seismic response and temporal variation of damage are computed using a non-linear dynamic analysis program (IDARC) and compared with a normal structure. Time period based damage identification model is used and suitably calibrated with classic damage models. Regenerated stiffness of the three degrees of freedom model (for the three storied frame) is used to locate the damage, both on-line as well as after the seismic event. Multi resolution analysis using wavelets is also used for localized damage identification for soft storey columns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inadvertent failure of power transformers has serious consequences on the power system reliability, economics and the revenue accrual. Insulation is the weakest link in the power transformer prompting periodic inspection of the status of insulation at different points in time. A close Monitoring of the electrical, chemical and such other properties on insulation as are sensitive to the amount of time-dependent degradation becomes mandatory to judge the status of the equipment. Data-driven Diagnostic Testing and Condition Monitoring (DTCM) specific to power transformer is the aspect in focus. Authors develop a Monte Carlo approach for augmenting the rather scanty experimental data normally acquired using Proto-types of power transformers. Also described is a validation procedure for estimating the accuracy of the Model so developed.