977 resultados para cost estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scale image mosaicing methods are in great demand among scientists who study different aspects of the seabed, and have been fostered by impressive advances in the capabilities of underwater robots in gathering optical data from the seafloor. Cost and weight constraints mean that lowcost Remotely operated vehicles (ROVs) usually have a very limited number of sensors. When a low-cost robot carries out a seafloor survey using a down-looking camera, it usually follows a predetermined trajectory that provides several non time-consecutive overlapping image pairs. Finding these pairs (a process known as topology estimation) is indispensable to obtaining globally consistent mosaics and accurate trajectory estimates, which are necessary for a global view of the surveyed area, especially when optical sensors are the only data source. This thesis presents a set of consistent methods aimed at creating large area image mosaics from optical data obtained during surveys with low-cost underwater vehicles. First, a global alignment method developed within a Feature-based image mosaicing (FIM) framework, where nonlinear minimisation is substituted by two linear steps, is discussed. Then, a simple four-point mosaic rectifying method is proposed to reduce distortions that might occur due to lens distortions, error accumulation and the difficulties of optical imaging in an underwater medium. The topology estimation problem is addressed by means of an augmented state and extended Kalman filter combined framework, aimed at minimising the total number of matching attempts and simultaneously obtaining the best possible trajectory. Potential image pairs are predicted by taking into account the uncertainty in the trajectory. The contribution of matching an image pair is investigated using information theory principles. Lastly, a different solution to the topology estimation problem is proposed in a bundle adjustment framework. Innovative aspects include the use of fast image similarity criterion combined with a Minimum spanning tree (MST) solution, to obtain a tentative topology. This topology is improved by attempting image matching with the pairs for which there is the most overlap evidence. Unlike previous approaches for large-area mosaicing, our framework is able to deal naturally with cases where time-consecutive images cannot be matched successfully, such as completely unordered sets. Finally, the efficiency of the proposed methods is discussed and a comparison made with other state-of-the-art approaches, using a series of challenging datasets in underwater scenarios

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El treball desenvolupat en aquesta tesi aprofundeix i aporta solucions innovadores en el camp orientat a tractar el problema de la correspondència en imatges subaquàtiques. En aquests entorns, el que realment complica les tasques de processat és la falta de contorns ben definits per culpa d'imatges esborronades; un fet aquest que es deu fonamentalment a il·luminació deficient o a la manca d'uniformitat dels sistemes d'il·luminació artificials. Els objectius aconseguits en aquesta tesi es poden remarcar en dues grans direccions. Per millorar l'algorisme d'estimació de moviment es va proposar un nou mètode que introdueix paràmetres de textura per rebutjar falses correspondències entre parells d'imatges. Un seguit d'assaigs efectuats en imatges submarines reals han estat portats a terme per seleccionar les estratègies més adients. Amb la finalitat d'aconseguir resultats en temps real, es proposa una innovadora arquitectura VLSI per la implementació d'algunes parts de l'algorisme d'estimació de moviment amb alt cost computacional.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the proposed approach is simple to implement and the associated computational cost is very low. An illustrative example is employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to that of the classical Parzen window estimate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hidden Markov Models (HMMs) have been successfully applied to different modelling and classification problems from different areas over the recent years. An important step in using HMMs is the initialisation of the parameters of the model as the subsequent learning of HMM’s parameters will be dependent on these values. This initialisation should take into account the knowledge about the addressed problem and also optimisation techniques to estimate the best initial parameters given a cost function, and consequently, to estimate the best log-likelihood. This paper proposes the initialisation of Hidden Markov Models parameters using the optimisation algorithm Differential Evolution with the aim to obtain the best log-likelihood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss-Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter proposes the subspace-based blind adaptive channel estimation algorithm for dual-rate quasi-synchronous DS/CDMA systems, which can operate at the low-rate (LR) or high-rate (HR) mode. Simulation results show that the proposed blind adaptive algorithm at the LR mode has a better performance than that at the HR mode, with the cost of an increasing computational complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a subspace based blind adaptive channel estimation algorithm for dual-rate DS-CDMA systems, which can operate at the low-rate (LR) or high-rate (HR) mode. Simulation results show that the proposed blind adaptive algorithm at the LR mode has a better performance than that at the HR mode, with the cost of an increased computational complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a current need to constrain the parameters of gravity wave drag (GWD) schemes in climate models using observational information instead of tuning them subjectively. In this work, an inverse technique is developed using data assimilation principles to estimate gravity wave parameters. Because mostGWDschemes assume instantaneous vertical propagation of gravity waves within a column, observations in a single column can be used to formulate a one-dimensional assimilation problem to estimate the unknown parameters. We define a cost function that measures the differences between the unresolved drag inferred from observations (referred to here as the ‘observed’ GWD) and the GWD calculated with a parametrisation scheme. The geometry of the cost function presents some difficulties, including multiple minima and ill-conditioning because of the non-independence of the gravity wave parameters. To overcome these difficulties we propose a genetic algorithm to minimize the cost function, which provides a robust parameter estimation over a broad range of prescribed ‘true’ parameters. When real experiments using an independent estimate of the ‘observed’ GWD are performed, physically unrealistic values of the parameters can result due to the non-independence of the parameters. However, by constraining one of the parameters to lie within a physically realistic range, this degeneracy is broken and the other parameters are also found to lie within physically realistic ranges. This argues for the essential physical self-consistency of the gravity wave scheme. A much better fit to the observed GWD at high latitudes is obtained when the parameters are allowed to vary with latitude. However, a close fit can be obtained either in the upper or the lower part of the profiles, but not in both at the same time. This result is a consequence of assuming an isotropic launch spectrum. The changes of sign in theGWDfound in the tropical lower stratosphere, which are associated with part of the quasi-biennial oscillation forcing, cannot be captured by the parametrisation with optimal parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Partial budgeting was used to estimate the net benefit of blending Jersey milk in Holstein-Friesian milk for Cheddar cheese production. Jersey milk increases Cheddar cheese yield. However, the cost of Jersey milk is also higher; thus, determining the balance of profitability is necessary, including consideration of seasonal effects. Input variables were based on a pilot plant experiment run from 2012 to 2013 and industry milk and cheese prices during this period. When Jersey milk was used at an increasing rate with Holstein-Friesian milk (25, 50, 75, and 100% Jersey milk), it resulted in an increase of average net profit of 3.41, 6.44, 8.57, and 11.18 pence per kilogram of milk, respectively, and this additional profit was constant throughout the year. Sensitivity analysis showed that the most influential input on additional profit was cheese yield, whereas cheese price and milk price had a small effect. The minimum increase in yield, which was necessary for the use of Jersey milk to be profitable, was 2.63, 7.28, 9.95, and 12.37% at 25, 50, 75, and 100% Jersey milk, respectively. Including Jersey milk did not affect the quantity of whey butter and powder produced. Althoug further research is needed to ascertain the amount of additional profit that would be found on a commercial scale, the results indicate that using Jersey milk for Cheddar cheese making would lead to an improvement in profit for the cheese makers, especially at higher inclusion rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensitivity and specificity are measures that allow us to evaluate the performance of a diagnostic test. In practice, it is common to have situations where a proportion of selected individuals cannot have the real state of the disease verified, since the verification could be an invasive procedure, as occurs with biopsy. This happens, as a special case, in the diagnosis of prostate cancer, or in any other situation related to risks, that is, not practicable, nor ethical, or in situations with high cost. For this case, it is common to use diagnostic tests based only on the information of verified individuals. This procedure can lead to biased results or workup bias. In this paper, we introduce a Bayesian approach to estimate the sensitivity and the specificity for two diagnostic tests considering verified and unverified individuals, a result that generalizes the usual situation based on only one diagnostic test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper elaborates the routing of cable cycle through available routes in a building in order to link a set of devices, in a most reasonable way. Despite of the similarities to other NP-hard routing problems, the only goal is not only to minimize the cost (length of the cycle) but also to increase the reliability of the path (in case of a cable cut) which is assessed by a risk factor. Since there is often a trade-off between the risk and length factors, a criterion for ranking candidates and deciding the most reasonable solution is defined. A set of techniques is proposed to perform an efficient and exact search among candidates. A novel graph is introduced to reduce the search-space, and navigate the search toward feasible and desirable solutions. Moreover, admissible heuristic length estimation helps to early detection of partial cycles which lead to unreasonable solutions. The results show that the method provides solutions which are both technically and financially reasonable. Furthermore, it is proved that the proposed techniques are very efficient in reducing the computational time of the search to a reasonable amount.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.