33 resultados para cost estimation accuracy
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.
Resumo:
A generic, hierarchical, and multifidelity unit cost of acquisition estimating methodology for outside production machined parts is presented. The originality of the work lies with the method’s inherent capability of being able to generate multilevel and multifidelity cost relations for large volumes of parts utilizing process, supply chain costing data, and varying degrees of part design definition information. Estimates can be generated throughout the life cycle of a part using different grades of the combined information available. Considering design development for a given part, additional design definition may be used as it becomes available within the developed method to improve the quality of the resulting estimate. Via a process of analogous classification, parts are classified into groups of increasing similarity using design-based descriptors. A parametric estimating method is then applied to each subgroup of the machined part commodity in the direction of improved classification and using which, a relationship which links design variables to manufacturing cycle time may be generated. A rate cost reflective of the supply chain is then applied to the cycle time estimate for a given part to arrive at an estimate of make cost which is then totalled with the material and treatments cost components respectively to give an overall estimate of unit acquisition cost. Both the rate charge applied and the treatments cost calculated for a given procured part is derived via the use of ratio analysis.
Resumo:
Bridge construction responds to the need for environmentally friendly design of motorways and facilitates the passage through sensitive natural areas and the bypassing of urban areas. However, according to numerous research studies, bridge construction presents substantial budget overruns. Therefore, it is necessary early in the planning process for the decision makers to have reliable estimates of the final cost based on previously constructed projects. At the same time, the current European financial crisis reduces the available capital for investments and financial institutions are even less willing to finance transportation infrastructure. Consequently, it is even more necessary today to estimate the budget of high-cost construction projects -such as road bridges- with reasonable accuracy, in order for the state funds to be invested with lower risk and the projects to be designed with the highest possible efficiency. In this paper, a Bill-of-Quantities (BoQ) estimation tool for road bridges is developed in order to support the decisions made at the preliminary planning and design stages of highways. Specifically, a Feed-Forward Artificial Neural Network (ANN) with a hidden layer of 10 neurons is trained to predict the superstructure material quantities (concrete, pre-stressed steel and reinforcing steel) using the width of the deck, the adjusted length of span or cantilever and the type of the bridge as input variables. The training dataset includes actual data from 68 recently constructed concrete motorway bridges in Greece. According to the relevant metrics, the developed model captures very well the complex interrelations in the dataset and demonstrates strong generalisation capability. Furthermore, it outperforms the linear regression models developed for the same dataset. Therefore, the proposed cost estimation model stands as a useful and reliable tool for the construction industry as it enables planners to reach informed decisions for technical and economic planning of concrete bridge projects from their early implementation stages.
Resumo:
This Letter rethinks the problems of available bandwidth estimation in IEEE 802.11-based ad hoc networks. The estimation accuracy is increased by improving the calculation accuracy that the probability for two adjacent nodes' idle periods toverlap (a key issue when estimating the available bandwidth in 802.11 networks)
Resumo:
The need to account for the effect of design decisions on manufacture and the impact of manufacturing cost on the life cycle cost of any product are well established. In this context, digital design and manufacturing solutions have to be further developed to facilitate and automate the integration of cost as one of the major driver in the product life cycle management. This article is to present an integration methodology for implementing cost estimation capability within a digital manufacturing environment. A digital manufacturing structure of knowledge databases are set out and the ontology of assembly and part costing that is consistent with the structure is provided. Although the methodology is currently used for recurring cost prediction, it can be well applied to other functional developments, such as process planning. A prototype tool is developed to integrate both assembly time cost and parts manufacturing costs within the same digital environment. An industrial example is used to validate this approach.
Resumo:
Variations are inherent in all manufacturing processes and can significantly affect the quality of a final assembly, particularly in multistage assembly systems. Existing research in variation management has primarily focused on incorporating GD&T factors into variation propagation models in order to predict product quality and allocate tolerances. However, process induced variation, which has a key influence on process planning, has not been fully studied. Furthermore, the link between variation and cost has not been well established, in particular the effect that assembly process selection has on the final quality and cost of a product. To overcome these barriers, this paper proposes a novel method utilizing process capabilities to establish the relationship between variation and cost. The methodology is discussed using a real industrial case study. The benefits include determining the optimum configuration of an assembly system and facilitating rapid introduction of novel assembly techniques to achieve a competitive edge.
Resumo:
Temporal dynamics and speaker characteristics are two important features of speech that distinguish speech from noise. In this paper, we propose a method to maximally extract these two features of speech for speech enhancement. We demonstrate that this can reduce the requirement for prior information about the noise, which can be difficult to estimate for fast-varying noise. Given noisy speech, the new approach estimates clean speech by recognizing long segments of the clean speech as whole units. In the recognition, clean speech sentences, taken from a speech corpus, are used as examples. Matching segments are identified between the noisy sentence and the corpus sentences. The estimate is formed by using the longest matching segments found in the corpus sentences. Longer speech segments as whole units contain more distinct dynamics and richer speaker characteristics, and can be identified more accurately from noise than shorter speech segments. Therefore, estimation based on the longest recognized segments increases the noise immunity and hence the estimation accuracy. The new approach consists of a statistical model to represent up to sentence-long temporal dynamics in the corpus speech, and an algorithm to identify the longest matching segments between the noisy sentence and the corpus sentences. The algorithm is made more robust to noise uncertainty by introducing missing-feature based noise compensation into the corpus sentences. Experiments have been conducted on the TIMIT database for speech enhancement from various types of nonstationary noise including song, music, and crosstalk speech. The new approach has shown improved performance over conventional enhancement algorithms in both objective and subjective evaluations.
Resumo:
When an agent wants to fulfill its desires about the world, the agent usually has multiple plans to choose from and these plans have different pre-conditions and additional effects in addition to achieving its goals. Therefore, for further reasoning and interaction with the world, a plan selection strategy (usually based on plan cost estimation) is mandatory for an autonomous agent. This demand becomes even more critical when uncertainty on the observation of the world is taken into account, since in this case, we consider not only the costs of different plans, but also their chances of success estimated according to the agent's beliefs. In addition, when multiple goals are considered together, different plans achieving the goals can be conflicting on their preconditions (contexts) or the required resources. Hence a plan selection strategy should be able to choose a subset of plans that fulfills the maximum number of goals while maintaining context consistency and resource-tolerance among the chosen plans. To address the above two issues, in this paper we first propose several principles that a plan selection strategy should satisfy, and then we present selection strategies that stem from the principles, depending on whether a plan cost is taken into account. In addition, we also show that our selection strategy can partially recover intention revision.
Resumo:
Recent improvements in the speed, cost and accuracy of next generation sequencing are revolutionizing the discovery of single nucleotide polymorphisms (SNPs). SNPs are increasingly being used as an addition to the molecular ecology toolkit in nonmodel organisms, but their efficient use remains challenging. Here, we discuss common issues when employing SNP markers, including the high numbers of markers typically employed, the effects of ascertainment bias and the inclusion of nonneutral loci in a marker panel. We provide a critique of considerations specifically associated with the application and population genetic analysis of SNPs in nonmodel taxa, focusing specifically on some of the most commonly applied methods.
Resumo:
The work presented is concerned with the estimation of manufacturing cost at the concept design stage, when little technical information is readily available. The work focuses on the nose cowl sections of a wide range of engine nacelles built at Bombardier Aerospace Shorts of Belfast. A core methodology is presented that: defines manufacturing cost elements that are prominent; utilises technical parameters that are highly influential in generating those costs; establishes the linkage between these two; and builds the associated cost estimating relations into models. The methodology is readily adapted to deal with both the early and more mature conceptual design phases, which thereby highlights the generic, flexible and fundamental nature of the method. The early concept cost model simplifies cost as a cumulative element that can be estimated using higher level complexity ratings, while the mature concept cost model breaks manufacturing cost down into a number of constituents that are each driven by their own specific drivers. Both methodologies have an average error of less that ten percent when correlated with actual findings, thus achieving an acceptable level of accuracy. By way of validity and application, the research is firmly based on industrial case studies and practice and addresses the integration of design and manufacture through cost. The main contribution of the paper is the cost modelling methodology. The elemental modelling of the cost breakdown structure through materials, part fabrication, assembly and their associated drivers is relevant to the analytical design procedure, as it utilises design definition and complexity that is understood by engineers.
Resumo:
This paper presents a novel approach based on the use of evolutionary agents for epipolar geometry estimation. In contrast to conventional nonlinear optimization methods, the proposed technique employs each agent to denote a minimal subset to compute the fundamental matrix, and considers the data set of correspondences as a 1D cellular environment, in which the agents inhabit and evolve. The agents execute some evolutionary behavior, and evolve autonomously in a vast solution space to reach the optimal (or near optima) result. Then three different techniques are proposed in order to improve the searching ability and computational efficiency of the original agents. Subset template enables agents to collaborate more efficiently with each other, and inherit accurate information from the whole agent set. Competitive evolutionary agent (CEA) and finite multiple evolutionary agent (FMEA) apply a better evolutionary strategy or decision rule, and focus on different aspects of the evolutionary process. Experimental results with both synthetic data and real images show that the proposed agent-based approaches perform better than other typical methods in terms of accuracy and speed, and are more robust to noise and outliers.
Resumo:
In this paper, a new reconfigurable multi-standard Motion Estimation (ME) architecture is proposed and a standard-cell based design study is presented. The architecture exhibits simpler control, high throughput and relative low hardware cost and is highly competitive when compared with existing designs for specific video standards. ©2007 IEEE.
Resumo:
Quality of Service (QoS) support in IEEE 802.11-based ad hoc networks relies on the networks’ ability to estimate the available bandwidth on a given link. However, no mechanism has been standardized to accurately evaluate this resource. This remains one of the main issues open to research in this field. This paper proposes an available bandwidth estimation approach which achieves more accurate estimation when compared to existing research. The proposed approach differentiates the channel busy caused by transmitting or receiving from that caused by carrier sensing, and thus improves the accuracy of estimating the overlap probability of two adjacent nodes’ idle time. Simulation results testify the improvement of this approach when compared with well known bandwidth estimation methods in the literature.