14 resultados para Estimation process


Relevância:

70.00% 70.00%

Publicador:

Resumo:

A new domain-specific, reconfigurable system-on-a-chip (SoC) architecture is proposed for video motion estimation. This has been designed to cover most of the common block-based video coding standards, including MPEG-2, MPEG-4, H.264, WMV-9 and AVS. The architecture exhibits simple control, high throughput and relatively low hardware cost when compared with existing circuits. It can also easily handle flexible search ranges without any increase in silicon area and can be configured prior to the start of the motion estimation process for a specific standard. The computational rates achieved make the circuit suitable for high-end video processing applications, such as HDTV. Silicon design studies indicate that circuits based on this approach incur only a relatively small penalty in terms of power dissipation and silicon area when compared with implementations for specific standards. Indeed, the cost/performance achieved exceeds that of existing but specific solutions and greatly exceeds that of general purpose field programmable gate array (FPGA) designs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Variations are inherent in all manufacturing processes and can significantly affect the quality of a final assembly, particularly in multistage assembly systems. Existing research in variation management has primarily focused on incorporating GD&T factors into variation propagation models in order to predict product quality and allocate tolerances. However, process induced variation, which has a key influence on process planning, has not been fully studied. Furthermore, the link between variation and cost has not been well established, in particular the effect that assembly process selection has on the final quality and cost of a product. To overcome these barriers, this paper proposes a novel method utilizing process capabilities to establish the relationship between variation and cost. The methodology is discussed using a real industrial case study. The benefits include determining the optimum configuration of an assembly system and facilitating rapid introduction of novel assembly techniques to achieve a competitive edge.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Time-domain modelling of single-reed woodwind instruments usually involves a lumped model of the excitation mechanism. The parameters of this lumped model have to be estimated for use in numerical simulations. Several attempts have been made to estimate these parameters, including observations of the mechanics of isolated reeds, measurements under artificial or real playing conditions and estimations based on numerical simulations. In this study an optimisation routine is presented, that can estimate reed-model parameters, given the pressure and flow signals in the mouthpiece. The method is validated, tested on a series of numerically synthesised data. In order to incorporate the actions of the player in the parameter estimation process, the optimisation routine has to be applied to signals obtained under real playing conditions. The estimated parameters can then be used to resynthesise the pressure and flow signals in the mouthpiece. In the case of measured data, as opposed to numerically synthesised data, special care needs to be taken while modelling the bore of the instrument. In fact, a careful study of various experimental datasets revealed that for resynthesis to work, the bore termination impedance should be known very precisely from theory. An example is given, where the above requirement is satisfied, and the resynthesised signals closely match the original signals generated by the player.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generic, hierarchical, and multifidelity unit cost of acquisition estimating methodology for outside production machined parts is presented. The originality of the work lies with the method’s inherent capability of being able to generate multilevel and multifidelity cost relations for large volumes of parts utilizing process, supply chain costing data, and varying degrees of part design definition information. Estimates can be generated throughout the life cycle of a part using different grades of the combined information available. Considering design development for a given part, additional design definition may be used as it becomes available within the developed method to improve the quality of the resulting estimate. Via a process of analogous classification, parts are classified into groups of increasing similarity using design-based descriptors. A parametric estimating method is then applied to each subgroup of the machined part commodity in the direction of improved classification and using which, a relationship which links design variables to manufacturing cycle time may be generated. A rate cost reflective of the supply chain is then applied to the cycle time estimate for a given part to arrive at an estimate of make cost which is then totalled with the material and treatments cost components respectively to give an overall estimate of unit acquisition cost. Both the rate charge applied and the treatments cost calculated for a given procured part is derived via the use of ratio analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of new video standards such as MPEG-4 part-10 and H.264/H.26L, demands for advanced video coding, particularly in the area of variable block size video motion estimation (VBSME), are increasing. In this paper, we propose a new one-dimensional (1-D) very large-scale integration architecture for full-search VBSME (FSVBSME). The VBS sum of absolute differences (SAD) computation is performed by re-using the results of smaller sub-block computations. These are distributed and combined by incorporating a shuffling mechanism within each processing element. Whereas a conventional 1-D architecture can process only one motion vector (MV), this new architecture can process up to 41 MV sub-blocks (within a macroblock) in the same number of clock cycles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel approach based on the use of evolutionary agents for epipolar geometry estimation. In contrast to conventional nonlinear optimization methods, the proposed technique employs each agent to denote a minimal subset to compute the fundamental matrix, and considers the data set of correspondences as a 1D cellular environment, in which the agents inhabit and evolve. The agents execute some evolutionary behavior, and evolve autonomously in a vast solution space to reach the optimal (or near optima) result. Then three different techniques are proposed in order to improve the searching ability and computational efficiency of the original agents. Subset template enables agents to collaborate more efficiently with each other, and inherit accurate information from the whole agent set. Competitive evolutionary agent (CEA) and finite multiple evolutionary agent (FMEA) apply a better evolutionary strategy or decision rule, and focus on different aspects of the evolutionary process. Experimental results with both synthetic data and real images show that the proposed agent-based approaches perform better than other typical methods in terms of accuracy and speed, and are more robust to noise and outliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of the acoustical functioning of musical instruments invariably involves the estimation of model parameters. The broad aim of this paper is to develop methods for estimation of clarinet reed parameters that are representative of actual playing conditions. This presents various challenges because of the di?culties of measuring the directly relevant variables without interfering with the control of the instrument. An inverse modelling approach is therefore proposed, in which the equations governing the sound generation mechanism of the clarinet
are employed in an optimisation procedure to determine the reed parameters from the mouthpiece pressure and volume ?ow signals. The underlying physical model captures most of the reed dynamics and is simple enough to be used in an inversion process. The optimisation procedure is ?rst tested by applying it to numerically synthesised signals, and then applied to mouthpiece signals acquired during notes blown by a human player. The proposed inverse modelling approach raises the possibility of revealing information about the way in which the embouchure-related reed parameters are controlled by the player, and also facilitates physics-based re-synthesis of clarinet sounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of full-scale biogas plant operation is of great importance to make biomass a competitive source of renewable energy. The implementation of innovative control and optimization algorithms, such as Nonlinear Model Predictive Control, requires an online estimation of operating states of biogas plants. This state estimation allows for optimal control and operating decisions according to the actual state of a plant. In this paper such a state estimator is developed using a calibrated simulation model of a full-scale biogas plant, which is based on the Anaerobic Digestion Model No.1. The use of advanced pattern recognition methods shows that model states can be predicted from basic online measurements such as biogas production, CH4 and CO2 content in the biogas, pH value and substrate feed volume of known substrates. The machine learning methods used are trained and evaluated using synthetic data created with the biogas plant model simulating over a wide range of possible plant operating regions. Results show that the operating state vector of the modelled anaerobic digestion process can be predicted with an overall accuracy of about 90%. This facilitates the application of state-based optimization and control algorithms on full-scale biogas plants and therefore fosters the production of eco-friendly energy from biomass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arsenic contamination of rice plants by arsenic-polluted irrigation groundwater could result in high arsenic concentrations in cooked rice. The main objective of the study was to estimate the total and inorganic arsenic intakes in a rural population of West Bengal, India, through both drinking water and cooked rice. Simulated cooking of rice with different levels of arsenic species in the cooking water was carried out. The presence of arsenic in the cooking water was provided by four arsenic species (arsenite, arsenate, methylarsonate or dimethylarsinate) and at three total arsenic concentrations (50, 250 or 500 mu g l(-1)). The results show that the arsenic concentration in cooked rice is always higher than that in raw rice and range from 227 to 1642 mu g kg(-1). The cooking process did not change the arsenic speciation in rice. Cooked rice contributed a mean of 41% to the daily intake of inorganic arsenic. The daily inorganic arsenic intakes for water plus rice were 229, 1024 and 2000 mu g day(-1) for initial arsenic concentrations in the cooking water of 50, 250 and 500 g arsenic l(-1), respectively, compared with the tolerable daily intake which is 150 mu g day(-1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plasma etch is a key process in modern semiconductor manufacturing facilities as it offers process simplification and yet greater dimensional tolerances compared to wet chemical etch technology. The main challenge of operating plasma etchers is to maintain a consistent etch rate spatially and temporally for a given wafer and for successive wafers processed in the same etch tool. Etch rate measurements require expensive metrology steps and therefore in general only limited sampling is performed. Furthermore, the results of measurements are not accessible in real-time, limiting the options for run-to-run control. This paper investigates a Virtual Metrology (VM) enabled Dynamic Sampling (DS) methodology as an alternative paradigm for balancing the need to reduce costly metrology with the need to measure more frequently and in a timely fashion to enable wafer-to-wafer control. Using a Gaussian Process Regression (GPR) VM model for etch rate estimation of a plasma etch process, the proposed dynamic sampling methodology is demonstrated and evaluated for a number of different predictive dynamic sampling rules. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bridge construction responds to the need for environmentally friendly design of motorways and facilitates the passage through sensitive natural areas and the bypassing of urban areas. However, according to numerous research studies, bridge construction presents substantial budget overruns. Therefore, it is necessary early in the planning process for the decision makers to have reliable estimates of the final cost based on previously constructed projects. At the same time, the current European financial crisis reduces the available capital for investments and financial institutions are even less willing to finance transportation infrastructure. Consequently, it is even more necessary today to estimate the budget of high-cost construction projects -such as road bridges- with reasonable accuracy, in order for the state funds to be invested with lower risk and the projects to be designed with the highest possible efficiency. In this paper, a Bill-of-Quantities (BoQ) estimation tool for road bridges is developed in order to support the decisions made at the preliminary planning and design stages of highways. Specifically, a Feed-Forward Artificial Neural Network (ANN) with a hidden layer of 10 neurons is trained to predict the superstructure material quantities (concrete, pre-stressed steel and reinforcing steel) using the width of the deck, the adjusted length of span or cantilever and the type of the bridge as input variables. The training dataset includes actual data from 68 recently constructed concrete motorway bridges in Greece. According to the relevant metrics, the developed model captures very well the complex interrelations in the dataset and demonstrates strong generalisation capability. Furthermore, it outperforms the linear regression models developed for the same dataset. Therefore, the proposed cost estimation model stands as a useful and reliable tool for the construction industry as it enables planners to reach informed decisions for technical and economic planning of concrete bridge projects from their early implementation stages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Camera traps are used to estimate densities or abundances using capture-recapture and, more recently, random encounter models (REMs). We deploy REMs to describe an invasive-native species replacement process, and to demonstrate their wider application beyond abundance estimation. The Irish hare Lepus timidus hibernicus is a high priority endemic of conservation concern. It is threatened by an expanding population of non-native, European hares L. europaeus, an invasive species of global importance. Camera traps were deployed in thirteen 1 km squares, wherein the ratio of invader to native densities were corroborated by night-driven line transect distance sampling throughout the study area of 1652 km2. Spatial patterns of invasive and native densities between the invader’s core and peripheral ranges, and native allopatry, were comparable between methods. Native densities in the peripheral range were comparable to those in native allopatry using REM, or marginally depressed using Distance Sampling. Numbers of the invader were substantially higher than the native in the core range, irrespective of method, with a 5:1 invader-to-native ratio indicating species replacement. We also describe a post hoc optimization protocol for REM which will inform subsequent (re-)surveys, allowing survey effort (camera hours) to be reduced by up to 57% without compromising the width of confidence intervals associated with density estimates. This approach will form the basis of a more cost-effective means of surveillance and monitoring for both the endemic and invasive species. The European hare undoubtedly represents a significant threat to the endemic Irish hare.