799 resultados para recursive partitioning algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new sparse kernel density estimator is introduced. Our main contribution is to develop a recursive algorithm for the selection of significant kernels one at time using the minimum integrated square error (MISE) criterion for both kernel selection. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with competitive accuracy to existing kernel density estimators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows existence of approximate recursive equilibrium with minimal state space in an environment of incomplete markets. We prove that the approximate recursive equilibrium implements an approximate sequential equilibrium which is always close to a Magill and Quinzii equilibrium without short sales for arbitrarily small errors. This implies that the competitive equilibrium can be implemented by using forecast statistics with minimal state space provided that agents will reduce errors in their estimates in the long run. We have also developed an alternative algorithm to compute the approximate recursive equilibrium with incomplete markets and heterogeneous agents through a procedure of iterating functional equations and without using the rst order conditions of optimality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses dynamic programming to study the time consistency of optimal macroeconomic policy in economies with recurring public deficits. To this end, a general equilibrium recursive model introduced in Chang (1998) is extended to include govemment bonds and production. The original mode! presents a Sidrauski economy with money and transfers only, implying that the need for govemment fmancing through the inflation tax is minimal. The extended model introduces govemment expenditures and a deficit-financing scheme, analyzing the SargentWallace (1981) problem: recurring deficits may lead the govemment to default on part of its public debt through inflation. The methodology allows for the computation of the set of alI sustainable stabilization plans even when the govemment cannot pre-commit to an optimal inflation path. This is done through value function iterations, which can be done on a computeI. The parameters of the extended model are calibrated with Brazilian data, using as case study three Brazilian stabilization attempts: the Cruzado (1986), Collor (1990) and the Real (1994) plans. The calibration of the parameters of the extended model is straightforward, but its numerical solution proves unfeasible due to a dimensionality problem in the algorithm arising from limitations of available computer technology. However, a numerical solution using the original algorithm and some calibrated parameters is obtained. Results indicate that in the absence of govemment bonds or production only the Real Plan is sustainable in the long run. The numerical solution of the extended algorithm is left for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A low-cost computer procedure to determine the orbit of an artificial satellite by using short arc data from an onboard GPS receiver is proposed. Pseudoranges are used as measurements to estimate the orbit via recursive least squares method. The algorithm applies orthogonal Givens rotations for solving recursive and sequential orbit determination problems. To assess the procedure, it was applied to the TOPEX/POSEIDON satellite for data batches of one orbital period (approximately two hours), and force modelling, due to the full JGM-2 gravity field model, was considered. When compared with the reference Precision Orbit Ephemeris (POE) of JPL/NASA, the results have indicated that precision better than 9 m is easily obtained, even when short batches of data are used. Copyright (c) 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-relational data mining enables pattern mining from multiple tables. The existing multi-relational mining association rules algorithms are not able to process large volumes of data, because the amount of memory required exceeds the amount available. The proposed algorithm MRRadix presents a framework that promotes the optimization of memory usage. It also uses the concept of partitioning to handle large volumes of data. The original contribution of this proposal is enable a superior performance when compared to other related algorithms and moreover successfully concludes the task of mining association rules in large databases, bypass the problem of available memory. One of the tests showed that the MR-Radix presents fourteen times less memory usage than the GFP-growth. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FEAST is a recently developed eigenvalue algorithm which computes selected interior eigenvalues of real symmetric matrices. It uses contour integral resolvent based projections. A weakness is that the existing algorithm relies on accurate reasoned estimates of the number of eigenvalues within the contour. Examining the singular values of the projections on moderately-sized, randomly-generated test problems motivates orthogonalization-based improvements to the algorithm. The singular value distributions provide experimentally robust estimates of the number of eigenvalues within the contour. The algorithm is modified to handle both Hermitian and general complex matrices. The original algorithm (based on circular contours and Gauss-Legendre quadrature) is extended to contours and quadrature schemes that are recursively subdividable. A general complex recursive algorithm is implemented on rectangular and diamond contours. The accuracy of different quadrature schemes for various contours is investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bruynooghe described a framework for the top-down abstract interpretation of logic programs. In this framework, abstract interpretation is carried out by constructing an abstract and-or tree in a top-down fashion for a given query and program. Such an abstract interpreter requires fixpoint computation for programs which contain recursive predicates. This paper presents in detail a fixpoint algorithm that has been developed for this purpose and the motivation behind it. We start off by describing a simple-minded algorithm. After pointing out its shortcomings, we present a series of refinements to this algorithm, until we reach the final version. The aim is to give an intuitive grasp and provide justification for the relative complexity of the final algorithm. We also present an informal proof of correctness of the algorithm and some results obtained from an implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Markov Chain Monte Carlo methods are widely used in signal processing and communications for statistical inference and stochastic optimization. In this work, we introduce an efficient adaptive Metropolis-Hastings algorithm to draw samples from generic multimodal and multidimensional target distributions. The proposal density is a mixture of Gaussian densities with all parameters (weights, mean vectors and covariance matrices) updated using all the previously generated samples applying simple recursive rules. Numerical results for the one and two-dimensional cases are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequent Itemsets mining is well explored for various data types, and its computational complexity is well understood. There are methods to deal effectively with computational problems. This paper shows another approach to further performance enhancements of frequent items sets computation. We have made a series of observations that led us to inventing data pre-processing methods such that the final step of the Partition algorithm, where a combination of all local candidate sets must be processed, is executed on substantially smaller input data. The paper shows results from several experiments that confirmed our general and formally presented observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G07, 62L20.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the development of an adaptive control algorithm for Computerized Numerical Control (CNC) machines implemented in a multi-axis motion control board based on the TMS320C31 DSP chip. The adaptive process involves two stages: Plant Modeling and Inverse Control Application. The first stage builds a non-recursive model of the CNC system (plant) using the Least-Mean-Square (LMS) algorithm. The second stage consists of the definition of a recursive structure (the controller) that implements an inverse model of the plant by using the coefficients of the model in an algorithm called Forward-Time Calculation (FTC). In this way, when the inverse controller is implemented in series with the plant, it will pre-compensate for the modification that the original plant introduces in the input signal. The performance of this solution was verified at three different levels: Software simulation, implementation in a set of isolated motor-encoder pairs and implementation in a real CNC machine. The use of the adaptive inverse controller effectively improved the step response of the system in all three levels. In the simulation, an ideal response was obtained. In the motor-encoder test, the rise time was reduced by as much as 80%, without overshoot, in some cases. Even with the larger mass of the actual CNC machine, decrease of the rise time and elimination of the overshoot were obtained in most cases. These results lead to the conclusion that the adaptive inverse controller is a viable approach to position control in CNC machinery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multilevel algorithms are a successful class of optimisation techniques which address the mesh partitioning problem for mapping meshes onto parallel computers. They usually combine a graph contraction algorithm together with a local optimisation method which refines the partition at each graph level. To date these algorithms have been used almost exclusively to minimise the cut-edge weight in the graph with the aim of minimising the parallel communication overhead. However it has been shown that for certain classes of problem, the convergence of the underlying solution algorithm is strongly influenced by the shape or aspect ratio of the subdomains. In this paper therefore, we modify the multilevel algorithms in order to optimise a cost function based on aspect ratio. Several variants of the algorithms are tested and shown to provide excellent results.