62 resultados para Non-Gaussian dynamic models
Resumo:
A problem with use of the geostatistical Kriging error for optimal sampling design is that the design does not adapt locally to the character of spatial variation. This is because a stationary variogram or covariance function is a parameter of the geostatistical model. The objective of this paper was to investigate the utility of non-stationary geostatistics for optimal sampling design. First, a contour data set of Wiltshire was split into 25 equal sub-regions and a local variogram was predicted for each. These variograms were fitted with models and the coefficients used in Kriging to select optimal sample spacings for each sub-region. Large differences existed between the designs for the whole region (based on the global variogram) and for the sub-regions (based on the local variograms). Second, a segmentation approach was used to divide a digital terrain model into separate segments. Segment-based variograms were predicted and fitted with models. Optimal sample spacings were then determined for the whole region and for the sub-regions. It was demonstrated that the global design was inadequate, grossly over-sampling some segments while under-sampling others.
Resumo:
The monitoring of multivariate systems that exhibit non-Gaussian behavior is addressed. Existing work advocates the use of independent component analysis (ICA) to extract the underlying non-Gaussian data structure. Since some of the source signals may be Gaussian, the use of principal component analysis (PCA) is proposed to capture the Gaussian and non-Gaussian source signals. A subsequent application of ICA then allows the extraction of non-Gaussian components from the retained principal components (PCs). A further contribution is the utilization of a support vector data description to determine a confidence limit for the non-Gaussian components. Finally, a statistical test is developed for determining how many non-Gaussian components are encapsulated within the retained PCs, and associated monitoring statistics are defined. The utility of the proposed scheme is demonstrated by a simulation example, and the analysis of recorded data from an industrial melter.
Resumo:
The stochastic nature of oil price fluctuations is investigated over a twelve-year period, borrowing feedback from an existing database (USA Energy Information Administration database, available online). We evaluate the scaling exponents of the fluctuations by employing different statistical analysis methods, namely rescaled range analysis (R/S), scale windowed variance analysis (SWV) and the generalized Hurst exponent (GH) method. Relying on the scaling exponents obtained, we apply a rescaling procedure to investigate the complex characteristics of the probability density functions (PDFs) dominating oil price fluctuations. It is found that PDFs exhibit scale invariance, and in fact collapse onto a single curve when increments are measured over microscales (typically less than 30 days). The time evolution of the distributions is well fitted by a Levy-type stable distribution. The relevance of a Levy distribution is made plausible by a simple model of nonlinear transfer. Our results also exhibit a degree of multifractality as the PDFs change and converge toward to a Gaussian distribution at the macroscales.
Resumo:
This paper investigates sub-integer implementations of the adaptive Gaussian mixture model (GMM) for background/foreground segmentation to allow the deployment of the method on low cost/low power processors that lack Floating Point Unit (FPU). We propose two novel integer computer arithmetic techniques to update Gaussian parameters. Specifically, the mean value and the variance of each Gaussian are updated by a redefined and generalised "round'' operation that emulates the original updating rules for a large set of learning rates. Weights are represented by counters that are updated following stochastic rules to allow a wider range of learning rates and the weight trend is approximated by a line or a staircase. We demonstrate that the memory footprint and computational cost of GMM are significantly reduced, without significantly affecting the performance of background/foreground segmentation.
Resumo:
Electric vehicles are a key prospect for future transportation. A large penetration of electric vehicles has the potential to reduce the global fossil fuel consumption and hence the greenhouse gas emissions and air pollution. However, the additional stochastic loads imposed by plug-in electric vehicles will possibly introduce significant changes to existing load profiles. In his paper, electric vehicles loads are integrated into an 5-unit system using a non-convex dynamic dispatch model. The actual infrastructure characteristics including valve-point effects, load balance constrains and transmission loss have been included in the model. Multiple load profiles are comparatively studied and compared in terms of economic and environmental impacts in order o identify patterns to charge properly. The study as expected shows ha off-peak charging is the best scenario with respect to using less fuels and producing less emissions.
Resumo:
Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.
Resumo:
We numerically analyse the behavior of the full distribution of collective observables in quantum spin chains. While most of previous studies of quantum critical phenomena are limited to the first moments, here we demonstrate how quantum fluctuations at criticality lead to highly non-Gaussian distributions. Interestingly, we show that the distributions for different system sizes collapse on thesame curve after scaling for a wide range of transitions: first and second order quantum transitions and transitions of the Berezinskii–Kosterlitz–Thouless type. We propose and analyse the feasibility of an experimental reconstruction of the distribution using light–matter interfaces for atoms in optical lattices or in optical resonators.
Resumo:
We address the problem of non-linearity in 2D Shape modelling of a particular articulated object: the human body. This issue is partially resolved by applying a different Point Distribution Model (PDM) depending on the viewpoint. The remaining non-linearity is solved by using Gaussian Mixture Models (GMM). A dynamic-based clustering is proposed and carried out in the Pose Eigenspace. A fundamental question when clustering is to determine the optimal number of clusters. From our point of view, the main aspect to be evaluated is the mean gaussianity. This partitioning is then used to fit a GMM to each one of the view-based PDM, derived from a database of Silhouettes and Skeletons. Dynamic correspondences are then obtained between gaussian models of the 4 mixtures. Finally, we compare this approach with other two methods we previously developed to cope with non-linearity: Nearest Neighbor (NN) Classifier and Independent Component Analysis (ICA).
Resumo:
Polymer extrusion is a complex process and the availability of good dynamic models is key for improved system operation. Previous modelling attempts have failed adequately to capture the non-linearities of the process or prove too complex for control applications. This work presents a novel approach to the problem by the modelling of extrusion viscosity and pressure, adopting a grey box modelling technique that combines mechanistic knowledge with empirical data using a genetic algorithm approach. The models are shown to outperform those of a much higher order generated by a conventional black box technique while providing insight into the underlying processes at work within the extruder.
Resumo:
Traditionally the simulation of the thermodynamic aspects of the internal combustion engine has been undertaken using one-dimensional gas-dynamic models to represent the intake and exhaust systems. CFD analysis of engines has been restricted to modelling of in-cylinder flow structures. With the increasing accessibility of CFD software it is now worth considering its use for complete gas-dynamic engine simulation. This paper appraises the accuracy of various CFD models in comparison to a 1D gas-dynamic simulation. All of the models are compared to experimental data acquired on an apparatus that generates a single gas-dynamic pressure wave. The progress of the wave along a constant area pipe and its subsequent reflection from the open pipe end are recorded with a number of high speed pressure transducers. It was found that there was little to choose between the accuracy of the 1D model and the best CFD model. The CFD model did not require experimentally derived loss coefficients to accurately represent the open pipe end; however, it took several hundred times longer to complete its analysis. The best congruency between the CFD models and the experimental data was achieved using the RNG k-e turbulence model. The open end of the pipe was most effectively represented by surrounding it with a relatively small volume of cells connected to the rest of the environment using a pressure boundary.