65 resultados para linear predictive coding (LPC)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kinematic analysis is conducted to derive the geometric constraints for the geometric design of foldable barrel vaults (FBV) composed of polar or angulated scissor units. Non-linear structural analysis is followed to determine the structural response of FBVs in the fully deployed configuration under static loading. Two load cases are considered: cross wind and longitudinal wind. The effect of varying member sizes, depth-to-span ratio and geometric imperfections is examined. (C) 2000 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Although population viability analysis (PVA) is widely employed, forecasts from PVA models are rarely tested. This study in a fragmented forest in southern Australia contrasted field data on patch occupancy and abundance for the arboreal marsupial greater glider Petauroides volans with predictions from a generic spatially explicit PVA model. This work represents one of the first landscape-scale tests of its type. 2. Initially we contrasted field data from a set of eucalypt forest patches totalling 437 ha with a naive null model in which forecasts of patch occupancy were made, assuming no fragmentation effects and based simply on remnant area and measured densities derived from nearby unfragmented forest. The naive null model predicted an average total of approximately 170 greater gliders, considerably greater than the true count (n = 81). 3. Congruence was examined between field data and predictions from PVA under several metapopulation modelling scenarios. The metapopulation models performed better than the naive null model. Logistic regression showed highly significant positive relationships between predicted and actual patch occupancy for the four scenarios (P = 0.001-0.006). When the model-derived probability of patch occupancy was high (0.50-0.75, 0.75-1.00), there was greater congruence between actual patch occupancy and the predicted probability of occupancy. 4. For many patches, probability distribution functions indicated that model predictions for animal abundance in a given patch were not outside those expected by chance. However, for some patches the model either substantially over-predicted or under-predicted actual abundance. Some important processes, such as inter-patch dispersal, that influence the distribution and abundance of the greater glider may not have been adequately modelled. 5. Additional landscape-scale tests of PVA models, on a wider range of species, are required to assess further predictions made using these tools. This will help determine those taxa for which predictions are and are not accurate and give insights for improving models for applied conservation management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High performance video codec is mandatory for multimedia applications such as video-on-demand and video conferencing. Recent research has proposed numerous video coding techniques to meet the requirement in bandwidth, delay, loss and Quality-of-Service (QoS). In this paper, we present our investigations on inter-subband self-similarity within the wavelet-decomposed video frames using neural networks, and study the performance of applying the spatial network model to all video frames over time. The goal of our proposed method is to restore the highest perceptual quality for video transmitted over a highly congested network. Our contributions in this paper are: (1) A new coding model with neural network based, inter-subband redundancy (ISR) prediction for video coding using wavelet (2) The performance of 1D and 2D ISR prediction, including multiple levels of wavelet decompositions. Our result shows a short-term quality enhancement may be obtained using both 1D and 2D ISR prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Many guidelines advocate measurement of total or low density lipoprotein cholesterol (LDL), high density lipoprotein cholesterol (HDL), and triglycerides (TG) to determine treatment recommendations for preventing coronary heart disease (CHD) and cardiovascular disease (CVD). This analysis is a comparison of lipid variables as predictors of cardiovascular disease. METHODS: Hazard ratios for coronary and cardiovascular deaths by fourths of total cholesterol (TC), LDL, HDL, TG, non-HDL, TC/HDL, and TG/HDL values, and for a one standard deviation change in these variables, were derived in an individual participant data meta-analysis of 32 cohort studies conducted in the Asia-Pacific region. The predictive value of each lipid variable was assessed using the likelihood ratio statistic. RESULTS: Adjusting for confounders and regression dilution, each lipid variable had a positive (negative for HDL) log-linear association with fatal CHD and CVD. Individuals in the highest fourth of each lipid variable had approximately twice the risk of CHD compared with those with lowest levels. TG and HDL were each better predictors of CHD and CVD risk compared with TC alone, with test statistics similar to TC/HDL and TG/HDL ratios. Calculated LDL was a relatively poor predictor. CONCLUSIONS: While LDL reduction remains the main target of intervention for lipid-lowering, these data support the potential use of TG or lipid ratios for CHD risk prediction. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classification rules of linear discriminant analysis are defined by the true mean vectors and the common covariance matrix of the populations from which the data come. Because these true parameters are generally unknown, they are commonly estimated by the sample mean vector and covariance matrix of the data in a training sample randomly drawn from each population. However, these sample statistics are notoriously susceptible to contamination by outliers, a problem compounded by the fact that the outliers may be invisible to conventional diagnostics. High-breakdown estimation is a procedure designed to remove this cause for concern by producing estimates that are immune to serious distortion by a minority of outliers, regardless of their severity. In this article we motivate and develop a high-breakdown criterion for linear discriminant analysis and give an algorithm for its implementation. The procedure is intended to supplement rather than replace the usual sample-moment methodology of discriminant analysis either by providing indications that the dataset is not seriously affected by outliers (supporting the usual analysis) or by identifying apparently aberrant points and giving resistant estimators that are not affected by them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The anisotropic norm of a linear discrete-time-invariant system measures system output sensitivity to stationary Gaussian input disturbances of bounded mean anisotropy. Mean anisotropy characterizes the degree of predictability (or colouredness) and spatial non-roundness of the noise. The anisotropic norm falls between the H-2 and H-infinity norms and accommodates their loss of performance when the probability structure of input disturbances is not exactly known. This paper develops a method for numerical computation of the anisotropic norm which involves linked Riccati and Lyapunov equations and an associated special type equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ussing [1] considered the steady flux of a single chemical component diffusing through a membrane under the influence of chemical potentials and derived from his linear model, an expression for the ratio of this flux and that of the complementary experiment in which the boundary conditions were interchanged. Here, an extension of Ussing's flux ratio theorem is obtained for n chemically interacting components governed by a linear system of diffusion-migration equations that may also incorporate linear temporary trapping reactions. The determinants of the output flux matrices for complementary experiments are shown to satisfy an Ussing flux ratio formula for steady state conditions of the same form as for the well-known one-component case. (C) 2000 Elsevier Science Ltd. All rights reserved.