118 resultados para Exponential Smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The bifidobacterial β-galactosidase (BbgIV) was produced in E. coli DH5α at 37 and 30 °C in a 5 L bioreactor under varied conditions of dissolved oxygen (dO2) and pH. The yield of soluble BbgIV was significantly (P < 0.05) increased once the dO2 dropped to 0–2% and remained at such low values during the exponential phase. Limited dO2 significantly (P < 0.05) increased the plasmid copy number and decreased the cells growth rate. Consequently, the BbgIV yield increased to its maximum (71–75 mg per g dry cell weight), which represented 20–25% of the total soluble proteins in the cells. In addition, the specific activity and catalytic efficiency of BbgIV were significantly (P < 0.05) enhanced under limited dO2 conditions. This was concomitant with a change in the enzyme secondary structure, suggesting a link between the enzyme structure and function. The knowledge generated from this work is very important for producing BbgIV as a biocatalyst for the development of a cost-effective process for the synthesis of prebiotic galactooligosaccharides from lactose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Psychotic phenomena appear to form a continuum with normal experience and beliefs, and may build on common emotional interpersonal concerns. Aims: We tested predictions that paranoid ideation is exponentially distributed and hierarchically arranged in the general population, and that persecutory ideas build on more common cognitions of mistrust, interpersonal sensitivity and ideas of reference. Method: Items were chosen from the Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II) questionnaire and the Psychosis Screening Questionnaire in the second British National Survey of Psychiatric Morbidity (n = 8580), to test a putative hierarchy of paranoid development using confirmatory factor analysis, latent class analysis and factor mixture modelling analysis. Results: Different types of paranoid ideation ranged in frequency from less than 2% to nearly 30%. Total scores on these items followed an almost perfect exponential distribution (r = 0.99). Our four a priori first-order factors were corroborated (interpersonal sensitivity; mistrust; ideas of reference; ideas of persecution). These mapped onto four classes of individual respondents: a rare, severe, persecutory class with high endorsement of all item factors, including persecutory ideation; a quasi-normal class with infrequent endorsement of interpersonal sensitivity, mistrust and ideas of reference, and no ideas of persecution; and two intermediate classes, characterised respectively by relatively high endorsement of items relating to mistrust and to ideas of reference. Conclusions: The paranoia continuum has implications for the aetiology, mechanisms and treatment of psychotic disorders, while confirming the lack of a clear distinction from normal experiences and processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose and analyze a hybrid $hp$ boundary element method for the solution of problems of high frequency acoustic scattering by sound-soft convex polygons, in which the approximation space is enriched with oscillatory basis functions which efficiently capture the high frequency asymptotics of the solution. We demonstrate, both theoretically and via numerical examples, exponential convergence with respect to the order of the polynomials, moreover providing rigorous error estimates for our approximations to the solution and to the far field pattern, in which the dependence on the frequency of all constants is explicit. Importantly, these estimates prove that, to achieve any desired accuracy in the computation of these quantities, it is sufficient to increase the number of degrees of freedom in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the inuence of the intrinsic curvature on the large time behaviour of the heat equation in a tubular neighbourhood of an unbounded geodesic in a two-dimensional Riemannian manifold. Since we consider killing boundary conditions, there is always an exponential-type decay for the heat semigroup. We show that this exponential-type decay is slower for positively curved manifolds comparing to the at case. As the main result, we establish a sharp extra polynomial-type decay for the heat semigroup on negatively curved manifolds comparing to the at case. The proof employs the existence of Hardy-type inequalities for the Dirichlet Laplacian in the tubular neighbourhoods on negatively curved manifolds and the method of self-similar variables and weighted Sobolev spaces for the heat equation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Considerable effort is presently being devoted to producing high-resolution sea surface temperature (SST) analyses with a goal of spatial grid resolutions as low as 1 km. Because grid resolution is not the same as feature resolution, a method is needed to objectively determine the resolution capability and accuracy of SST analysis products. Ocean model SST fields are used in this study as simulated “true” SST data and subsampled based on actual infrared and microwave satellite data coverage. The subsampled data are used to simulate sampling errors due to missing data. Two different SST analyses are considered and run using both the full and the subsampled model SST fields, with and without additional noise. The results are compared as a function of spatial scales of variability using wavenumber auto- and cross-spectral analysis. The spectral variance at high wavenumbers (smallest wavelengths) is shown to be attenuated relative to the true SST because of smoothing that is inherent to both analysis procedures. Comparisons of the two analyses (both having grid sizes of roughly ) show important differences. One analysis tends to reproduce small-scale features more accurately when the high-resolution data coverage is good but produces more spurious small-scale noise when the high-resolution data coverage is poor. Analysis procedures can thus generate small-scale features with and without data, but the small-scale features in an SST analysis may be just noise when high-resolution data are sparse. Users must therefore be skeptical of high-resolution SST products, especially in regions where high-resolution (~5 km) infrared satellite data are limited because of cloud cover.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article, we investigate how the choice of the attenuation factor in an extended version of Katz centrality influences the centrality of the nodes in evolving communication networks. For given snapshots of a network, observed over a period of time, recently developed communicability indices aim to identify the best broadcasters and listeners (receivers) in the network. Here we explore the attenuation factor constraint, in relation to the spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We compare three different communicability measures: standard, exponential, and relaxed (where the spectral radius bound on the attenuation factor is relaxed and the adjacency matrix is normalised, in order to maintain the convergence of the measure). Furthermore, using a vitality-based measure of both standard and relaxed communicability indices, we look at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We compare those measures with the scores produced by an iterative version of the PageRank algorithm and illustrate our findings with two examples of real-life evolving networks: the MIT reality mining data set, consisting of daily communications between 106 individuals over the period of one year, a UK Twitter mentions network, constructed from the direct \emph{tweets} between 12.4k individuals during one week, and a subset the Enron email data set.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the solutions of the Smoluchowski coagulation equation with a regularization term which removes clusters from the system when their mass exceeds a specified cutoff size, M. We focus primarily on collision kernels which would exhibit an instantaneous gelation transition in the absence of any regularization. Numerical simulations demonstrate that for such kernels with monodisperse initial data, the regularized gelation time decreasesas M increases, consistent with the expectation that the gelation time is zero in the unregularized system. This decrease appears to be a logarithmically slow function of M, indicating that instantaneously gelling kernels may still be justifiable as physical models despite the fact that they are highly singular in the absence of a cutoff. We also study the case when a source of monomers is introduced in the regularized system. In this case a stationary state is reached. We present a complete analytic description of this regularized stationary state for the model kernel, K(m1,m2)=max{m1,m2}ν, which gels instantaneously when M→∞ if ν>1. The stationary cluster size distribution decays as a stretched exponential for small cluster sizes and crosses over to a power law decay with exponent ν for large cluster sizes. The total particle density in the stationary state slowly vanishes as [(ν−1)logM]−1/2 when M→∞. The approach to the stationary state is nontrivial: Oscillations about the stationary state emerge from the interplay between the monomer injection and the cutoff, M, which decay very slowly when M is large. A quantitative analysis of these oscillations is provided for the addition model which describes the situation in which clusters can only grow by absorbing monomers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Protons and electrons are being exploited in different natural charge transfer processes. Both types of charge carriers could be, therefore, responsible for charge transport in biomimetic self-assembled peptide nanostructures. The relative contribution of each type of charge carrier is studied in the present work for fi brils self-assembled from amyloid- β derived peptide molecules, in which two non-natural thiophene-based amino acids are included. It is shown that under low humidity conditions both electrons and protons contribute to the conduction, with current ratio of 1:2 respectively, while at higher relative humidity proton transport dominates the conductance. This hybrid conduction behavior leads to a bimodal exponential dependence of the conductance on the relative humidity. Furthermore, in both cases the conductance is shown to be affected by the peptide folding state under the entire relative humidity range. This unique hybrid conductivity behavior makes self-assembled peptide nanostructures powerful building blocks for the construction of electric devices that could use either or both types of charge carriers for their function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various complex oscillatory processes are involved in the generation of the motor command. The temporal dynamics of these processes were studied for movement detection from single trial electroencephalogram (EEG). Autocorrelation analysis was performed on the EEG signals to find robust markers of movement detection. The evolution of the autocorrelation function was characterised via the relaxation time of the autocorrelation by exponential curve fitting. It was observed that the decay constant of the exponential curve increased during movement, indicating that the autocorrelation function decays slowly during motor execution. Significant differences were observed between movement and no moment tasks. Additionally, a linear discriminant analysis (LDA) classifier was used to identify movement trials with a peak accuracy of 74%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Binding to bovine serum albumin of monomeric (vescalagin and pedunculagin) and dimeric ellagitannins (roburin A, oenothein B, and gemin A) was investigated by isothermal titration calorimetry and fluorescence spectroscopy, which indicated two types of binding sites. Stronger and more specific sites exhibited affinity constants, K1, of 104–106 M–1 and stoichiometries, n1, of 2–13 and dominated at low tannin concentrations. Weaker and less-specific binding sites had K2 constants of 103–105 M–1 and stoichiometries, n2, of 16–30 and dominated at higher tannin concentrations. Binding to stronger sites appeared to be dependent on tannin flexibility and the presence of free galloyl groups. Positive entropies for all but gemin A indicated that hydrophobic interactions dominated during complexation. This was supported by an exponential relationship between the affinity, K1, and the modeled hydrophobic accessible surface area and by a linear relationship between K1 and the Stern–Volmer quenching constant, KSV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transient responses of electrorheological fluids to square-wave electric fields in steady shear are investigated by computational simulation method. The structure responses of the fluids to the field with high frequency are found to be very similar to that to the field with very low frequency or the sudden applied direct current field. The stress rise processes are also similar in both cases and can be described by an exponential expression. The characteristic time tau of the stress response is found to decrease with the increase of the shear rate (gamma) over dot and the area fraction of the particles phi(2). The relation between them can be roughly expressed as tau proportional to(gamma) over dot(-3/4)phi(2)(-3/2). The simulation results are compared with experimental measurements. The aggregation kinetics of the particles in steady shear is also discussed according to these results.