877 resultados para Non-stationary iterative method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we show how to accurately perform a quasi-a priori estimation of the truncation error of steady-state solutions computed by a discontinuous Galerkin spectral element method. We estimate the spatial truncation error using the ?-estimation procedure. While most works in the literature rely on fully time-converged solutions on grids with different spacing to perform the estimation, we use non time-converged solutions on one grid with different polynomial orders. The quasi-a priori approach estimates the error while the residual of the time-iterative method is not negligible. Furthermore, the method permits one to decouple the surface and the volume contributions of the truncation error, and provides information about the anisotropy of the solution as well as its rate of convergence in polynomial order. First, we focus on the analysis of one dimensional scalar conservation laws to examine the accuracy of the estimate. Then, we extend the analysis to two dimensional problems. We demonstrate that this quasi-a priori approach yields a spectrally accurate estimate of the truncation error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the DFA introduced by Peng, was established as an important tool capable of detecting long-range autocorrelation in time series with non-stationary. This technique has been successfully applied to various areas such as: Econophysics, Biophysics, Medicine, Physics and Climatology. In this study, we used the DFA technique to obtain the Hurst exponent (H) of the profile of electric density profile (RHOB) of 53 wells resulting from the Field School of Namorados. In this work we want to know if we can or not use H to spatially characterize the spatial data field. Two cases arise: In the first a set of H reflects the local geology, with wells that are geographically closer showing similar H, and then one can use H in geostatistical procedures. In the second case each well has its proper H and the information of the well are uncorrelated, the profiles show only random fluctuations in H that do not show any spatial structure. Cluster analysis is a method widely used in carrying out statistical analysis. In this work we use the non-hierarchy method of k-means. In order to verify whether a set of data generated by the k-means method shows spatial patterns, we create the parameter Ω (index of neighborhood). High Ω shows more aggregated data, low Ω indicates dispersed or data without spatial correlation. With help of this index and the method of Monte Carlo. Using Ω index we verify that random cluster data shows a distribution of Ω that is lower than actual cluster Ω. Thus we conclude that the data of H obtained in 53 wells are grouped and can be used to characterize space patterns. The analysis of curves level confirmed the results of the k-means

Relevância:

100.00% 100.00%

Publicador:

Resumo:

United States federal agencies assess flood risk using Bulletin 17B procedures which assume annual maximum flood series are stationary. This represents a significant limitation of current flood frequency models as the flood distribution is thereby assumed to be unaffected by trends or periodicity of atmospheric/climatic variables and/or anthropogenic activities. The validity of this assumption is at the core of this thesis, which aims to improve understanding of the forms and potential causes of non-stationarity in flood series for moderately impaired watersheds in the Upper Midwest and Northeastern US. Prior studies investigated non-stationarity in flood series for unimpaired watersheds; however, as the majority of streams are located in areas of increasing human activity, relative and coupled impacts of natural and anthropogenic factors need to be considered such that non-stationary flood frequency models can be developed for flood risk forecasting over relevant planning horizons for large scale water resources planning and management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the Cauchy problem for the Laplace equation in 3-dimensional doubly-connected domains, that is the reconstruction of a harmonic function from knowledge of the function values and normal derivative on the outer of two closed boundary surfaces. We employ the alternating iterative method, which is a regularizing procedure for the stable determination of the solution. In each iteration step, mixed boundary value problems are solved. The solution to each mixed problem is represented as a sum of two single-layer potentials giving two unknown densities (one for each of the two boundary surfaces) to determine; matching the given boundary data gives a system of boundary integral equations to be solved for the densities. For the discretisation, Weinert's method [24] is employed, which generates a Galerkin-type procedure for the numerical solution via rewriting the boundary integrals over the unit sphere and expanding the densities in terms of spherical harmonics. Numerical results are included as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-intrusive monitoring of health state of induction machines within industrial process and harsh environments poses a technical challenge. In the field, winding failures are a major fault accounting for over 45% of total machine failures. In the literature, many condition monitoring techniques based on different failure mechanisms and fault indicators have been developed where the machine current signature analysis (MCSA) is a very popular and effective method at this stage. However, it is extremely difficult to distinguish different types of failures and hard to obtain local information if a non-intrusive method is adopted. Typically, some sensors need to be installed inside the machines for collecting key information, which leads to disruption to the machine operation and additional costs. This paper presents a new non-invasive monitoring method based on GMRs to measure stray flux leaked from the machines. It is focused on the influence of potential winding failures on the stray magnetic flux in induction machines. Finite element analysis and experimental tests on a 1.5-kW machine are presented to validate the proposed method. With time-frequency spectrogram analysis, it is proven to be effective to detect several winding faults by referencing stray flux information. The novelty lies in the implement of GMR sensing and analysis of machine faults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the recent years, autonomous aerial vehicles gained large popularity in a variety of applications in the field of automation. To accomplish various and challenging tasks the capability of generating trajectories has assumed a key role. As higher performances are sought, traditional, flatness-based trajectory generation schemes present their limitations. In these approaches the highly nonlinear dynamics of the quadrotor is, indeed, neglected. Therefore, strategies based on optimal control principles turn out to be beneficial, since in the trajectory generation process they allow the control unit to best exploit the actual dynamics, and enable the drone to perform quite aggressive maneuvers. This dissertation is then concerned with the development of an optimal control technique to generate trajectories for autonomous drones. The algorithm adopted to this end is a second-order iterative method working directly in continuous-time, which, under proper initialization, guarantees quadratic convergence to a locally optimal trajectory. At each iteration a quadratic approximation of the cost functional is minimized and a decreasing direction is then obtained as a linear-affine control law, after solving a differential Riccati equation. The algorithm has been implemented and its effectiveness has been tested on the vectored-thrust dynamical model of a quadrotor in a realistic simulative setup.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Um evento extremo de precipitação ocorreu na primeira semana do ano 2000, de 1º a 5 de janeiro, no Vale do Paraíba, parte leste do Estado de São Paulo, Brasil, causando enorme impacto socioeconômico, com mortes e destruição. Este trabalho estudou este evento em 10 estações meteorológicas selecionadas que foram consideradas como aquelas tendo dados mais homogêneos do Que outras estações na região. O modelo de distribuição generalizada de Pareto (DGP) para valores extremos de precipitação de 5 dias foi desenvolvido, individualmente para cada uma dessas estações. Na modelagem da DGP, foi adotada abordagem não-estacionaria considerando o ciclo anual e tendência de longo prazo como co-variaveis. Uma conclusão desta investigação é que as quantidades de precipitação acumulada durante os 5 dias do evento estudado podem ser classificadas como extremamente raras para a região, com probabilidade de ocorrência menor do que 1% para maioria das estações, e menor do que 0,1% em três estações.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eletrorretinograma (ERG) é o meio diagnóstico objetivo e não-invasivo para avaliar a função retiniana e detectar precocemente, em várias espécies, lesões nas suas camadas mais externas. As indicações mais comuns para ERG em cães são: avaliação pré-cirúrgica de pacientes com catarata, caracterização de distúrbios que causam cegueira, além de servir como importante modelo para o estudo da distrofia retiniana que acomete o homem. Vários são os fatores que podem alterar o ERG tais como: eletrorretinógrafo, fonte de estimulação luminosa, tipo do eletrodo, tempo de adaptação ao escuro, tamanho pupilar, opacidade de meios e protocolo de sedação ou anestesia; além da espécie, raça e idade. Objetivou-se com este estudo padronizar o ERG para cães submetidos à sedação, seguindo o protocolo da International Society for Clinical Electrophysiology of Vision (ISCEV), utilizando Ganzfeld e eletrodos Burian Allen. Foram realizados 233 eletrorretinogramas em cães, 147 fêmeas e 86 machos, com idades entre um e 14 anos. Dos 233 cães examinados, 100 apresentavam catarata em diferentes estágios de maturação, 72 eram diabéticos e apresentavam catarata madura ou hipermadura, 26 apresentaram eletrorretinograma compatível com degeneração retiniana progressiva, três apresentaram eletrorretinograma compatível com síndrome da degeneração retiniana adquirida subitamente e 32 não apresentaram lesão retiniana capaz de atenuar as respostas do ERG, sendo considerados normais quanto à função retiniana. A sedação foi capaz de produzir boa imobilização do paciente sem rotacionar o bulbo ocular, permitindo adequada estimulação retiniana bilateralmente, com auxílio do Ganzfeld. O sistema eletrodiagnóstico Veris registrou com sucesso e simultaneamente de ambos os olhos, as cinco respostas preconizadas pela ISCEV. Como o ERG de campo total tornou-se exame fundamental na rotina oftalmológica, sua padronização é indispensável quando se objetiva comparar resultados de laboratórios distintos. A confiabilidade e reprodutibilidade deste protocolo foi demonstrada com a obtenção de registros de ótima qualidade utilizando protocolo padrão da ISCEV, eletrorretinógrafo Veris, Ganzfeld e eletrodos Burian Allen nos cães submetidos à sedação.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interference by autofluorescence is one of the major concerns of immunofluorescence analysis of in situ hybridization-based diagnostic assays. We present a useful technique that reduces autofluorescent background without affecting the tissue integrity or direct immunofluorescence signals in brain sections. Using six different protocols, such as ammonia/ethanol, Sudan Black B (SBB) in 70% ethanol, photobleaching with UV light and different combinations of them in both formalin-fixed paraffin-embedded and frozen human brain tissue sections, we have found that tissue treatment of SBB in a concentration of 0.1% in 70% ethanol is the best approach to reduce/eliminate tissue autofluorescence and background, while preserving the specific fluorescence hybridization signals. This strategy is a feasible, non-time consuming method that provides a reasonable compromise between total reduction of the tissue autofluorescence and maintenance of specific fluorescent labels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detection of seizure in the newborn is a critical aspect of neurological research. Current automatic detection techniques are difficult to assess due to the problems associated with acquiring and labelling newborn electroencephalogram (EEG) data. A realistic model for newborn EEG would allow confident development, assessment and comparison of these detection techniques. This paper presents a model for newborn EEG that accounts for its self-similar and non-stationary nature. The model consists of background and seizure sub-models. The newborn EEG background model is based on the short-time power spectrum with a time-varying power law. The relationship between the fractal dimension and the power law of a power spectrum is utilized for accurate estimation of the short-time power law exponent. The newborn EEG seizure model is based on a well-known time-frequency signal model. This model addresses all significant time-frequency characteristics of newborn EEG seizure which include; multiple components or harmonics, piecewise linear instantaneous frequency laws and harmonic amplitude modulation. Estimates of the parameters of both models are shown to be random and are modelled using the data from a total of 500 background epochs and 204 seizure epochs. The newborn EEG background and seizure models are validated against real newborn EEG data using the correlation coefficient. The results show that the output of the proposed models has a higher correlation with real newborn EEG than currently accepted models (a 10% and 38% improvement for background and seizure models, respectively).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, a fast, non destructive voltammetric method for cocaine detection in acetonitrile medium using a platinum disk electrode chemically modified with cobalt-hexacyanoferrate (CoHCFe) film is described. The deposition of CoHCFe film at platinum disk (working electrode) was carried out in aqueous solution containing NaClO(4) at 0.1 mol L(-1) as supporting electrolite. Stability studies of the film and subsequent voltammetric analysis of cocaine were made in acetonitrile medium with NaClO4 at 0.1 mol L(-1) as supporting electrolite. A reversible interaction between cocaine and CoHCFe at the film produces a proportional decrease of original peak current, due to the formation of a complex between cocaine and cobalt ions, with subsequent partial passivation of the film surface, being the intensity of current decrease used as analytical signal for cocaine. A linear dependence of cocaine detection was carried out in the range from 2.4 x 10 x 4 to 1.5 x 10(-3) mol L(-1), with a linear correlation coefficient of 0.994 and a detection limit of 1.4 x 10 x 4 mol L(-1). The analysis of confiscated samples by the proposed method indicated cocaine levels from 37% to 95% (m/m) and these results were validated by comparison to HPLC technique, being obtained good correlation between both methods. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The study compared the prevalence, correlates of functional impairment, and service utilization for eating disorders across Latinos, Asians, and African Americans living in the United States to non-Latino Whites. Method: Pooled data from the NIMH Collaborative Psychiatric Epidemiological Studies (CPES; NIMH, 2007) were used. Results: The prevalence of anorexia nervosa (AN) and binge-eating disorder (BED) were similar across all groups examined, but bulimia nervosa (BN) was more prevalent among Latinos and African Americans than non-Latino Whites. Despite similar prevalence of BED among ethnic groups examined, lifetime prevalence of any binge eating (ABE) was greater among each of the ethnic minority groups in comparison to non-Latino Whites. Lifetime prevalence of mental health service utilization was lower among ethnic minority groups studied than for non-Latino Whites for respondents with a lifetime history of any eating disorder. Discussion: These findings suggest the need for clinician training and health policy interventions to achieve optimal and equitable care for eating disorders across all ethnic groups in the United States. (C) 2010 by Wiley Periodicals, Inc.