12 resultados para Linear analysis

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Open-loop operatlon of the stepping motor exploits the inherent advantages of the machine. For near optimum operation: in this mode, however, an accurate system model is required to facilitate controller design. Such a model must be comprehensive and take account of the non-linearities inherent in the system. The result is a complex formulation which can be made manageable with a computational aid. A digital simulation of a hybrid type stepping motor and its associated drive circuit is proposed. The simulation is based upon a block diagram model which includes reasonable approximations to the major non-linearities. The simulation is shown to yield accurate performance predictions. The determination of the transfer functions is based upon the consideration of the physical processes involved rather than upon direct input-outout measurements. The effects of eddy currents, saturation, hysteresis, drive circuit characteristics and non-linear torque displacement characteristics are considered and methods of determining transfer functions, which take account of these effects, are offered. The static torque displacement characteristic is considered in detail and a model is proposed which predicts static torque for any combination of phase currents and shaft position. Methods of predicting the characteristic directly from machine geometry are investigated. Drive circuit design for high efficiency operation is considered and a model of a bipolar, bilevel circuit is proposed. The transfers between stator voltage and stator current and between stator current and air gap flux are complicated by the effects of eddy currents, saturation and hysteresis. Frequency response methods, combined with average inductance measurements, are shown to yield reasonable transfer functions. The modelling procedure and subsequent digital simulation is concluded to be a powerful method of non-linear analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The stability of internally heated convective flows in a vertical channel under the influence of a pressure gradient and in the limit of small Prandtl number is examined numerically. In each of the cases studied the basic flow, which can have two inflection points, loses stability at the critical point identified by the corresponding linear analysis to two-dimensional states in a Hopf bifurcation. These marginal points determine the linear stability curve that identifies the minimum Grashof number (based on the strength of the homogeneous heat source), at which the two-dimensional periodic flow can bifurcate. The range of stability of the finite amplitude secondary flow is determined by its (linear) stability against three-dimensional infinitesimal disturbances. By first examining the behavior of the eigenvalues as functions of the Floquet parameters in the streamwise and spanwise directions we show that the secondary flow loses stability also in a Hopf bifurcation as the Grashof number increases, indicating that the tertiary flow is quasi-periodic. Secondly the Eckhaus marginal stability curve, that bounds the domain of stable transverse vortices towards smaller and larger wavenumbers, but does not cause a transition as the Grashof number increases, is also given for the cases studied in this work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Technological advances have driven some attempt of vital parameters monitoring in adverse environments; these improvements will make possible to monitor cardiac activity also in automotive environments. In this scenario, heart rate changes associated with alcohol consumption, become of great importance to assess the drivers state during time. This paper presents the results of a first set of experiments aimed to discover heart rate variability modification induced by moderate assumption of alcoholic drink (i.e. single draft beer) as that typically occurs in weekend among some people. In the study, twenty subjects were enrolled and for each of them two electrocardiographic recordings were carried out: the first before alcohol ingestion and the second after 25-30 minutes. Each participant remained fasting until the second ECG acquisition was completed. ECG signal were analyzed by typical timedomain, frequency and non linear analysis. Results showed a small increase in LF/HF ratio which reflects a dominance of the sympathetic system over the parasympathetic system, and an increase in signal complexity as proven by non linear analysis. However, the study highlighted the need to monitor HRV starting from alcohol ingestion until its complete metabolization to allow a more precise description of its variation. © Springer International Publishing Switzerland 2014.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The spatial patterns of discrete beta-amyloid (Abeta) deposits in brain tissue from patients with Alzheimer disease (AD) were studied using a statistical method based on linear regression, the results being compared with the more conventional variance/mean (V/M) method. Both methods suggested that Abeta deposits occurred in clusters (400 to <12,800 mu m in diameter) in all but 1 of the 42 tissues examined. In many tissues, a regular periodicity of the Abeta deposit clusters parallel to the tissue boundary was observed. In 23 of 42 (55%) tissues, the two methods revealed essentially the same spatial patterns of Abeta deposits; in 15 of 42 (36%), the regression method indicated the presence of clusters at a scale not revealed by the V/M method; and in 4 of 42 (9%), there was no agreement between the two methods. Perceived advantages of the regression method are that there is a greater probability of detecting clustering at multiple scales, the dimension of larger Abeta clusters can be estimated more accurately, and the spacing between the clusters may be estimated. However, both methods may be useful, with the regression method providing greater resolution and the V/M method providing greater simplicity and ease of interpretation. Estimates of the distance between regularly spaced Abeta clusters were in the range 2,200-11,800 mu m, depending on tissue and cluster size. The regular periodicity of Abeta deposit clusters in many tissues would be consistent with their development in relation to clusters of neurons that give rise to specific neuronal projections.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The accurate in silico identification of T-cell epitopes is a critical step in the development of peptide-based vaccines, reagents, and diagnostics. It has a direct impact on the success of subsequent experimental work. Epitopes arise as a consequence of complex proteolytic processing within the cell. Prior to being recognized by T cells, an epitope is presented on the cell surface as a complex with a major histocompatibility complex (MHC) protein. A prerequisite therefore for T-cell recognition is that an epitope is also a good MHC binder. Thus, T-cell epitope prediction overlaps strongly with the prediction of MHC binding. In the present study, we compare discriminant analysis and multiple linear regression as algorithmic engines for the definition of quantitative matrices for binding affinity prediction. We apply these methods to peptides which bind the well-studied human MHC allele HLA-A*0201. A matrix which results from combining results of the two methods proved powerfully predictive under cross-validation. The new matrix was also tested on an external set of 160 binders to HLA-A*0201; it was able to recognize 135 (84%) of them.