954 resultados para RBF NLGA reti neurali quadrotor identificazione Matlab simulatori controlli automatici
Resumo:
Presentamos una investigación cuyo objetivo es analizar la comprensión de la recta tangente en un entorno de aprendizaje en el que se puede usar un CAS. Desde las perspectivas históricas y cognitivas (APOS) analizaremos una serie textos de Bachillerato e Ingeniería que nos permitirá fijar una propuesta para la comprensión de la recta tangente como el límite de una sucesión de rectas secantes que tienen en común el punto de tangencia. Finalmente, mostramos unas herramientas diseñadas con el asistente matemático MATLAB© (génesis instrumental), accesibles online, que pueden ayudar a los estudiantes, especialmente en el registro gráfico, a construir los objetos cognitivos descritos en la descomposición genética.
Resumo:
En un proyecto de investigación finalizado, se diseñó un software de escritorio para la enseñanza y el aprendizaje del tema Resolución Numérica de Ecuaciones no Lineales, usando el paquete MatLab.
Resumo:
Artificial neural network (ANN) models for water loss (WL) and solid gain (SG) were evaluated as potential alternative to multiple linear regression (MLR) for osmotic dehydration of apple, banana and potato. The radial basis function (RBF) network with a Gaussian function was used in this study. The RBF employed the orthogonal least square learning method. When predictions of experimental data from MLR and ANN were compared, an agreement was found for ANN models than MLR models for SG than WL. The regression coefficient for determination (R2) for SG in MLR models was 0.31, and for ANN was 0.91. The R2 in MLR for WL was 0.89, whereas ANN was 0.84.Osmotic dehydration experiments found that the amount of WL and SG occurred in the following descending order: Golden Delicious apple > Cox apple > potato > banana. The effect of temperature and concentration of osmotic solution on WL and SG of the plant materials followed a descending order as: 55 > 40 > 32.2C and 70 > 60 > 50 > 40%, respectively.
Resumo:
The Continuous Plankton Recorder (CPR) survey provides a unique multi- decadal dataset on the abundance of plankton in the North Sea and North Atlantic and is one of only a few monitoring programmes operating at a large spatio- temporal scale. The results of all samples analysed from the survey since 1946 are stored on an Access Database at the Sir Alister Hardy Foundation for Ocean Science (SAHFOS) in Plymouth. The database is large, containing more than two million records (~80 million data points, if zero results are added) for more than 450 taxonomic entities. An open data policy is operated by SAHFOS. However, the data are not on-line and so access by scientists and others wishing to use the results is not interactive. Requests for data are dealt with by the Database Manager. To facilitate access to the data from the North Sea, which is an area of high research interest, a selected set of data for key phytoplankton and zooplankton species has been processed in a form that makes them readily available on CD for research and other applications. A set of MATLAB tools has been developed to provide an interpolated spatio-temporal description of plankton sampled by the CPR in the North Sea, as well as easy and fast access to users in the form of a browser. Using geostatistical techniques, plankton abundance values have been interpolated on a regular grid covering the North Sea. The grid is established on centres of 1 degree longitude x 0.5 degree latitude (~32 x 30 nautical miles). Based on a monthly temporal resolution over a fifty-year period (1948-1997), 600 distribution maps have been produced for 54 zooplankton species, and 480 distribution maps for 57 phytoplankton species over the shorter period 1958-1997. The gridded database has been developed in a user-friendly form and incorporates, as a package on a CD, a set of options for visualisation and interpretation, including the facility to plot maps for selected species by month, year, groups of months or years, long-term means or as time series and contour plots. This study constitutes the first application of an easily accessed and interactive gridded database of plankton abundance in the North Sea. As a further development the MATLAB browser is being converted to a user- friendly Windows-compatible format (WinCPR) for release on CD and via the Web in 2003.
Resumo:
This paper describes the application of regularisation to the training of feedforward neural networks, as a means of improving the quality of solutions obtained. The basic principles of regularisation theory are outlined for both linear and nonlinear training and then extended to cover a new hybrid training algorithm for feedforward neural networks recently proposed by the authors. The concept of functional regularisation is also introduced and discussed in relation to MLP and RBF networks. The tendency for the hybrid training algorithm and many linear optimisation strategies to generate large magnitude weight solutions when applied to ill-conditioned neural paradigms is illustrated graphically and reasoned analytically. While such weight solutions do not generally result in poor fits, it is argued that they could be subject to numerical instability and are therefore undesirable. Using an illustrative example it is shown that, as well as being beneficial from a generalisation perspective, regularisation also provides a means for controlling the magnitude of solutions. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.
Resumo:
The simultaneous heat and moisture transfer in the building envelope has an important influence on the indoor environment and the overall performance of buildings. In this paper, a model for predicting whole building heat and moisture transfer was presented. Both heat and moisture transfer in the building envelope and indoor air were simultaneously considered; their interactions were modeled. The coupled model takes into account most of the main hygrothermal effects in buildings. The coupled system model was implemented in MATLAB-Simulink, and validated by using a series of published testing tools. The new program was applied to investigate the moisture transfer effect on indoor air humidity and building energy consumption under different climates. The results show that the use of more detailed simulation routines can result in improvements to the building's design for energy optimisation through the choice of proper hygroscopic materials, which would not be indicated by simpler calculation techniques.
Resumo:
This paper deals with an experimental investigation into the velocity distribution downstream of a propeller, operating at bollard pull conditions and in the presence of a mobile sediment bed. Previous investigations either ignored the effect of a rudder in the wash or considered only its influence on an unconfined jet. The velocity profiles within the jet produced by a rotating propeller with a rudder present were measured at a mobile bed and compared to currently available predictive equations. The velocity distribution profiles in the jet, influenced by bed proximity, were found not to comply with current predictive methods. The velocity distributions measured within the jet were found to be complex and non-symmetrical. To provide a basic velocity predictive tool, a neural network analysis toolbox within Matlab was utilised and trained using the experimental data.
Resumo:
La presente investigación trata de poner énfasis en la importancia del paisaje en Deserto Rosso de Michelangelo Antonioni, erigiéndose este como elemento fundamental de la trama argumental del filme. Partiendo de esta premisa, se analiza el significado de dicho paisaje en el contexto socioartístico de los años 60. La interacción del hombre con su entorno parece ser el punto de partida para una reflexión más profunda sobre el devenir humano. Las nuevas conquistas estéticas alcanzadas y el análisis históricoartístico de los precedentes más inmediatos del filme, sitúan a Deserto Rosso como obra cumbre de la neovanguardia posmoderna europea.
Resumo:
This paper investigates the control and operation of doubly-fed induction generator (DFIG) and fixed-speed induction generator (FSIG) based wind farms under unbalanced grid conditions. A DFIG system model suitable for analyzing unbalanced operation is developed, and used to assess the impact of an unbalanced supply on DFIG and FSIG operation. Unbalanced voltage at DFIG and FSIG terminals can cause unequal heating on the stator windings, extra mechanical stresses and output power fluctuations. These problems are particularly serious for the FSIG-based wind farm without a power electronic interface to the grid. To improve the stability of a wind energy system containing both DFIG and FSIG based wind farms during network unbalance, a control strategy of unbalanced voltage compensation by the DFIG systems is proposed. The DFIG system compensation ability and the impact of transmission network impedance are illustrated. The simulation results implemented in Matlab/Simulink show that the proposed DFIG control system improves not only its own performance, but also the stability of the FSIG system with the same grid connection point during network unbalance.
Resumo:
In this paper, a novel video-based multimodal biometric verification scheme using the subspace-based low-level feature fusion of face and speech is developed for specific speaker recognition for perceptual human--computer interaction (HCI). In the proposed scheme, human face is tracked and face pose is estimated to weight the detected facelike regions in successive frames, where ill-posed faces and false-positive detections are assigned with lower credit to enhance the accuracy. In the audio modality, mel-frequency cepstral coefficients are extracted for voice-based biometric verification. In the fusion step, features from both modalities are projected into nonlinear Laplacian Eigenmap subspace for multimodal speaker recognition and combined at low level. The proposed approach is tested on the video database of ten human subjects, and the results show that the proposed scheme can attain better accuracy in comparison with the conventional multimodal fusion using latent semantic analysis as well as the single-modality verifications. The experiment on MATLAB shows the potential of the proposed scheme to attain the real-time performance for perceptual HCI applications.
Resumo:
This paper describes the application of an improved nonlinear principal component analysis (PCA) to the detection of faults in polymer extrusion processes. Since the processes are complex in nature and nonlinear relationships exist between the recorded variables, an improved nonlinear PCA, which incorporates the radial basis function (RBF) networks and principal curves, is proposed. This algorithm comprises two stages. The first stage involves the use of the serial principal curve to obtain the nonlinear scores and approximated data. The second stage is to construct two RBF networks using a fast recursive algorithm to solve the topology problem in traditional nonlinear PCA. The benefits of this improvement are demonstrated in the practical application to a polymer extrusion process.
Resumo:
Nonlinear principal component analysis (PCA) based on neural networks has drawn significant attention as a monitoring tool for complex nonlinear processes, but there remains a difficulty with determining the optimal network topology. This paper exploits the advantages of the Fast Recursive Algorithm, where the number of nodes, the location of centres, and the weights between the hidden layer and the output layer can be identified simultaneously for the radial basis function (RBF) networks. The topology problem for the nonlinear PCA based on neural networks can thus be solved. Another problem with nonlinear PCA is that the derived nonlinear scores may not be statistically independent or follow a simple parametric distribution. This hinders its applications in process monitoring since the simplicity of applying predetermined probability distribution functions is lost. This paper proposes the use of a support vector data description and shows that transforming the nonlinear principal components into a feature space allows a simple statistical inference. Results from both simulated and industrial data confirm the efficacy of the proposed method for solving nonlinear principal component problems, compared with linear PCA and kernel PCA.
Resumo:
To improve the performance of classification using Support Vector Machines (SVMs) while reducing the model selection time, this paper introduces Differential Evolution, a heuristic method for model selection in two-class SVMs with a RBF kernel. The model selection method and related tuning algorithm are both presented. Experimental results from application to a selection of benchmark datasets for SVMs show that this method can produce an optimized classification in less time and with higher accuracy than a classical grid search. Comparison with a Particle Swarm Optimization (PSO) based alternative is also included.
Resumo:
The relationship between changes in retinal vessel morphology and the onset and progression of diseases such as diabetes, hypertension and retinopathy of prematurity (ROP) has been the subject of several large scale clinical studies. However, the difficulty of quantifying changes in retinal vessels in a sufficiently fast, accurate and repeatable manner has restricted the application of the insights gleaned from these studies to clinical practice. This paper presents a novel algorithm for the efficient detection and measurement of retinal vessels, which is general enough that it can be applied to both low and high resolution fundus photographs and fluorescein angiograms upon the adjustment of only a few intuitive parameters. Firstly, we describe the simple vessel segmentation strategy, formulated in the language of wavelets, that is used for fast vessel detection. When validated using a publicly available database of retinal images, this segmentation achieves a true positive rate of 70.27%, false positive rate of 2.83%, and accuracy score of 0.9371. Vessel edges are then more precisely localised using image profiles computed perpendicularly across a spline fit of each detected vessel centreline, so that both local and global changes in vessel diameter can be readily quantified. Using a second image database, we show that the diameters output by our algorithm display good agreement with the manual measurements made by three independent observers. We conclude that the improved speed and generality offered by our algorithm are achieved without sacrificing accuracy. The algorithm is implemented in MATLAB along with a graphical user interface, and we have made the source code freely available.