14 resultados para Eigensystem realization algorithms

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identification of order of an Autoregressive Moving Average Model (ARMA) by the usual graphical method is subjective. Hence, there is a need of developing a technique to identify the order without employing the graphical investigation of series autocorrelations. To avoid subjectivity, this thesis focuses on determining the order of the Autoregressive Moving Average Model using Reversible Jump Markov Chain Monte Carlo (RJMCMC). The RJMCMC selects the model from a set of the models suggested by better fitting, standard deviation errors and the frequency of accepted data. Together with deep analysis of the classical Box-Jenkins modeling methodology the integration with MCMC algorithms has been focused through parameter estimation and model fitting of ARMA models. This helps to verify how well the MCMC algorithms can treat the ARMA models, by comparing the results with graphical method. It has been seen that the MCMC produced better results than the classical time series approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the modern warfare there is an active development of a new trend connected with a robotic warfare. One of the critical elements of robotics warfare systems is an automatic target recognition system, allowing to recognize objects, based on the data received from sensors. This work considers aspects of optical realization of such a system by means of NIR target scanning at fixed wavelengths. An algorithm was designed, an experimental setup was built and samples of various modern gear and apparel materials were tested. For pattern testing the samples of actively arm engaged armies camouflages were chosen. Tests were performed both in clear atmosphere and in the artificial extremely humid and hot atmosphere to simulate field conditions.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parameter estimation still remains a challenge in many important applications. There is a need to develop methods that utilize achievements in modern computational systems with growing capabilities. Owing to this fact different kinds of Evolutionary Algorithms are becoming an especially perspective field of research. The main aim of this thesis is to explore theoretical aspects of a specific type of Evolutionary Algorithms class, the Differential Evolution (DE) method, and implement this algorithm as codes capable to solve a large range of problems. Matlab, a numerical computing environment provided by MathWorks inc., has been utilized for this purpose. Our implementation empirically demonstrates the benefits of a stochastic optimizers with respect to deterministic optimizers in case of stochastic and chaotic problems. Furthermore, the advanced features of Differential Evolution are discussed as well as taken into account in the Matlab realization. Test "toycase" examples are presented in order to show advantages and disadvantages caused by additional aspects involved in extensions of the basic algorithm. Another aim of this paper is to apply the DE approach to the parameter estimation problem of the system exhibiting chaotic behavior, where the well-known Lorenz system with specific set of parameter values is taken as an example. Finally, the DE approach for estimation of chaotic dynamics is compared to the Ensemble prediction and parameter estimation system (EPPES) approach which was recently proposed as a possible solution for similar problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The AQUAREL project studied the availability and optional utilization methods for fish processing side streams and other aquatic biomaterial in the Republic of Karelia. Additionally processing aquatic biomaterial with manure and sewage sludge was studied. Based on the results, the most feasible option today is to process fish side streams to fish oil and dewatered oil-free residue and to use them for fish or animal feed production. However, it is necessary to highlight, that changes in e.g. economic environment, energy prices and demand may require re-evaluating the results and conclusions made in the project. Producing fish oil from fish processing side streams is an easy and relatively simple production process generating a valuable end product. The functionality of the process was confirmed in a pilot conducted in the project. The oil and solids are separated from the heated fish waste based on gravity. The fish oil separating on top of the separator unit is removed. Fish oil can as such be utilized for heating purposes, fish meal or animal feed production, but it can also be further processed to biodiesel. However, due to currently moderate energy prices in Russia, biodiesel production is not economically profitable. Even if the fish oil production process is not complicated, the operative management of small-scale fish oil production unit requires dedicated resources and separate facilities especially to meet hygiene requirements. Managing the side streams is not a core business for fish farmers. Efficient and economically profitable fish oil production requires a centralized production unit with bigger processing capacity. One fish processing unit needs to be designed to manage side streams collected from several fish farms. The optimum location for the processing unit is in the middle of the fish farms. Based on the transportation cost analysis in the Republic of Karelia, it is not economically efficient to transport bio-wastes for more than 100 km since the transportation costs start increasing substantially. Another issue to be considered is that collection of side streams, including the dead fish, from the fish farms should be organized on a daily basis in order to eliminate the need for storing the side streams at the farms. Based on AQUAREL project studies there are different public funding sources available for supporting and enabling profitable and environmentally sustainable utilization, research or development of fish processing side streams and other aquatic biomaterial. Different funding programmes can be utilized by companies, research organizations, authorities and non-governmental organizations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many industrial applications need object recognition and tracking capabilities. The algorithms developed for those purposes are computationally expensive. Yet ,real time performance, high accuracy and small power consumption are essential measures of the system. When all these requirements are combined, hardware acceleration of these algorithms becomes a feasible solution. The purpose of this study is to analyze the current state of these hardware acceleration solutions, which algorithms have been implemented in hardware and what modifications have been done in order to adapt these algorithms to hardware.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to extend prior knowledge on the learning and developmental outcomes of the experiential learning cycle of David Kolb by the analysis of its practical realization at Team Academy. The study is based on the constructivist approach to learning and considers, among others, the concepts of autonomy support, Nonaka and Takeuchi's knowledge creation model, Luft and Ingham's Johari Window and Deci and Ryan's Self-determination theory. For the investigation deep interviews were carried out with the participants of Team Academy, both learners and coaches. Taking the interview results and the above described theories into consideration this study concludes that experiential learning results not only in effective learning, but also in a remarkable soft skill acquisition, self-development and increase in motivation with an internal locus of causality. Real-life projects permit the learners to experience real challenges. By the practical activities and teamwork they also get the possibility to find out their personal strengths, weaknesses and unique capacities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dignity is seen important in health care context but considered as a controversial and complex concept. In health care context, it is described as being influenced by for example autonomy, respect, communication, privacy and hospital environment. Patient dignity is related to satisfaction with care, reduced stress, better confidence in health services, enhanced patient outcomes and shorter stay in a hospital. Stroke patients may struggle for dignity as being dependent on other people has impact on the patients’ self-image. In all, stroke patients are very specific patient group and considered vulnerable from emotional aspect. Therefore study findings from other patient groups in the area of ethical problems cannot be transferred to the stroke patients. This master’s thesis consists of two parts. The first part is the literature review of patients’ dignity in hospital care. The literature defined dignity and described factors promoting and reducing it. The results were ambiguous and thus a clear understanding was not able to create. That was the basis for the second part of the master’s thesis, the empirical study. This part aimed to develop theoretical construction to explore the realization of stroke patients’ dignity in hospital care. The data of the second part was collected by interviewing 16 stroke patients and analyzed using the constant comparison of Grounded Theory. The result was ‘The Theory of Realization of Stroke Patients’ Dignity in Hospital Care’ which is described not only in this master’s thesis but also as a scientific article. The theory consists of the core category, four generic elements and five specific types on realization. The core category emerged as ‘dignity in a new situation’. After a stroke, dignity is defined in a new way which is influenced by the generic elements: life history, health history, individuality and a stroke. Stroke patient’s dignity is realized through five specific types on realization: person related dignity type, control related dignity type, independence related dignity type, social related dignity type and care related dignity type. The theory points out possible special characteristics of stroke patients’ dignity in control related dignity type and independence related dignity type. Before implementing the theory, the relation between the core category, generic elements and specific types on realization needs to be studied further.