10 resultados para Weighting

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need for high performance, high precision, and energy saving in rotating machinery demands an alternative solution to traditional bearings. Because of the contactless operation principle, the rotating machines employing active magnetic bearings (AMBs) provide many advantages over the traditional ones. The advantages such as contamination-free operation, low maintenance costs, high rotational speeds, low parasitic losses, programmable stiffness and damping, and vibration insulation come at expense of high cost, and complex technical solution. All these properties make the use of AMBs appropriate primarily for specific and highly demanding applications. High performance and high precision control requires model-based control methods and accurate models of the flexible rotor. In turn, complex models lead to high-order controllers and feature considerable computational burden. Fortunately, in the last few years the advancements in signal processing devices provide new perspective on the real-time control of AMBs. The design and the real-time digital implementation of the high-order LQ controllers, which focus on fast execution times, are the subjects of this work. In particular, the control design and implementation in the field programmable gate array (FPGA) circuits are investigated. The optimal design is guided by the physical constraints of the system for selecting the optimal weighting matrices. The plant model is complemented by augmenting appropriate disturbance models. The compensation of the force-field nonlinearities is proposed for decreasing the uncertainty of the actuator. A disturbance-observer-based unbalance compensation for canceling the magnetic force vibrations or vibrations in the measured positions is presented. The theoretical studies are verified by the practical experiments utilizing a custom-built laboratory test rig. The test rig uses a prototyping control platform developed in the scope of this work. To sum up, the work makes a step in the direction of an embedded single-chip FPGA-based controller of AMBs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main subject of this master's thesis was predicting diffusion of innovations. The prediction was done in a special case: product has been available in some countries, and based on its diffusion in those countries the prediction is done for other countries. The prediction was based on finding similar countries with Self-Organizing Map~(SOM), using parameters of countries. Parameters included various economical and social key figures. SOM was optimised for different products using two different methods: (a) by adding diffusion information of products to the country parameters, and (b) by weighting the country parameters based on their importance for the diffusion of different products. A novel method using Differential Evolution (DE) was developed to solve the latter, highly non-linear optimisation problem. Results were fairly good. The prediction method seems to be on a solid theoretical foundation. The results based on country data were good. Instead, optimisation for different products did not generally offer clear benefit, but in some cases the improvement was clearly noticeable. The weights found for the parameters of the countries with the developed SOM optimisation method were interesting, and most of them could be explained by properties of the products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työpaikkailmoitusten etsiminen internetistä on hyvin yleistä nykyään, mutta kysei- nen prosessi ei ole kehittynyt vuosien varrella muiden palvelujen tapaan. Tämän ta- kia tehokkaan ja omiin taitoihin kohdistetun haun tekeminen on hyvin vaikeaa. Tässä työssä toteutetaan verkkopalvelu, jonka avulla käyttäjä voi tutkia useasta läh- teestä haettuja IT-alan työpaikkailmoituksia ja etsiä niistä omille taidoilleen parhai- ten sopivia. Palvelun taustalla toimiva järjestelmä hakee ilmoitukset ja analysoi ne tarvittavan datan saamiseksi. Samalla ilmoituksista luodaan tilastoja, joita käyttäjät voivat tutkia. Kerätyistä tiedoista saadaan myös selville millaisia yhteyksiä eri am- mattien ja termien välillä on. Palvelun avulla on helppoa tehdä hakuja painottaen omia osaamisalueita. Haun tu- lokset tulostetaan parhaiten sopivasta huonoimmin sopivaan. Jokaisen ilmoituksen mukana tulostetaan listaus ilmoituksessa olleista ammattitermeistä ja jokaisen haun loppuun tulostetaan myös listaus kaikista haun tuloksista löytyneistä ilmoituksista. Kohdistetut haut ovat mahdollisia, koska palvelu kerää tietoja ilmoituksista löytyvis- tä termeistä luokitellen niitä. Tilastoista käyttäjällä on mahdollisuus seurata työpaikkailmoitusmäärien muutoksia viikoittain niin mol:n kuin monsterin järjestelmissä. Pelkkien ilmoitusmäärien lisäksi tilastoista voi seurata yksittäisten ammattitermien esiintymistä, sekä tietyn ammat- tialan ilmoitusten määriä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this thesis is twofold. The first and major part is devoted to sensitivity analysis of various discrete optimization problems while the second part addresses methods applied for calculating measures of solution stability and solving multicriteria discrete optimization problems. Despite numerous approaches to stability analysis of discrete optimization problems two major directions can be single out: quantitative and qualitative. Qualitative sensitivity analysis is conducted for multicriteria discrete optimization problems with minisum, minimax and minimin partial criteria. The main results obtained here are necessary and sufficient conditions for different stability types of optimal solutions (or a set of optimal solutions) of the considered problems. Within the framework of quantitative direction various measures of solution stability are investigated. A formula for a quantitative characteristic called stability radius is obtained for the generalized equilibrium situation invariant to changes of game parameters in the case of the H¨older metric. Quality of the problem solution can also be described in terms of robustness analysis. In this work the concepts of accuracy and robustness tolerances are presented for a strategic game with a finite number of players where initial coefficients (costs) of linear payoff functions are subject to perturbations. Investigation of stability radius also aims to devise methods for its calculation. A new metaheuristic approach is derived for calculation of stability radius of an optimal solution to the shortest path problem. The main advantage of the developed method is that it can be potentially applicable for calculating stability radii of NP-hard problems. The last chapter of the thesis focuses on deriving innovative methods based on interactive optimization approach for solving multicriteria combinatorial optimization problems. The key idea of the proposed approach is to utilize a parameterized achievement scalarizing function for solution calculation and to direct interactive procedure by changing weighting coefficients of this function. In order to illustrate the introduced ideas a decision making process is simulated for three objective median location problem. The concepts, models, and ideas collected and analyzed in this thesis create a good and relevant grounds for developing more complicated and integrated models of postoptimal analysis and solving the most computationally challenging problems related to it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the use of di erent features derived from remotely sensed data in segmentation of forest stands. Surface interpolation methods were applied to LiDAR points in order to represent data in the form of grayscale images. Median and mean shift ltering was applied to the data for noise reduction. The ability of di erent compositions of rasters obtained from LiDAR data and an aerial image to maximize stand homogeneity in the segmentation was evaluated. The quality of forest stand delineations was assessed by the Akaike information criterion. The research was performed in co-operation with Arbonaut Ltd., Joensuu, Finland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Elevator landing doors are fire tested to measure their fire resistance. The objective of this master’s thesis was to create a method to evaluate the fire tests and the organizations that provide these testing services. The main focus area was in creating accurate evaluation criteria and weighting the criteria. The thesis was formed by first presenting the reader with the literature review of the closest related theories. The theories which were chosen were systematic decision making, supplier selection, and make or buy and outsourcing theories. In the empirical section the created process of evaluating fire testing is presented, with analysis of the current situation of fire testing processes and evaluation methods. Evaluating fire testing services required two types of criteria to be formed, technical criteria to evaluate the technical requirements, and service criteria to evaluate the organization which was offering the testing service. These criteria formed the core for the evaluation process which consisted of five different phases that were developed based on the literature review. The process was tested to create best practices and to make improvement proposals accordingly. Systematical process for evaluating fire testing helps to recognize the most important technical and service related aspects. The created criteria can be also used in future to benchmark and monitor the situation of fire testing. The results of the process can be used when deciding whether to outsource the service or to keep it in-house.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.