896 resultados para Exact Algorithms
Resumo:
Ajoneuvojen reititystä on tutkittu 1950-luvulta asti, alunperin etsiessä polttoainekuljetuksille optimaalisinta reittiä varastolta useille palveluasemille. Siitä lähtien ajoneuvon reititystehtäviä on tutkittu akateemisesti ja niistä on muodostettu kymmeniä erilaisia variaatioita. Tehtävien ratkaisumenetelmät jaetaan tyypillisesti tarkkoihin menetelmiin sekä heuristiikkoihin ja metaheuristiikkoihin. Konetehon ja heuristiikoissa käytettävien algoritmien kehittymisen myötä reitinoptimointia on alettu tarjota kaupallisesti. CO-SKY-projektin tavoitteena on kaupallistaa web-pohjainen tai toiminnanohjausjärjestelmään integroitava ajoneuvon reititys. Diplomityössä tutkitaan kuljetustensuunnittelu- ja reitinoptimointiohjelmistojen kaupallistamiseen vaikuttavia keskeisiä ominaisuuksia. Ominaisuuksia on tarkasteltu: 1) erityisesti pk-kuljetusyritysten tarpeiden ja vaatimusten pohjalta, ja 2) markkinoilla olevien ohjelmistojen tarjontaa arvioiden. Näiden pohjalta on myös pyritty arvioimaan kysynnän ja tarjonnan kohtaamista. Pilottiasiakkaita haastattelemalla ohjelmistolle on kyetty asettamaan vaatimuksia, mutta samalla on kuultu käyttäjien mielipiteitä optimoinnista. Lukuisia logistiikkaohjelmistojen tarjoajia on haastateltu logistiikkamessuilla sekä Suomessa että Saksassa. Haastattelujen perusteella on saatu käsitys kyseisistä ohjelmista sekä optimoinnin tarjonnasta että kysynnästä. Akateeminen tutkimus aiheesta on laajaa, koskien niin teknistä toteutusta kuin myös (kysely-)tutkimuksia tarjolla olevien ohjelmistojen ominaisuuksista ja laadusta. Kuljetusyritysten tarpeissa on vaihtelua yritys- ja alakohtaisesti. Perusongelmat ovat samoja, joita reitinoptimoinnin akateemisessa tutkimuksessa käsitellään ja joita kaupalliset ohjelmistot pystyvät ratkaisemaan. Vaikka reitinoptimoinnilla saatavat hyödyt ovat mitattavissa, suunnittelu etenkin pk-yrityksissä tehdään pääosin yhä käsin. Messuhaastattelujen ja loppukäyttäjien mielipiteiden perusteella voidaan todeta kaupallisten ratkaisujen olevan suunniteltu isommille kuljetusyrityksille: tyypillisen it-projektin hinta, käyttöönottoaika ja asennus sekä ratkaisun takaisinmaksuaika vaikuttavat pk-yritysten hankintapäätökseen. Kaupallistamiseen liittyen haasteet liittyvät erityisesti segmentointiin ja markkinointiin asiakasarvon todentamisen ja sen välittämisen kautta.
Resumo:
Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.
Resumo:
This thesis considers optimization problems arising in printed circuit board assembly. Especially, the case in which the electronic components of a single circuit board are placed using a single placement machine is studied. Although there is a large number of different placement machines, the use of collect-and-place -type gantry machines is discussed because of their flexibility and increasing popularity in the industry. Instead of solving the entire control optimization problem of a collect-andplace machine with a single application, the problem is divided into multiple subproblems because of its hard combinatorial nature. This dividing technique is called hierarchical decomposition. All the subproblems of the one PCB - one machine -context are described, classified and reviewed. The derived subproblems are then either solved with exact methods or new heuristic algorithms are developed and applied. The exact methods include, for example, a greedy algorithm and a solution based on dynamic programming. Some of the proposed heuristics contain constructive parts while others utilize local search or are based on frequency calculations. For the heuristics, it is made sure with comprehensive experimental tests that they are applicable and feasible. A number of quality functions will be proposed for evaluation and applied to the subproblems. In the experimental tests, artificially generated data from Markov-models and data from real-world PCB production are used. The thesis consists of an introduction and of five publications where the developed and used solution methods are described in their full detail. For all the problems stated in this thesis, the methods proposed are efficient enough to be used in the PCB assembly production in practice and are readily applicable in the PCB manufacturing industry.
Resumo:
Kahdessa osassa painettu kartta.
Resumo:
We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV) infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co) ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB) positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm) which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79). The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.
Resumo:
Our objective is to evaluate the accuracy of three algorithms in differentiating the origins of outflow tract ventricular arrhythmias (OTVAs). This study involved 110 consecutive patients with OTVAs for whom a standard 12-lead surface electrocardiogram (ECG) showed typical left bundle branch block morphology with an inferior axis. All the ECG tracings were retrospectively analyzed using the following three recently published ECG algorithms: 1) the transitional zone (TZ) index, 2) the V2 transition ratio, and 3) V2 R wave duration and R/S wave amplitude indices. Considering all patients, the V2 transition ratio had the highest sensitivity (92.3%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (93.9%). The latter finding had a maximal area under the ROC curve of 0.925. In patients with left ventricular (LV) rotation, the V2 transition ratio had the highest sensitivity (94.1%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (87.5%). The former finding had a maximal area under the ROC curve of 0.892. All three published ECG algorithms are effective in differentiating the origin of OTVAs, while the V2 transition ratio, and the V2 R wave duration and R/S wave amplitude indices are the most sensitive and specific algorithms, respectively. Amongst all of the patients, the V2 R wave duration and R/S wave amplitude algorithm had the maximal area under the ROC curve, but in patients with LV rotation the V2 transition ratio algorithm had the maximum area under the ROC curve.
Resumo:
Many industrial applications need object recognition and tracking capabilities. The algorithms developed for those purposes are computationally expensive. Yet ,real time performance, high accuracy and small power consumption are essential measures of the system. When all these requirements are combined, hardware acceleration of these algorithms becomes a feasible solution. The purpose of this study is to analyze the current state of these hardware acceleration solutions, which algorithms have been implemented in hardware and what modifications have been done in order to adapt these algorithms to hardware.
Resumo:
Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.
Resumo:
This thesis concerns the analysis of epidemic models. We adopt the Bayesian paradigm and develop suitable Markov Chain Monte Carlo (MCMC) algorithms. This is done by considering an Ebola outbreak in the Democratic Republic of Congo, former Zaïre, 1995 as a case of SEIR epidemic models. We model the Ebola epidemic deterministically using ODEs and stochastically through SDEs to take into account a possible bias in each compartment. Since the model has unknown parameters, we use different methods to estimate them such as least squares, maximum likelihood and MCMC. The motivation behind choosing MCMC over other existing methods in this thesis is that it has the ability to tackle complicated nonlinear problems with large number of parameters. First, in a deterministic Ebola model, we compute the likelihood function by sum of square of residuals method and estimate parameters using the LSQ and MCMC methods. We sample parameters and then use them to calculate the basic reproduction number and to study the disease-free equilibrium. From the sampled chain from the posterior, we test the convergence diagnostic and confirm the viability of the model. The results show that the Ebola model fits the observed onset data with high precision, and all the unknown model parameters are well identified. Second, we convert the ODE model into a SDE Ebola model. We compute the likelihood function using extended Kalman filter (EKF) and estimate parameters again. The motivation of using the SDE formulation here is to consider the impact of modelling errors. Moreover, the EKF approach allows us to formulate a filtered likelihood for the parameters of such a stochastic model. We use the MCMC procedure to attain the posterior distributions of the parameters of the SDE Ebola model drift and diffusion parts. In this thesis, we analyse two cases: (1) the model error covariance matrix of the dynamic noise is close to zero , i.e. only small stochasticity added into the model. The results are then similar to the ones got from deterministic Ebola model, even if methods of computing the likelihood function are different (2) the model error covariance matrix is different from zero, i.e. a considerable stochasticity is introduced into the Ebola model. This accounts for the situation where we would know that the model is not exact. As a results, we obtain parameter posteriors with larger variances. Consequently, the model predictions then show larger uncertainties, in accordance with the assumption of an incomplete model.
Resumo:
Kartta leikattu karttakuvaa reunastavan kehyksen ulkorajaa pitkin.