15 resultados para Mesh generation from image data

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this study was todo a statistical analysis of ecological type from optical satellite data, using Tipping's sparse Bayesian algorithm. This thesis uses "the Relevence Vector Machine" algorithm in ecological classification betweenforestland and wetland. Further this bi-classification technique was used to do classification of many other different species of trees and produces hierarchical classification of entire subclasses given as a target class. Also, we carried out an attempt to use airborne image of same forest area. Combining it with image analysis, using different image processing operation, we tried to extract good features and later used them to perform classification of forestland and wetland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the use of di erent features derived from remotely sensed data in segmentation of forest stands. Surface interpolation methods were applied to LiDAR points in order to represent data in the form of grayscale images. Median and mean shift ltering was applied to the data for noise reduction. The ability of di erent compositions of rasters obtained from LiDAR data and an aerial image to maximize stand homogeneity in the segmentation was evaluated. The quality of forest stand delineations was assessed by the Akaike information criterion. The research was performed in co-operation with Arbonaut Ltd., Joensuu, Finland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biokuvainformatiikan kehittäminen – mikroskopiasta ohjelmistoratkaisuihin – sovellusesimerkkinä α2β1-integriini Kun ihmisen genomi saatiin sekvensoitua vuonna 2003, biotieteiden päätehtäväksi tuli selvittää eri geenien tehtävät, ja erilaisista biokuvantamistekniikoista tuli keskeisiä tutkimusmenetelmiä. Teknologiset kehitysaskeleet johtivat erityisesti fluoresenssipohjaisten valomikroskopiatekniikoiden suosion räjähdysmäiseen kasvuun, mutta mikroskopian tuli muuntua kvalitatiivisesta tieteestä kvantitatiiviseksi. Tämä muutos synnytti uuden tieteenalan, biokuvainformatiikan, jonka on sanottu mahdollisesti mullistavan biotieteet. Tämä väitöskirja esittelee laajan, poikkitieteellisen työkokonaisuuden biokuvainformatiikan alalta. Väitöskirjan ensimmäinen tavoite oli kehittää protokollia elävien solujen neliulotteiseen konfokaalimikroskopiaan, joka oli yksi nopeimmin kasvavista biokuvantamismenetelmistä. Ihmisen kollageenireseptori α2β1-integriini, joka on tärkeä molekyyli monissa fysiologisissa ja patologisissa prosesseissa, oli sovellusesimerkkinä. Työssä saavutettiin selkeitä visualisointeja integriinien liikkeistä, yhteenkeräytymisestä ja solun sisään siirtymisestä, mutta työkaluja kuvainformaation kvantitatiiviseen analysointiin ei ollut. Väitöskirjan toiseksi tavoitteeksi tulikin tällaiseen analysointiin soveltuvan tietokoneohjelmiston kehittäminen. Samaan aikaan syntyi biokuvainformatiikka, ja kipeimmin uudella alalla kaivattiin erikoistuneita tietokoneohjelmistoja. Tämän väitöskirjatyön tärkeimmäksi tulokseksi muodostui näin ollen BioImageXD, uudenlainen avoimen lähdekoodin ohjelmisto moniulotteisten biokuvien visualisointiin, prosessointiin ja analysointiin. BioImageXD kasvoi yhdeksi alansa suurimmista ja monipuolisimmista. Se julkaistiin Nature Methods -lehden biokuvainformatiikkaa käsittelevässä erikoisnumerossa, ja siitä tuli tunnettu ja laajalti käytetty. Väitöskirjan kolmas tavoite oli soveltaa kehitettyjä menetelmiä johonkin käytännönläheisempään. Tehtiin keinotekoisia piidioksidinanopartikkeleita, joissa oli "osoitelappuina" α2β1-integriinin tunnistavia vasta-aineita. BioImageXD:n avulla osoitettiin, että nanopartikkeleilla on potentiaalia lääkkeiden täsmäohjaussovelluksissa. Tämän väitöskirjatyön yksi perimmäinen tavoite oli edistää uutta ja tuntematonta biokuvainformatiikan tieteenalaa, ja tämä tavoite saavutettiin erityisesti BioImageXD:n ja sen lukuisten julkaistujen sovellusten kautta. Väitöskirjatyöllä on merkittävää potentiaalia tulevaisuudessa, mutta biokuvainformatiikalla on vakavia haasteita. Ala on liian monimutkainen keskimääräisen biolääketieteen tutkijan hallittavaksi, ja alan keskeisin elementti, avoimen lähdekoodin ohjelmistokehitystyö, on aliarvostettu. Näihin seikkoihin tarvitaan useita parannuksia,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diabetic retinopathy, age-related macular degeneration and glaucoma are the leading causes of blindness worldwide. Automatic methods for diagnosis exist, but their performance is limited by the quality of the data. Spectral retinal images provide a significantly better representation of the colour information than common grayscale or red-green-blue retinal imaging, having the potential to improve the performance of automatic diagnosis methods. This work studies the image processing techniques required for composing spectral retinal images with accurate reflection spectra, including wavelength channel image registration, spectral and spatial calibration, illumination correction, and the estimation of depth information from image disparities. The composition of a spectral retinal image database of patients with diabetic retinopathy is described. The database includes gold standards for a number of pathologies and retinal structures, marked by two expert ophthalmologists. The diagnostic applications of the reflectance spectra are studied using supervised classifiers for lesion detection. In addition, inversion of a model of light transport is used to estimate histological parameters from the reflectance spectra. Experimental results suggest that the methods for composing, calibrating and postprocessing spectral images presented in this work can be used to improve the quality of the spectral data. The experiments on the direct and indirect use of the data show the diagnostic potential of spectral retinal data over standard retinal images. The use of spectral data could improve automatic and semi-automated diagnostics for the screening of retinal diseases, for the quantitative detection of retinal changes for follow-up, clinically relevant end-points for clinical studies and development of new therapeutic modalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide an incremental quantile estimator for Non-stationary Streaming Data. We propose a method for simultaneous estimation of multiple quantiles corresponding to the given probability levels from streaming data. Due to the limitations of the memory, it is not feasible to compute the quantiles by storing the data. So estimating the quantiles as the data pass by is the only possibility. This can be effective in network measurement. To provide the minimum of the mean-squared error of the estimation, we use parabolic approximation and for comparison we simulate the results for different number of runs and using both linear and parabolic approximations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contrast enhancement is an image processing technique where the objective is to preprocess the image so that relevant information can be either seen or further processed more reliably. These techniques are typically applied when the image itself or the device used for image reproduction provides poor visibility and distinguishability of different regions of interest inthe image. In most studies, the emphasis is on the visualization of image data,but this human observer biased goal often results to images which are not optimal for automated processing. The main contribution of this study is to express the contrast enhancement as a mapping from N-channel image data to 1-channel gray-level image, and to devise a projection method which results to an image with minimal error to the correct contrast image. The projection, the minimum-error contrast image, possess the optimal contrast between the regions of interest in the image. The method is based on estimation of the probability density distributions of the region values, and it employs Bayesian inference to establish the minimum error projection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coating and filler pigments have strong influence to the properties of the paper. Filler content can be even over 30 % and pigment content in coating is about 85-95 weight percent. The physical and chemical properties of the pigments are different and the knowledge of these properties is important for optimising of optical and printing properties of the paper. The size and shape of pigment particles can be measured by different analysers which can be based on sedimentation, laser diffraction, changes in electric field etc. In this master's thesis was researched particle properties especially by scanning electron microscope (SEM) and image analysis programs. Research included nine pigments with different particle size and shape. Pigments were analysed by two image analysis programs (INCA Feature and Poikki), Coulter LS230 (laser diffraction) and SediGraph 5100 (sedimentation). The results were compared to perceive the effect of particle shape to the performance of the analysers. Only image analysis programs gave parameters of the particle shape. One part of research was also the sample preparation for SEM. Individual particles should be separated and distinct in ideal sample. Analysing methods gave different results but results from image analysis programs corresponded even to sedimentation or to laser diffraction depending on the particle shape. Detailed analysis of the particle shape required high magnification in SEM, but measured parameters described very well the shape of the particles. Large particles (ecd~1 µm) could be used also in 3D-modelling which enabled the measurement of the thickness of the particles. Scanning electron microscope and image analysis programs were effective and multifunctional tools for particle analyses. Development and experience will devise the usability of analysing method in routine use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the occurrence of fossil fuels such as oil, gas and coal we found new sources of energy that have played a critical role in the progress of our modern society. Coal is very ample compared to the other two fossil fuels. Global coal reserves at the end of 2005 were estimated at 847,5 billion tones. Along with the major energy sources, coal is the most fast growing fuel on a global basis, it provides 26% of primary energy needs and remains essential to the economies of many developed and developing countries. Coal-fired power generation accounts for 41% of the world‘s total electricity production and in some countries, such as South Africa, Poland, China, Australia, Kazakhstan and India is on very high level. Still, coal utilization represents challenges related to high emissions of air pollutants such as sulphur and nitrogen dioxides, particulate matter, mercury and carbon dioxide. In relation to these a number of technologies have been developed and are in marketable use, with further potential developments towards ―Near Zero Emission‖ coal plants. In present work, coals mined in Russia and countries of Former Soviet Union were reviewed. Distribution of coal reserves on the territory of Russia and the potential for power generation from coal-fired plants across Russia was shown. Physical and chemical properties of coals produced were listed and examined, as main factor influencing on design of the combustion facility and incineration process performance. The ash-related problems in coal-fired boilers were described. The analysis of coal ash of Russia and countries of Former Soviet Union were prepared. Feasible combustion technologies also were reviewed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena oli tutkia kompleksoituvien metallien erotusta kloridiliuoksesta ioninvaihdolla. Kirjallisessa osassa perehdyttiin metallikompleksien muodostumiseen, ja erityisesti hopean, kalsiumin, magnesiumin, lyijyn ja sinkin muodostamiin komplekseihin kloridin ja nitraatin kanssa. Kirjallisessa osassa käsiteltiin myös metallien erottamista kiintopetikolonneissa jatkuvatoimisilla ioninvaihtomenetelmillä. Tässä työssä jatkuvatoimisen ioninvaihdon prosessivaihtoehdot jaoteltiin pyöriviin ja paikallaan pysyviin kolonneihin, sekä tarkasteltiin eri prosessivaihtoehtoja kolonnien kytkentöjen suhteen. Työn kokeellisessa osassa tutkittiin kahdenarvoisten metallien erottamista yhdenarvoisista metalleista sekä luotiin koedataa vastaavanlaisen erotusprosessin simulointiin. Kokeissa käytettiin anioninvaihtohartsia ja kelatoivaa selektiivistä ioninvaihtohartsia. Kahdenarvoisen kalsiumin, magnesiumin, lyijyn ja sinkin adsorptiota hartseihin tutkittiin tasapaino-, kinetiikka- ja kolonnikokeilla. Anioninvaihtohartsilla tehtyjen tasapaino- ja kolonnikokeiden tulokset osoittivat, että hartsi adsorboi tehokkaasti sinkkiä kloridiliuoksista, koska sinkki muodostaa stabiileja anionisia klorokomplekseja. Muiden tutkittujen kahdenarvoisten metallien adsorptio hartsiin oli huomattavasti vähäisempää. Tulosten perusteella tutkittu anioninvaihtohartsi on hyvä vaihtoehto sinkin erottamiseen muista tutkituista kahdenarvoisista metalleista kloridiympäristössä. Kelatoivalla hartsilla tehdyt tasapaino- ja kolonnikokeet osoittivat, että hartsi adsorboi kloridiliuoksista hyvin kahdenarvoista kalsiumia, magnesiumia, lyijyä ja sinkkiä, mutta ei adsorboi yhdenarvoista hopeaa. Tulosten perusteella kahdenarvoisten metallien erottaminen yhdenarvoisista metalleista voidaan toteuttaa kokeissa käytetyllä kelatoivalla ioninvaihtohartsilla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The consumption of manganese is increasing, but huge amounts of manganese still end up in waste in hydrometallurgical processes. The recovery of manganese from multi-metal solutions at low concentrations may not be economical. In addition, poor iron control typically prevents the production of high purity manganese. Separation of iron from manganese can be done with chemical precipitation or solvent extraction methods. Combined carbonate precipitation with air oxidation is a feasible method to separate iron and manganese due to the fast kinetics, good controllability and economical reagents. In addition the leaching of manganese carbonate is easier and less acid consuming than that of hydroxide or sulfide precipitates. Selective iron removal with great efficiency from MnSO4 solution is achieved by combined oxygen or air oxidation and CaCO3 precipitation at pH > 5.8 and at a redox potential of > 200 mV. In order to avoid gypsum formation, soda ash should be used instead of limestone. In such case, however, extra attention needs to be paid on the reagents mole ratios in order to avoid manganese coprecipitation. After iron removal, pure MnSO4 solution was obtained by solvent extraction using organophosphorus reagents, di-(2-ethylhexyl)phosphoric acid (D2EHPA) and bis(2,4,4- trimethylpentyl)phosphinic acid (CYANEX 272). The Mn/Ca and Mn/Mg selectivities can be increased by decreasing the temperature from the commonly used temperatures (40 –60oC) to 5oC. The extraction order of D2EHPA (Ca before Mn) at low temperature remains unchanged but the lowering of temperature causes an increase in viscosity and slower phase separation. Of these regents, CYANEX 272 is selective for Mn over Ca and, therefore, it would be the better choice if there is Ca present in solution. A three-stage Mn extraction followed by a two-stage scrubbing and two-stage sulfuric acid stripping is an effective method of producing a very pure MnSO4 intermediate solution for further processing. From the intermediate MnSO4 some special Mn- products for ion exchange applications were synthesized and studied. Three types of octahedrally coordinated manganese oxide materials as an alternative final product for manganese were chosen for synthesis: layer structured Nabirnessite, tunnel structured Mg-todorokite and K-kryptomelane. As an alternative source of pure MnSO4 intermediate, kryptomelane was synthesized by using a synthetic hydrometallurgical tailings. The results show that the studied OMS materials adsorb selectively Cu, Ni, Cd and K in the presence of Ca and Mg. It was also found that the exchange rates were reasonably high due to the small particle dimensions. Materials are stable in the studied conditions and their maximum Cu uptake capacity was 1.3 mmol/g. Competitive uptake of metals and acid was studied using equilibrium, batch kinetic and fixed-bed measurements. The experimental data was correlated with a dynamic model, which also accounts for the dissolution of the framework manganese. Manganese oxide micro-crystals were also bound onto silica to prepare a composite material having a particle size large enough to be used in column separation experiments. The MnOx/SiO2 ratio was found to affect significantly the properties of the composite. The higher the ratio, the lower is the specific surface area, the pore volume and the pore size. On the other hand, higher amount of silica binder gives composites better mechanical properties. Birnesite and todorokite can be aggregated successfully with colloidal silica at pH 4 and with MnO2/SiO2 weight ratio of 0.7. The best gelation and drying temperature was 110oC and sufficiently strong composites were obtained by additional heat-treatment at 250oC for 2 h. The results show that silica–supported MnO2 materials can be utilized to separate copper from nickel and cadmium. The behavior of the composites can be explained reasonably well with the presented model and the parameters estimated from the data of the unsupported oxides. The metal uptake capacities of the prepared materials were quite small. For example, the final copper loading was 0.14 mmol/gMnO2. According to the results the special MnO2 materials are potential for a specific environmental application to uptake harmful metal ions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personalized medicine will revolutionize our capabilities to combat disease. Working toward this goal, a fundamental task is the deciphering of geneticvariants that are predictive of complex diseases. Modern studies, in the formof genome-wide association studies (GWAS) have afforded researchers with the opportunity to reveal new genotype-phenotype relationships through the extensive scanning of genetic variants. These studies typically contain over half a million genetic features for thousands of individuals. Examining this with methods other than univariate statistics is a challenging task requiring advanced algorithms that are scalable to the genome-wide level. In the future, next-generation sequencing studies (NGS) will contain an even larger number of common and rare variants. Machine learning-based feature selection algorithms have been shown to have the ability to effectively create predictive models for various genotype-phenotype relationships. This work explores the problem of selecting genetic variant subsets that are the most predictive of complex disease phenotypes through various feature selection methodologies, including filter, wrapper and embedded algorithms. The examined machine learning algorithms were demonstrated to not only be effective at predicting the disease phenotypes, but also doing so efficiently through the use of computational shortcuts. While much of the work was able to be run on high-end desktops, some work was further extended so that it could be implemented on parallel computers helping to assure that they will also scale to the NGS data sets. Further, these studies analyzed the relationships between various feature selection methods and demonstrated the need for careful testing when selecting an algorithm. It was shown that there is no universally optimal algorithm for variant selection in GWAS, but rather methodologies need to be selected based on the desired outcome, such as the number of features to be included in the prediction model. It was also demonstrated that without proper model validation, for example using nested cross-validation, the models can result in overly-optimistic prediction accuracies and decreased generalization ability. It is through the implementation and application of machine learning methods that one can extract predictive genotype–phenotype relationships and biological insights from genetic data sets.