18 resultados para digital spiral analysis
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Selostus: Tasoskannerin ja digitaalisen kuva-analyysimenetelmän kalibrointi juurten morfologian kvantifioimiseksi
Resumo:
This dissertation discusses Holocene palaeoenvironmental and palaeomagnetic secular variation (PSV) records reconstructed from sediments preserved in Lake Lehmilampi (63º37´N, 29º06´E) and Lake Kortejärvi (63º37´N, 28º56´E) in eastern Finland. Several piston and freeze cores were obtained from both lakes for varve and magnetic analyses. Sediment samples were impregnated in low-viscosity epoxy and physical parameters of varves, including varve thickness and relative grey-scale values, were recorded using x-ray densitometry combined with semiautomatic digital image analysis. On average, varve records of Lehmilampi and Kortejärvi cover 5122 and 3902 years, respectively. Past solar activity, as estimated by residual 14C data, compares favourably with varve thicknesses from Lehmilampi during the last 2000 years. This indicates the potential of clastic-organic varves to record sensitively climatic variations. Bulk magnetic parameters, including magnetic susceptibility together with natural, anhysteretic and isothermal remanent magnetizations, were measured to describe mineral magnetic properties and geomagnetic palaeosecular variation recorded in the sediments. Main stages in the development of the investigated lakes are reflected in the variations in the mineral magnetic records, sediment lithology and composition. Similar variations in magnetic parameters and sediment organic matter suggest contribution of bacterial magnetite in the magnetic assemblages of Lehmilampi. Inclination and relative declination records yielded largely consistent results, attesting to the great potential of these sediments to preserve directional palaeosecular variation in high resolution. The PSV data from Lehmilampi and Kortejärvi were stacked into North Karelian PSV stack, which may be used for dating homogenous lake sediments in the same regional context. Reconstructed millennial variations in relative palaeointensity results are approximately in agreement with those seen in the absolute palaeointensity data from Europe. Centennial variations in the relative palaeointensity, however, are influenced by environmental changes. Caution is recommended when using varved lake sediments in reconstructing relative palaeointensity.
Resumo:
Tässä diplomityössä tutkitaan tekniikoita, joillavesileima lisätään spektrikuvaan, ja menetelmiä, joilla vesileimat tunnistetaanja havaitaan spektrikuvista. PCA (Principal Component Analysis) -algoritmia käyttäen alkuperäisten kuvien spektriulottuvuutta vähennettiin. Vesileiman lisääminen spektrikuvaan suoritettiin muunnosavaruudessa. Ehdotetun mallin mukaisesti muunnosavaruuden komponentti korvattiin vesileiman ja toisen muunnosavaruuden komponentin lineaarikombinaatiolla. Lisäyksessä käytettävää parametrijoukkoa tutkittiin. Vesileimattujen kuvien laatu mitattiin ja analysoitiin. Suositukset vesileiman lisäykseen esitettiin. Useita menetelmiä käytettiin vesileimojen tunnistamiseen ja tunnistamisen tulokset analysoitiin. Vesileimojen kyky sietää erilaisia hyökkäyksiä tarkistettiin. Diplomityössä suoritettiin joukko havaitsemis-kokeita ottamalla huomioon vesileiman lisäyksessä käytetyt parametrit. ICA (Independent Component Analysis) -menetelmää pidetään yhtenä mahdollisena vaihtoehtona vesileiman havaitsemisessa.
Resumo:
Problems of the designing active magnet bearingcontrol are developed. The estimation controller are designed and applied to a rigid rotor. The mathematical model of the active magnet bearing controller is developed. This mathematical model is realized on a DSP. The results of this realization are analyzed. The conclusions about the digital signal processing are made.
Resumo:
In a centrifugal compressor the flow around the diffuser is collected and led to the pipe system by a spiral-shaped volute. In this study a single-stage centrifugal compressor with three different volutes is investigated. The compressorwas first equipped with the original volute, the cross-section of which was a combination of a rectangle and semi-circle. Next a new volute with a fully circular cross-section was designed and manufactured. Finally, the circular volute wasmodified by rounding the tongue and smoothing the tongue area. The overall performance of the compressor as well as the static pressure distribution after the impeller and on the volute surface were measured. The flow entering the volute was measured using a three-hole Cobra-probe, and flow visualisations were carriedout in the exit cone of the volute. In addition, the radial force acting on theimpeller was measured using magnetic bearings. The complete compressor with thecircular volute (inlet pipe, full impeller, diffuser, volute and outlet pipe) was also modelled using computational fluid dynamics (CFD). A fully 3-D viscous flow was solved using a Navier-Stokes solver, Finflo, developed at Helsinki University of Technology. Chien's k-e model was used to take account of the turbulence. The differences observed in the performance of the different volutes were quite small. The biggest differences were at low speeds and high volume flows,i.e. when the flow entered the volute most radially. In this operating regime the efficiency of the compressor with the modified circular volute was about two percentage points higher than with the other volutes. Also, according to the Cobra-probe measurements and flow visualisations, the modified circular volute performed better than the other volutes in this operating area. The circumferential static pressure distribution in the volute showed increases at low flow, constant distribution at the design flow and decrease at high flow. The non-uniform static pressure distribution of the volute was transmitted backwards across the vaneless diffuser and observed at the impeller exit. At low volume flow a strong two-wave pattern developed into the static pressure distribution at the impeller exit due to the response of the impeller to the non-uniformity of pressure. The radial force of the impeller was the greatest at the choke limit, the smallest atthe design flow, and moderate at low flow. At low flow the force increase was quite mild, whereas the increase at high flow was rapid. Thus, the non-uniformityof pressure and the force related to it are strong especially at high flow. Theforce caused by the modified circular volute was weaker at choke and more symmetric as a function of the volume flow than the force caused by the other volutes.
Resumo:
Industry's growing need for higher productivity is placing new demands on mechanisms connected with electrical motors, because these can easily lead to vibration problems due to fast dynamics. Furthermore, the nonlinear effects caused by a motor frequently reduce servo stability, which diminishes the controller's ability to predict and maintain speed. Hence, the flexibility of a mechanism and its control has become an important area of research. The basic approach in control system engineering is to assume that the mechanism connected to a motor is rigid, so that vibrations in the tool mechanism, reel, gripper or any apparatus connected to the motor are not taken into account. This might reduce the ability of the machine system to carry out its assignment and shorten the lifetime of the equipment. Nonetheless, it is usually more important to know how the mechanism, or in other words the load on the motor, behaves. A nonlinear load control method for a permanent magnet linear synchronous motor is developed and implemented in the thesis. The purpose of the controller is to track a flexible load to the desired velocity reference as fast as possible and without awkward oscillations. The control method is based on an adaptive backstepping algorithm with its stability ensured by the Lyapunov stability theorem. As a reference controller for the backstepping method, a hybrid neural controller is introduced in which the linear motor itself is controlled by a conventional PI velocity controller and the vibration of the associated flexible mechanism is suppressed from an outer control loop using a compensation signal from a multilayer perceptron network. To avoid the local minimum problem entailed in neural networks, the initial weights are searched for offline by means of a differential evolution algorithm. The states of a mechanical system for controllers are estimated using the Kalman filter. The theoretical results obtained from the control design are validated with the lumped mass model for a mechanism. Generalization of the mechanism allows the methods derived here to be widely implemented in machine automation. The control algorithms are first designed in a specially introduced nonlinear simulation model and then implemented in the physical linear motor using a DSP (Digital Signal Processor) application. The measurements prove that both controllers are capable of suppressing vibration, but that the backstepping method is superior to others due to its accuracy of response and stability properties.
Resumo:
Testing of a complex software is time consuming. Automated tools are available quite a lot for desktop applications, but for embedded systems a custom-made tool is required Building a complete test framework is a complicated task. Therefore, the test platform was built on top of an already existing tool, CANoe. CANoe is a tool for CAN bus analysis and node simulation. The functionality of CANoe was extended with LabVIEW DLL. The LabVIEW software was used for simulating hardware components of the embedded device As a result of the study, a platform was created where tests could be automated. Of the current test plan, 10 percent were automated and up to 60 percent could be automated with the current functionality.
Resumo:
Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.
Resumo:
Paper presented at the 40th Annual Conference of LIBER (Ligue des Bibliothèques Européennes de Recherche - Association of European Research Libraries) on July 1st, 2011; with the slides used at the presentation.
Resumo:
With the increase of use of digital media the need for the methods of multimedia protection becomes extremely important. The number of the solutions to the problem from encryption to watermarking is large and is growing every year. In this work digital image watermarking is considered, specifically a novel method of digital watermarking of color and spectral images. An overview of existing methods watermarking of color and grayscale images is given in the paper. Methods using independent component analysis (ICA) for detection and the ones using discrete wavelet transform (DWT) and discrete cosine transform (DCT) are considered in more detail. A novel method of watermarking proposed in this paper allows embedding of a color or spectral watermark image into color or spectral image consequently and successful extraction of the watermark out of the resultant watermarked image. A number of experiments have been performed on the quality of extraction depending on the parameters of the embedding procedure. Another set of experiments included the test of the robustness of the algorithm proposed. Three techniques have been chosen for that purpose: median filter, low-pass filter (LPF) and discrete cosine transform (DCT), which are a part of a widely known StirMark - Image Watermarking Robustness Test. The study shows that the proposed watermarking technique is fragile, i.e. watermark is altered by simple image processing operations. Moreover, we have found that the contents of the image to be watermarked do not affect the quality of the extraction. Mixing coefficients, that determine the amount of the key and watermark image in the result, should not exceed 1% of the original. The algorithm proposed has proven to be successful in the task of watermark embedding and extraction.
Resumo:
This study was conducted in order to learn how companies’ revenue models will be transformed due to the digitalisation of its products and processes. Because there is still only a limited number of researches focusing solely on revenue models, and particularly on the revenue model change caused by the changes at the business environment, the topic was initially approached through the business model concept, which organises the different value creating operations and resources at a company in order to create profitable revenue streams. This was used as the base for constructing the theoretical framework for this study, used to collect and analyse the information. The empirical section is based on a qualitative study approach and multiple-case analysis of companies operating in learning materials publishing industry. Their operations are compared with companies operating in other industries, which have undergone comparable transformation, in order to recognise either similarities or contrasts between the cases. The sources of evidence are a literature review to find the essential dimensions researched earlier, and interviews 29 of managers and executives at 17 organisations representing six industries. Based onto the earlier literature and the empirical findings of this study, the change of the revenue model is linked with the change of the other dimen-sions of the business model. When one dimension will be altered, as well the other should be adjusted accordingly. At the case companies the transformation is observed as the utilisation of several revenue models simultaneously and the revenue creation processes becoming more complex.
Resumo:
This paper describes the cost-benefit analysis of digital long-term preservation (LTP) that was carried out in the context of the Finnish National Digital Library Project (NDL) in 2010. The analysis was based on the assumption that as many as 200 archives, libraries, and museums will share an LTP system. The term ‘system’ shall be understood as encompassing not only information technology, but also human resources, organizational structures, policies and funding mechanisms. The cost analysis shows that an LTP system will incur, over the first 12 years, cumulative costs of €42 million, i.e. an average of €3.5 million per annum. Human resources and investments in information technology are the major cost factors. After the initial stages, the analysis predicts annual costs of circa €4 million. The analysis compared scenarios with and without a shared LTP system. The results indicate that a shared system will have remarkable benefits. At the development and implementation stages, a shared system shows an advantage of €30 million against the alternative scenario consisting of five independent LTP solutions. During the later stages, the advantage is estimated at €10 million per annum. The cumulative cost benefit over the first 12 years would amount to circa €100 million.
Resumo:
This thesis aims to find an effective way of conducting a target audience analysis (TAA) in cyber domain. There are two main focal points that are addressed; the nature of the cyber domain and the method of the TAA. Of the cyber domain the object is to find the opportunities, restrictions and caveats that result from its digital and temporal nature. This is the environment in which the TAA method is examined in this study. As the TAA is an important step of any psychological operation and critical to its success, the method used must cover all the main aspects affecting the choice of a proper target audience. The first part of the research was done by sending an open-ended questionnaire to operators in the field of information warfare both in Finland and abroad. As the results were inconclusive, the research was completed by assessing the applicability of United States Army Joint Publication FM 3-05.301 in the cyber domain via a theory-based content analysis. FM 3- 05.301 was chosen because it presents a complete method of the TAA process. The findings were tested against the results of the questionnaire and new scientific research in the field of psychology. The cyber domain was found to be “fast and vast”, volatile and uncontrollable. Although governed by laws to some extent, the cyber domain is unpredictable by nature and not controllable to reasonable amount. The anonymity and lack of verification often present in the digital channels mean that anyone can have an opinion, and any message sent may change or even be counterproductive to the original purpose. The TAA method of the FM 3-05.301 is applicable in the cyber domain, although some parts of the method are outdated and thus suggested to be updated if used in that environment. The target audience categories of step two of the process were replaced by new groups that exist in the digital environment. The accessibility assessment (step eight) was also redefined, as in the digital media the mere existence of a written text is typically not enough to convey the intended message to the target audience. The scientific studies made in computer sciences and both in psychology and sociology about the behavior of people in social media (and overall in cyber domain) call for a more extensive remake of the TAA process. This falls, however, out of the scope of this work. It is thus suggested that further research should be carried out in search of computer-assisted methods and a more thorough TAA process, utilizing the latest discoveries of human behavior. ---------------------------------------------------------------------------------------------------------------------------------- Tämän opinnäytetyön tavoitteena on löytää tehokas tapa kohdeyleisöanalyysin tekemiseksi kybertoimintaympäristössä. Työssä keskitytään kahteen ilmiöön: kybertoimintaympäristön luonteeseen ja kohdeyleisöanalyysin metodiin. Kybertoimintaympäristön osalta tavoitteena on löytää sen digitaalisesta ja ajallisesta luonteesta juontuvat mahdollisuudet, rajoitteet ja sudenkuopat. Tämä on se ympäristö jossa kohdeyleisöanalyysiä tarkastellaan tässä työssä. Koska kohdeyleisöanalyysi kuuluu olennaisena osana jokaiseen psykologiseen operaatioon ja on onnistumisen kannalta kriittinen tekijä, käytettävän metodin tulee pitää sisällään kaikki oikean kohdeyleisön valinnan kannalta merkittävät osa-alueet. Tutkimuksen ensimmäisessä vaiheessa lähetettiin avoin kysely informaatiosodankäynnin ammattilaisille Suomessa ja ulkomailla. Koska kyselyn tulokset eivät olleet riittäviä johtopäätösten tekemiseksi, tutkimusta jatkettiin tarkastelemalla Yhdysvaltojen armeijan kenttäohjesäännön FM 3-05.301 soveltuvuutta kybertoimintaympäristössä käytettäväksi teorialähtöisen sisällönanalyysin avulla. FM 3-05.301 valittiin koska se sisältää kokonaisvaltaisen kohdeyleisöanalyysiprosessin. Havaintoja verrattiin kyselytutkimuksen tuloksiin ja psykologian uusiin tutkimuksiin. Kybertoimintaympäristö on tulosten perusteella nopea ja valtava, jatkuvasti muuttuva ja kontrolloimaton. Vaikkakin lait hallitsevat kybertoimintaympäristöä jossakin määrin, on se silti luonteeltaan ennakoimaton eikä sitä voida luotettavasti hallita. Digitaalisilla kanavilla usein läsnäoleva nimettömyys ja tiedon tarkastamisen mahdottomuus tarkoittavat että kenellä tahansa voi olla mielipide asioista, ja mikä tahansa viesti voi muuttua, jopa alkuperäiseen tarkoitukseen nähden vastakkaiseksi. FM 3-05.301:n metodi toimii kybertoimintaympäristössä, vaikkakin jotkin osa-alueet ovat vanhentuneita ja siksi ne esitetään päivitettäväksi mikäli metodia käytetään kyseisessä ympäristössä. Kohdan kaksi kohdeyleisökategoriat korvattiin uusilla, digitaalisessa ympäristössä esiintyvillä ryhmillä. Lähestyttävyyden arviointi (kohta 8) muotoiltiin myös uudestaan, koska digitaalisessa mediassa pelkkä tekstin läsnäolo ei sellaisenaan tyypillisesti vielä riitä halutun viestin välittämiseen kohdeyleisölle. Tietotekniikan edistyminen ja psykologian sekä sosiologian aloilla tehty tieteellinen tutkimus ihmisten käyttäytymisestä sosiaalisessa mediassa (ja yleensä kybertoimintaympäristössä) mahdollistavat koko kohdeyleisöanalyysiprosessin uudelleenrakentamisen. Tässä työssä sitä kuitenkaan ei voida tehdä. Siksi esitetäänkin että lisätutkimusta tulisi tehdä sekä tietokoneavusteisten prosessien että vielä syvällisempien kohdeyleisöanalyysien osalta, käyttäen hyväksi viimeisimpiä ihmisen käyttäytymiseen liittyviä tutkimustuloksia.
Resumo:
Prostate cancer is a heterogeneous disease affecting an increasing number of men all over the world, but particularly in the countries with the Western lifestyle. The best biomarker assay currently available for the diagnosis of the disease, the measurement of prostate specific antigen (PSA) levels from blood, lacks specificity, and even when combined with invasive tests such as digital rectal exam and prostate tissue biopsies, these methods can both miss cancers, and lead to overdiagnosis and subsequent overtreatment of cancers. Moreover, they cannot provide an accurate prognosis for the disease. Due to the high prevalence of indolent prostate cancers, the majority of men affected by prostate cancer would be able to live without any medical intervention. Their latent prostate tumors would not cause any clinical symptoms during their lifetime, but few are willing to take the risk, as currently there are no methods or biomarkers to reliably differentiate the indolent cancers from the aggressive, lethal cases that really are in need of immediate medical treatment. This doctoral work concentrated on validating 12 novel candidate genes for use as biomarkers for prostate cancer by measuring their mRNA expression levels in prostate tissue and peripheral blood of men with cancer as well as unaffected individuals. The panel of genes included the most prominent markers in the current literature: PCA3 and the fusion gene TMPRSS2-ERG, in addition to BMP-6, FGF-8b, MSMB, PSCA, SPINK1, and TRPM8; and the kallikrein-related peptidase genes 2, 3, 4, and 15. Truly quantitative reverse-transcription PCR assays were developed for each of the genes for the purpose, time-resolved fluorometry was applied in the real-time detection of the amplification products, and the gene expression data were normalized by using artificial internal RNA standards. Cancer-related, statistically significant differences in gene transcript levels were found for TMPRSS2-ERG, PCA3, and in a more modest scale, for KLK15, PSCA, and SPINK1. PCA3 RNA was found in the blood of men with metastatic prostate cancer, but not in localized cases of cancer, suggesting limitations for using this method for early cancer detection in blood. TMPRSS2-ERG mRNA transcripts were found more frequently in cancerous than in benign prostate tissues, but they were present also in 51% of the histologically benign prostate tissues of men with prostate cancer, while being absent in specimens from men without any signs of prostate cancer. PCA3 was shown to be 5.8 times overexpressed in cancerous tissue, but similarly to the fusion gene mRNA, its levels were upregulated also in the histologically benign regions of the tissue if the corresponding prostate was harboring carcinoma. These results indicate a possibility to utilize these molecular assays to assist in prostate cancer risk evaluation especially in men with initially histologically negative biopsies.
Resumo:
Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.