39 resultados para stream processing crowdsensing scheduling traffic analysis
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Diplomityön tavoitteena oli sopivimman yritysostokohteen valitseminen useiden kilpailijoiden joukosta puunkäsittelykoneiden toimittajalle. Ensin esiteltiin Suomen metsäteollisuus sekä sen osaamistarpeista noussut metsäklusteri pääosin kohdeyrityksen näkökulmasta. Seuraavaksi annettiin kuva yrityksen tuotteista, kilpailijoista ja asiakkaista. Yritysostoprosessi kuvattiin sekä esille tuotiin yleiset motiivit ja kriittiset menestystekijät. Lisäksi kuvattiin kilpailijoiden ja liiketoimintaympäristön analysointi yrityksen menestyksen edellytyksenä. Puuntyöstökoneiden markkinat segmentoitiin ja analysoitiin vuodesta 1990 aina tähän päivään asti, jotta löydettäisiin kehityskelpoiset osa-alueet eli alueet, joissa yrityksen markkinaosuutta voitaisiin kasvattaa. Kandidaattien ominaisuuksia verrattiin yritysoston motiiveihin. Yritysten tuotteet sekä maantieteellinen sijainti pisteytettiin, jotta sopivimmat yritykset nousisivat esille. Kolme yritystä valittiin syvällisempään tarkasteluun. Yritysten tuotteita, taloudellista asemaa ja globaalia verkostoa vertailtiin keskenään muiden tekijöiden, kuten maailmantalouden ohessa. Taloudellisesti vakaa ja teknisesti monipuolinen yritys kohtasi yritysoston motiivit parhaiten. Kohteen positiivisia puolia olivat sijainti, tuotteet ja palvelut. Lisäksi, yritys sopii ostajan strategiaan sekä auttaa kohtaamaan asiakkaiden nykyiset ja tulevat tarpeet.
Resumo:
Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.
Resumo:
Studying testis is complex, because the tissue has a very heterogeneous cell composition and its structure changes dynamically during development. In reproductive field, the cell composition is traditionally studied by morphometric methods such as immunohistochemistry and immunofluorescence. These techniques provide accurate quantitative information about cell composition, cell-cell association and localization of the cells of interest. However, the sample preparation, processing, staining and data analysis are laborious and may take several working days. Flow cytometry protocols coupled with DNA stains have played an important role in providing quantitative information of testicular cells populations ex vivo and in vitro studies. Nevertheless, the addition of specific cells markers such as intracellular antibodies would allow the more specific identification of cells of crucial interest during spermatogenesis. For this study, adult rat Sprague-Dawley rats were used for optimization of the flow cytometry protocol. Specific steps within the protocol were optimized to obtain a singlecell suspension representative of the cell composition of the starting material. Fixation and permeabilization procedure were optimized to be compatible with DNA stains and fluorescent intracellular antibodies. Optimization was achieved by quantitative analysis of specific parameters such as recovery of meiotic cells, amount of debris and comparison of the proportions of the various cell populations with already published data. As a result, a new and fast flow cytometry method coupled with DNA stain and intracellular antigen detection was developed. This new technique is suitable for analysis of population behavior and specific cells during postnatal testis development and spermatogenesis in rodents. This rapid protocol recapitulated the known vimentin and γH2AX protein expression patterns during rodent testis ontogenesis. Moreover, the assay was applicable for phenotype characterization of SCRbKO and E2F1KO mouse models.
Resumo:
Tässä diplomityössä on oletettu että neljännen sukupolven mobiiliverkko on saumaton yhdistelmä olemassa olevia toisen ja kolmannen sukupolven langattomia verkkoja sekä lyhyen kantaman WLAN- ja Bluetooth-radiotekniikoita. Näiden tekniikoiden on myös oletettu olevan niin yhteensopivia ettei käyttäjä havaitse saanti verkon muuttumista. Työ esittelee neljännen sukupolven mobiiliverkkoihin liittyvien tärkeimpien langattomien tekniikoiden arkkitehtuurin ja perustoiminta-periaatteet. Työ kuvaa eri tekniikoita ja käytäntöjä tiedon mittaamiseen ja keräämiseen. Saatuja transaktiomittauksia voidaan käyttää tarjottaessa erilaistettuja palvelutasoja sekä verkko- ja palvelukapasiteetin optimoimisessa. Lisäksi työssä esitellään Internet Business Information Manager joka on ohjelmistokehys hajautetun tiedon keräämiseen. Sen keräämää mittaustietoa voidaan käyttää palvelun tason seurannassa j a raportoinnissa sekä laskutuksessa. Työn käytännön osuudessa piti kehittää langattoman verkon liikennettä seuraava agentti joka tarkkailisi palvelun laatua. Agentti sijaitsisi matkapuhelimessa mitaten verkon liikennettä. Agenttia ei kuitenkaan voitu toteuttaa koska ohjelmistoympäristö todettiin vajaaksi. Joka tapauksessa työ osoitti että käyttäjän näkökulmasta tietoa kerääville agenteille on todellinen tarve.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
This paper explores behavioral patterns of web users on an online magazine web-site. The goal of the study is to first find and visualize user paths within the data generated during collection, and to identify some generic behavioral typologies of user behavior. To form a theoretical foundation for processing data and identifying behavioral ar-chetypes, the study relies on established consumer behavior literature to propose typologies of behavior. For data processing, the study utilizes methodologies of ap-plied cluster analysis and sequential path analysis. Utilizing a dataset of click stream data generated from the real-life clicks of 250 ran-domly selected website visitors over a period of six weeks. Based on the data collect-ed, an exploratory method is followed in order to find and visualize generally occur-ring paths of users on the website. Six distinct behavioral typologies were recog-nized, with the dominant user consuming mainly blog content, as opposed to editori-al content. Most importantly, it was observed that approximately 80% of clicks were of the blog content category, meaning that the majority of web traffic occurring in the site takes place in content other than the desired editorial content pages. The out-come of the study is a set of managerial recommendations for each identified behavioral archetype.
Resumo:
On selvää, että tänä päivänä maailmankaupan painopiste on hiljalleen siirtymässä Aasiaan ja varsinkin Kiina on ollut huomion keskipisteessä. Erityisesti valmistavien yritysten perspektiivistä muutos on ollut merkittävä ja tämä tosiasia kasvattaa yrityksissä paineita luoda kustannustehokkaita toimitusketjuratkaisuja,joiden vasteaika on mahdollisimman lyhyt. Samaan aikaan kun tarkastellaan kuljetusvirtoja, huomattaan että maanosien välillä on suuri epätasapaino. Tämä on enimmäkseen seurausta suurten globaalisti toimivien yritysten toimitusketjustrategioista. Useimmat näistä toimijoista optimoivat verkostonsa turvautumalla 'paikalliseen hankintaan', jotta he voisivat paremmin hallita toimitusketjujaan ja saada näitä reagointiherkimmiksi. Valmistusyksiköillä onkin monesti Euroopassa pakko käyttää kalliita raaka-aineita ja puolivalmisteita. Kriittisiksi tekijöiksi osoittautuvat kuljetus- ja varastointikustannukset sekä näiden seurauksena hukka-aika, joka aiheutuu viivästyksistä. Voidakseen saavuttaa optimiratkaisun, on tehtävä päätös miten tuotteet varastoidaan: keskitetysti tai hajautetusti ja integroida tämä valinta sopivien kuljetusmuotojen kanssa. Aasiasta Pohjois-Eurooppaan on halpaa käyttää merikuljetusta, mutta operaatio kestää hyvin pitkään - joissain tapauksessa jopa kahdeksan viikkoa. Toisaalta lentokuljetus on sekä kallis että rajoittaa siirrettävien tuotteiden eräkokoa.On olemassa kolmaskin vaihtoehto, josta voisi olla ratkaisuksi: rautatiekuljetus on halvempi kuin lentokuljetus ja vasteajat ovat lyhyemmät kuin merikuljetuksissa. Tässä tutkimuksessa tilannetta selvitetään kyselyllä, joka suunnattiin Suomessa ja Ruotsissa toimiville yrityksille. Tuloksien perusteella teemme johtopäätökset siitä, mitkä kuljetusmuotojen markkinaosuudet tulevat olemaan tulevaisuudessa sekä luomme kuvan kuljetusvirroista Euroopan, Venäjän, Etelä-Korea, Intian, Kiinan ja Japanin välillä. Samalla on tarkoitus ennakoida sitä, miten tarkastelun kohteena olevat yritykset aikovat kehittää kuljetuksiaan ja varastointiaan tulevien vuosien aikana. Tulosten perusteella näyttää siltä, että seuraavan viiden vuoden kuluessa kuljetuskustannukset eivät merkittävissä määrin tule muuttuman ja meri- sekä kumipyöräkuljetukset pysyvät suosituimpina vaihtoehtoina.Kuitenkin lentokuljetusten osuus laskee hiukan, kun taas rautatiekuljetusten painotus kasvaa. Tulokset paljastavat, että Kiinassa ja Venäjällä kuljetettava konttimäärä kasvaa; Intiassa tulos on saman suuntainen, joskaan ei niin voimakas. Analyysimme mukaan kuljetusvirtoihin liittyvä epätasapaino säilyy Venäjän kuljetusten suhteen: yritykset jatkavat tulevaisuudessakin vientiperusteista strategiaansa. Varastoinnin puolella tunnistamme pienemmän muutoksen, jonka mukaan pienikokoisten varastojen määrät todennäköisesti vähenevät tulevaisuudessa ja kiinnostus isoja varastoja kohtaan lisääntyy. Tässä kohtaa on mainittava, että suomalaisilla yrityksillä on enemmän varastoja Keski- ja Itä-Euroopassa verrattuna ruotsalaisiin toimijoihin, jotka keskittyvät selkeämmin Länsi-Euroopan maihin. Varastoja yrityksillä on molemmissa tapaukissa paljolti kotimaassaan. Valitessaan varastojensa sijoituskohteita yritykset painottavat seuraavia kriteereitä: alhaiset jakelukustannukset, kokoamispaikan/valmistustehtaan läheisyys, saapuvan logistiikan integroitavuus ja saatavilla olevat logistiikkapalvelut. Tutkimuksemme lopussa päädymme siihen, että varastojen sijoituspaikat eivät muutu satamien rakenteen ja liikenneyhteyksien takia kovinkaan nopeasti.
Resumo:
This thesis researches automatic traffic sign inventory and condition analysis using machine vision and pattern recognition methods. Automatic traffic sign inventory and condition analysis can be used to more efficient road maintenance, improving the maintenance processes, and to enable intelligent driving systems. Automatic traffic sign detection and classification has been researched before from the viewpoint of self-driving vehicles, driver assistance systems, and the use of signs in mapping services. Machine vision based inventory of traffic signs consists of detection, classification, localization, and condition analysis of traffic signs. The produced machine vision system performance is estimated with three datasets, from which two of have been been collected for this thesis. Based on the experiments almost all traffic signs can be detected, classified, and located and their condition analysed. In future, the inventory system performance has to be verified in challenging conditions and the system has to be pilot tested.
Resumo:
In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.
Resumo:
Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented
Resumo:
Diabetic retinopathy, age-related macular degeneration and glaucoma are the leading causes of blindness worldwide. Automatic methods for diagnosis exist, but their performance is limited by the quality of the data. Spectral retinal images provide a significantly better representation of the colour information than common grayscale or red-green-blue retinal imaging, having the potential to improve the performance of automatic diagnosis methods. This work studies the image processing techniques required for composing spectral retinal images with accurate reflection spectra, including wavelength channel image registration, spectral and spatial calibration, illumination correction, and the estimation of depth information from image disparities. The composition of a spectral retinal image database of patients with diabetic retinopathy is described. The database includes gold standards for a number of pathologies and retinal structures, marked by two expert ophthalmologists. The diagnostic applications of the reflectance spectra are studied using supervised classifiers for lesion detection. In addition, inversion of a model of light transport is used to estimate histological parameters from the reflectance spectra. Experimental results suggest that the methods for composing, calibrating and postprocessing spectral images presented in this work can be used to improve the quality of the spectral data. The experiments on the direct and indirect use of the data show the diagnostic potential of spectral retinal data over standard retinal images. The use of spectral data could improve automatic and semi-automated diagnostics for the screening of retinal diseases, for the quantitative detection of retinal changes for follow-up, clinically relevant end-points for clinical studies and development of new therapeutic modalities.
Resumo:
This study examines how MPEG-2 Transport Stream, used in DVB-T video transmission, can be reliably and efficiently transferred to remote locations over an MPLS network. All the relevant technologies used in this scenario are also discussed in the study. This study was done for Digita Oy, which is a major radio and television content distributor in Finland. The theoretical part of the study begins with the introduction to MPLS technology and continues with explanation of IP Multicast and its components. The fourth section discusses MPEG-2 and the formation and content of MPEG-2 Transport Stream. These technologies were studied in relevant literature and RFC documentation. After the theoretical part of the study, the test setup and the test cases are presented. The results of the test cases, and the conclusions that can be drawn based on them, are discussed in the last section of the study. The tests showed that it is possible to transfer digital video quite reliably over an MPLS network using IP Multicast. By configuring the equipment correctly, the recovery time of the network in case of a failure can be shortened remarkably. Also, the unwanted effect of other traffic on the critical video traffic can be eliminated by defining the Quality of Service parameters correctly. There are, however, some issues that need to be tested further before this setup can be used in broadcast networks. Reliable operation of IP Multicast and proper error correction are the main subjects for future testing.
Resumo:
The topic of this thesis is studying how lesions in retina caused by diabetic retinopathy can be detected from color fundus images by using machine vision methods. Methods for equalizing uneven illumination in fundus images, detecting regions of poor image quality due toinadequate illumination, and recognizing abnormal lesions were developed duringthe work. The developed methods exploit mainly the color information and simpleshape features to detect lesions. In addition, a graphical tool for collecting lesion data was developed. The tool was used by an ophthalmologist who marked lesions in the images to help method development and evaluation. The tool is a general purpose one, and thus it is possible to reuse the tool in similar projects.The developed methods were tested with a separate test set of 128 color fundus images. From test results it was calculated how accurately methods classify abnormal funduses as abnormal (sensitivity) and healthy funduses as normal (specificity). The sensitivity values were 92% for hemorrhages, 73% for red small dots (microaneurysms and small hemorrhages), and 77% for exudates (hard and soft exudates). The specificity values were 75% for hemorrhages, 70% for red small dots, and 50% for exudates. Thus, the developed methods detected hemorrhages accurately and microaneurysms and exudates moderately.
Resumo:
Problems of the designing active magnet bearingcontrol are developed. The estimation controller are designed and applied to a rigid rotor. The mathematical model of the active magnet bearing controller is developed. This mathematical model is realized on a DSP. The results of this realization are analyzed. The conclusions about the digital signal processing are made.