62 resultados para Geo-scientific processing
Resumo:
The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.
Resumo:
In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.
Resumo:
The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.
Resumo:
Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented
Resumo:
Janamittakaava: Russian wersts.
Resumo:
1:25000.
Resumo:
Arkit: 1 arkintunnukseton lehti, A-B4. - S. [2] tyhjä.
Resumo:
In recent years, technological advancements in microelectronics and sensor technologies have revolutionized the field of electrical engineering. New manufacturing techniques have enabled a higher level of integration that has combined sensors and electronics into compact and inexpensive systems. Previously, the challenge in measurements was to understand the operation of the electronics and sensors, but this has now changed. Nowadays, the challenge in measurement instrumentation lies in mastering the whole system, not just the electronics. To address this issue, this doctoral dissertation studies whether it would be beneficial to consider a measurement system as a whole from the physical phenomena to the digital recording device, where each piece of the measurement system affects the system performance, rather than as a system consisting of small independent parts such as a sensor or an amplifier that could be designed separately. The objective of this doctoral dissertation is to describe in depth the development of the measurement system taking into account the challenges caused by the electrical and mechanical requirements and the measurement environment. The work is done as an empirical case study in two example applications that are both intended for scientific studies. The cases are a light sensitive biological sensor used in imaging and a gas electron multiplier detector for particle physics. The study showed that in these two cases there were a number of different parts of the measurement system that interacted with each other. Without considering these interactions, the reliability of the measurement may be compromised, which may lead to wrong conclusions about the measurement. For this reason it is beneficial to conceptualize the measurement system as a whole from the physical phenomena to the digital recording device where each piece of the measurement system affects the system performance. The results work as examples of how a measurement system can be successfully constructed to support a study of sensors and electronics.
Resumo:
This study is done to examine waste power plant’s optimal processing chain and it is important to consider from several points of view on why one option is better than the other. This is to insure that the right decision is made. Incineration of waste has devel-oped to be one decent option for waste disposal. There are several legislation matters and technical options to consider when starting up a waste power plant. From the tech-niques pretreatment, burner and flue gas cleaning are the biggest ones to consider. The treatment of incineration residues is important since it can be very harmful for the envi-ronment. The actual energy production from waste is not highly efficient and there are several harmful compounds emitted. Recycling of waste before incineration is not very typical and there are not many recycling options for materials that cannot be easily re-cycled to same product. Life cycle assessment is a good option for studying the envi-ronmental effect of the system. It has four phases that are part of the iterative study process. In this study the case environment is a waste power plant. The modeling of the plant is done with GaBi 6 software and the scope is from gate-to-grave. There are three different scenarios, from which the first and second are compared to each other to reach conclusions. Zero scenario is part of the study to demonstrate situation without the power plant. The power plant in this study is recycling some materials in scenario one and in scenario two even more materials and utilize the bottom ash more ways than one. The model has the substitutive processes for the materials when they are not recycled in the plant. The global warming potential results show that scenario one is the best option. The variable costs that have been considered tell the same result. The conclusion is that the waste power plant should not recycle more and utilize bottom ash in a number of ways. The area is not ready for that kind of utilization and production from recycled materials.
Resumo:
This doctoral study conducts an empirical analysis of the impact of Word-of-Mouth (WOM) on marketing-relevant outcomes such as attitudes and consumer choice, during a high-involvement and complex service decision. Due to its importance to decisionmaking, WOM has attracted interest from academia and practitioners for decades. Consumers are known to discuss products and services with one another. These discussions help consumers to form an evaluative opinion, as WOM reduces perceived risk, simplifies complexity, and increases the confidence of consumers in decisionmaking. These discussions are also highly impactful as WOM is a trustworthy source of information, since it is independent from the company or brand. In responding to the calls for more research on what happens after WOM information is received, and how it affects marketing-relevant outcomes, this dissertation extends prior WOM literature by investigating how consumers process information in a highinvolvement service domain, in particular higher-education. Further, the dissertation studies how the form of WOM influences consumer choice. The research contributes to WOM and services marketing literature by developing and empirically testing a framework for information processing and studying the long-term effects of WOM. The results of the dissertation are presented in five research publications. The publications are based on longitudinal data. The research leads to the development of a proposed theoretical framework for the processing of WOM, based on theories from social psychology. The framework is specifically focused on service decisions, as it takes into account evaluation difficulty through the complex nature of choice criteria associated with service purchase decisions. Further, other gaps in current WOM literature are taken into account by, for example, examining how the source of WOM and service values affects the processing mechanism. The research also provides implications for managers aiming to trigger favorable WOM through marketing efforts, such as advertising and testimonials. The results provide suggestions on how to design these marketing efforts by taking into account the mechanism through which information is processed, or the form of social influence.