42 resultados para mapping technique
Resumo:
During the past decades testing has matured from ad-hoc activity into being an integral part of the development process. The benefits of testing are obvious for modern communication systems, which operate in heterogeneous environments amongst devices from various manufacturers. The increased demand for testing also creates demand for tools and technologies that support and automate testing activities. This thesis discusses applicability of visualization techniques in the result analysis part of the testing process. Particularly, the primary focus of this work is visualization of test execution logs produced by a TTCN-3 test system. TTCN-3 is an internationally standardized test specification and implementation language. The TTCN-3 standard suite includes specification of a test logging interface and a graphical presentation format, but no immediate relationship between them. This thesis presents a technique for mapping the log events to the graphical presentation format along with a concrete implementation, which is integrated with the Eclipse Platform and the OpenTTCN Tester toolchain. Results of this work indicate that for majority of the log events, a visual representation may be derived from the TTCN-3 standard suite. The remaining events were analysed and three categories relevant in either log analysis or implementation of the visualization tool were identified: events indicating insertion of something into the incoming queue of a port, events indicating a mismatch and events describing the control flow during the execution. Applicability of the results is limited into the domain of TTCN-3, but the developed mapping and the implementation may be utilized with any TTCN-3 tool that is able to produce the execution log in the standardized XML format.
Resumo:
Webben är en enorm källa för information. Innehållet på webbsidorna är en synlig typ av information, men webben innehåller även information av en annan typ, en mera gömd typ i form av sambanden och nätverken som hyperlänkarna skapar mellan webbsajterna och –sidorna som de kopplar ihop. Forskningsområdet webometri ämnar, bland annat, att skapa ny kunskap ur denna gömda information som finns inbyggt i hyperlänkarna samt att skapa förståelse för hurudana fenomen och förhållanden utanför webben kan finnas representerade i hyperlänkarna. Målet med denna forskning var att öka förståelse för användningen av hyperlänkar på webben och speciellt kommunernas användning av hyperlänkar. Denna forskning undersökte hur kommunerna i Egentliga Finland skapade och mottog hyperlänkar samt hurudana nätverk formades av dessa hyperlänkar. Forskningen kartlade nätverk av direkta länkar mellan kommunerna och av samlänkar till och från kommunerna och undersökte ifall dessa nätverk kunde användas för att undersöka geopolitiska förhållanden och samarbete mellan kommunerna i Egentliga Finland. De övergripande forskningsfrågorna som har besvarats i denna forskning är: 1) Från ett webometriskt perspektiv, hur använder kommunerna i Egentliga Finland webben? 2) Kan hyperlänkar (direkta länkar och samlänkar) användas för att kartlägga geopolitiska förhållanden och samarbete mellan kommuner? 3) Vilka är de viktigaste motiveringarna för att skapa länkar mellan, till och från kommunernas webbsajter? Denna forskning kom till ovanligt tydliga resultat för en webometrisk forskning, både när det gäller upptäckta geografiska faktorer som påverkar hyperlänkningarna och de klassificerade motivationerna för att skapa länkarna. Resultaten visade att de direkta hyperlänkarna mellan kommunerna kan användas för att kartlägga geopolitiska förhållanden och samarbete mellan kommunerna för att de direkta länkarna var motiverade av officiella orsaker och de var klart påverkade av distansen mellan kommunerna och av de ekonomiska regionerna. Samlänkningarna in till kommunerna visade sig fungera som ett mått för geografisk likhet mellan kommunerna, medan samlänkningarna ut från kommunerna visade potential för att kunna användas till för att kartlägga kommunernas gemensamma intressen. Forskningen kontribuerade även till utvecklandet av forskningsområdet webometri. En del av de viktigaste kontributionerna av denna forskning var utvecklandet av nya metoder för webometrisk forskning samt att öka kunskap om hur existerande metoder från nätverksanalys kan användas effektivt för webometrisk forskning. Resultaten från denna forskning och de utvecklade metoderna kan användas för snabba kartläggningar av diverse förhållanden mellan olika organisationer och företag genom att använda information gratis tillgängligt på webben.
Resumo:
With the increase of use of digital media the need for the methods of multimedia protection becomes extremely important. The number of the solutions to the problem from encryption to watermarking is large and is growing every year. In this work digital image watermarking is considered, specifically a novel method of digital watermarking of color and spectral images. An overview of existing methods watermarking of color and grayscale images is given in the paper. Methods using independent component analysis (ICA) for detection and the ones using discrete wavelet transform (DWT) and discrete cosine transform (DCT) are considered in more detail. A novel method of watermarking proposed in this paper allows embedding of a color or spectral watermark image into color or spectral image consequently and successful extraction of the watermark out of the resultant watermarked image. A number of experiments have been performed on the quality of extraction depending on the parameters of the embedding procedure. Another set of experiments included the test of the robustness of the algorithm proposed. Three techniques have been chosen for that purpose: median filter, low-pass filter (LPF) and discrete cosine transform (DCT), which are a part of a widely known StirMark - Image Watermarking Robustness Test. The study shows that the proposed watermarking technique is fragile, i.e. watermark is altered by simple image processing operations. Moreover, we have found that the contents of the image to be watermarked do not affect the quality of the extraction. Mixing coefficients, that determine the amount of the key and watermark image in the result, should not exceed 1% of the original. The algorithm proposed has proven to be successful in the task of watermark embedding and extraction.
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.
Resumo:
Seloste artikkelista: Tuominen, S., Eerikäinen, K., Schibalski, A., Haakana, M. & Lehtonen, A. / Mapping biomass variables with a multi-source forest inventory technique. Silva Fennica 44 (2010) : 1, 109-119.
Resumo:
The purpose of this study was to simulate and to optimize integrated gasification for combine cycle (IGCC) for power generation and hydrogen (H2) production by using low grade Thar lignite coal and cotton stalk. Lignite coal is abundant of moisture and ash content, the idea of addition of cotton stalk is to increase the mass of combustible material per mass of feed use for the process, to reduce the consumption of coal and to increase the cotton stalk efficiently for IGCC process. Aspen plus software is used to simulate the process with different mass ratios of coal to cotton stalk and for optimization: process efficiencies, net power generation and H2 production etc. are considered while environmental hazard emissions are optimized to acceptance level. With the addition of cotton stalk in feed, process efficiencies started to decline along with the net power production. But for H2 production, it gave positive result at start but after 40% cotton stalk addition, H2 production also started to decline. It also affects negatively on environmental hazard emissions and mass of emissions/ net power production increases linearly with the addition of cotton stalk in feed mixture. In summation with the addition of cotton stalk, overall affects seemed to negative. But the effect is more negative after 40% cotton stalk addition so it is concluded that to get maximum process efficiencies and high production less amount of cotton stalk addition in feed is preferable and the maximum level of addition is estimated to 40%. Gasification temperature should keep lower around 1140 °C and prefer technique for studied feed in IGCC is fluidized bed (ash in dry form) rather than ash slagging gasifier
Resumo:
The objective of this master’s thesis is to investigate the loss behavior of three-level ANPC inverter and compare it with conventional NPC inverter. The both inverters are controlled with mature space vector modulation strategy. In order to provide the comparison both accurate and detailed enough NPC and ANPC simulation models should be obtained. The similar control model of SVM is utilized for both NPC and ANPC inverter models. The principles of control algorithms, the structure and description of models are clarified. The power loss calculation model is based on practical calculation approaches with certain assumptions. The comparison between NPC and ANPC topologies is presented based on results obtained for each semiconductor device, their switching and conduction losses and efficiency of the inverters. Alternative switching states of ANPC topology allow distributing losses among the switches more evenly, than in NPC inverter. Obviously, the losses of a switching device depend on its position in the topology. Losses distribution among the components in ANPC topology allows reducing the stress on certain switches, thus losses are equally distributed among the semiconductors, however the efficiency of the inverters is the same. As a new contribution to earlier studies, the obtained models of SVM control, NPC and ANPC inverters have been built. Thus, this thesis can be used in further more complicated modelling of full-power converters for modern multi-megawatt wind energy conversion systems.
Resumo:
The objective of this thesis is the development of a multibody dynamic model matching the observed movements of the lower limb of a skier performing the skating technique in cross-country style. During the construction of this model, the formulation of the equation of motion was made using the Euler - Lagrange approach with multipliers applied to a multibody system in three dimensions. The description of the lower limb of the skate skier and the ski was completed by employing three bodies, one representing the ski, and two representing the natural movements of the leg of the skier. The resultant system has 13 joint constraints due to the interconnection of the bodies, and four prescribed kinematic constraints to account for the movements of the leg, leaving the amount of degrees of freedom equal to one. The push-off force exerted by the skate skier was taken directly from measurements made on-site in the ski tunnel at the Vuokatti facilities (Finland) and was input into the model as a continuous function. Then, the resultant velocities and movement of the ski, center of mass of the skier, and variation of the skating angle were studied to understand the response of the model to the variation of important parameters of the skate technique. This allowed a comparison of the model results with the real movement of the skier. Further developments can be made to this model to better approximate the results to the real movement of the leg. One can achieve this by changing the constraints to include the behavior of the real leg joints and muscle actuation. As mentioned in the introduction of this thesis, a multibody dynamic model can be used to provide relevant information to ski designers and to obtain optimized results of the given variables, which athletes can use to improve their performance.
Resumo:
Tämän pro gradu -tutkielman aiheena on kääntäminen ja viestintä vieraalla kielellä Suomen ulkoasiainministeriön rahoittamissa kansalaisjärjestöjen kehitysyhteistyöhankkeissa. Tutkielman on tarkoitus kartoittaa kyseisten kehitysyhteistyöhankkeiden kääntämiseen ja vieraskieliseen viestintään liittyviä tarpeita ja -käytäntöjä. Aihetta ei ole tutkittu laajasti muutoin, joten empiirinen osuus on erityisen merkittävässä roolissa. Tutkielman alussa esitellään ulkoasiainministeriön laatimia suomalaisten kansalaisjärjestöjen kehitysyhteistyöhankkeiden ehtoja ja ohjeita, jotka ovat kyseiselle tutkimukselle olennaisia. Tutkielman teoreettinen osuus koostuu lähinnä metodologisesta pohdinnasta, koska aiempaa tutkimustietoa on vähän. Tutkimusmetodien analysointia on painotettu myös siksi, että empiirisellä osuudella on niin suuri merkitys. Tutkielman empiirinen aineisto on yhtäältä peräisin tutkijan havainnoista, jotka saatiin omia varhaisempia hankehallinnointikokemuksia hyödyntäen sekä osallistuvan havainnoinnin menetelmää käyttäen kahden opiskelijajärjestön kehitysyhteistyöhankkeista vuosien 2009 ja 2012 välisenä aikana. Toisaalta aineisto on kerätty kyselytutkimuksen vastauksista. Kysely lähetettiin 35 suomalaiselle kansalaisjärjestölle, joiden kehitysyhteistyöhankkeita Suomen ulkoasiainministeriö rahoittaa. Järjestöt valittiin satunnaisotannalla. Empiirisen aineiston analyysistä ilmeni muun muassa, että tutkijan alkuperäinen hypoteesi kehitysyhteistyöhön liittyvien tekstien kääntämisen yleisyydestä pitää osittain paikkansa, vaikka tekstejä käännettiinkin kansalaisjärjestöissä vähemmän kuin tutkija odotti. Tutkimusaineisto ei ole kuitenkaan tarpeeksi laaja, jotta tuloksia voitaisiin yleistää, sillä kansalaisjärjestöt, niiden hankkeet, käännöstarpeet ja asiantuntemus osoittautuivat odotetusti hyvin erilaisiksi. Tulokset sisältävät asiaa myös esimerkiksi järjestöissä kääntävien ja vieraalla kielellä viestivien taustoista. Jatkotutkimukset ovat kuitenkin tarpeen, ja mahdollisia tutkimusaiheita on runsaasti.
Resumo:
The ongoing global financial crisis has demonstrated the importance of a systemwide, or macroprudential, approach to safeguarding financial stability. An essential part of macroprudential oversight concerns the tasks of early identification and assessment of risks and vulnerabilities that eventually may lead to a systemic financial crisis. Thriving tools are crucial as they allow early policy actions to decrease or prevent further build-up of risks or to otherwise enhance the shock absorption capacity of the financial system. In the literature, three types of systemic risk can be identified: i ) build-up of widespread imbalances, ii ) exogenous aggregate shocks, and iii ) contagion. Accordingly, the systemic risks are matched by three categories of analytical methods for decision support: i ) early-warning, ii ) macro stress-testing, and iii ) contagion models. Stimulated by the prolonged global financial crisis, today's toolbox of analytical methods includes a wide range of innovative solutions to the two tasks of risk identification and risk assessment. Yet, the literature lacks a focus on the task of risk communication. This thesis discusses macroprudential oversight from the viewpoint of all three tasks: Within analytical tools for risk identification and risk assessment, the focus concerns a tight integration of means for risk communication. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. The overall task of this thesis is to represent high-dimensional data concerning financial entities on lowdimensional displays. The low-dimensional representations have two subtasks: i ) to function as a display for individual data concerning entities and their time series, and ii ) to use the display as a basis to which additional information can be linked. The final nuance of the task is, however, set by the needs of the domain, data and methods. The following ve questions comprise subsequent steps addressed in the process of this thesis: 1. What are the needs for macroprudential oversight? 2. What form do macroprudential data take? 3. Which data and dimension reduction methods hold most promise for the task? 4. How should the methods be extended and enhanced for the task? 5. How should the methods and their extensions be applied to the task? Based upon the Self-Organizing Map (SOM), this thesis not only creates the Self-Organizing Financial Stability Map (SOFSM), but also lays out a general framework for mapping the state of financial stability. This thesis also introduces three extensions to the standard SOM for enhancing the visualization and extraction of information: i ) fuzzifications, ii ) transition probabilities, and iii ) network analysis. Thus, the SOFSM functions as a display for risk identification, on top of which risk assessments can be illustrated. In addition, this thesis puts forward the Self-Organizing Time Map (SOTM) to provide means for visual dynamic clustering, which in the context of macroprudential oversight concerns the identification of cross-sectional changes in risks and vulnerabilities over time. Rather than automated analysis, the aim of visual means for identifying and assessing risks is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence, as well as external risk communication.
Resumo:
In this work, image based estimation methods, also known as direct methods, are studied which avoid feature extraction and matching completely. Cost functions use raw pixels as measurements and the goal is to produce precise 3D pose and structure estimates. The cost functions presented minimize the sensor error, because measurements are not transformed or modified. In photometric camera pose estimation, 3D rotation and translation parameters are estimated by minimizing a sequence of image based cost functions, which are non-linear due to perspective projection and lens distortion. In image based structure refinement, on the other hand, 3D structure is refined using a number of additional views and an image based cost metric. Image based estimation methods are particularly useful in conditions where the Lambertian assumption holds, and the 3D points have constant color despite viewing angle. The goal is to improve image based estimation methods, and to produce computationally efficient methods which can be accomodated into real-time applications. The developed image-based 3D pose and structure estimation methods are finally demonstrated in practise in indoor 3D reconstruction use, and in a live augmented reality application.
Resumo:
Information gained from the human genome project and improvements in compound synthesizing have increased the number of both therapeutic targets and potential lead compounds. This has evolved a need for better screening techniques to have a capacity to screen number of compound libraries against increasing amount of targets. Radioactivity based assays have been traditionally used in drug screening but the fluorescence based assays have become more popular in high throughput screening (HTS) as they avoid safety and waste problems confronted with radioactivity. In comparison to conventional fluorescence more sensitive detection is obtained with time-resolved luminescence which has increased the popularity of time-resolved fluorescence resonance energy transfer (TR-FRET) based assays. To simplify the current TR-FRET based assay concept the luminometric homogeneous single-label utilizing assay technique, Quenching Resonance Energy Transfer (QRET), was developed. The technique utilizes soluble quencher to quench non-specifically the signal of unbound fraction of lanthanide labeled ligand. One labeling procedure and fewer manipulation steps in the assay concept are saving resources. The QRET technique is suitable for both biochemical and cell-based assays as indicated in four studies:1) ligand screening study of β2 -adrenergic receptor (cell-based), 2) activation study of Gs-/Gi-protein coupled receptors by measuring intracellular concentration of cyclic adenosine monophosphate (cell-based), 3) activation study of G-protein coupled receptors by observing the binding of guanosine-5’-triphosphate (cell membranes), and 4) activation study of small GTP binding protein Ras (biochemical). Signal-to-background ratios were between 2.4 to 10 and coefficient of variation varied from 0.5 to 17% indicating their suitability to HTS use.